Update README.md
Browse files
README.md
CHANGED
|
@@ -6,7 +6,7 @@ tags:
|
|
| 6 |
- immunology
|
| 7 |
- seq2seq
|
| 8 |
- t5
|
| 9 |
-
pipeline_tag:
|
| 10 |
---
|
| 11 |
|
| 12 |
# TCRT5 model (pre-trained)
|
|
@@ -17,7 +17,7 @@ This model is the pre-trained model used for finetuning [TCRT5](https://huggingf
|
|
| 17 |
seq2seq model designed for the conditional generation of T-cell receptor (TCR) sequences given a target peptide-MHC (pMHC).
|
| 18 |
It is a transformers model that is built on the [T5 architecture](https://github.com/google-research/text-to-text-transfer-transformer/tree/main/t5) and
|
| 19 |
operationalized by the associated HuggingFace [abstraction](https://huggingface.co/docs/transformers/v4.46.2/en/model_doc/t5#transformers.T5ForConditionalGeneration).
|
| 20 |
-
It is released along with [this paper](
|
| 21 |
|
| 22 |
## Intended uses & limitations
|
| 23 |
|
|
@@ -111,10 +111,20 @@ Carbon emissions were estimated using the [Machine Learning Impact calculator](h
|
|
| 111 |
## BibTeX entry and citation info
|
| 112 |
|
| 113 |
```bibtex
|
| 114 |
-
@
|
| 115 |
-
|
| 116 |
-
|
| 117 |
-
|
| 118 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 119 |
}
|
| 120 |
```
|
|
|
|
| 6 |
- immunology
|
| 7 |
- seq2seq
|
| 8 |
- t5
|
| 9 |
+
pipeline_tag: other
|
| 10 |
---
|
| 11 |
|
| 12 |
# TCRT5 model (pre-trained)
|
|
|
|
| 17 |
seq2seq model designed for the conditional generation of T-cell receptor (TCR) sequences given a target peptide-MHC (pMHC).
|
| 18 |
It is a transformers model that is built on the [T5 architecture](https://github.com/google-research/text-to-text-transfer-transformer/tree/main/t5) and
|
| 19 |
operationalized by the associated HuggingFace [abstraction](https://huggingface.co/docs/transformers/v4.46.2/en/model_doc/t5#transformers.T5ForConditionalGeneration).
|
| 20 |
+
It is released along with [this paper](https://www.nature.com/articles/s42256-025-01096-6#citeas)).
|
| 21 |
|
| 22 |
## Intended uses & limitations
|
| 23 |
|
|
|
|
| 111 |
## BibTeX entry and citation info
|
| 112 |
|
| 113 |
```bibtex
|
| 114 |
+
@Article{Karthikeyan2025_tcrtranslate,
|
| 115 |
+
author={Karthikeyan, Dhuvarakesh
|
| 116 |
+
and Bennett, Sarah N.
|
| 117 |
+
and Reynolds, Amy G.
|
| 118 |
+
and Vincent, Benjamin G.
|
| 119 |
+
and Rubinsteyn, Alex},
|
| 120 |
+
title={Conditional generation of real antigen-specific T cell receptor sequences},
|
| 121 |
+
journal={Nature Machine Intelligence},
|
| 122 |
+
year={2025},
|
| 123 |
+
month={Sep},
|
| 124 |
+
day={08},
|
| 125 |
+
abstract={Despite recent advances in T cell receptor (TCR) engineering, designing functional TCRs against arbitrary targets remains challenging due to complex rules governing cross-reactivity and limited paired data. Here we present TCR-TRANSLATE, a sequence-to-sequence framework that adapts low-resource machine translation techniques to generate antigen-specific TCR sequences against unseen epitopes. By evaluating 12 model variants of the BART and T5 model architectures, we identified key factors affecting performance and utility, revealing discordances between these objectives. Our flagship model, TCRT5, outperforms existing approaches on computational benchmarks, prioritizing functionally relevant sequences at higher ranks. Most significantly, we experimentally validated a computationally designed TCR against Wilms' tumour antigen, a therapeutically relevant target in leukaemia, excluded from our training and validation sets. Although the identified TCR shows cross-reactivity with pathogen-derived peptides, highlighting limitations in specificity, our work represents the successful computational design of a functional TCR construct against a non-viral epitope from the target sequence alone. Our findings establish a foundation for computational TCR design and reveal current limitations in data availability and methodology, providing a framework for accelerating personalized immunotherapy by reducing the search space for novel targets.},
|
| 126 |
+
issn={2522-5839},
|
| 127 |
+
doi={10.1038/s42256-025-01096-6},
|
| 128 |
+
url={https://doi.org/10.1038/s42256-025-01096-6}
|
| 129 |
}
|
| 130 |
```
|