diff --git a/README.md b/README.md index 03efe3e..25f604e 100644 --- a/README.md +++ b/README.md @@ -1,3 +1,153 @@ +--- +license: apache-2.0 +tags: +- DNA +- RNA +- genomic +- metagenomic +language: +- en +--- # METAGENE-1 -METAGENE-1 \ No newline at end of file +## **Model Overview** +**METAGENE-1** is a 7B parameter metagenomic foundation model designed for pandemic monitoring, trained on over 1.5T base pairs of DNA and RNA sequenced from wastewater. + +![METAGENE-1 Overview](overview.png) + +**METAGENE-1** is a 7-billion-parameter autoregressive transformer language model, which we refer to as a *metagenomic foundation model*, that was trained on a novel corpus of diverse metagenomic DNA and RNA sequences comprising over 1.5 trillion base pairs. This dataset is sourced from a large collection of human wastewater samples, processed and sequenced using deep metagenomic (next-generation) sequencing methods. Unlike genomic models that focus on individual genomes or curated sets of specific species, the aim of METAGENE-1 is to capture the full distribution of genomic information present across the human microbiome. After pretraining, this model is designed to aid in tasks in the areas of biosurveillance, pandemic monitoring, and pathogen detection. + +We carry out byte-pair encoding (BPE) tokenization on our dataset, tailored for metagenomic sequences, and then pretrain our model. We detail the pretraining data, tokenization strategy, and model architecture, highlighting the considerations and design choices that enable the effective modeling of metagenomic data, in our technical report. + +## **Usage** +```python +import torch +from transformers import AutoTokenizer, AutoModelForCausalLM + +# Load the tokenizer and model +tokenizer = AutoTokenizer.from_pretrained("metagene-ai/METAGENE-1") +model = AutoModelForCausalLM.from_pretrained("metagene-ai/METAGENE-1", torch_dtype=torch.bfloat16) + +# Example input sequence +input_sequence = "TCACCGTTCTACAATCCCAAGCTGGAGTCAAGCTCAACAGGGTCTTC" + +# Tokenize the input sequence and remove the [EOS] token for generation +input_tokens = tokenizer.encode(input_sequence, return_tensors="pt", add_special_tokens=False) + +# Generate output from the model +generated_tokens = model.generate(input_tokens, max_length=32) + +# Decode the generated output and clean up the result +generated_sequence = tokenizer.decode(generated_tokens[0], skip_special_tokens=True) +generated_sequence = generated_sequence.replace(" ", "").replace("_", "") + +# Generated output: A Hexamita inflata 5.8S ribosomal RNA gene sequence +print(f"🔬 Generated Sequence:\n{generated_sequence}") +# TCACCGTTCTACAATCCCAAGCTGGAGTCAAGCTCAACAGGGTCTTCTTGCCCCGCTGAGGGTTACACTCGCCCGTTCCCGAGTCTGTGGTTTCGCGAAGATATGACCAGGGACAGTAAGAACC +``` + +## **Benchmark Performance** +We evaluate METAGENE-1 across three tasks: pathogen detection, zero-shot embedding benchmarks (**Gene-MTEB**), and genome understanding (**GUE**), achieving state-of-the-art performance on most benchmarks. For more details, check out our [paper](https://arxiv.org/abs/2501.02045). +### **Pathogen Detection** +The pathogen detection benchmark evaluates **METAGENE-1**’s ability to classify sequencing reads as human pathogens or non-pathogens across four distinct datasets, each derived from different sequencing deliveries and designed to mimic real-world conditions with limited training data. +| | **DNABERT-2** | **DNABERT-S** | **NT-2.5b-Multi** | **NT-2.5b-1000g** | **METAGENE-1** | +|-------------------------------|---------------|---------------|-------------------|-------------------|----------------| +| **Pathogen-Detect (avg.)** | 87.92 | 87.02 | 82.43 | 79.02 | **92.96** | +| **Pathogen-Detect-1** | 86.73 | 85.43 | 83.80 | 77.52 | **92.14** | +| **Pathogen-Detect-2** | 86.90 | 85.23 | 83.53 | 80.38 | **90.91** | +| **Pathogen-Detect-3** | 88.30 | 89.01 | 82.48 | 79.83 | **93.70** | +| **Pathogen-Detect-4** | 89.77 | 88.41 | 79.91 | 78.37 | **95.10** | + +### **Gene-MTEB** +The Gene-MTEB benchmark evaluates **METAGENE-1**’s ability to produce high-quality, zero-shot genomic representations through eight classification and eight clustering tasks. +| | **DNABERT-2** | **DNABERT-S** | **NT-2.5b-Multi** | **NT-2.5b-1000g** | **METAGENE-1** | +|--------------------------------|---------------|---------------|-------------------|-------------------|----------------| +| **Human-Virus (avg.)** | 0.564 | 0.570 | 0.675 | 0.710 | **0.775** | +| Human-Virus-1 | 0.594 | 0.605 | 0.671 | 0.721 | **0.828** | +| Human-Virus-2 | 0.507 | 0.510 | 0.652 | 0.624 | **0.742** | +| Human-Virus-3 | 0.606 | 0.612 | 0.758 | 0.740 | **0.835** | +| Human-Virus-4 | 0.550 | 0.551 | 0.620 | **0.755** | 0.697 | +| **HMPD (avg.)** | 0.397 | 0.403 | 0.449 | 0.451 | **0.465** | +| HMPD-single | 0.292 | 0.293 | 0.285 | 0.292 | **0.297** | +| HMPD-disease | 0.480 | 0.486 | 0.498 | 0.489 | **0.542** | +| HMPD-sex | 0.366 | 0.367 | 0.487 | 0.476 | **0.495** | +| HMPD-source | 0.451 | 0.465 | 0.523 | **0.545** | 0.526 | +| **HVR (avg.)** | 0.479 | 0.479 | 0.546 | 0.524 | **0.550** | +| HVR-p2p | 0.548 | 0.550 | 0.559 | **0.650** | 0.466 | +| HVR-s2s-align | 0.243 | 0.241 | 0.266 | **0.293** | 0.267 | +| HVR-s2s-small | 0.373 | 0.372 | 0.357 | 0.371 | **0.467** | +| HVR-s2s-tiny | 0.753 | 0.753 | 1.000 | 0.782 | **1.000** | +| **HMPR (avg.)** | 0.347 | 0.351 | 0.348 | 0.403 | **0.476** | +| HMPR-p2p | 0.566 | **0.580** | 0.471 | 0.543 | 0.479 | +| HMPR-s2s-align | 0.127 | 0.129 | 0.144 | **0.219** | 0.140 | +| HMPR-s2s-small | 0.419 | 0.421 | 0.443 | **0.459** | 0.432 | +| HMPR-s2s-tiny | 0.274 | 0.274 | 0.332 | 0.391 | **0.855** | +| **Global Average** | 0.475 | 0.479 | 0.525 | 0.545 | **0.590** | + +### **GUE** + +Next, we evaluate **METAGENE-1** on the GUE multi-species classification benchmark proposed in [DNABERT-2](https://arxiv.org/abs/2306.15006). This experiment is designed to assess the viability of **METAGENE-1** as a general-purpose genome foundation model. +| | **CNN** | **HyenaDNA** | **DNABERT** | **NT-2.5B-Multi** | **DNABERT-2** | **METAGENE-1** | +|--------------------------------|---------|--------------|-------------|-------------------|---------------|----------------| +| **TF-Mouse (avg.)** | 45.3 | 51.0 | 57.7 | 67.0 | 68.0 | **71.4** | +| 0 | 31.1 | 35.6 | 42.3 | **63.3** | 56.8 | 61.5 | +| 1 | 59.7 | 80.5 | 79.1 | 83.8 | **84.8** | 83.7 | +| 2 | 63.2 | 65.3 | 69.9 | 71.5 | 79.3 | **83.0** | +| 3 | 45.5 | 54.2 | 55.4 | 69.4 | 66.5 | **82.2** | +| 4 | 27.2 | 19.2 | 42.0 | 47.1 | **52.7** | 46.6 | +| **TF-Human (avg.)** | 50.7 | 56.0 | 64.4 | 62.6 | **70.1** | 68.3 | +| 0 | 54.0 | 62.3 | 68.0 | 66.6 | **72.0** | 68.9 | +| 1 | 63.2 | 67.9 | 70.9 | 66.6 | **76.1** | 70.8 | +| 2 | 45.2 | 46.9 | 60.5 | 58.7 | **66.5** | 65.9 | +| 3 | 29.8 | 41.8 | 53.0 | 51.7 | **58.5** | 58.1 | +| 4 | 61.5 | 61.2 | 69.8 | 69.3 | 77.4 | **77.9** | +| **EMP (avg.)** | 37.6 | 44.9 | 49.5 | 58.1 | 56.0 | **66.0** | +| H3 | 61.5 | 67.2 | 74.2 | 78.8 | 78.3 | **80.2** | +| H3K14ac | 29.7 | 32.0 | 42.1 | 56.2 | 52.6 | **64.9** | +| H3K36me3 | 38.6 | 48.3 | 48.5 | 62.0 | 56.9 | **66.7** | +| H3K4me1 | 26.1 | 35.8 | 43.0 | 55.3 | 50.5 | **55.3** | +| H3K4me2 | 25.8 | 25.8 | 31.3 | 36.5 | 31.1 | **51.2** | +| H3K4me3 | 20.5 | 23.1 | 28.9 | 40.3 | 36.3 | **58.5** | +| H3K79me3 | 46.3 | 54.1 | 60.1 | 64.7 | 67.4 | **73.0** | +| H3K9ac | 40.0 | 50.8 | 50.5 | 56.0 | 55.6 | **65.5** | +| H4 | 62.3 | 73.7 | 78.3 | 81.7 | 80.7 | **82.7** | +| H4ac | 25.5 | 38.4 | 38.6 | 49.1 | 50.4 | **61.7** | +| **PD (avg.)** | 77.1 | 35.0 | 84.6 | **88.1** | 84.2 | 82.3 | +| All | 75.8 | 47.4 | 90.4 | **91.0** | 86.8 | 86.0 | +| No-TATA | 85.1 | 52.2 | 93.6 | **94.0** | **94.3** | 93.7 | +| TATA | 70.3 | 5.3 | 69.8 | **79.4** | 71.6 | 67.4 | +| **CPD (avg.)** | 62.5 | 48.4 | **73.0** | 71.6 | 70.5 | 69.9 | +| All | 58.1 | 37.0 | **70.9** | 70.3 | 69.4 | 66.4 | +| No-TATA | 60.1 | 35.4 | 69.8 | **71.6** | 68.0 | 68.3 | +| TATA | 69.3 | 72.9 | **78.2** | 73.0 | 74.2 | 75.1 | +| **SSD** | 76.8 | 72.7 | 84.1 | **89.3** | 85.0 | 87.8 | +| **COVID** | 22.2 | 23.3 | 62.2 | **73.0** | 71.9 | 72.5 | +| **Global Win %** | 0.0 | 0.0 | 7.1 | 21.4 | 25.0 | **46.4** | + +## **Safety Considerations** + +**METAGENE-1** provides valuable capabilities for biosurveillance and genomic anomaly detection, showing state-of-the-art results on a broad coverage of benchmarks. While its current version poses minimal risk, we carefully weighed its benefits against potential misuse, particularly in synthetic biology, and emphasize the need for stricter safety considerations for future, more capable models. + +**Purpose and Capabilities**: **METAGENE-1** is specifically optimized to detect anomalies in short metagenomic reads (100-300 base pairs), making it well-suited for tasks like pathogen detection and biosurveillance. The model’s architectural constraints, such as its 512-token context length, limit its applicability to complex sequence design tasks, reducing misuse risks. + +**Open Source Impact**: We believe the open release of **METAGENE-1** will foster research in pathogen detection and biosurveillance by providing a valuable tool for scientists; it will also facilitate interpretability and controllability research in scientific foundation models. However, we emphasize the need for more rigorous safety evaluations before open-sourcing larger or more capable genomic models in the future. + +We have included more in-depth discussions on safety considerations in our [paper](https://arxiv.org/abs/2501.02045). + +## **Model Details** +- **Release Date**: Jan 06 2025 +- **Model License**: Apache 2.0 + +## **BibTeX** + +```BibTeX +@misc{liu2025metagene1metagenomicfoundationmodel, + title={METAGENE-1: Metagenomic Foundation Model for Pandemic Monitoring}, + author={Ollie Liu and Sami Jaghouar and Johannes Hagemann and Shangshang Wang and Jason Wiemels and Jeff Kaufman and Willie Neiswanger}, + year={2025}, + eprint={2501.02045}, + archivePrefix={arXiv}, + primaryClass={q-bio.GN}, + url={https://arxiv.org/abs/2501.02045}, +} +``` \ No newline at end of file diff --git a/config.json b/config.json new file mode 100644 index 0000000..48507f6 --- /dev/null +++ b/config.json @@ -0,0 +1,35 @@ +{ + "_name_or_path": "/project/neiswang_1391/MGFM/MGFM-serving/model_ckpts/safetensors/step-00086000", + "architectures": [ + "LlamaForCausalLM" + ], + "attention_bias": false, + "attention_dropout": 0.0, + "bos_token_id": 3, + "eos_token_id": 4, + "head_dim": 128, + "hidden_act": "silu", + "hidden_size": 4096, + "initializer_range": 0.02, + "intermediate_size": 11008, + "mask_token_id": 5, + "max_position_embeddings": 512, + "max_sequence_length": 2048, + "mlp_bias": false, + "model_type": "llama", + "num_attention_heads": 32, + "num_hidden_layers": 32, + "num_key_value_heads": 32, + "pad_token_id": 0, + "pretraining_tp": 1, + "rms_norm_eps": 1e-05, + "rope_scaling": null, + "rope_theta": 10000.0, + "sep_token_id": 2, + "tie_word_embeddings": false, + "torch_dtype": "float32", + "transformers_version": "4.46.3", + "unk_token_id": 1, + "use_cache": true, + "vocab_size": 1024 +} \ No newline at end of file diff --git a/configuration.json b/configuration.json new file mode 100644 index 0000000..159097f --- /dev/null +++ b/configuration.json @@ -0,0 +1 @@ +{"framework": "pytorch", "task": "others", "allow_remote": true} \ No newline at end of file diff --git a/generation_config.json b/generation_config.json new file mode 100644 index 0000000..85c9d83 --- /dev/null +++ b/generation_config.json @@ -0,0 +1,7 @@ +{ + "_from_model_config": true, + "bos_token_id": 3, + "eos_token_id": 4, + "pad_token_id": 0, + "transformers_version": "4.46.3" +} diff --git a/model-00001-of-00006.safetensors b/model-00001-of-00006.safetensors new file mode 100644 index 0000000..95ec341 --- /dev/null +++ b/model-00001-of-00006.safetensors @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:7f8e8789e005c63157a5c61e1a5974109136d44259020ac9e779f33a0b487480 +size 4941093144 diff --git a/model-00002-of-00006.safetensors b/model-00002-of-00006.safetensors new file mode 100644 index 0000000..cb07d31 --- /dev/null +++ b/model-00002-of-00006.safetensors @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:4fa6abadcb468608f19df31644f74c9151c28c8df3007a9b4f7946ea96bb6784 +size 4991424824 diff --git a/model-00003-of-00006.safetensors b/model-00003-of-00006.safetensors new file mode 100644 index 0000000..4645237 --- /dev/null +++ b/model-00003-of-00006.safetensors @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8ecf5b3c2a7311ca4cb39a8832d276817ff0cafb328f70b1dab12a3a382774b7 +size 4924315880 diff --git a/model-00004-of-00006.safetensors b/model-00004-of-00006.safetensors new file mode 100644 index 0000000..267ddb7 --- /dev/null +++ b/model-00004-of-00006.safetensors @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:d481bc4e96ae0b2302931395315a3dc373e556d4269c564d4b306f6a1e5c451e +size 4857206904 diff --git a/model-00005-of-00006.safetensors b/model-00005-of-00006.safetensors new file mode 100644 index 0000000..8f0011d --- /dev/null +++ b/model-00005-of-00006.safetensors @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:25778d64535c016d32cedf1fceaf13a22bf52b3c557e129c2cdbb723eab017a6 +size 4857206904 diff --git a/model-00006-of-00006.safetensors b/model-00006-of-00006.safetensors new file mode 100644 index 0000000..4ec8321 --- /dev/null +++ b/model-00006-of-00006.safetensors @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:8abf5e8a7aef8884cf729f8efb97f02f799876e31ab326cb99565e7c52444a39 +size 1367426832 diff --git a/model.safetensors.index.json b/model.safetensors.index.json new file mode 100644 index 0000000..9ba7f0f --- /dev/null +++ b/model.safetensors.index.json @@ -0,0 +1,298 @@ +{ + "metadata": { + "total_size": 25938640896 + }, + "weight_map": { + "lm_head.weight": "model-00006-of-00006.safetensors", + "model.embed_tokens.weight": "model-00001-of-00006.safetensors", + "model.layers.0.input_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.0.mlp.down_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.0.mlp.up_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.1.input_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.1.mlp.down_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.1.mlp.up_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.10.input_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.10.mlp.down_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.10.mlp.gate_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.10.mlp.up_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.10.post_attention_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.10.self_attn.k_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.10.self_attn.o_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.10.self_attn.q_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.10.self_attn.v_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.11.input_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.11.mlp.down_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.11.mlp.gate_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.11.mlp.up_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.11.post_attention_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.11.self_attn.k_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.11.self_attn.o_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.11.self_attn.q_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.11.self_attn.v_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.12.input_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.12.mlp.down_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.12.mlp.gate_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.12.mlp.up_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.12.post_attention_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.12.self_attn.o_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.12.self_attn.v_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.13.input_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.13.mlp.down_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.13.mlp.gate_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.13.mlp.up_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.13.post_attention_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.13.self_attn.k_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.13.self_attn.o_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.13.self_attn.q_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.13.self_attn.v_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.14.input_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.14.mlp.down_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.14.mlp.gate_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.14.mlp.up_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.14.post_attention_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.14.self_attn.k_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.14.self_attn.o_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.14.self_attn.q_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.14.self_attn.v_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.15.input_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.15.mlp.down_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.15.mlp.gate_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.15.mlp.up_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.15.post_attention_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.15.self_attn.k_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.15.self_attn.o_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.15.self_attn.q_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.15.self_attn.v_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.16.input_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.16.mlp.down_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.16.mlp.gate_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.16.mlp.up_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.16.post_attention_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.16.self_attn.k_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.16.self_attn.o_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.16.self_attn.q_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.16.self_attn.v_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.17.input_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.17.mlp.down_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.17.mlp.gate_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.17.mlp.up_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.17.post_attention_layernorm.weight": "model-00003-of-00006.safetensors", + "model.layers.17.self_attn.k_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.17.self_attn.o_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.17.self_attn.q_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.17.self_attn.v_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.18.input_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.18.mlp.down_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.18.mlp.gate_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.18.mlp.up_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.18.post_attention_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.18.self_attn.k_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.18.self_attn.o_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.18.self_attn.q_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.18.self_attn.v_proj.weight": "model-00003-of-00006.safetensors", + "model.layers.19.input_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.19.mlp.down_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.19.mlp.gate_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.19.mlp.up_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.19.post_attention_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.19.self_attn.k_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.19.self_attn.o_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.19.self_attn.q_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.19.self_attn.v_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.2.input_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.2.mlp.down_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.2.mlp.up_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.20.input_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.20.mlp.down_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.20.mlp.gate_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.20.mlp.up_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.20.post_attention_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.20.self_attn.k_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.20.self_attn.o_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.20.self_attn.q_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.20.self_attn.v_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.21.input_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.21.mlp.down_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.21.mlp.gate_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.21.mlp.up_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.21.post_attention_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.21.self_attn.k_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.21.self_attn.o_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.21.self_attn.q_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.21.self_attn.v_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.22.input_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.22.mlp.down_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.22.mlp.gate_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.22.mlp.up_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.22.post_attention_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.22.self_attn.k_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.22.self_attn.o_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.22.self_attn.q_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.22.self_attn.v_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.23.input_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.23.mlp.down_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.23.mlp.gate_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.23.mlp.up_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.23.post_attention_layernorm.weight": "model-00004-of-00006.safetensors", + "model.layers.23.self_attn.k_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.23.self_attn.o_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.23.self_attn.q_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.23.self_attn.v_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.24.input_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.24.mlp.down_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.24.mlp.gate_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.24.mlp.up_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.24.post_attention_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.24.self_attn.k_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.24.self_attn.o_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.24.self_attn.q_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.24.self_attn.v_proj.weight": "model-00004-of-00006.safetensors", + "model.layers.25.input_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.25.mlp.down_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.25.mlp.gate_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.25.mlp.up_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.25.post_attention_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.25.self_attn.k_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.25.self_attn.o_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.25.self_attn.q_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.25.self_attn.v_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.26.input_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.26.mlp.down_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.26.mlp.gate_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.26.mlp.up_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.26.post_attention_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.26.self_attn.k_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.26.self_attn.o_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.26.self_attn.q_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.26.self_attn.v_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.27.input_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.27.mlp.down_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.27.mlp.gate_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.27.mlp.up_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.27.post_attention_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.27.self_attn.k_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.27.self_attn.o_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.27.self_attn.q_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.27.self_attn.v_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.28.input_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.28.mlp.down_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.28.mlp.gate_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.28.mlp.up_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.28.post_attention_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.28.self_attn.k_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.28.self_attn.o_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.28.self_attn.q_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.28.self_attn.v_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.29.input_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.29.mlp.down_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.29.mlp.gate_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.29.mlp.up_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.29.post_attention_layernorm.weight": "model-00005-of-00006.safetensors", + "model.layers.29.self_attn.k_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.29.self_attn.o_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.29.self_attn.q_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.29.self_attn.v_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.3.input_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.3.mlp.down_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.3.mlp.up_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.30.input_layernorm.weight": "model-00006-of-00006.safetensors", + "model.layers.30.mlp.down_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.30.mlp.gate_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.30.mlp.up_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.30.post_attention_layernorm.weight": "model-00006-of-00006.safetensors", + "model.layers.30.self_attn.k_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.30.self_attn.o_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.30.self_attn.q_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.30.self_attn.v_proj.weight": "model-00005-of-00006.safetensors", + "model.layers.31.input_layernorm.weight": "model-00006-of-00006.safetensors", + "model.layers.31.mlp.down_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.31.mlp.gate_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.31.mlp.up_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.31.post_attention_layernorm.weight": "model-00006-of-00006.safetensors", + "model.layers.31.self_attn.k_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.31.self_attn.o_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.31.self_attn.q_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.31.self_attn.v_proj.weight": "model-00006-of-00006.safetensors", + "model.layers.4.input_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.4.mlp.down_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.4.mlp.up_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.5.input_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.5.mlp.down_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.5.mlp.up_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00006.safetensors", + "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.6.input_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.6.mlp.down_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.6.mlp.gate_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.6.mlp.up_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.6.post_attention_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.6.self_attn.k_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.6.self_attn.o_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00006.safetensors", + "model.layers.6.self_attn.v_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.7.input_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.7.mlp.down_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.7.mlp.gate_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.7.mlp.up_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.7.post_attention_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.7.self_attn.k_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.7.self_attn.q_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.7.self_attn.v_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.8.input_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.8.mlp.down_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.8.mlp.gate_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.8.mlp.up_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.8.post_attention_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.8.self_attn.k_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.8.self_attn.o_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.8.self_attn.q_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.8.self_attn.v_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.9.input_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.9.mlp.down_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.9.mlp.gate_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.9.mlp.up_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.9.post_attention_layernorm.weight": "model-00002-of-00006.safetensors", + "model.layers.9.self_attn.k_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.9.self_attn.o_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.9.self_attn.q_proj.weight": "model-00002-of-00006.safetensors", + "model.layers.9.self_attn.v_proj.weight": "model-00002-of-00006.safetensors", + "model.norm.weight": "model-00006-of-00006.safetensors" + } +} diff --git a/overview.png b/overview.png new file mode 100644 index 0000000..0430870 Binary files /dev/null and b/overview.png differ diff --git a/special_tokens_map.json b/special_tokens_map.json new file mode 100644 index 0000000..65147a3 --- /dev/null +++ b/special_tokens_map.json @@ -0,0 +1,8 @@ +{ + "bos_token": "[BOS]", + "eos_token": "[EOS]", + "mask_token": "[MASK]", + "pad_token": "[PAD]", + "sep_token": "[SEP]", + "unk_token": "[UNK]" +} \ No newline at end of file diff --git a/tokenizer.json b/tokenizer.json new file mode 100644 index 0000000..07ee080 --- /dev/null +++ b/tokenizer.json @@ -0,0 +1,2191 @@ +{ + "version": "1.0", + "truncation": null, + "padding": null, + "added_tokens": [ + { + "id": 0, + "content": "[PAD]", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 1, + "content": "[UNK]", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 2, + "content": "[SEP]", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 3, + "content": "[BOS]", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 4, + "content": "[EOS]", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 5, + "content": "[MASK]", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + } + ], + "normalizer": { + "type": "Replace", + "pattern": { + "Regex": "^" + }, + "content": "_" + }, + "pre_tokenizer": { + "type": "Whitespace" + }, + "post_processor": { + "type": "TemplateProcessing", + "single": [ + { + "Sequence": { + "id": "A", + "type_id": 0 + } + }, + { + "SpecialToken": { + "id": "[EOS]", + "type_id": 0 + } + } + ], + "pair": [ + { + "Sequence": { + "id": "A", + "type_id": 0 + } + }, + { + "Sequence": { + "id": "B", + "type_id": 1 + } + } + ], + "special_tokens": { + "[EOS]": { + "id": "[EOS]", + "ids": [ + 4 + ], + "tokens": [ + "[EOS]" + ] + }, + "[MASK]": { + "id": "[MASK]", + "ids": [ + 5 + ], + "tokens": [ + "[MASK]" + ] + }, + "[PAD]": { + "id": "[PAD]", + "ids": [ + 0 + ], + "tokens": [ + "[PAD]" + ] + }, + "[SEP]": { + "id": "[SEP]", + "ids": [ + 2 + ], + "tokens": [ + "[SEP]" + ] + } + } + }, + "decoder": null, + "model": { + "type": "BPE", + "dropout": null, + "unk_token": "[UNK]", + "continuing_subword_prefix": null, + "end_of_word_suffix": null, + "fuse_unk": false, + "byte_fallback": false, + "ignore_merges": false, + "vocab": { + "[PAD]": 0, + "[UNK]": 1, + "[SEP]": 2, + "[BOS]": 3, + "[EOS]": 4, + "[MASK]": 5, + "_": 6, + "A": 7, + "C": 8, + "G": 9, + "T": 10, + "N": 11, + "TT": 12, + "CC": 13, + "AA": 14, + "GG": 15, + "TC": 16, + "GC": 17, + "TA": 18, + "GA": 19, + "CA": 20, + "TG": 21, + "TCC": 22, + "TAA": 23, + "TCA": 24, + "TGG": 25, + "TTA": 26, + "GCC": 27, + "TGC": 28, + "TTC": 29, + "GAA": 30, + "TGA": 31, + "GTT": 32, + "TAC": 33, + "GCA": 34, + "CCA": 35, + "GGA": 36, + "AAA": 37, + "GTA": 38, + "GGC": 39, + "GTC": 40, + "GAC": 41, + "AAC": 42, + "CCC": 43, + "CAC": 44, + "TAAA": 45, + "TTTT": 46, + "TCCA": 47, + "TTTC": 48, + "GTG": 49, + "TCAC": 50, + "TTCC": 51, + "TGCA": 52, + "TTAA": 53, + "TTCA": 54, + "TTAC": 55, + "TCCC": 56, + "GCCA": 57, + "TGGA": 58, + "TTGG": 59, + "TCAA": 60, + "TGAA": 61, + "GAAA": 62, + "TGGC": 63, + "TTTA": 64, + "TGCC": 65, + "GTAA": 66, + "TGAC": 67, + "TACC": 68, + "GACC": 69, + "GTTA": 70, + "TACA": 71, + "TCTC": 72, + "GTTC": 73, + "GTCA": 74, + "TTGC": 75, + "TTGA": 76, + "GTAC": 77, + "GGAA": 78, + "TATC": 79, + "GAAC": 80, + "GCAA": 81, + "GGCA": 82, + "TAAC": 83, + "TGTC": 84, + "GCTC": 85, + "GTCC": 86, + "GACA": 87, + "TAGC": 88, + "GGCC": 89, + "CCAC": 90, + "GTGA": 91, + "AAAC": 92, + "GCAC": 93, + "GTGG": 94, + "GCCC": 95, + "AAAA": 96, + "_A": 97, + "TTTCA": 98, + "GCTA": 99, + "GTGC": 100, + "GATA": 101, + "TGTA": 102, + "TGTT": 103, + "GCTT": 104, + "CCCA": 105, + "TCTT": 106, + "GGAC": 107, + "TCGC": 108, + "GGTA": 109, + "TCGA": 110, + "TATT": 111, + "GGTT": 112, + "GATT": 113, + "TCCCC": 114, + "GAGC": 115, + "AACA": 116, + "AACC": 117, + "GGTC": 118, + "GATC": 119, + "GGGG": 120, + "TTACC": 121, + "CACC": 122, + "TATA": 123, + "GCGG": 124, + "GCGC": 125, + "TAGA": 126, + "CCCC": 127, + "GCGA": 128, + "TCTCA": 129, + "GGGA": 130, + "GTCAA": 131, + "GTTAC": 132, + "TCTA": 133, + "TCCAC": 134, + "TTTTC": 135, + "TCACC": 136, + "CATC": 137, + "TACAA": 138, + "TAGG": 139, + "GTAAA": 140, + "GAGA": 141, + "TTCCA": 142, + "TGAAA": 143, + "TCCAA": 144, + "GTACC": 145, + "GGTAC": 146, + "TAAAA": 147, + "TTCAA": 148, + "CCACC": 149, + "CACA": 150, + "GCACC": 151, + "TTTGA": 152, + "TGTG": 153, + "TTAGC": 154, + "GGGC": 155, + "GAGTT": 156, + "_C": 157, + "AAACC": 158, + "TGGAA": 159, + "TCGTT": 160, + "TGCCA": 161, + "TGTAA": 162, + "TTGTT": 163, + "TACCA": 164, + "GAGG": 165, + "TGCAA": 166, + "_TC": 167, + "TTTAA": 168, + "TAAAC": 169, + "GATG": 170, + "GAAAA": 171, + "TCAAA": 172, + "TGGAC": 173, + "GCCCA": 174, + "TCCCA": 175, + "TTTTA": 176, + "GCCAC": 177, + "TAGTT": 178, + "TCGAC": 179, + "GTCCA": 180, + "TTTGG": 181, + "GCGGC": 182, + "TTAAA": 183, + "TGTTA": 184, + "TCGG": 185, + "GTAAC": 186, + "TGTCA": 187, + "AAGC": 188, + "TGTTC": 189, + "TCTAA": 190, + "GCTAC": 191, + "TTCGG": 192, + "TTGGA": 193, + "GCTTTC": 194, + "TTACA": 195, + "CCAA": 196, + "TGCAC": 197, + "GAACA": 198, + "TCAAC": 199, + "TTTAAA": 200, + "GGGTT": 201, + "TTTCC": 202, + "TGAAC": 203, + "GCCCC": 204, + "TCGCA": 205, + "TCGCC": 206, + "TTGTA": 207, + "TCACA": 208, + "GATGA": 209, + "TATCC": 210, + "TTCAC": 211, + "GAAAC": 212, + "GCTGA": 213, + "GCTTA": 214, + "TCTCC": 215, + "_TCC": 216, + "GCTCC": 217, + "GACAA": 218, + "TCTTC": 219, + "GCGCC": 220, + "GGGTA": 221, + "TGGCC": 222, + "TGTCC": 223, + "TTAAC": 224, + "TAACC": 225, + "GGACC": 226, + "TTGTC": 227, + "GTTCA": 228, + "TTGAC": 229, + "GCCCCA": 230, + "GTGGC": 231, + "GCTTC": 232, + "TATTC": 233, + "GTTAA": 234, + "TTTAC": 235, + "GGCAA": 236, + "GGTTC": 237, + "TAACA": 238, + "TGACC": 239, + "GTGCA": 240, + "TTGAA": 241, + "TGCCC": 242, + "TGGCA": 243, + "TTGCA": 244, + "GATGC": 245, + "AATA": 246, + "TAAGCC": 247, + "GCGCA": 248, + "TCTAC": 249, + "TTAACC": 250, + "GAACC": 251, + "_AC": 252, + "TGCTTC": 253, + "TATAA": 254, + "TATCA": 255, + "TGGGTT": 256, + "GTAGGA": 257, + "TTAGG": 258, + "TGTGG": 259, + "TTGCC": 260, + "TTTCCC": 261, + "TCCCCA": 262, + "TACAC": 263, + "GACCA": 264, + "TACCC": 265, + "GTGGA": 266, + "TGTGA": 267, + "GTCAC": 268, + "AACAA": 269, + "GCTAA": 270, + "AAACA": 271, + "TTCCC": 272, + "GGTGC": 273, + "GATCA": 274, + "GATAA": 275, + "GGTCC": 276, + "GATGG": 277, + "GAGCA": 278, + "GAGAA": 279, + "GATTA": 280, + "GGGGA": 281, + "GTGAA": 282, + "GAGGA": 283, + "TTTGC": 284, + "TCGAA": 285, + "GCACCC": 286, + "GTCCC": 287, + "GGCCA": 288, + "TTTTCC": 289, + "GT": 290, + "GGGCA": 291, + "TCTGC": 292, + "GCAAC": 293, + "GAGCC": 294, + "GGCAC": 295, + "TCATCC": 296, + "TTTCAC": 297, + "TTCACC": 298, + "TGACA": 299, + "GATCC": 300, + "TTTCACC": 301, + "TATTA": 302, + "GGGCC": 303, + "GTTCC": 304, + "TGTAC": 305, + "GTGCC": 306, + "TGCTGCC": 307, + "GGTGG": 308, + "TCTGG": 309, + "TATGC": 310, + "GTGAC": 311, + "TGCTGG": 312, + "GGGAA": 313, + "GGAAC": 314, + "TCGTC": 315, + "GCTG": 316, + "TAGGA": 317, + "GAGGC": 318, + "TATAC": 319, + "TAGCC": 320, + "GGGGC": 321, + "GCGTT": 322, + "GGAAA": 323, + "GACCC": 324, + "GGTAA": 325, + "TTCTGA": 326, + "TCTTA": 327, + "TAGAA": 328, + "GGGTC": 329, + "GTGTCTCA": 330, + "TTGGC": 331, + "TCAGC": 332, + "CCACA": 333, + "TCGGC": 334, + "TCCACC": 335, + "GCTGC": 336, + "GACAC": 337, + "GTATTTA": 338, + "GCAAA": 339, + "TCTGA": 340, + "GCCAA": 341, + "GGCCC": 342, + "_CC": 343, + "TTTTCA": 344, + "TCCCGTAGGA": 345, + "TACTCA": 346, + "TTTG": 347, + "GATTC": 348, + "GTACA": 349, + "TATTAA": 350, + "TGCTGCCTCCCGTAGGA": 351, + "GGTCA": 352, + "GCGTC": 353, + "TAGTA": 354, + "TGAAAA": 355, + "GGTGA": 356, + "GGGACC": 357, + "GGACA": 358, + "TCACCC": 359, + "TGGACC": 360, + "TTCCAA": 361, + "CACCA": 362, + "GATAC": 363, + "TTTCAA": 364, + "CCCAA": 365, + "TCACGGTAC": 366, + "GGAGTT": 367, + "GCTGG": 368, + "TTAGCA": 369, + "GCTCA": 370, + "TCGTA": 371, + "TTCCCA": 372, + "TATTTCAC": 373, + "TAGTGA": 374, + "CCAGTG": 375, + "TTATAC": 376, + "TGCTGCCTCCCGTAGGAGTC": 377, + "TCTACC": 378, + "TTATC": 379, + "TAGCA": 380, + "TGTGC": 381, + "TCCTCC": 382, + "TATGA": 383, + "TGCTTCTAAGCC": 384, + "TATGG": 385, + "GGTTGA": 386, + "TTTGAGTT": 387, + "GCGAC": 388, + "TTACCGCGGC": 389, + "TCACGAC": 390, + "TTTTCACC": 391, + "TAGTC": 392, + "TTAGCC": 393, + "TTTCCA": 394, + "GTAGG": 395, + "GTTTCC": 396, + "TTTCCCTCACGGTAC": 397, + "TCTTTT": 398, + "TTGGCC": 399, + "TTCACA": 400, + "GGATCAC": 401, + "TCATTA": 402, + "GTCAAAC": 403, + "GGTTA": 404, + "GAGAC": 405, + "TTCAAA": 406, + "GCAGAC": 407, + "GCGAA": 408, + "GTACTCCCCA": 409, + "GTTTGA": 410, + "GAGTA": 411, + "GTTGC": 412, + "TACCAGGGTA": 413, + "TTCCAC": 414, + "GGGAC": 415, + "GTGGAC": 416, + "TGCTTTC": 417, + "TGTCTCACGAC": 418, + "TCTAATCC": 419, + "GCATTC": 420, + "GCCTTC": 421, + "GGATC": 422, + "GCCTCC": 423, + "TCAAAA": 424, + "TCTG": 425, + "TTCAAC": 426, + "GGTGCC": 427, + "TGCACC": 428, + "TAGAC": 429, + "TATTTT": 430, + "GAGTC": 431, + "TTACCGCGGCTGCTGG": 432, + "TTTGAA": 433, + "GTTAGCC": 434, + "TGCTCC": 435, + "TTGCCA": 436, + "TACACC": 437, + "TACCAGGGTATCTAATCC": 438, + "TAGCTAA": 439, + "TGTTCC": 440, + "TTGCAC": 441, + "CACAA": 442, + "TCTTCC": 443, + "GATTCC": 444, + "TGGGC": 445, + "TTACGCTTTC": 446, + "AACATCC": 447, + "TACGCA": 448, + "TCATC": 449, + "GCGTA": 450, + "TTCTTC": 451, + "TAAAAC": 452, + "GCGGA": 453, + "TACGGC": 454, + "GGGTGG": 455, + "TTACGCTTTCTTTAAA": 456, + "TTGTCC": 457, + "TGTTTT": 458, + "TTGTAA": 459, + "TTCCTTTGAGTT": 460, + "TCCGGA": 461, + "TCGGA": 462, + "TCCACA": 463, + "GCCTTGG": 464, + "TGGCCA": 465, + "TGCTC": 466, + "GACTAA": 467, + "_ACC": 468, + "TATTCA": 469, + "TGAGCC": 470, + "TAGGC": 471, + "GGTG": 472, + "TTTGCC": 473, + "GCAGTT": 474, + "TTGGGACC": 475, + "TACACCA": 476, + "TTAGAA": 477, + "TCTTGC": 478, + "TGAGA": 479, + "AAACCA": 480, + "GGTTTCA": 481, + "TAAACA": 482, + "TACTC": 483, + "TCAACC": 484, + "GTTTC": 485, + "GGCGA": 486, + "GCTTTT": 487, + "_TG": 488, + "TTAAAA": 489, + "TTGTG": 490, + "GAACAA": 491, + "TTAACA": 492, + "TTGGCA": 493, + "GTATTTAGCC": 494, + "TATCAC": 495, + "TTACCGCGGCTGCTGGCAC": 496, + "GTTCAA": 497, + "TGCAAAA": 498, + "GGTTTC": 499, + "TCCCAA": 500, + "TCTCGTAC": 501, + "TTGTCA": 502, + "GCTACC": 503, + "TTTACC": 504, + "TCCTAC": 505, + "TTGGAA": 506, + "TCAGAC": 507, + "TAATAA": 508, + "GAACTGTCTCACGAC": 509, + "GATTAAC": 510, + "TCTTTC": 511, + "TTAGGA": 512, + "GCACA": 513, + "GGCTTC": 514, + "TTGGGTT": 515, + "GCAACA": 516, + "TGAGC": 517, + "TGGTA": 518, + "GAACTGTCTCACGACGTTC": 519, + "_TCCC": 520, + "GTGAAA": 521, + "_TCA": 522, + "TGGGG": 523, + "TTACCC": 524, + "GGGAAC": 525, + "GGGTTC": 526, + "GGATGG": 527, + "AACCTCC": 528, + "TGGCTGCTTCTAAGCC": 529, + "TGGGCC": 530, + "TTATTC": 531, + "GTATTACCGCGGCTGCTGGCAC": 532, + "TTTCACCCC": 533, + "AACAAC": 534, + "TTTACA": 535, + "GTAAAC": 536, + "GCCACC": 537, + "TTTTAA": 538, + "CCCAGCTC": 539, + "TTAGATG": 540, + "TAAGG": 541, + "TATG": 542, + "AAACAA": 543, + "GTGTCTCAGTT": 544, + "GCCCCAGGA": 545, + "TGTGTC": 546, + "GTCCCA": 547, + "GATTAC": 548, + "GTCCCC": 549, + "TACGCC": 550, + "TGGGTTGTT": 551, + "TTCGGA": 552, + "GAAGAA": 553, + "GATAGGGACC": 554, + "GTTACA": 555, + "TACCAA": 556, + "GTTACC": 557, + "TACGGA": 558, + "GACATCGA": 559, + "TGAGCCA": 560, + "TACCAGGGTATCTAATCCTGTT": 561, + "TATCGG": 562, + "TACGA": 563, + "TAAGCA": 564, + "TCACCAAC": 565, + "TGGACCGTGTCTCAGTT": 566, + "TAAGTT": 567, + "_AA": 568, + "GACCTTAGC": 569, + "TCTTTA": 570, + "TGACAA": 571, + "TCAACA": 572, + "GCTACA": 573, + "TCCGACC": 574, + "GAACCA": 575, + "GCGGCA": 576, + "TCATCA": 577, + "AATATTCC": 578, + "TCTTTCC": 579, + "GTCCAC": 580, + "GCCCAA": 581, + "GAAGA": 582, + "CCACCGGATCAC": 583, + "GCGTG": 584, + "TTCTCC": 585, + "GCAAAA": 586, + "GCCTGC": 587, + "_TGG": 588, + "GTCTC": 589, + "TCGCTTTC": 590, + "CCACTGCTGCCTCCCGTAGGAGTC": 591, + "TATACC": 592, + "TATTTA": 593, + "GGGGTTC": 594, + "GCAGA": 595, + "TGCCTTC": 596, + "GCAGG": 597, + "TTAGTA": 598, + "TGGTCC": 599, + "GTCACC": 600, + "TGATCC": 601, + "TGGACCGTGTCTCAGTTCCAGTG": 602, + "GCTTTAC": 603, + "TGCCAA": 604, + "GCCGTC": 605, + "TCTTTAAA": 606, + "_TCTC": 607, + "TGGTC": 608, + "GCCGTT": 609, + "GCTTCA": 610, + "TTCGGC": 611, + "TTGAAA": 612, + "GTATTCACC": 613, + "TAGTG": 614, + "GCAGGC": 615, + "GTTGTT": 616, + "TAAACC": 617, + "TACCAC": 618, + "TGAGG": 619, + "TTTAAC": 620, + "TAAAAA": 621, + "_CA": 622, + "GGAAAA": 623, + "TTTTCACCTTTCCCTCACGGTAC": 624, + "GGTGGA": 625, + "TCCCCCCA": 626, + "GTACTCCCCAGGC": 627, + "GCAGC": 628, + "TTAGA": 629, + "TAGCTGTC": 630, + "TACGC": 631, + "GTTTCA": 632, + "GGAGA": 633, + "GAATC": 634, + "GACCGCCCCA": 635, + "TGATCATCC": 636, + "GTAGA": 637, + "GCTTGTGC": 638, + "GTAACA": 639, + "GACCAA": 640, + "TGTTAA": 641, + "TGTTTA": 642, + "GCTATTACGCTTTCTTTAAA": 643, + "TTATA": 644, + "TCGTG": 645, + "TTACAA": 646, + "TACAAA": 647, + "TTATCC": 648, + "TTACGCAC": 649, + "TCACTTC": 650, + "TGCCCA": 651, + "GAACTGTCTCACGACGTTCTGAA": 652, + "TATTTCACTCCCC": 653, + "TGCCAC": 654, + "_GG": 655, + "GCCGC": 656, + "GCCCCC": 657, + "TAATAC": 658, + "TGATGA": 659, + "_TGC": 660, + "TTCGA": 661, + "TATTCC": 662, + "GAAAAA": 663, + "CCCTTCCA": 664, + "GGCACC": 665, + "TGTTGC": 666, + "GAAACA": 667, + "TATAAA": 668, + "TCATAA": 669, + "TTGTGC": 670, + "TATTTC": 671, + "TGCACA": 672, + "GTCAATTCCTTTGAGTT": 673, + "GACCGCCCCAGTCAAAC": 674, + "TTGTAC": 675, + "GCTTGC": 676, + "GTAAAA": 677, + "TATCAA": 678, + "GACAAA": 679, + "GACATCGAGGTGCC": 680, + "TGCTCA": 681, + "GTATCA": 682, + "GGGAACGTATTCACC": 683, + "TCTCAA": 684, + "TTAACCA": 685, + "GCTCAC": 686, + "TCGGTA": 687, + "TATGCC": 688, + "GAAACC": 689, + "TCCTC": 690, + "TTGACC": 691, + "CACTGC": 692, + "GTATGC": 693, + "CCACGCTTTC": 694, + "GACTCGCTTTC": 695, + "TCAGGC": 696, + "GCTCCA": 697, + "GACTAACCC": 698, + "TGGCAA": 699, + "TCAGGA": 700, + "GCCAAA": 701, + "GCAGCA": 702, + "TTTTATCC": 703, + "TGGAAA": 704, + "TGTGAC": 705, + "TTGTGG": 706, + "GTCGAGTT": 707, + "GGACGTTA": 708, + "GGTTCC": 709, + "TGGGA": 710, + "TTATTA": 711, + "GAACCACCGGATCAC": 712, + "GAAGTT": 713, + "TCCAAA": 714, + "GCTTTA": 715, + "CCCAAC": 716, + "TCTGAC": 717, + "GGCCGAC": 718, + "TCAGA": 719, + "TAGGCA": 720, + "TCGCTACTCA": 721, + "TGTCAA": 722, + "TTGCAA": 723, + "GAACTGTCTCACGACGTTCTGAACCCAGCTC": 724, + "TAGGAA": 725, + "GCTGAA": 726, + "TGAGCCATTACC": 727, + "TGGACA": 728, + "GGTTTT": 729, + "TGTTTC": 730, + "TTGTTC": 731, + "CCAGTGA": 732, + "TTCTCCA": 733, + "TCTTGA": 734, + "TGGCGAACA": 735, + "TTTTCAAC": 736, + "GTGTGTA": 737, + "TAATCA": 738, + "CACTATC": 739, + "GTTTTA": 740, + "GTCGAGTTGCAGAC": 741, + "GTAACC": 742, + "TTTGAC": 743, + "TCATTATGCAAAA": 744, + "GTCATCC": 745, + "TGATC": 746, + "TACCTCCA": 747, + "TGGCGAACAGCCA": 748, + "GAAGC": 749, + "GAAGG": 750, + "GCTCGCC": 751, + "CATCGTT": 752, + "GGAATA": 753, + "TTTGCA": 754, + "CATCTTCC": 755, + "TGCGA": 756, + "TTGACA": 757, + "TACCCC": 758, + "TACACA": 759, + "GTTGAGC": 760, + "TGTGAA": 761, + "TGGTTC": 762, + "TGCTGA": 763, + "TGCATGC": 764, + "GCATAC": 765, + "TAACAA": 766, + "GACGG": 767, + "TGTAAA": 768, + "TCCCGAA": 769, + "TGGCGAACAGCCATACCC": 770, + "GCACTTCTGA": 771, + "GTACAA": 772, + "TAAGTC": 773, + "GTCAGTA": 774, + "TTTTAC": 775, + "TTCTA": 776, + "GCCGCC": 777, + "GCTATC": 778, + "TGAGCCATTACCTCACCAAC": 779, + "GGCCCC": 780, + "GATAAA": 781, + "TATACA": 782, + "TCGACTAGTGA": 783, + "TACTTTC": 784, + "GGTGAA": 785, + "TGTTCA": 786, + "CATCTTCCGCGCA": 787, + "_TGCC": 788, + "GATCTC": 789, + "GTGGAA": 790, + "TGTCCA": 791, + "GACACC": 792, + "GGTCTGGGTTGTT": 793, + "TTACCA": 794, + "GCCGGC": 795, + "GAAGGC": 796, + "TGCAAC": 797, + "GCTCCCC": 798, + "TGGCTGC": 799, + "GCCTCCGTTAC": 800, + "GCTCCC": 801, + "TCCGAA": 802, + "GTCAAA": 803, + "GGAGTTAGCC": 804, + "GAACCC": 805, + "TTACCAA": 806, + "_GCC": 807, + "GCTCGAC": 808, + "GAGTTC": 809, + "TGACTGATCATCC": 810, + "GACTTAA": 811, + "GGTACA": 812, + "GATCAA": 813, + "GGTCCTCTCGTAC": 814, + "AACTTC": 815, + "GATGGC": 816, + "TTTCACCCCTA": 817, + "GTTTCCCAC": 818, + "GGAACC": 819, + "TAATCC": 820, + "TTTTCGCC": 821, + "TGGCGAACAGCCATACCCTTGGGACC": 822, + "TAGCGATTCC": 823, + "GTCCAA": 824, + "GCAGCC": 825, + "CCAGTA": 826, + "GTGTCA": 827, + "TTTCACA": 828, + "GGAGGC": 829, + "TAATC": 830, + "TATATC": 831, + "TAATTC": 832, + "GTGGACTACCAGGGTATCTAATCCTGTT": 833, + "TGTTGG": 834, + "TGGTTTCA": 835, + "GTGGCA": 836, + "GATTTT": 837, + "GTGCAA": 838, + "TGCTGC": 839, + "TGAACC": 840, + "TTACGCACTCTTTAAA": 841, + "GGACGTTAGCACCC": 842, + "TCTTCA": 843, + "GCTTTTC": 844, + "CCACTGCTGCCTCCCGTAGGAGTCTGGACCGTGTCTCAGTTCCAGTG": 845, + "GAAGAC": 846, + "GCATTCGCACTTCTGA": 847, + "GTAGC": 848, + "GGGAAA": 849, + "TCCACAA": 850, + "GGCCAA": 851, + "TGGGGC": 852, + "GTGCCA": 853, + "TGAACA": 854, + "GCTGGC": 855, + "_GTA": 856, + "GGTCTGGGTTGTTTCCC": 857, + "TTAGATGTTTCA": 858, + "GCTTGCACCC": 859, + "TCCCCC": 860, + "GGAATTTC": 861, + "TTATACAAAA": 862, + "GTATTACCGCGGCTGCTGGCACGGAGTTAGCC": 863, + "GCCACA": 864, + "GATTTC": 865, + "GGACCA": 866, + "GTCACA": 867, + "CCCACC": 868, + "GGTCTC": 869, + "GGCTC": 870, + "TGTGTCGGTT": 871, + "TTTGGGACCTTAGC": 872, + "CACGAC": 873, + "TCGGTT": 874, + "TGTGTT": 875, + "GGCTCATTATGCAAAA": 876, + "TGCAAA": 877, + "GTTTA": 878, + "GTAATTCC": 879, + "TGGTTCAC": 880, + "GTACCTTTTATCC": 881, + "GCGTTC": 882, + "GAGAAC": 883, + "TAAGTA": 884, + "TACCCACCA": 885, + "TCAGCC": 886, + "GGAAACC": 887, + "TACTTA": 888, + "TAACCA": 889, + "CACTTC": 890, + "TGCTAA": 891, + "_GC": 892, + "TTCGTAC": 893, + "TCAAAC": 894, + "TGCTCGAC": 895, + "TGAATGA": 896, + "TCTCAC": 897, + "GACTAC": 898, + "TTAGATA": 899, + "GGGGTTCTTTTCGCC": 900, + "TACCCA": 901, + "TCGGTATTCC": 902, + "GTCTA": 903, + "CCACAA": 904, + "TCCGG": 905, + "TAGGGC": 906, + "TAAGA": 907, + "GTCTCGCA": 908, + "GCTACAC": 909, + "TTTGGA": 910, + "TTAGTC": 911, + "TCCACCGCTTGTGC": 912, + "TTTACAA": 913, + "GGCAAA": 914, + "GTTATA": 915, + "TTAAAC": 916, + "TGTCAC": 917, + "TATATT": 918, + "GTATTTAGCCTTGGA": 919, + "GTCAAC": 920, + "GGGCCA": 921, + "TCATTC": 922, + "TGGCTGCTTCTAAGCCAACCTCC": 923, + "GTCGA": 924, + "GCTGACCCA": 925, + "TTTGGC": 926, + "GGGCGG": 927, + "TGTTGA": 928, + "TTCCCC": 929, + "TTTCGG": 930, + "GTTCCC": 931, + "TGATGG": 932, + "TGTAAC": 933, + "TATGTA": 934, + "GGATCA": 935, + "GTTAAA": 936, + "GGAACA": 937, + "TTATCCA": 938, + "GGCGG": 939, + "TGTCTC": 940, + "_GA": 941, + "TAGAAC": 942, + "GCTTTACGCCCA": 943, + "TACTGA": 944, + "GCAACC": 945, + "TTATACAAAAGGTAC": 946, + "TGAGTC": 947, + "GAGCTGAC": 948, + "GTTGA": 949, + "TGGAAC": 950, + "TTGGAC": 951, + "TGAAAC": 952, + "TCTCAAACCA": 953, + "GTAGGAAACC": 954, + "TTATGA": 955, + "TGTACA": 956, + "TTCTGC": 957, + "GTCAAAA": 958, + "AACACC": 959, + "TTCTCACC": 960, + "TTATAA": 961, + "GGATA": 962, + "TTATTCA": 963, + "TTACCCC": 964, + "GCATCA": 965, + "TTCGTGCA": 966, + "GTTTTC": 967, + "TAAATCA": 968, + "TGCTCCCCACGCTTTC": 969, + "GTGACA": 970, + "TGCCCC": 971, + "GGCCCA": 972, + "GCCCAC": 973, + "TATAGC": 974, + "GGCGGC": 975, + "AACTTCA": 976, + "TATGACC": 977, + "TCACAA": 978, + "GTGCTCTACC": 979, + "GCATGA": 980, + "GAATAA": 981, + "TACTGC": 982, + "GTCGG": 983, + "GCTTAC": 984, + "TATCCA": 985, + "GCTCGCCGCTAC": 986, + "TCGACTAGTGAGCTATTACGCTTTCTTTAAA": 987, + "TGTCCC": 988, + "GACCCC": 989, + "TGAGTA": 990, + "GGGGTTCTTTTCGCCTTTCCCTCACGGTAC": 991, + "TTTGTAA": 992, + "GAATCA": 993, + "GCACAA": 994, + "GAATTC": 995, + "GTTTGATTGGCC": 996, + "TTCCAAGCC": 997, + "CCAGCTA": 998, + "TGTCTCCC": 999, + "GTTGAGCGATGG": 1000, + "TCCAAC": 1001, + "GAACCCC": 1002, + "TGACGAGCA": 1003, + "TGAATGGCTGC": 1004, + "GTTACATCTTCCGCGCA": 1005, + "TCTCAGACCA": 1006, + "GGATGGCTGCTTCTAAGCCAACCTCC": 1007, + "TGTTAC": 1008, + "GACTCGCTTTCGCTAC": 1009, + "TGGGAC": 1010, + "GGTACC": 1011, + "GGAGC": 1012, + "GGGCCCCC": 1013, + "TAAACAA": 1014, + "GTCGCC": 1015, + "GAGGAA": 1016, + "GACTTTC": 1017, + "TTTTTGA": 1018, + "AAAGTT": 1019, + "AAAACC": 1020, + "TAGAAA": 1021, + "GGTTAA": 1022, + "GCGCAA": 1023 + }, + "merges": [ + "T T", + "C C", + "A A", + "G G", + "T C", + "G C", + "T A", + "G A", + "C A", + "T G", + "T CC", + "T AA", + "TC A", + "T GG", + "TT A", + "G CC", + "T GC", + "TT C", + "G AA", + "T GA", + "G TT", + "TA C", + "GC A", + "CC A", + "GG A", + "AA A", + "G TA", + "GG C", + "G TC", + "GA C", + "AA C", + "CC C", + "CA C", + "TAA A", + "TT TT", + "TCC A", + "TT TC", + "G TG", + "TCA C", + "TT CC", + "TGC A", + "TT AA", + "TT CA", + "TTA C", + "TCC C", + "GCC A", + "TGG A", + "TT GG", + "TC AA", + "TG AA", + "GAA A", + "TGG C", + "TT TA", + "TG CC", + "G TAA", + "TGA C", + "TA CC", + "GA CC", + "G TTA", + "TA CA", + "TC TC", + "G TTC", + "G TCA", + "TT GC", + "TT GA", + "G TAC", + "GG AA", + "TA TC", + "GAA C", + "GC AA", + "GG CA", + "TAA C", + "TG TC", + "GC TC", + "G TCC", + "GA CA", + "TA GC", + "GG CC", + "CCA C", + "G TGA", + "AAA C", + "GCA C", + "G TGG", + "GCC C", + "AA AA", + "_ A", + "TT TCA", + "GC TA", + "G TGC", + "GA TA", + "TG TA", + "TG TT", + "GC TT", + "CC CA", + "TC TT", + "GGA C", + "TC GC", + "GG TA", + "TC GA", + "TA TT", + "GG TT", + "GA TT", + "TCC CC", + "GA GC", + "AA CA", + "AA CC", + "GG TC", + "GA TC", + "GG GG", + "TTA CC", + "CA CC", + "TA TA", + "GC GG", + "GC GC", + "TA GA", + "CC CC", + "GC GA", + "TC TCA", + "GG GA", + "GTC AA", + "G TTAC", + "TC TA", + "TCCA C", + "TT TTC", + "TCA CC", + "CA TC", + "TAC AA", + "TA GG", + "G TAAA", + "GA GA", + "TT CCA", + "TG AAA", + "TCC AA", + "GTA CC", + "GG TAC", + "TAA AA", + "TTC AA", + "CCA CC", + "CA CA", + "GCA CC", + "TT TGA", + "TG TG", + "TTA GC", + "GG GC", + "GA GTT", + "_ C", + "AAA CC", + "TGG AA", + "TC GTT", + "TG CCA", + "TG TAA", + "TT GTT", + "TA CCA", + "GA GG", + "TGC AA", + "_ TC", + "TT TAA", + "TAAA C", + "GA TG", + "GAA AA", + "TC AAA", + "TGGA C", + "GCC CA", + "TCC CA", + "TT TTA", + "GCCA C", + "TA GTT", + "TC GAC", + "G TCCA", + "TT TGG", + "GC GGC", + "TT AAA", + "TG TTA", + "TC GG", + "GTAA C", + "TG TCA", + "AA GC", + "TG TTC", + "TC TAA", + "GC TAC", + "TTC GG", + "TT GGA", + "GC TTTC", + "TTA CA", + "CC AA", + "TGCA C", + "GAA CA", + "TC AAC", + "TT TAAA", + "GG GTT", + "TT TCC", + "TG AAC", + "GCC CC", + "TC GCA", + "TC GCC", + "TT GTA", + "TCA CA", + "GA TGA", + "TA TCC", + "TT CAC", + "GAAA C", + "GC TGA", + "GC TTA", + "TC TCC", + "_ TCC", + "GC TCC", + "GAC AA", + "TC TTC", + "GC GCC", + "GG GTA", + "TGG CC", + "TG TCC", + "TT AAC", + "TAA CC", + "GGA CC", + "TT GTC", + "GTT CA", + "TT GAC", + "GCC CCA", + "G TGGC", + "GC TTC", + "TA TTC", + "GTT AA", + "TT TAC", + "GGC AA", + "GG TTC", + "TAA CA", + "TGA CC", + "G TGCA", + "TT GAA", + "TG CCC", + "TGG CA", + "TT GCA", + "GA TGC", + "AA TA", + "TAA GCC", + "GC GCA", + "TC TAC", + "TTAA CC", + "GAA CC", + "_A C", + "TGC TTC", + "TA TAA", + "TA TCA", + "TGG GTT", + "GTA GGA", + "TTA GG", + "TG TGG", + "TT GCC", + "TT TCCC", + "TCC CCA", + "TA CAC", + "GA CCA", + "TA CCC", + "G TGGA", + "TG TGA", + "G TCAC", + "AAC AA", + "GC TAA", + "AAA CA", + "TT CCC", + "GG TGC", + "GA TCA", + "GA TAA", + "GG TCC", + "GA TGG", + "GA GCA", + "GA GAA", + "GA TTA", + "GG GGA", + "GTG AA", + "GA GGA", + "TT TGC", + "TC GAA", + "GCA CCC", + "G TCCC", + "GG CCA", + "TTTT CC", + "G T", + "GG GCA", + "TC TGC", + "GC AAC", + "GA GCC", + "GG CAC", + "TCA TCC", + "TT TCAC", + "TTCA CC", + "TGA CA", + "GA TCC", + "TTTCA CC", + "TA TTA", + "GG GCC", + "GTT CC", + "TG TAC", + "GTG CC", + "TGC TGCC", + "GG TGG", + "TC TGG", + "TA TGC", + "G TGAC", + "TGC TGG", + "GG GAA", + "GG AAC", + "TC GTC", + "GC TG", + "TA GGA", + "GA GGC", + "TA TAC", + "TA GCC", + "GG GGC", + "GC GTT", + "GG AAA", + "GA CCC", + "GG TAA", + "TTC TGA", + "TC TTA", + "TA GAA", + "GG GTC", + "GTG TCTCA", + "TT GGC", + "TCA GC", + "CCA CA", + "TC GGC", + "TCCA CC", + "GC TGC", + "GA CAC", + "GTA TTTA", + "GC AAA", + "TC TGA", + "GCC AA", + "GG CCC", + "_ CC", + "TTTT CA", + "TCCC GTAGGA", + "TAC TCA", + "TT TG", + "GA TTC", + "GTA CA", + "TA TTAA", + "TGCTGCC TCCCGTAGGA", + "GG TCA", + "GC GTC", + "TA GTA", + "TGAA AA", + "GG TGA", + "GG GACC", + "GGA CA", + "TCA CCC", + "TGGA CC", + "TTCC AA", + "CA CCA", + "GA TAC", + "TTTC AA", + "CCC AA", + "TCAC GGTAC", + "GGA GTT", + "GC TGG", + "TTA GCA", + "GC TCA", + "TC GTA", + "TTCC CA", + "TATT TCAC", + "TA GTGA", + "CCA GTG", + "TTA TAC", + "TGCTGCCTCCCGTAGGA GTC", + "TC TACC", + "TTA TC", + "TA GCA", + "TG TGC", + "TCC TCC", + "TA TGA", + "TGCTTC TAAGCC", + "TA TGG", + "GG TTGA", + "TTTGA GTT", + "GC GAC", + "TTACC GCGGC", + "TCAC GAC", + "TTTT CACC", + "TA GTC", + "TTA GCC", + "TT TCCA", + "GTA GG", + "GTT TCC", + "TTTCCC TCACGGTAC", + "TC TTTT", + "TTGG CC", + "TTCA CA", + "GGA TCAC", + "TCA TTA", + "GTC AAAC", + "GG TTA", + "GA GAC", + "TTC AAA", + "GCA GAC", + "GC GAA", + "GTAC TCCCCA", + "GTT TGA", + "GA GTA", + "GTT GC", + "TACCA GGGTA", + "TT CCAC", + "GG GAC", + "G TGGAC", + "TGC TTTC", + "TGTC TCACGAC", + "TCTAA TCC", + "GCA TTC", + "GCC TTC", + "GGA TC", + "GCC TCC", + "TCAA AA", + "TC TG", + "TTC AAC", + "GG TGCC", + "TGCA CC", + "TA GAC", + "TA TTTT", + "GA GTC", + "TTACCGCGGC TGCTGG", + "TT TGAA", + "GTTA GCC", + "TGC TCC", + "TT GCCA", + "TACA CC", + "TACCAGGGTA TCTAATCC", + "TAGC TAA", + "TG TTCC", + "TT GCAC", + "CAC AA", + "TC TTCC", + "GA TTCC", + "TGG GC", + "TTAC GCTTTC", + "AACA TCC", + "TAC GCA", + "TCA TC", + "GC GTA", + "TTC TTC", + "TAA AAC", + "GC GGA", + "TAC GGC", + "GG GTGG", + "TTACGCTTTC TTTAAA", + "TT GTCC", + "TG TTTT", + "TT GTAA", + "TTCC TTTGAGTT", + "TCC GGA", + "TC GGA", + "TCCA CA", + "GCC TTGG", + "TGG CCA", + "TGC TC", + "GAC TAA", + "_A CC", + "TA TTCA", + "TGA GCC", + "TA GGC", + "GG TG", + "TT TGCC", + "GCA GTT", + "TTGG GACC", + "TACA CCA", + "TTA GAA", + "TC TTGC", + "TGA GA", + "AAA CCA", + "GG TTTCA", + "TAAA CA", + "TAC TC", + "TCAA CC", + "GTT TC", + "GGC GA", + "GC TTTT", + "_ TG", + "TTAA AA", + "TT GTG", + "GAAC AA", + "TTAA CA", + "TTGG CA", + "GTATTTA GCC", + "TA TCAC", + "TTACCGCGGCTGCTGG CAC", + "GTTC AA", + "TGC AAAA", + "GG TTTC", + "TCCC AA", + "TCTC GTAC", + "TT GTCA", + "GC TACC", + "TTTA CC", + "TCC TAC", + "TTGG AA", + "TCA GAC", + "TAA TAA", + "GAAC TGTCTCACGAC", + "GATT AAC", + "TC TTTC", + "TTA GGA", + "GCA CA", + "GGC TTC", + "TTGG GTT", + "GCAA CA", + "TGA GC", + "TGG TA", + "GAACTGTCTCACGAC GTTC", + "_ TCCC", + "GTG AAA", + "_ TCA", + "TGG GG", + "TTA CCC", + "GG GAAC", + "GG GTTC", + "GGA TGG", + "AACC TCC", + "TGGC TGCTTCTAAGCC", + "TGG GCC", + "TTA TTC", + "GTA TTACCGCGGCTGCTGGCAC", + "TTTCA CCCC", + "AAC AAC", + "TTTA CA", + "GTAAA C", + "GCCA CC", + "TTTT AA", + "CCCA GCTC", + "TTA GATG", + "TAA GG", + "TA TG", + "AAAC AA", + "GTGTCTCA GTT", + "GCCCCA GGA", + "TG TGTC", + "GTCC CA", + "GA TTAC", + "GTCC CC", + "TAC GCC", + "TGGGTT GTT", + "TTC GGA", + "GAA GAA", + "GATA GGGACC", + "GTTA CA", + "TACC AA", + "GTTA CC", + "TAC GGA", + "GACA TCGA", + "TGA GCCA", + "TACCAGGGTATCTAATCC TGTT", + "TATC GG", + "TAC GA", + "TAA GCA", + "TCACC AAC", + "TGGACC GTGTCTCAGTT", + "TAA GTT", + "_ AA", + "GACC TTAGC", + "TC TTTA", + "TGAC AA", + "TCAA CA", + "GC TACA", + "TCC GACC", + "GAA CCA", + "GC GGCA", + "TCA TCA", + "AATA TTCC", + "TCTT TCC", + "G TCCAC", + "GCCC AA", + "GAA GA", + "CCACC GGATCAC", + "GC GTG", + "TTC TCC", + "GCAA AA", + "GCC TGC", + "_ TGG", + "GTC TC", + "TCGC TTTC", + "CCAC TGCTGCCTCCCGTAGGAGTC", + "TA TACC", + "TA TTTA", + "GGGG TTC", + "GCA GA", + "TGCC TTC", + "GCA GG", + "TTA GTA", + "TGG TCC", + "GTCA CC", + "TGA TCC", + "TGGACCGTGTCTCAGTT CCAGTG", + "GCTT TAC", + "TGCC AA", + "GCC GTC", + "TCTT TAAA", + "_ TCTC", + "TGG TC", + "GCC GTT", + "GC TTCA", + "TTC GGC", + "TT GAAA", + "GTA TTCACC", + "TA GTG", + "GCA GGC", + "GTT GTT", + "TAAA CC", + "TA CCAC", + "TGA GG", + "TT TAAC", + "TAA AAA", + "_ CA", + "GGAA AA", + "TTTTCACC TTTCCCTCACGGTAC", + "GG TGGA", + "TCCCC CCA", + "GTACTCCCCA GGC", + "GCA GC", + "TTA GA", + "TAGC TGTC", + "TAC GC", + "GTT TCA", + "GGA GA", + "GAA TC", + "GACC GCCCCA", + "TGA TCATCC", + "GTA GA", + "GCTT GTGC", + "GTAA CA", + "GACC AA", + "TG TTAA", + "TG TTTA", + "GCTA TTACGCTTTCTTTAAA", + "TTA TA", + "TC GTG", + "TTAC AA", + "TAC AAA", + "TTA TCC", + "TTAC GCAC", + "TCAC TTC", + "TGCC CA", + "GAACTGTCTCACGACGTTC TGAA", + "TATTTCAC TCCCC", + "TG CCAC", + "_ GG", + "GCC GC", + "GCC CCC", + "TAA TAC", + "TGA TGA", + "_ TGC", + "TTC GA", + "TA TTCC", + "GAA AAA", + "CCC TTCCA", + "GGCA CC", + "TG TTGC", + "GAAA CA", + "TA TAAA", + "TCA TAA", + "TT GTGC", + "TA TTTC", + "TGCA CA", + "GTCAA TTCCTTTGAGTT", + "GACCGCCCCA GTCAAAC", + "TT GTAC", + "GC TTGC", + "GTAA AA", + "TA TCAA", + "GAC AAA", + "GACATCGA GGTGCC", + "TGC TCA", + "GTA TCA", + "GGGAAC GTATTCACC", + "TC TCAA", + "TTAA CCA", + "GC TCAC", + "TC GGTA", + "TA TGCC", + "GAAA CC", + "TCC TC", + "TT GACC", + "CAC TGC", + "GTA TGC", + "CCAC GCTTTC", + "GAC TCGCTTTC", + "TCA GGC", + "GC TCCA", + "GACTAA CCC", + "TGGC AA", + "TCA GGA", + "GCC AAA", + "GCA GCA", + "TTTTA TCC", + "TGG AAA", + "TG TGAC", + "TT GTGG", + "GTC GAGTT", + "GGAC GTTA", + "GG TTCC", + "TGG GA", + "TTA TTA", + "GAA CCACCGGATCAC", + "GAA GTT", + "TCC AAA", + "GC TTTA", + "CCC AAC", + "TC TGAC", + "GGCC GAC", + "TCA GA", + "TA GGCA", + "TCGC TACTCA", + "TG TCAA", + "TTGC AA", + "GAACTGTCTCACGACGTTCTGAA CCCAGCTC", + "TA GGAA", + "GC TGAA", + "TGAGCCA TTACC", + "TGGA CA", + "GG TTTT", + "TG TTTC", + "TT GTTC", + "CCA GTGA", + "TTC TCCA", + "TC TTGA", + "TGGC GAACA", + "TTTTC AAC", + "GTG TGTA", + "TAA TCA", + "CAC TATC", + "GTT TTA", + "GTCGAGTT GCAGAC", + "GTAA CC", + "TT TGAC", + "TCATTA TGCAAAA", + "GTCA TCC", + "TGA TC", + "TACC TCCA", + "TGGCGAACA GCCA", + "GAA GC", + "GAA GG", + "GCTC GCC", + "CATC GTT", + "GGAA TA", + "TT TGCA", + "CATC TTCC", + "TGC GA", + "TTGA CA", + "TACC CC", + "TACA CA", + "GTT GAGC", + "TG TGAA", + "TGG TTC", + "TGC TGA", + "TGCA TGC", + "GCA TAC", + "TAAC AA", + "GAC GG", + "TG TAAA", + "TCCC GAA", + "TGGCGAACAGCCA TACCC", + "GCAC TTCTGA", + "GTAC AA", + "TAA GTC", + "GTCA GTA", + "TT TTAC", + "TTC TA", + "GCC GCC", + "GC TATC", + "TGAGCCATTACC TCACCAAC", + "GGCC CC", + "GA TAAA", + "TA TACA", + "TCGAC TAGTGA", + "TAC TTTC", + "GG TGAA", + "TG TTCA", + "CATCTTCC GCGCA", + "_ TGCC", + "GA TCTC", + "GTGG AA", + "TG TCCA", + "GACA CC", + "GGTC TGGGTTGTT", + "TTA CCA", + "GCC GGC", + "GAA GGC", + "TGC AAC", + "GC TCCCC", + "TGGC TGC", + "GCCTCC GTTAC", + "GC TCCC", + "TCC GAA", + "GTC AAA", + "GGA GTTAGCC", + "GAA CCC", + "TTACC AA", + "_ GCC", + "GCTC GAC", + "GA GTTC", + "TGAC TGATCATCC", + "GAC TTAA", + "GG TACA", + "GA TCAA", + "GGTCC TCTCGTAC", + "AAC TTC", + "GA TGGC", + "TTTCACCCC TA", + "GTTTCC CAC", + "GGAA CC", + "TAA TCC", + "TTTTC GCC", + "TGGCGAACAGCCATACCC TTGGGACC", + "TAGC GATTCC", + "GTCC AA", + "GCA GCC", + "CCA GTA", + "GTG TCA", + "TTTCA CA", + "GGA GGC", + "TAA TC", + "TA TATC", + "TAA TTC", + "GTGGAC TACCAGGGTATCTAATCCTGTT", + "TG TTGG", + "TGG TTTCA", + "GTGG CA", + "GA TTTT", + "GTGC AA", + "TGC TGC", + "TGAA CC", + "TTACGCAC TCTTTAAA", + "GGACGTTA GCACCC", + "TC TTCA", + "GCTT TTC", + "CCACTGCTGCCTCCCGTAGGAGTC TGGACCGTGTCTCAGTTCCAGTG", + "GAA GAC", + "GCATTC GCACTTCTGA", + "GTA GC", + "GG GAAA", + "TCCAC AA", + "GGCC AA", + "TGG GGC", + "GTG CCA", + "TGAA CA", + "GC TGGC", + "_ GTA", + "GGTCTGGGTTGTT TCCC", + "TTAGATG TTTCA", + "GCTT GCACCC", + "TCC CCC", + "GGAA TTTC", + "TTATAC AAAA", + "GTATTACCGCGGCTGCTGGCAC GGAGTTAGCC", + "GCCA CA", + "GA TTTC", + "GGA CCA", + "GTCA CA", + "CCCA CC", + "GG TCTC", + "GGC TC", + "TGTGTC GGTT", + "TTTGG GACCTTAGC", + "CAC GAC", + "TC GGTT", + "TG TGTT", + "GGC TCATTATGCAAAA", + "TGC AAA", + "GTT TA", + "GTAA TTCC", + "TGG TTCAC", + "GTACC TTTTATCC", + "GC GTTC", + "GA GAAC", + "TAA GTA", + "TACC CACCA", + "TCA GCC", + "GG AAACC", + "TAC TTA", + "TAA CCA", + "CAC TTC", + "TGC TAA", + "_ GC", + "TTC GTAC", + "TC AAAC", + "TGC TCGAC", + "TGAA TGA", + "TC TCAC", + "GAC TAC", + "TTA GATA", + "GGGGTTC TTTTCGCC", + "TACC CA", + "TCGGTA TTCC", + "GTC TA", + "CCAC AA", + "TCC GG", + "TAGG GC", + "TAA GA", + "GTC TCGCA", + "GCTA CAC", + "TT TGGA", + "TTA GTC", + "TCCACC GCTTGTGC", + "TT TACAA", + "GGC AAA", + "GTTA TA", + "TT AAAC", + "TG TCAC", + "TA TATT", + "GTATTTAGCC TTGGA", + "GTC AAC", + "GG GCCA", + "TCA TTC", + "TGGCTGCTTCTAAGCC AACCTCC", + "GTC GA", + "GCTGA CCCA", + "TT TGGC", + "GG GCGG", + "TG TTGA", + "TTCC CC", + "TTTC GG", + "GTT CCC", + "TGA TGG", + "TG TAAC", + "TA TGTA", + "GGA TCA", + "GTT AAA", + "GGAA CA", + "TTA TCCA", + "GGC GG", + "TG TCTC", + "_ GA", + "TA GAAC", + "GCTTTAC GCCCA", + "TAC TGA", + "GCAA CC", + "TTATACAAAA GGTAC", + "TGA GTC", + "GAGC TGAC", + "GTT GA", + "TGG AAC", + "TT GGAC", + "TG AAAC", + "TCTC AAACCA", + "GTAGG AAACC", + "TTA TGA", + "TG TACA", + "TTC TGC", + "GTC AAAA", + "AACA CC", + "TTC TCACC", + "TTA TAA", + "GGA TA", + "TTA TTCA", + "TTACC CC", + "GCA TCA", + "TTC GTGCA", + "GTT TTC", + "TAAA TCA", + "TGCTCC CCACGCTTTC", + "GTGA CA", + "TGCC CC", + "GGCC CA", + "GCC CAC", + "TA TAGC", + "GGC GGC", + "AAC TTCA", + "TA TGACC", + "TCAC AA", + "GTGC TCTACC", + "GCA TGA", + "GAA TAA", + "TAC TGC", + "GTC GG", + "GC TTAC", + "TA TCCA", + "GCTCGCC GCTAC", + "TCGACTAGTGA GCTATTACGCTTTCTTTAAA", + "TG TCCC", + "GACC CC", + "TGA GTA", + "GGGGTTCTTTTCGCC TTTCCCTCACGGTAC", + "TT TGTAA", + "GAA TCA", + "GCAC AA", + "GAA TTC", + "GTTTGA TTGGCC", + "TTCCAA GCC", + "CCA GCTA", + "TGTC TCCC", + "GTTGAGC GATGG", + "TCC AAC", + "GAA CCCC", + "TGAC GAGCA", + "TGAA TGGCTGC", + "GTTA CATCTTCCGCGCA", + "TCTCA GACCA", + "GGA TGGCTGCTTCTAAGCCAACCTCC", + "TG TTAC", + "GACTCGCTTTC GCTAC", + "TGG GAC", + "GG TACC", + "GGA GC", + "GGGCC CCC", + "TAAAC AA", + "GTC GCC", + "GA GGAA", + "GAC TTTC", + "TTTT TGA", + "AAA GTT", + "AAAA CC", + "TA GAAA", + "GG TTAA", + "GC GCAA" + ] + } +} \ No newline at end of file diff --git a/tokenizer.model b/tokenizer.model new file mode 100644 index 0000000..42d871a --- /dev/null +++ b/tokenizer.model @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0bcccad15ab3d220976cbfaf12718e4236f2159e5d0915d95c91f8d866b6569c +size 6128 diff --git a/tokenizer_config.json b/tokenizer_config.json new file mode 100644 index 0000000..6c2197c --- /dev/null +++ b/tokenizer_config.json @@ -0,0 +1,64 @@ +{ + "added_tokens_decoder": { + "0": { + "content": "[PAD]", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "1": { + "content": "[UNK]", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "2": { + "content": "[SEP]", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "3": { + "content": "[BOS]", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "4": { + "content": "[EOS]", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "5": { + "content": "[MASK]", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + } + }, + "bos_token": "[BOS]", + "clean_up_tokenization_spaces": false, + "cls_token": null, + "do_lower_case": null, + "eos_token": "[EOS]", + "extra_special_tokens": {}, + "mask_token": "[MASK]", + "model_max_length": 512, + "pad_token": "[PAD]", + "sep_token": "[SEP]", + "tokenizer_class": "PreTrainedTokenizerFast", + "unk_token": "[UNK]" +}