Datasets:
question
stringlengths 11
79
| choices
listlengths 4
4
| answer
int64 0
3
| answer_label
stringclasses 4
values | split
stringclasses 1
value | subcategories
stringclasses 1
value | categories
stringclasses 1
value | lang
stringclasses 1
value | second_lang
stringclasses 1
value | notes
stringlengths 0
74
| id
stringlengths 7
7
| set_id
stringlengths 5
5
| variation_id
stringclasses 1
value | vanilla_cos_sim_to_canonical
dict | trimmed_cos_sim_to_canonical
dict | token_counts
dict |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Dr Gülçiçek Bölükbaşı bir doktordur. Dr Gülçiçek bir
|
[
"öğretmendir",
"hakimdir",
"doktordur",
"avukattır"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
Dr Ahmadi is a Doctor. Dr Ahmadi is a
|
300-2.0
|
300.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 17,
"Qwen/Qwen3-8B": 18,
"bigscience/bloom": 26,
"common-pile/comma-v0.1-1t": 30,
"facebook/xglm-564M": 16,
"google-bert/bert-base-multilingual-cased": 22,
"google/byt5-small": 62,
"google/gemma-2-2b": 18,
"gpt2": 31,
"meta-llama/Llama-3.2-1B": 21,
"microsoft/Phi-3-mini-4k-instruct": 27,
"mistralai/tekken": 20,
"tiktoken/gpt-4o": 20,
"tokenmonster/englishcode-32000-consistent-v1": 38
}
|
||
Gökyüzünün rengi
|
[
"kırmızıdır",
"yeşildir",
"sarıdır",
"mavidir"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
The color of the sky is
|
301-2.0
|
301.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 9,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 20,
"google/gemma-2-2b": 5,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 9,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 6,
"tokenmonster/englishcode-32000-consistent-v1": 14
}
|
||
Bu evin fiyatı 5.850.000 Liradır. Bu evin maliyeti
|
[
"5,850,000 Liradır",
"5.850.000 Liradır",
"5.550.000 Liradır",
"5.850.000 Liradır"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
The price of this house is 5,850,000 Lira. The cost of this house is
|
302-2.0
|
302.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 23,
"Qwen/Qwen3-8B": 26,
"bigscience/bloom": 22,
"common-pile/comma-v0.1-1t": 25,
"facebook/xglm-564M": 16,
"google-bert/bert-base-multilingual-cased": 19,
"google/byt5-small": 52,
"google/gemma-2-2b": 20,
"gpt2": 25,
"meta-llama/Llama-3.2-1B": 20,
"microsoft/Phi-3-mini-4k-instruct": 30,
"mistralai/tekken": 26,
"tiktoken/gpt-4o": 21,
"tokenmonster/englishcode-32000-consistent-v1": 25
}
|
||
Bugün 17/08/1998'dir. Bugünün tarihi
|
[
"16/08/1998'dir",
"17/08/1998'dir",
"17/07/1998'dir",
"17/09/1988'dir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
Today is 15/08/1998. Today's date is
|
303-2.0
|
303.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 20,
"Qwen/Qwen3-8B": 22,
"bigscience/bloom": 13,
"common-pile/comma-v0.1-1t": 22,
"facebook/xglm-564M": 10,
"google-bert/bert-base-multilingual-cased": 13,
"google/byt5-small": 40,
"google/gemma-2-2b": 20,
"gpt2": 20,
"meta-llama/Llama-3.2-1B": 16,
"microsoft/Phi-3-mini-4k-instruct": 23,
"mistralai/tekken": 21,
"tiktoken/gpt-4o": 16,
"tokenmonster/englishcode-32000-consistent-v1": 25
}
|
||
Dünya'daki kıta sayısı
|
[
"5'tir",
"7'dir",
"6'dır",
"8'dir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The number of continents on Earth is
|
304-2.0
|
304.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 6,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 26,
"google/gemma-2-2b": 8,
"gpt2": 13,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 14,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 17
}
|
||
İran'ın başkenti
|
[
"Meşhed'dir",
"Tahran'dır",
"Bağdat'tır",
"Paris'tir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The capital city of Iran is
|
305-2.0
|
305.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 4,
"Qwen/Qwen3-8B": 7,
"bigscience/bloom": 9,
"common-pile/comma-v0.1-1t": 9,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 19,
"google/gemma-2-2b": 7,
"gpt2": 11,
"meta-llama/Llama-3.2-1B": 5,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 6,
"tiktoken/gpt-4o": 6,
"tokenmonster/englishcode-32000-consistent-v1": 12
}
|
||
Bir haftadaki gün sayısı
|
[
"5'tir",
"7'dir",
"6'dır",
"8'dir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
306-2.0
|
306.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 11,
"common-pile/comma-v0.1-1t": 11,
"facebook/xglm-564M": 5,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 27,
"google/gemma-2-2b": 7,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 5,
"microsoft/Phi-3-mini-4k-instruct": 11,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 7,
"tokenmonster/englishcode-32000-consistent-v1": 14
}
|
|||
Bir gündeki saat sayısı
|
[
"20'dir",
"25'tir",
"30'dur",
"24'tür"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
The number of hours in a day is
|
307-2.0
|
307.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 6,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 12,
"facebook/xglm-564M": 5,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 26,
"google/gemma-2-2b": 7,
"gpt2": 11,
"meta-llama/Llama-3.2-1B": 5,
"microsoft/Phi-3-mini-4k-instruct": 11,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 6,
"tokenmonster/englishcode-32000-consistent-v1": 15
}
|
||
Bir ineğin bacak sayısı
|
[
"2'dir",
"4'tür",
"3'tür",
"5'tir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The number of legs a cow has is
|
308-2.0
|
308.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 6,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 26,
"google/gemma-2-2b": 7,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 6,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 7,
"tokenmonster/englishcode-32000-consistent-v1": 14
}
|
||
İki saatteki dakika sayısı
|
[
"100'dür",
"140'tır",
"90'dır",
"120'dir"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
The number of minutes in 2 hours is
|
309-2.0
|
309.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 11,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 5,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 29,
"google/gemma-2-2b": 7,
"gpt2": 14,
"meta-llama/Llama-3.2-1B": 6,
"microsoft/Phi-3-mini-4k-instruct": 13,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 19
}
|
||
Bir yıldaki ay sayısı
|
[
"10'dur",
"12'dir",
"11'dir",
"13'tür"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The number of months in a year is
|
310-2.0
|
310.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 7,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 5,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 24,
"google/gemma-2-2b": 6,
"gpt2": 10,
"meta-llama/Llama-3.2-1B": 6,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 6,
"tiktoken/gpt-4o": 6,
"tokenmonster/englishcode-32000-consistent-v1": 15
}
|
||
Bir dakikadaki saniye sayısı
|
[
"50'dir",
"100'dür",
"30'dur",
"60'tır"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
The number of seconds in a minute is
|
311-2.0
|
311.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 6,
"Qwen/Qwen3-8B": 10,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 30,
"google/gemma-2-2b": 9,
"gpt2": 13,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 13,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 16
}
|
||
Bir altıgenin kenar sayısı
|
[
"5'tir",
"7'dir",
"8'dir",
"6'dır"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
The number of sides a hexagon has is
|
312-2.0
|
312.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 29,
"google/gemma-2-2b": 8,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 7,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 16
}
|
||
Bir üçgenin kenar sayısı
|
[
"2'dir",
"4'tür",
"3'tür",
"5'tir"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
The number of sides a triangle has is
|
313-2.0
|
313.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 11,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 28,
"google/gemma-2-2b": 8,
"gpt2": 13,
"meta-llama/Llama-3.2-1B": 7,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 17
}
|
||
"Trendyol'da çalışıyorum" cümlesinde Trendyol bir
|
[
"kişidir",
"şirkettir",
"şehirdir",
"meyvedir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
In "I work at Trendyol", Trendyol is a
|
314-2.0
|
314.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 18,
"Qwen/Qwen3-8B": 16,
"bigscience/bloom": 20,
"common-pile/comma-v0.1-1t": 28,
"facebook/xglm-564M": 13,
"google-bert/bert-base-multilingual-cased": 19,
"google/byt5-small": 54,
"google/gemma-2-2b": 16,
"gpt2": 25,
"meta-llama/Llama-3.2-1B": 17,
"microsoft/Phi-3-mini-4k-instruct": 26,
"mistralai/tekken": 19,
"tiktoken/gpt-4o": 17,
"tokenmonster/englishcode-32000-consistent-v1": 30
}
|
||
"Hepsiburada'da çalışıyorum" cümlesinde Hepsiburada bir
|
[
"kişidir",
"şirkettir",
"şehirdir",
"meyvedir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
In "I work at Hepsiburada", Hepsiburada is a
|
315-2.0
|
315.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 20,
"Qwen/Qwen3-8B": 17,
"bigscience/bloom": 21,
"common-pile/comma-v0.1-1t": 30,
"facebook/xglm-564M": 16,
"google-bert/bert-base-multilingual-cased": 21,
"google/byt5-small": 60,
"google/gemma-2-2b": 19,
"gpt2": 27,
"meta-llama/Llama-3.2-1B": 18,
"microsoft/Phi-3-mini-4k-instruct": 28,
"mistralai/tekken": 19,
"tiktoken/gpt-4o": 18,
"tokenmonster/englishcode-32000-consistent-v1": 35
}
|
||
"Mikro Yazılım yeni bir güncelleme yayınladı" cümlesinde Micro Yazılım bir
|
[
"kişidir",
"yerdir",
"şirkettir",
"tarihtir"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
In "Microsoft released a new update", Microsoft is a
|
316-2.0
|
316.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 20,
"Qwen/Qwen3-8B": 24,
"bigscience/bloom": 30,
"common-pile/comma-v0.1-1t": 39,
"facebook/xglm-564M": 19,
"google-bert/bert-base-multilingual-cased": 24,
"google/byt5-small": 82,
"google/gemma-2-2b": 22,
"gpt2": 38,
"meta-llama/Llama-3.2-1B": 24,
"microsoft/Phi-3-mini-4k-instruct": 38,
"mistralai/tekken": 25,
"tiktoken/gpt-4o": 21,
"tokenmonster/englishcode-32000-consistent-v1": 50
}
|
||
"Kedi paspasın üzerine oturdu" cümlesinde özne
|
[
"oturdu",
"paspastır",
"üzerine",
"kedidir"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
In "The cat sat on the mat", the subject is
|
317-2.0
|
317.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 16,
"Qwen/Qwen3-8B": 16,
"bigscience/bloom": 19,
"common-pile/comma-v0.1-1t": 25,
"facebook/xglm-564M": 14,
"google-bert/bert-base-multilingual-cased": 16,
"google/byt5-small": 50,
"google/gemma-2-2b": 16,
"gpt2": 26,
"meta-llama/Llama-3.2-1B": 16,
"microsoft/Phi-3-mini-4k-instruct": 22,
"mistralai/tekken": 16,
"tiktoken/gpt-4o": 15,
"tokenmonster/englishcode-32000-consistent-v1": 28
}
|
||
İnsanların yaşamak için soluması gereken gaz
|
[
"nitrojendir",
"oksijendir",
"karbondioksittir",
"hidrojendir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The gas humans need to breathe to live is
|
322-2.0
|
322.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 10,
"Qwen/Qwen3-8B": 13,
"bigscience/bloom": 18,
"common-pile/comma-v0.1-1t": 22,
"facebook/xglm-564M": 9,
"google-bert/bert-base-multilingual-cased": 13,
"google/byt5-small": 49,
"google/gemma-2-2b": 11,
"gpt2": 23,
"meta-llama/Llama-3.2-1B": 12,
"microsoft/Phi-3-mini-4k-instruct": 22,
"mistralai/tekken": 13,
"tiktoken/gpt-4o": 11,
"tokenmonster/englishcode-32000-consistent-v1": 31
}
|
||
100'ün %10'u
|
[
"5'tir",
"15'tir",
"10'dur",
"20'dir"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
en: 10% of 100 is
|
323-2.0
|
323.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 10,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 6,
"common-pile/comma-v0.1-1t": 7,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 13,
"google/gemma-2-2b": 10,
"gpt2": 8,
"meta-llama/Llama-3.2-1B": 6,
"microsoft/Phi-3-mini-4k-instruct": 11,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 6,
"tokenmonster/englishcode-32000-consistent-v1": 8
}
|
||
80'in %25'i
|
[
"15'tir",
"20'dir",
"25'tir",
"30'dur"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
25% of 80 is
|
324-2.0
|
324.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 9,
"Qwen/Qwen3-8B": 7,
"bigscience/bloom": 5,
"common-pile/comma-v0.1-1t": 6,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 11,
"google/gemma-2-2b": 9,
"gpt2": 7,
"meta-llama/Llama-3.2-1B": 5,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 5,
"tokenmonster/englishcode-32000-consistent-v1": 7
}
|
||
Çad'ın başkenti
|
[
"Moundou'dur",
"Abéché'dir",
"Ngama'dır",
"Encemine'dir"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
Chad's capital is
|
326-2.0
|
326.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 7,
"bigscience/bloom": 8,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 6,
"google/byt5-small": 18,
"google/gemma-2-2b": 7,
"gpt2": 11,
"meta-llama/Llama-3.2-1B": 5,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 6,
"tiktoken/gpt-4o": 6,
"tokenmonster/englishcode-32000-consistent-v1": 11
}
|
||
Fransa'nın başkenti
|
[
"Londra'dır",
"Paris'tir",
"Berlin'dir",
"Roma'dır"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The capital of France is
|
327-2.0
|
327.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 9,
"common-pile/comma-v0.1-1t": 10,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 22,
"google/gemma-2-2b": 8,
"gpt2": 13,
"meta-llama/Llama-3.2-1B": 6,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 7,
"tokenmonster/englishcode-32000-consistent-v1": 13
}
|
||
Japonya'nın başkenti
|
[
"Tokyo'dur",
"Kyoto'dur",
"Osaka'dır",
"Hiroşima'dır"
] | 0
|
A
|
test
|
Canonical
|
tur_Latn
|
The capital of Japan is
|
328-2.0
|
328.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 5,
"Qwen/Qwen3-8B": 10,
"bigscience/bloom": 9,
"common-pile/comma-v0.1-1t": 11,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 23,
"google/gemma-2-2b": 8,
"gpt2": 14,
"meta-llama/Llama-3.2-1B": 7,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 15
}
|
||
Türkiye'nin başkenti
|
[
"Ankara'dır",
"İstanbul'dur",
"İzmir'dir",
"Bursa'dır"
] | 0
|
A
|
test
|
Canonical
|
tur_Latn
|
The capital of Turkey is
|
329-2.0
|
329.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 4,
"Qwen/Qwen3-8B": 6,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 11,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 5,
"google/byt5-small": 22,
"google/gemma-2-2b": 6,
"gpt2": 11,
"meta-llama/Llama-3.2-1B": 4,
"microsoft/Phi-3-mini-4k-instruct": 10,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 5,
"tokenmonster/englishcode-32000-consistent-v1": 14
}
|
||
Suyun kimyasal formülü
|
[
"CO2'dir",
"NaCl'dir",
"H2O'dur",
"O2'dir"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
The chemical formula for water is
|
330-2.0
|
330.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 6,
"Qwen/Qwen3-8B": 8,
"bigscience/bloom": 9,
"common-pile/comma-v0.1-1t": 11,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 9,
"google/byt5-small": 24,
"google/gemma-2-2b": 7,
"gpt2": 12,
"meta-llama/Llama-3.2-1B": 7,
"microsoft/Phi-3-mini-4k-instruct": 11,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 14
}
|
||
"Mağaza saat kaçta kapanıyor?" cümlesindeki niyet
|
[
"bilgi almaktır",
"alışveriş yapmaktır",
"randevu almaktır",
"şikayet etmektir"
] | 0
|
A
|
test
|
Canonical
|
tur_Latn
|
The intent in "What time does the store close?" is
|
331-2.0
|
331.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 16,
"Qwen/Qwen3-8B": 17,
"bigscience/bloom": 20,
"common-pile/comma-v0.1-1t": 25,
"facebook/xglm-564M": 14,
"google-bert/bert-base-multilingual-cased": 19,
"google/byt5-small": 53,
"google/gemma-2-2b": 16,
"gpt2": 24,
"meta-llama/Llama-3.2-1B": 17,
"microsoft/Phi-3-mini-4k-instruct": 24,
"mistralai/tekken": 18,
"tiktoken/gpt-4o": 16,
"tokenmonster/englishcode-32000-consistent-v1": 29
}
|
||
Dünyadaki en büyük memeli
|
[
"yunustur",
"mavi balinadır",
"zürafadır",
"ayıdır"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The largest mammal in the world is
|
332-2.0
|
332.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 7,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 16,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 28,
"google/gemma-2-2b": 8,
"gpt2": 13,
"meta-llama/Llama-3.2-1B": 7,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 17
}
|
||
Uluslararası Sistemde sıcaklık ölçü birimi
|
[
"Celsius'tur",
"metredir",
"Rankine'dir",
"Kelvin'dir"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
The unit of measurement for temperature in the International System is
|
333-2.0
|
333.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 6,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 19,
"common-pile/comma-v0.1-1t": 25,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 13,
"google/byt5-small": 48,
"google/gemma-2-2b": 11,
"gpt2": 24,
"meta-llama/Llama-3.2-1B": 12,
"microsoft/Phi-3-mini-4k-instruct": 23,
"mistralai/tekken": 11,
"tiktoken/gpt-4o": 10,
"tokenmonster/englishcode-32000-consistent-v1": 30
}
|
||
Uzay ajansı NASA olan ülke
|
[
"Rusya'dır",
"Çin'dir",
"Amerika Birleşik Devletleri'dir",
"Japonya'dır"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
The country whose space agency is NASA is
|
334-2.0
|
334.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 7,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 11,
"common-pile/comma-v0.1-1t": 14,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 28,
"google/gemma-2-2b": 8,
"gpt2": 14,
"meta-llama/Llama-3.2-1B": 8,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 17
}
|
||
Brezilya'da konuşulan dil
|
[
"Portekizce'dir",
"İspanyolca'dır",
"Fransızca'dır",
"İtalyanca'dır"
] | 0
|
A
|
test
|
Canonical
|
tur_Latn
|
The language spoken in Brazil is
|
335-2.0
|
335.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 6,
"Qwen/Qwen3-8B": 9,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 11,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 9,
"google/byt5-small": 26,
"google/gemma-2-2b": 8,
"gpt2": 13,
"meta-llama/Llama-3.2-1B": 9,
"microsoft/Phi-3-mini-4k-instruct": 12,
"mistralai/tekken": 8,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 13
}
|
||
Kimyasal sembolü 'Fe' olan metal
|
[
"kurşundur",
"çinkodur",
"altındır",
"demirdir"
] | 3
|
D
|
test
|
Canonical
|
tur_Latn
|
The metal with chemical symbol 'Fe' is
|
336-2.0
|
336.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 10,
"Qwen/Qwen3-8B": 11,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 17,
"facebook/xglm-564M": 11,
"google-bert/bert-base-multilingual-cased": 11,
"google/byt5-small": 33,
"google/gemma-2-2b": 11,
"gpt2": 13,
"meta-llama/Llama-3.2-1B": 10,
"microsoft/Phi-3-mini-4k-instruct": 13,
"mistralai/tekken": 11,
"tiktoken/gpt-4o": 11,
"tokenmonster/englishcode-32000-consistent-v1": 16
}
|
||
İnsan vücudunda kan pompalayan organ
|
[
"karaciğerdir",
"akciğerlerdir",
"kalptir",
"böbreklerdir"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
The organ in the human body that pumps blood is
|
337-2.0
|
337.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 9,
"Qwen/Qwen3-8B": 10,
"bigscience/bloom": 13,
"common-pile/comma-v0.1-1t": 13,
"facebook/xglm-564M": 7,
"google-bert/bert-base-multilingual-cased": 11,
"google/byt5-small": 38,
"google/gemma-2-2b": 11,
"gpt2": 14,
"meta-llama/Llama-3.2-1B": 11,
"microsoft/Phi-3-mini-4k-instruct": 15,
"mistralai/tekken": 13,
"tiktoken/gpt-4o": 11,
"tokenmonster/englishcode-32000-consistent-v1": 18
}
|
||
Güneş sisteminde Güneş'e en yakın gezegen
|
[
"Venüs'tür",
"Merkür'dür",
"Mars'tır",
"Dünya'dır"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The planet closest to the Sun in our solar system is
|
338-2.0
|
338.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 11,
"Qwen/Qwen3-8B": 15,
"bigscience/bloom": 15,
"common-pile/comma-v0.1-1t": 20,
"facebook/xglm-564M": 12,
"google-bert/bert-base-multilingual-cased": 15,
"google/byt5-small": 46,
"google/gemma-2-2b": 14,
"gpt2": 23,
"meta-llama/Llama-3.2-1B": 14,
"microsoft/Phi-3-mini-4k-instruct": 18,
"mistralai/tekken": 13,
"tiktoken/gpt-4o": 13,
"tokenmonster/englishcode-32000-consistent-v1": 27
}
|
||
Güneş Sistemindeki en büyük gezegen
|
[
"Dünya'dır",
"Satürn'dür",
"Jüpiter'dir",
"Mars'tır"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
The largest planet in the Solar System is
|
339-2.0
|
339.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 13,
"bigscience/bloom": 16,
"common-pile/comma-v0.1-1t": 18,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 11,
"google/byt5-small": 39,
"google/gemma-2-2b": 9,
"gpt2": 19,
"meta-llama/Llama-3.2-1B": 10,
"microsoft/Phi-3-mini-4k-instruct": 15,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 9,
"tokenmonster/englishcode-32000-consistent-v1": 22
}
|
||
Bitkilerin güneş ışığını kullanarak kendi besinlerini üretmesini sağlayan süreç
|
[
"solunumdur",
"sindirimdir",
"fotosentezdir",
"fermentasyondur"
] | 2
|
C
|
test
|
Canonical
|
tur_Latn
|
The process that allows plants to produce their own food using sunlight is
|
340-2.0
|
340.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 15,
"Qwen/Qwen3-8B": 22,
"bigscience/bloom": 34,
"common-pile/comma-v0.1-1t": 43,
"facebook/xglm-564M": 15,
"google-bert/bert-base-multilingual-cased": 21,
"google/byt5-small": 91,
"google/gemma-2-2b": 19,
"gpt2": 39,
"meta-llama/Llama-3.2-1B": 17,
"microsoft/Phi-3-mini-4k-instruct": 39,
"mistralai/tekken": 21,
"tiktoken/gpt-4o": 18,
"tokenmonster/englishcode-32000-consistent-v1": 55
}
|
||
Romeo ve Juliet oyununu yazan yazar
|
[
"Charles Dickens'tir",
"William Shakespeare'dir",
"Mark Twain'dir",
"Jane Austen'dir"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
The author who wrote the play "Romeo and Juliet" is
|
341-2.0
|
341.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 8,
"Qwen/Qwen3-8B": 12,
"bigscience/bloom": 12,
"common-pile/comma-v0.1-1t": 16,
"facebook/xglm-564M": 8,
"google-bert/bert-base-multilingual-cased": 8,
"google/byt5-small": 35,
"google/gemma-2-2b": 8,
"gpt2": 14,
"meta-llama/Llama-3.2-1B": 10,
"microsoft/Phi-3-mini-4k-instruct": 15,
"mistralai/tekken": 9,
"tiktoken/gpt-4o": 10,
"tokenmonster/englishcode-32000-consistent-v1": 16
}
|
||
Arıların ürettiği şey
|
[
"baldır",
"süttür",
"ipektir",
"mumdur"
] | 0
|
A
|
test
|
Canonical
|
tur_Latn
|
What bees produce is
|
342-2.0
|
342.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 6,
"Qwen/Qwen3-8B": 6,
"bigscience/bloom": 10,
"common-pile/comma-v0.1-1t": 14,
"facebook/xglm-564M": 6,
"google-bert/bert-base-multilingual-cased": 7,
"google/byt5-small": 26,
"google/gemma-2-2b": 7,
"gpt2": 14,
"meta-llama/Llama-3.2-1B": 6,
"microsoft/Phi-3-mini-4k-instruct": 13,
"mistralai/tekken": 7,
"tiktoken/gpt-4o": 8,
"tokenmonster/englishcode-32000-consistent-v1": 18
}
|
||
Bitkilerin besin üretmek için havadan ihtiyaç duydukları şey
|
[
"nitrojendir",
"karbondioksittir",
"hidrojendir",
"helyumdur"
] | 1
|
B
|
test
|
Canonical
|
tur_Latn
|
What plants need from the air to make food is
|
343-2.0
|
343.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 15,
"Qwen/Qwen3-8B": 17,
"bigscience/bloom": 24,
"common-pile/comma-v0.1-1t": 32,
"facebook/xglm-564M": 13,
"google-bert/bert-base-multilingual-cased": 19,
"google/byt5-small": 65,
"google/gemma-2-2b": 17,
"gpt2": 30,
"meta-llama/Llama-3.2-1B": 16,
"microsoft/Phi-3-mini-4k-instruct": 30,
"mistralai/tekken": 18,
"tiktoken/gpt-4o": 15,
"tokenmonster/englishcode-32000-consistent-v1": 34
}
|
||
Lütfen Paris'e bir uçak bileti ayırtabilir misiniz? cümlesinde kişinin istediǧi
|
[
"rezervasyondur",
"bilgidir",
"şikayettir",
"iptaldir"
] | 0
|
A
|
test
|
Canonical
|
tur_Latn
|
In "Can you please book a flight to Paris?", the person wants
|
344-2.0
|
344.0
|
2.0
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 1,
"Qwen/Qwen3-8B": 1,
"bigscience/bloom": 1,
"common-pile/comma-v0.1-1t": 1,
"facebook/xglm-564M": 1,
"google-bert/bert-base-multilingual-cased": 1,
"google/byt5-small": 1,
"google/gemma-2-2b": 1,
"gpt2": 1,
"meta-llama/Llama-3.2-1B": 1,
"microsoft/Phi-3-mini-4k-instruct": 1,
"mistralai/tekken": 1,
"tiktoken/gpt-4o": 1,
"tokenmonster/englishcode-32000-consistent-v1": 1
}
|
{
"CohereLabs/aya-expanse-8b": 25,
"Qwen/Qwen3-8B": 24,
"bigscience/bloom": 30,
"common-pile/comma-v0.1-1t": 36,
"facebook/xglm-564M": 20,
"google-bert/bert-base-multilingual-cased": 25,
"google/byt5-small": 85,
"google/gemma-2-2b": 26,
"gpt2": 40,
"meta-llama/Llama-3.2-1B": 26,
"microsoft/Phi-3-mini-4k-instruct": 35,
"mistralai/tekken": 29,
"tiktoken/gpt-4o": 25,
"tokenmonster/englishcode-32000-consistent-v1": 43
}
|
Dataset Card for Tokenization Robustness
TokSuite Benchmark (Turkish Collection)
Dataset Description
This dataset is part of TokSuite, a comprehensive benchmark designed to measure how different tokenization strategies affect language model performance and robustness. This specific subset contains Turkish language multiple-choice text completion questions with various real-world perturbations that test tokenizer robustness.
- Curated by: R3 Research Team
- Language(s): Turkish (Tr)
- License: MIT License
Dataset Summary
TokSuite addresses a fundamental challenge in language model research: understanding how tokenization choices impact model behavior in isolation. The Turkish subset specifically measures model performance on canonical questions and various perturbations.
Key Features:
- 40 canonical questions covering general knowledge, geography, science, and language understanding
- Multiple perturbation types reflecting real-world text variations in Turkish
- Parallel structure with TokSuite benchmark (available in English, Italian, Farsi, Chinese)
- Native speaker curation ensuring linguistic authenticity
Supported Tasks
- Multiple-Choice Question Answering: Text completion format with 4 answer choices
- Tokenizer Robustness Evaluation: Measuring performance degradation under various text perturbations
- Multilingual NLP Benchmarking: Evaluating language models on Turkish text understanding
Languages
The dataset contains text in Turkish (language code: tur_Latn / tr).
Dataset Structure
Data Fields
| Field | Type | Description |
|---|---|---|
question |
string |
The question text in Turkish |
choices |
list[string] |
4 multiple-choice answer options |
answer |
int64 |
Index of the correct answer |
answer_label |
string |
Letter label of the correct answer |
split |
string |
Dataset split identifier |
subcategories |
string |
Perturbation category |
lang |
string |
Language code |
second_lang |
string |
English translation or description of the question |
notes |
string |
Additional context about the question or perturbation |
id |
string |
Unique question identifier |
set_id |
float64 |
Question set grouping identifier |
variation_id |
float64 |
Variation number within a question set |
vanilla_cos_sim_to_canonical |
dict[string, float] |
Cosine similarity scores to canonical form (raw tokens) |
trimmed_cos_sim_to_canonical |
dict[string, float] |
Cosine similarity scores after token normalization |
token_counts |
dict[string, integer] |
Number of tokens produced per tokenizer |
Dataset Creation
Curation Rationale
This dataset was created to:
- Systematically evaluate how different tokenization strategies handle Turkish
- Measure robustness against real-world text perturbations specific to Turkish
- Support research into the impact of tokenization on language model behavior
- Provide standardized benchmarks for Turkish language models
The questions were designed to be straightforward with high baseline accuracy, allowing researchers to cleanly measure performance degradation when perturbations are applied.
Source Data
Data Collection and Processing
- Canonical Questions: 40 baseline questions created in English
- Translation: Native Turkish speakers translated questions
- Perturbations: Each question underwent targeted perturbations designed to reflect Turkish characteristics
- Validation: Model-in-the-loop process ensured high baseline accuracy
Perturbation Categories
Canonical The baseline Turkish text written in standard, grammatically correct Turkish with no perturbations. This serves as the reference condition for evaluating the impact of all other perturbations.
Abbreviations Introduces common Turkish abbreviations and shortened forms (e.g.,
Dr.,Prof.,vb.,sn.), testing tokenizer robustness to compressed lexical forms.Capitalization Alters capitalization patterns by randomly capitalizing, lowercasing, or mixing case within words and sentences, simulating informal writing or casing errors.
Code / Language / Script Switching Mixes Turkish with English words or phrases within the same sentence, reflecting real-world code-switching common in technical, academic, or online Turkish text.
Contractions Applies contracted or fused forms common in informal Turkish writing (e.g., dropped vowels or merged suffix boundaries), stressing tokenizer handling of agglutinative morphology.
Date Formats Varies date representations (e.g.,
12.03.2022,12 Mart 2022,03/12/22), testing sensitivity to formatting and punctuation variation.Dialects Introduces regional Turkish dialectal or colloquial variants that preserve meaning but differ lexically or morphologically from Standard Turkish.
English Keyboard Simulates Turkish text typed on an English keyboard, leading to missing or substituted Turkish-specific characters (e.g.,
cokinstead ofçok,saglikinstead ofsağlık).Grammatical Errors Injects plausible grammatical mistakes such as incorrect suffix usage, agreement errors, or case marking issues, reflecting non-standard or learner Turkish.
Keyboard Proximity Errors Introduces typos caused by pressing adjacent keys on a keyboard, simulating realistic typing errors without intentionally changing word choice.
Numerical Formats Varies numeric representations (e.g.,
1.000vs.1000, comma vs. period usage for decimals), testing tokenizer sensitivity to locale-specific number formatting.Orthographic Errors Applies spelling mistakes that violate standard Turkish orthography (e.g., incorrect consonant usage or misspelled suffixes) while remaining plausible to native readers.
Phonetic Spelling Replaces words with spellings based on pronunciation rather than standard orthography, reflecting informal or speech-inspired Turkish writing.
Plausible Diacritics Errors Introduces missing, incorrect, or substituted diacritics (e.g.,
svs.ş,gvs.ğ,ivs.ı), testing tokenizer sensitivity to Turkish-specific characters.Similar Words Substitutes words with closely related or easily confusable alternatives (e.g., near-synonyms or minimal lexical contrasts), preserving sentence plausibility.
Spelled-Out Forms Replaces numerals, abbreviations, or symbols with fully spelled-out Turkish equivalents, increasing sequence length and altering token boundaries.
Typographical Errors Introduces general typographical mistakes such as duplicated letters, missing characters, or minor corruption commonly found in fast or careless typing.
Web Search Query Rewrites questions in the style of Turkish web search queries, using keyword-heavy phrasing, omitted function words, and informal structure typical of search engine inputs.
Who are the source data producers?
Native Turkish speakers curated and validated all questions and perturbations. The TokSuite research team at R3 designed the overall benchmark framework.
Annotations
Annotation process
Questions were manually created and translated by native speakers. Each perturbation was carefully designed to reflect authentic variations encountered in real-world Turkish text processing.
Who are the annotators?
Native Turkish speakers with expertise in linguistics and NLP, working as part of the TokSuite project.
Personal and Sensitive Information
The dataset contains only general knowledge questions and does not include any personal or sensitive information.
Considerations for Using the Data
Social Impact of Dataset
This dataset contributes to improving language technology for Turkish speakers by enabling better understanding of tokenization challenges and supporting more robust multilingual models.
Discussion of Biases
- Language variety: he dataset uses Standard Turkish (Türkiye Türkçesi) and may not fully represent regional or dialectal variations.
- Script focus: Only the Latin script is used; Turkish-specific diacritics and keyboard-related variations are included as perturbations.
- Domain coverage: Questions focus on general knowledge and may not represent domain-specific Turkish language use.
- Question simplicity: Designed for high baseline accuracy, which may not reflect real-world task complexity.
Other Known Limitations
- Relatively small dataset size (evaluation-only)
- Multiple-choice format
- Language-specific perturbations
- Results may differ at larger model scales
Additional Information
Dataset Curators
The dataset was curated by the TokSuite research team at R3.
Licensing Information
MIT license
Citation Information
If you use this dataset in your research, please cite the TokSuite paper:
@inproceedings{toksuite2026,
title={TokSuite: Measuring the Impact of Tokenizer Choice on Language Model Behavior},
author={Altıntaş, Gül Sena and Ehghaghi, Malikeh and Lester, Brian and Liu, Fengyuan and Zhao, Wanru and Ciccone, Marco and Raffel, Colin},
booktitle={Preprint.},
year={2026},
url={TBD}
}
Paper: TokSuite: Measuring the Impact of Tokenizer Choice on Language Model Behavior
Contributions
This dataset is part of TokSuite, which includes:
- 14 language models with identical architectures but different tokenizers
- Multilingual benchmark datasets (English, Turkish, Italian, Farsi, Chinese)
- Comprehensive analysis of tokenization's impact on model behavior
Contact
For questions or issues related to this dataset, please refer to the TokSuite project or contact the authors of the paper.
Part of the TokSuite Project
Understanding Tokenization's Role in Language Model Behavior
- Downloads last month
- 250