Workshop on NLP and Large Language Models for the Iranian Language Family
co-located with
The 19th Conference on the European Chapter of the Association for Computational Linguistics (EACL 2026)
SilkRoadNLP 2026 has now concluded

SilkRoadNLP 2026
SilkRoadNLP: An emerging community for NLP and LLMs
in the Iranian language family.
SilkRoadNLP was established as the first ACL-affiliated forum dedicated to low-resource NLP and large language models for the Iranian language family.
At its core, the workshop brought together scholars, engineers, and community researchers to explore the linguistic, historical, and social dimensions of NLP for Iranian languages. These include Persian (Farsi), Dari, and Tajik, as well as Kurdish, Pashto, Balochi, Gilaki, Mazandarani, Luri, Ossetic, and related varieties—many of which remain severely under-represented in computational research despite their deep literary and oral traditions.
The rise of foundation models brings both promise and risk: without deliberate inclusion, LLMs can amplify linguistic inequality and obscure local nuance.
SilkRoadNLP is an ongoing effort to address this challenge by fostering cross-disciplinary collaboration between NLP, linguistics, and cultural studies. Our goal is to support the development of resources, evaluation frameworks, and models that reflect the linguistic diversity and cultural context of the region.
We are continuing this work through future workshops, shared tasks, and open community resources. We welcome researchers and collaborators interested in contributing to this growing community.
SilkRoadNLP 2026 marked the first dedicated NLP workshop for the language of the Iranian linguistic family.
Despite significant challenges (due to the war in the region where many of these languages are spoken), researchers came together to share work, build connections, and begin shaping a community.Thank you to all authors, participants, committee members, and organizers who contributed to make this workshop possible.
This is just the beginning. We are now focused on growing this effort through future workshops, shared tasks, and open resources.
.png)