Excerpts below. I thought the model names were interesting

Toronto scaleup says Tiny Aya can run in regions where large-scale infrastructure isn’t always available.

Toronto AI company Cohere has released a suite of new multilingual AI models that support more than 70 languages on any device, even offline.

The base Tiny Aya model (Tiny Aya-Base) contains more than 3.35 billion parameters (the settings that control an AI model’s output and behavior) containing the data for languages like Amharic, German, Latvian, Tagalog, and Zulu. Those 3.35 billion parameters are a small number compared to well-known large language models like ChatGPT, which have hundreds of billions of parameters.

Tiny Aya-Base powers the instruction-tuned TinyAya-Global model, which Cohere released on Tuesday alongside several specialized models for specific regions of the world. TinyAya-Earth is strongest for languages across the African and West Asian regions; TinyAya-Fire is strongest for South Asian languages; and TinyAya-Water is strongest for the Asia-Pacific and European regions.

Cohere said in a blog post that this approach “allows each model to develop stronger linguistic grounding and cultural nuance,” resulting in systems that “feel more natural and reliable for the communities they are meant to serve.”

“The future of multilingual AI will not be one giant model,” the blog post reads. “It will be a vibrant ecosystem of many models, shaped by many voices.”

Cohere said Tiny Aya is designed to run on local devices, in classrooms, in community labs, and in regions where large-scale infrastructure isn’t always available, with the intent of bringing “high-quality AI” closer to researchers working on underrepresented languages and developers building locally. The company said the model could be used by a university lab as an offline translation or AI education tool in classrooms and community settings, without having to rely on cloud APIs.

The models are here I think?: https://huggingface.co/collections/CohereLabs/tiny-aya