Reading List
Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs (Ivan Mehta/TechCrunch) from Techmeme RSS feed.
Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs (Ivan Mehta/TechCrunch)
Ivan Mehta / TechCrunch:
Cohere releases Tiny Aya, a family of 3.35B-parameter open-weight models supporting 70+ languages for offline use, trained on a single cluster of 64 H100 GPUs — Enterprise AI company Cohere launched a new family of multilingual models on the sidelines of the ongoing India AI Summit.