OLMo
Description of OLMo
OLMo is a family of fully open language models from the Allen Institute for AI, in which the entire model pipeline is open: datasets, code, weights, training logs, and methodology. This makes OLMo one of the most transparent platforms for AI research and development. The new generation OLMo 2 includes 7B and 13B parameter models trained on trillions of tokens and available under the Apache 2.0 license, with a context window of about 4K tokens and quality on par with the best open-weight LLMs in its class.
The latest versions of OLMo 2/3 expand the context window to tens of thousands and even 128K tokens and add a thinking mode for step-by-step reasoning and tool use.
OLMo can be used to build research and experimental AI systems, enterprise assistants, RAG platforms over internal data, as well as specialized models for code, science, and analytics - with full control over the training and fine-tuning pipeline.
The FreeBlock team helps select the right OLMo configuration, design the architecture including RAG, agents, and tools, fine-tune the model on your data, and deploy it in the cloud or on-prem. If you want to rely on a truly open stack and reduce vendor lock-in risks, order AI project development based on OLMo from FreeBlock.
Other technologies
Submit an application
write to us on Telegram
@FreeBlockDev
or by e-mail
info@freeblock.dev
yes, sometimes all you need is a PDF
download presentation
Мы обрабатываются файлы cookie. Оставаясь на сайте, вы даёте своё согласие на использование cookie в соответствии с политикой конфиденциальности