You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
OpenCoder is an open and reproducible code LLM family which includes 1.5B and 8B base and chat models, supporting both English and Chinese languages. Starting from scratch, OpenCoder is trained on 2.5 trillion tokens composed of 90% raw code and 10% code-related web data, reaching the performance of top-tier code LLMs. We provide not only model weights and inference code, but also the reproducible training data, the complete data processing pipeline, rigorous experimental ablation results, and detailed training protocols. Empowering researchers to build and innovate, OpenCoder is your open foundation for advancing code AI.
State of the art code LLM that beats Qwen2.5-Coder of equivalent size.
OpenCoder is an open and reproducible code LLM family which includes 1.5B and 8B base and chat models, supporting both English and Chinese languages. Starting from scratch, OpenCoder is trained on 2.5 trillion tokens composed of 90% raw code and 10% code-related web data, reaching the performance of top-tier code LLMs. We provide not only model weights and inference code, but also the reproducible training data, the complete data processing pipeline, rigorous experimental ablation results, and detailed training protocols. Empowering researchers to build and innovate, OpenCoder is your open foundation for advancing code AI.
State of the art code LLM that beats Qwen2.5-Coder of equivalent size.
https://opencoder-llm.github.io/
https://arxiv.org/pdf/2411.04905
https://huggingface.co/infly/OpenCoder-1.5B
https://huggingface.co/infly/OpenCoder-1.5B-Instruct
https://huggingface.co/infly/OpenCoder-8B
https://huggingface.co/infly/OpenCoder-8B-Instruct
The text was updated successfully, but these errors were encountered: