✨ Arcee는 Apache 2.0에서 출시된 새로운 Trinity 모델로 미국 오픈 소스 AI를 재부팅하는 것을 목표로 합니다.
★ 8 전문 정보 ★
For much of 2025, the frontier of open-weight language models has been defined not in Silicon Valley or New York City, but in Beijing and Hangzhou.Chinese research labs including Alibaba's Qwen, DeepSeek, Moonshot and Baidu have rapidly set the pace in developing large-scale, open Mixture-of-Ex
🎯 핵심 특징
✅ 고품질
검증된 정보만 제공
⚡ 빠른 업데이트
실시간 최신 정보
💎 상세 분석
전문가 수준 리뷰
📖 상세 정보
For much of 2025, the frontier of open-weight language models has been defined not in Silicon Valley or New York City, but in Beijing and Hangzhou.Chinese research labs including Alibaba's Qwen, DeepSeek, Moonshot and Baidu have rapidly set the pace in developing large-scale, open Mixture-of-Experts (MoE) models — often with permissive licenses and leading benchmark performance. While OpenAI fielded its own open source, general purpose LLM this summer as well — gpt-oss-20B and 120B — the uptake has been slowed by so many equally or better performing alternatives. Now, one small U.S. company is pushing back.Today, Arcee AI announced the release of Trinity Mini and Trinity Nano Preview, the first two models in its new “Trinity” family—an open-weight MoE model suite fully trained in the United States. Users can try the former directly for themselves in a chatbot format on Acree's new website, chat.arcee.ai, and developers can download the code for both models on Hugging Face and