• Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints
  • Business
  • Markets
  • Politics
  • Crypto
  • Finance
  • Intelligence
    • Policy Intelligence
    • Security Intelligence
    • Economic Intelligence
    • Fashion Intelligence
  • Energy
  • Technology
  • Taxes
  • Creator Economy
  • Wealth Management
  • LBNN Blueprints

Tiny startup Arcee AI built a 400B open source LLM from scratch to best Meta’s Llama

Simon Osuji by Simon Osuji
January 28, 2026
in Creator Economy
0
Tiny startup Arcee AI built a 400B open source LLM from scratch to best Meta’s Llama
0
SHARES
2
VIEWS
Share on FacebookShare on Twitter

Many in the industry think the winners of the AI model market have already been decided: Big Tech will own it (Google, Meta, Microsoft, a bit of Amazon) along with their model makers of choice, largely OpenAI and Anthropic. 

But tiny 30-person startup Arcee AI disagrees. The company just released a truly and permanently open (Apache license) general-purpose, foundation model called Trinity, and Arcee claims that at 400B parameters, it is among the largest open-source foundation models ever trained and released by a U.S. company.

Arcee says Trinity compares to Meta’s Llama 4 Maverick 400B, and Z.ai GLM-4.5, a high-performing open-source model from China’s Tsinghua University, according to benchmark tests conducted using base models (very little post training).

Arcee AI benchmarks for Trinity LLM
Arcee AI benchmarks for its Trinity large LLM (preview version, base model)Image Credits:Arcee

Like other state-of-the-art (SOTA) models, Trinity is geared for coding and multi-step processes like agents. Still, despite its size, it’s not a true SOTA competitor yet because it currently supports only text.

More modes are in the works — a vision model is currently in development, and a speech-to-text version is on the roadmap, CTO Lucas Atkins told TechCrunch (pictured above, on the left). In comparison, Meta’s Llama 4 Maverick is already multi-modal, supporting text and images.

But before adding more AI modes to its roster, Arcee says, it wanted a base LLM that would impress its main target customers: developers and academics. The team particularly wants to woo U.S. companies of all sizes away from choosing open models from China. 

“Ultimately, the winners of this game, and the only way to really win over the usage, is to have the best open-weight model,” Atkins said. “To win the hearts and minds of developers, you have to give them the best.”

Techcrunch event

San Francisco
|
October 13-15, 2026

The benchmarks show that the Trinity base model, currently in preview while more post-training takes place, is largely holding its own and, in some cases, slightly besting Llama on tests of coding and math, common sense, knowledge and reasoning.

Related posts

OpenAI’s Sora app is struggling after its stellar launch

OpenAI’s Sora app is struggling after its stellar launch

January 29, 2026
I built marshmallow castles in Google’s new AI world generator

I built marshmallow castles in Google’s new AI world generator

January 29, 2026

The progress Arcee has made so far to become a competitive AI Lab is impressive. The large Trinity model follows two previous small models released in in December: the 26B-parameter Trinity Mini, a fully post-trained reasoning model for tasks ranging from web apps to agents, and the 6B-parameter Trinity Nano, an experimental model designed to push the boundaries of models that are tiny yet chatty.  

The kicker is, Arcee trained them all in six months for $20 million total, using 2,048 Nvidia Blackwell B300 GPUs. This out of the roughly $50 million the company has raised so far, said founder and CEO Mark McQuade (pictured above, on the right). 

That kind of cash was “a lot for us,” said Atkins, who led the model building effort. Still, he acknowledged that it pales in comparison to how much bigger labs are spending right now.

The six-month timeline “was very calculated,” said Atkins, whose career before LLMs involved building voice agents for cars. “We are a younger startup that’s extremely hungry. We have a tremendous amount of talent and bright young researchers who, when given the opportunity to spend this amount of money and train a model of this size, we trusted that they’d rise to the occasion. And they certainly did, with many sleepless nights, many long hours.” 

McQuade, previously an early employee at open-source model marketplace HuggingFace, says Arcee didn’t start out wanting to become a new U.S. AI Lab: The company was originally doing model customization for large enterprise clients like SK Telecom. 

“We were only doing post-training. So we would take the great work of others: We would take a Llama model, we would take a Mistral model, we would take a Qwen model that was open source, and we would post-train it to make it better” for a company’s intended use, he said, including doing the reinforcement learning. 

But as their client list grew, Atkins said, the need for their own model was becoming a necessity, and McQuade was worried about relying on other companies. At the same time, many of the best open models were coming from China, which U.S. enterprises were leery of, or were barred from using. 

It was a nerve-wracking decision. “I think there’s less than 20 companies in the world that have ever pre-trained and released their own model” at the size and level that Arcee was gunning for, McQuade said. 

The company started small at first, trying its hand at a tiny, 4.5B model created in partnership with training company DatologyAI. The project’s success then encouraged bigger endeavors. 

But if the U.S. already has Llama, why does it need another open weight model? Atkins says by choosing the open source Apache license, the startup is committed to always keeping its models open. This comes after Meta CEO Mark Zuckerberg last year indicated his company might not always make all of its most advanced models open source. 

“Llama can be looked at as not truly open source as it uses a Meta-controlled license with commercial and usage caveats,” he says. This has caused some open source organizations to claim that Llama isn’t open source compliant at all.

“Arcee exists because the U.S. needs a permanently open, Apache-licensed, frontier-grade alternative that can actually compete at today’s frontier,” McQuade said.

All Trinity models, large and small, can be downloaded for free. The largest version will be released in three flavors. Trinity Large Preview is a lightly post-trained instruct model, meaning it’s been trained to follow human instructions, not just predict the next word, which gears it for general chat usage. Trinity Large Base is the base model without post-training.

Then we have TrueBase, a model with any instruct data or post training so enterprises or researchers that want to customize it won’t have to unroll any data, rules or assumptions.

Acree AI will eventually offer a hosted version of its general release model for, it says, competitive API pricing. That release is up to six weeks away as the startup continues to improve the model’s reasoning training.

API pricing for Trinity-Mini is $0.045 / $0.15, and there is a rate-limited free tier available, too. Meanwhile, the company still sells post-training and customization options. 

Source link

Previous Post

Afreximbank bolsters Angola’s Energy Sector with a $1.75-billion facility for Sonangol

Next Post

AI Code Transforms C to Rust for Safer Software

Next Post
AI Code Transforms C to Rust for Safer Software

AI Code Transforms C to Rust for Safer Software

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

RECOMMENDED NEWS

Ghana: Tema ECG Embarks On Meter Replacement

Ghana: Tema ECG Embarks On Meter Replacement

2 years ago
Add USDT to MetaMask

Add USDT to MetaMask

1 year ago
How DOGE threatens the Forest Service and public lands

How DOGE threatens the Forest Service and public lands

11 months ago
Ghana’s oil revenue jumps by 27.8% to $1.36 billion in 2024 despite low production

Ghana’s oil revenue jumps by 27.8% to $1.36 billion in 2024 despite low production

5 months ago

POPULAR NEWS

  • Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    Ghana to build three oil refineries, five petrochemical plants in energy sector overhaul

    0 shares
    Share 0 Tweet 0
  • The world’s top 10 most valuable car brands in 2025

    0 shares
    Share 0 Tweet 0
  • Top 10 African countries with the highest GDP per capita in 2025

    0 shares
    Share 0 Tweet 0
  • Global ranking of Top 5 smartphone brands in Q3, 2024

    0 shares
    Share 0 Tweet 0
  • When Will SHIB Reach $1? Here’s What ChatGPT Says

    0 shares
    Share 0 Tweet 0

Get strategic intelligence you won’t find anywhere else. Subscribe to the Limitless Beliefs Newsletter for monthly insights on overlooked business opportunities across Africa.

Subscription Form

© 2026 LBNN – All rights reserved.

Privacy Policy | About Us | Contact

Tiktok Youtube Telegram Instagram Linkedin X-twitter
No Result
View All Result
  • Home
  • Business
  • Politics
  • Markets
  • Crypto
  • Economics
    • Manufacturing
    • Real Estate
    • Infrastructure
  • Finance
  • Energy
  • Creator Economy
  • Wealth Management
  • Taxes
  • Telecoms
  • Military & Defense
  • Careers
  • Technology
  • Artificial Intelligence
  • Investigative journalism
  • Art & Culture
  • LBNN Blueprints
  • Quizzes
    • Enneagram quiz
  • Fashion Intelligence

© 2023 LBNN - All rights reserved.