Why We Open-Source Everything
The case for radical openness in AI — why releasing weights is the right thing, strategically and ethically.
Every time we release model weights, someone asks us: "Why would you give this away?"
The answer is simple: the open frontier is how we win.
The Proprietary Trap
Keeping model weights closed creates short-term competitive moats that collapse. History shows this repeatedly: closed systems eventually face open alternatives that are "good enough" — and then the network effects of the open ecosystem compound until the proprietary system can't compete.
We've seen it with operating systems, databases, web servers, and now AI infrastructure. The question isn't whether open models will match proprietary ones — it's when.
By building openly from day one, we're on the right side of that curve.
What We Keep Private
We're not naive about intellectual property. There's a meaningful distinction between:
- Model weights — released openly under the Zen Open License
- Training data — proprietary curation and filtering pipelines that we've invested in heavily
- Infrastructure — our serving stack, routing optimizations, and operational knowledge
- Research — papers we publish (soon™), insights we've earned through iteration
The weights are the democratizing artifact. The infrastructure and know-how are where sustainable business value lives.
The Safety Argument
Some argue that open-sourcing powerful models is dangerous. We take safety seriously — which is why we think openness is the safer path.
Closed models create information asymmetries where only the developer knows what the model can and can't do. Open models allow:
- Independent red-teaming and vulnerability research
- Academic study of emergent behaviors
- Community development of alignment techniques
- Regulatory oversight that's grounded in facts, not marketing
Monsters hide in the dark. The best disinfectant is sunlight.
The @zenlm Org
All Zen open-source code lives at the @zenlm npm org and the zenlm GitHub organization. This includes:
@zenlm/models— canonical model catalog with specs and pricing@zenlm/ui— React components for model libraries and cards- Inference server code, fine-tuning recipes, and more coming
Install and build with us:
npm install @zenlm/models @zenlm/uiThe open frontier needs builders. Come help us build it.