⚡ Zen LM
Models

zen-coder

32B dense code model with 131K context for multi-language development.

zen-coder

Code

A 32B dense transformer trained for software engineering. Supports multi-language code generation, refactoring, debugging, and documentation with a 131K context window for working with large codebases.

Specifications

PropertyValue
Model IDzen-coder
Parameters32B
ArchitectureDense
Context Window131K tokens
StatusAvailable
HuggingFacezenlm/zen-coder

Capabilities

  • Multi-language code generation (Python, TypeScript, Go, Rust, C++, and more)
  • Code review and refactoring
  • Bug detection and debugging
  • Documentation generation
  • Test case creation
  • 131K context for repository-scale understanding

Usage

HuggingFace

pip install transformers torch
from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("zenlm/zen-coder")
model = AutoModelForCausalLM.from_pretrained("zenlm/zen-coder", device_map="auto")

inputs = tokenizer("Write a Python function to merge two sorted lists:", return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

API

from hanzoai import Hanzo

client = Hanzo(api_key="hk-your-api-key")

response = client.chat.completions.create(
    model="zen-coder",
    messages=[{"role": "user", "content": "Write a Go HTTP server with graceful shutdown."}],
)
print(response.choices[0].message.content)

See Also

On this page