⚡ Zen LM
Models

zen-code

Legacy 14B dense code model for general programming tasks.

zen-code

Code (Legacy)

A 14B dense transformer for code generation and understanding. This is a legacy model -- for new projects, consider zen-coder (32B) or zen4-coder (480B MoE).

Specifications

PropertyValue
Model IDzen-code
Parameters14B
ArchitectureDense
Context Window32K tokens
StatusLegacy
HuggingFacezenlm/zen-code

Capabilities

  • Code generation across common languages
  • Code explanation and documentation
  • Basic refactoring suggestions
  • Test generation
  • 32K context for moderate file sizes

Usage

HuggingFace

pip install transformers torch
from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("zenlm/zen-code")
model = AutoModelForCausalLM.from_pretrained("zenlm/zen-code", device_map="auto")

inputs = tokenizer("Implement a linked list in Python:", return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

API

from hanzoai import Hanzo

client = Hanzo(api_key="hk-your-api-key")

response = client.chat.completions.create(
    model="zen-code",
    messages=[{"role": "user", "content": "Write a REST API endpoint in Express.js for user registration."}],
)
print(response.choices[0].message.content)

See Also

On this page