Apertus Table of Contents Model Summary How to use Evaluation Training Limitations Legal Aspects Model Summary Apertus is a 70B and 8B parameter language model designed to push the boundaries of fully-open multilingual and transparent models. The model supports over 1000 languages and long context, it uses only fully compliant and open training data, and achieves comparable performance to models trained behind closed doors. The model is a decoder-only transformer, pretrained on 15T tokens with a staged curriculum of web, code and math data. The model uses a new xIELU activation function and is trained from scratch with the AdEMAMix optimizer. Post-training included supervised fine-tuning and alignment via QRPO. Key features Fully open model: open weights + open data + full training details including all data and training recipes Massively Multilingual: 1811 natively supported languages Compliant Apertus is trained while respecting opt-out consent of data owners (even retrospectivey), and avoiding memorization of training data For more details refer to our technical report How to use The modeling code for Apertus is available in transformers v4.56.0, so make sure to upgrade your transformers version. You can also load the model with the latest vLLM which uses transformers as a backend. pip install -U transformers from transformers import AutoModelForCausalLM, AutoTokenizer model_name = "swiss-ai/Apertus-70B-2509" device = "cuda" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForCausalLM.from_pretrained( model_name, ).to(device) prompt = "Give me a brief explanation of gravity in simple terms." messages_think = [ {"role": "user", "content": prompt} ] text = tokenizer.apply_chat_template( messages_think, tokenize=False, add_generation_prompt=True, ) model_inputs = tokenizer([text], return_tensors="pt").to(model.device) generated_ids = model.generate(**model_inputs, max_new_tokens=32768) output_ids = generated_ids[0][len(model_inputs.input_ids[0]...
First seen: 2025-09-05 19:15
Last seen: 2025-09-06 11:25