Transformer Lab is proud to be supported by Mozilla through the Mozilla Builders Program What is Transformer Lab? Transformer Lab is an open source platform that allows anyone to build, tune, & run Large Language Models locally, without writing code. We imagine a world where every software developer will incorporate large language models in their products. Transformer Lab allows users to do this without needing to know Python nor have previous experience with machine learning. Learn more about our vision Detailed Feature List ๐ One-click Download Hundreds of Popular Models: Llama3, Phi3, Mistral, Mixtral, Gemma, Command-R, and dozens more โฌ Download any LLM from Huggingface ๐ถ Finetune / Train Across Different Hardware Finetune using MLX on Apple Silicon Finetune using Huggingface on GPU โ๏ธ RLHF and Preference Optimization DPO ORPO SIMPO Reward Modeling ๐ง GRPO Training ๐ป Work with LLMs Across Operating Systems: Windows App MacOS App Linux ๐ฌ Chat with Models Chat Completions Preset (Templated) Prompts Chat History Tweak generation parameters Batch Inference ๐ Use Different Inference Engines MLX on Apple Silicon Huggingface Transformers vLLM Llama CPP ๐งโ๐ Evaluate models Eleuther Harness LLM as a Judge Objective Metrics Red Teaming Evals Eval visualization and graphing ๐ RAG (Retreival Augmented Generation) Drag and Drop File UI Works on Apple MLX, Transformers, and other engines ๐ Build Datasets for Training Pull from hundreds of common datasets available on HuggingFace Provide your own dataset using drag and drop ๐ข Calculate Embeddings ๐ Full REST API ๐ฉ Run in the Cloud You can run the user interface on your desktop/laptop while the engine runs on a remote or cloud machine Or you can run everything locally on a single machine ๐ Convert Models Across Platforms Convert from/to Huggingface, MLX, GGUF ๐ Plugin Support Easily pull from a library of existing plugins Write your own plugins to extend functionality ๐ Prompt Editing Easily edit System Messages or Prompt Templa...
First seen: 2025-04-14 02:02
Last seen: 2025-04-14 16:05