Show HN: LocalScore – Local LLM Benchmark

https://news.ycombinator.com/rss Hits: 12
Summary

Download LocalScore There are two ways to run LocalScore. The easiest way to get started is to download one of the Official Models. If you have .gguf models already you run LocalScore with them. Run With Official Models Your own .gguf What OS are you running? MacOS/Linux Windows Select the benchmark you want to run: Tiny - 1B requires: ~ 2GB memory Small - 8B requires: ~ 6GB memory Medium - 14B requires: ~ 10GB memory Open your terminal and run: Open cmd.exe and run: 3. localscore- 0.9.2 .exe -m Llama-3.2-1B-Instruct-Q4_K_M.gguf curl -OL https://localscore.ai/download/localscore-tiny chmod +x localscore-tiny ./localscore-tiny What OS are you running? MacOS/Linux Windows Download LocalScore Open your terminal and run: chmod +x localscore-0.9.2 ./localscore-0.9.2 -m path/to/model.gguf Having issues with the CLI client? Check out the troubleshooting guide. For further documentation on the LocalScore CLI, check out the README

First seen: 2025-04-06 13:14

Last seen: 2025-04-07 04:16