Build and deploy AI agent workflows in minutes. Build Workflows with Ease Design agent workflows visually on a canvas—connect agents, tools, and blocks, then run them instantly. Supercharge with Copilot Leverage Copilot to generate nodes, fix errors, and iterate on flows directly from natural language. Integrate Vector Databases Upload documents to a vector store and let agents answer questions grounded in your specific content. Quickstart Self-hosted: NPM Package npx simstudio → http://localhost:3000 Note Docker must be installed and running on your machine. Options Flag Description -p, --port <port> Port to run Sim on (default 3000 ) --no-pull Skip pulling latest Docker images Self-hosted: Docker Compose # Clone the repository git clone https://github.com/simstudioai/sim.git # Navigate to the project directory cd sim # Start Sim docker compose -f docker-compose.prod.yml up -d Access the application at http://localhost:3000/ Using Local Models with Ollama Run Sim with local AI models using Ollama - no external APIs required: # Start with GPU support (automatically downloads gemma3:4b model) docker compose -f docker-compose.ollama.yml --profile setup up -d # For CPU-only systems: docker compose -f docker-compose.ollama.yml --profile cpu --profile setup up -d Wait for the model to download, then visit http://localhost:3000. Add more models with: docker compose -f docker-compose.ollama.yml exec ollama ollama pull llama3.1:8b Using an External Ollama Instance If you already have Ollama running on your host machine (outside Docker), you need to configure the OLLAMA_URL to use host.docker.internal instead of localhost : # Docker Desktop (macOS/Windows) OLLAMA_URL=http://host.docker.internal:11434 docker compose -f docker-compose.prod.yml up -d # Linux (add extra_hosts or use host IP) docker compose -f docker-compose.prod.yml up -d # Then set OLLAMA_URL to your host's IP Why? When running inside Docker, localhost refers to the container itself, not your host machine. host...
First seen: 2025-12-11 19:38
Last seen: 2025-12-12 13:44