Build, deploy, and manage AI workflows with visual tools. Fully open source, self-hostable, and built for the community by the community.
Design complex AI workflows with our intuitive drag-and-drop interface. Connect LLMs, knowledge bases, APIs, and conditional logic nodes to build sophisticated applications without code.
MIT licensed with complete source code transparency. Fork, modify, and contribute back to the community. No vendor lock-in, no hidden fees, just pure open source innovation.
Deploy on your infrastructure with Docker or Kubernetes. Complete data sovereignty, custom configurations, and the security of keeping your AI workflows entirely under your control.
Built by developers, for developers. Active community contributions, extensive documentation, and collaborative development. Join our Discord and help shape the future of AI tooling.
Integrate with any LLM provider - OpenAI, Anthropic, local models via Ollama, or your own custom endpoints. Switch providers without vendor lock-in.
Built on battle-tested Elixir/Phoenix with OTP supervision trees. Fault-tolerant, concurrent, and scalable architecture that handles real-world production workloads.
Leveraging the power of the Actor Model and OTP for fault-tolerant, concurrent AI workflows. Open source architecture you can trust, modify, and extend.
Real-time UI updates and streaming responses
Fault-tolerant chat sessions and workflow execution
Background processing for RAG ingestion pipeline
Robust data layer for applications and analytics
Free to use, modify, and distribute commercially
One-command deployment with docker-compose
Active Discord community and GitHub discussions