ollama + docker + home server
Self-hosted LLM
Using Docker, Node.js, and Ollama, I created a self-hosted AI chat interface with streaming responses and markdown rendering. Serves from my domain which is hosted behind an nginx proxy. Try it above, it's fully functional right here on the page.