Skip to main content

MarvinOS Local AI Stack: Fully Self-Hosted AI Experimentation

  



MarvinOS Local AI Stack: Fully Self-Hosted AI Experimentation

Imagine having a powerful, fully self-hosted AI experimentation platform that allows you to explore the possibilities of artificial intelligence without relying on cloud dependencies or compromising on security. That's exactly what we're excited to introduce today — the MarvinOS Local AI Stack!

This innovative tool is designed for local AI experimentation, secure internal LLM access, GPU-accelerated inference, and even supports offline or air-gapped environments. It also includes Retrieval-Augmented Generation (RAG) with local document embeddings, enabling long-term knowledge bases without leaving your machine.


Key Features & Benefits

MarvinOS Local AI Stack offers a wide range of features that make it an ideal choice for those who value autonomy and control over their AI experiments:

  • Fully local LLM inference: No cloud dependency means you're in complete control.

  • RAG with Qdrant vector database: Create local knowledge bases and retrieve relevant context with citations.

  • GPU-accelerated (NVIDIA): Leverage the power of your NVIDIA GPU to accelerate your experimentation.

  • HTTPS ingress with Nginx: Ensure secure access to your AI experiments through a trusted interface.

  • Web-based chat UI (Open WebUI): A seamless user experience for interacting with your LLM models.

  • Private web search grounding (SearXNG): Protect sensitive data by keeping it internal and not exposing it to the public internet.

  • Local OpenAI-compatible TTS: Generate speech using OpenAI's technology, but without relying on cloud services.

  • Stable Diffusion-based image creation: Unleash your creativity with our local GPU-accelerated image generation capabilities.


Getting Started

Getting started is easy! Simply follow these steps:

git clone git@github.com:MarvinOS-online/fitlab-core.git cd fitlab-core docker compose up --build -d

By doing so, you'll have a fully functional AI experimentation platform that's tailored to your needs, including LLMs, RAG, TTS, Web Search, and Image Generation.


Prerequisites

Before diving in, please ensure that:

  • You're running on a Linux host (recommended)

  • Your Docker version is at least 24+

  • You have Docker Compose v2 installed

  • NVIDIA drivers and Container Toolkit are installed

  • Verify GPU support:

docker run --rm --gpus all nvidia/cuda:12.2.0-base nvidia-smi

Common Operations & Security

For a seamless experience:

  • Restart the stack:

docker compose restart
  • Stop the stack:

docker compose down
  • View logs in real-time:

docker compose logs -f open-webui docker compose logs -f ollama docker compose logs -f qdrant docker compose logs -f ingress

Security notes:

  • Self-signed SSL certificates are used by default

  • Open WebUI authentication is enabled

  • Replace placeholder secrets before exposure

  • SearXNG is intended for internal use only


Licensing & Attribution

MarvinOS Local AI Stack is proudly open source. Components are released under the following licenses:

  • Ollama — Apache 2.0

  • Open WebUI — MIT

  • Nginx — BSD-like

  • SearXNG — AGPLv3

  • Automatic1111 Stable Diffusion — GPL-compatible

This orchestration layer is designed to be freely used, modified, and adapted to your needs.


Support & Donate

If you find MarvinOS Local AI Stack useful and want to help support ongoing development, consider contributing:

  • Donations help cover server costs, development time, and model hosting

  • Any contribution — big or small — keeps the project sustainable and improves future features like RAG, TTS, and GPU performance tuning

Your support ensures that MarvinOS remains free, open, and cutting-edge for everyone in the AI community.




Conclusion

MarvinOS Local AI Stack represents a significant step forward in empowering the AI community. With its self-hosted, GPU-accelerated design and local RAG capabilities, it's an ideal choice for those seeking control, security, and flexibility.

If you're ready to explore the vast potential of AI without compromising your values, join us today by trying MarvinOS Local AI Stack — and consider supporting the project to keep it growing!


Sources & Learn More:
https://github.com/MarvinOS-online/fitlab-core



Comments

Popular posts from this blog

I Built a Docker Expert Because I Was Tired of Searching Docs

  I Built a Docker Expert Because I Was Tired of Searching Docs I didn’t set out to build a “Docker expert.” I set out to stop breaking flow. If you’ve worked with Docker long enough, you know the feeling: you know the answer is in the docs, but you don’t know where . The information is correct, but fragmented. CLI flags in one place. Concepts in another. Edge cases buried three clicks deep. By the time you find what you need, the mental context is gone. So instead of reading Docker documentation, I asked a different question: What if I could  talk  to the Docker docs?  Not a chatbot that “knows Docker” in a vague, internet-trained way; but something grounded strictly in Docker’s own words; current, precise, and boringly correct. That’s what I built. The Idea: Treat Documentation as a Dataset Docker’s documentation is excellent. It’s also public, structured, and version-controlled on GitHub. That’s the key insight. Instead of scraping random websites or relying on a...

The Centralization Trap in AI

  The Centralization Trap in AI. AI is everywhere—and the debate is intense. Enthusiasts call it a force for progress: multiplying productivity, creating new industries, and amplifying human capabilities. Critics warn of job loss, erosion of autonomy, environmental strain, and even existential risks. The real issue isn’t AI itself. It’s who controls it—and who pays the costs. Centralization Always Externalizes Harm Most AI lives in massive, centralized platforms. These data centers draw enormous amounts of electricity and water. Cooling alone can consume millions of gallons annually. High demand drives grid expansion and raises energy costs for local communities, many of whom see no benefit. Platforms also control access, dictate usage, and extract data without meaningful user oversight. Profit and influence concentrate, while environmental, economic, and social costs are externalized. The danger isn’t the technology. It’s the architecture. Why Architecture Matters Centralized AI r...