Perplexica, a Self-Hosted Open-Source Alternative to Perplexity AI

Source 1 min read
perplexicaself-hostingollamaopen-sourcesearch-enginedockerai-search
Watch on TikTok Tap to open video

Summary

Perplexica is a free, open-source, self-hostable alternative to Perplexity AI search. It pairs with SearXNG for web search and Ollama for local LLM inference, meaning you can run a fully private AI-powered search engine on hardware as modest as a Raspberry Pi (API mode) or an old Mac Mini with 8 GB RAM (local model mode).

Key Insight

  • Perplexica uses two Docker containers: Perplexica itself (port 3000) and SearXNG (port 8080) as the search backend
  • Even a 2B parameter model (Qwen 3.5 2B) running on 8 GB RAM provides usable speed for AI search - you trade depth for speed at smaller sizes
  • If you use cloud APIs (Claude, ChatGPT) instead of Ollama, hardware requirements drop to Raspberry Pi level since inference is offloaded
  • SearXNG is highly configurable - you can enable/disable individual search engines per category (web, images, video, news, social, IT, science) giving granular control over search sources
  • The setup includes a “Discover” tab that surfaces interesting content and a library of past searches
  • You can add system-level instructions (e.g., “be factual, no fluff”) that persist across all searches
  • Putting SearXNG behind a VPN container is mentioned as a privacy enhancement option