Introducing QuickAI 1.0·Learn More →

Your entire knowledge base,
instantly askable.

A private, grounded AI assistant for your second brain. Primarily grounded in your local notes, with optional web search for the gaps.

What was the feedback on the 'Nebula' campaign from last Tuesday's meeting?
Searching brainstorming_notes · 2 files
Sourcesnebula_sync.mddesign_audit.pdf
Draft a concise update for the design team highlighting these three points.
Ask a follow-up...
gemma-4↓ 1.4k · ↑ 82ms

Native integration with every frontier model, plus fully local

OpenAI
OpenAI
Anthropic
Anthropic
Gemini
Gemini
Grok
Llama
Llama
Ollama
Ollama

Bring your own API key — or run 100% offline with a local model.

Notes are where great ideas
go to hide.

You've spent years building a digital garden in Obsidian or Markdown. But when you actually need an answer, you're stuck digging through folders, chasing broken tags, and losing your flow. Your “Second Brain” has become a digital junk drawer.

ObsidianApple Notes.md

You don't need more notes. You need a way to talk to the ones you already have.

One keypress to remember everything.

QuickAI brings the power of a large language model to your local file system. No training on your private data. No server of ours in the path. Just instant recall of every thought you've ever recorded.

How it works

Five primitives,
one keypress.

Grounded Truth

Every answer traces back to a file.

What was our decision on the auth rewrite?
You decided to drop the custom JWT flow and move to session cookies — primarily to simplify revocation.
Sourcesmeeting_notes.mdarchitecture_v3.md

Grounded, not guessed. Every claim ships with a clickable link to the exact markdown file it came from.

The Note Loop

Today's chat
is tomorrow's context.

.md

Save any chat back to your notes as a summarized markdown file. Your assistant's memory compounds.

Contextual Recall

Highlight, then
reply.

The migration runs in
two phases — backfill, then cutover
. Both phases are…
Reply

Highlight text anywhere on your Mac, hit Reply, drop it in as context.

Local-first

We never see
your notes.

On-device by default. BYOK for frontier. No server of ours in the path.

Bring your own key

Your model,
your key.

Plug in GPT-5.4, Claude 4.7, Gemini 3.1 — whatever you already pay for. Or run fully offline with a local MoE model via Ollama. Your tokens stay in your keychain. We never see them.

Active model
  • GPT-5.4OpenAI
    your key
  • Claude 4.7 OpusAnthropic
    your key
  • Gemini 3.1 ProGoogle
    your key
  • Llama 4 ScoutOllama · local
    offline
  • Qwen 3.6 AgentOllama · local
    offline

Stop searching.
Start synthesizing.

This is the workflow you were promised. A silent partner that knows exactly what you know.

How do I initialize the auth-provider in this project? I wrote a custom hook for it in January.
Reading app_structure · auth_setup.md
Sourcesauth_provider.tsxsession_logic.rs
Can you provide that code block for me?
Ask a follow-up...
gemma-4↓ 1.4k · ↑ 82ms

Pricing

One-time payment. Yours forever.

Every tier is a lifetime license with one year of free updates. We never see your notes.

Privacy First

Local-first Inference

Run fully offline via Ollama, or bring your own key when you want frontier power. We never route your notes through a server of ours — because we don't run one.

Speed

Instant Recall

Find answers across thousands of files in milliseconds. No cloud latency, no loading spinners.

Ownership

Zero Subscriptions

Pay once, own it forever. We believe premium tools shouldn't be a rent-trap for your knowledge.

Active
Early Bird
The first to fly.
$69$19lifetime
50 of 50 seats left0% claimed
  • Lifetime license
  • 1 year of free updates
  • Unlimited knowledge bases
  • Unlimited Save to knowledge
  • Multi-model picker
  • Direct line to the founder
Late Bird
Still a steal.
$69$29lifetime
  • Lifetime license
  • 1 year of free updates
  • Unlimited knowledge bases
  • Unlimited Save to knowledge
  • Multi-model picker

Unlocks when Early Bird sells out.

Regular
The full price, always available.
$69lifetime
  • Lifetime license
  • 1 year of free updates
  • Unlimited knowledge bases
  • Unlimited Save to knowledge
  • Multi-model picker

Unlocks when Early Bird sells out.

Common questions.

Everything you need to know about privacy, licensing, and local AI.

How does 'Local Inference' actually work?

QuickAI runs inference on your Mac's hardware and searches your files locally. When you ask a question, the 'thinking' happens on your CPU/GPU, not in the cloud. We never route your notes through a server of ours — because we don't have one.

Which AI models do you support?

Run fully offline with Llama 4 Scout, Qwen 3.6, or Gemma via Ollama or LM Studio. Or 'Bring Your Own Key' for GPT-5.4, Claude 4.7 Opus, or Gemini 3.1 Pro when you want frontier power. Either way, we never see your traffic.

What does 'Lifetime License' really mean?

You pay once, and the version of QuickAI you buy is yours forever. No monthly 'rent' for your own knowledge. This license includes one full year of feature updates. After that, you can keep using your version forever or renew for another year of updates.

Does it work with my existing notes?

Yes. QuickAI is a 'read-only' layer that sits on top of your existing folders. It works with Obsidian, Apple Notes, and any folder of Markdown or PDF files. No vendor lock-in, ever.