Extend any Android app's capabilities to any on-device AI assistant.
OACP gives apps a simple way to describe what they can do, and gives assistants a standard way to discover and invoke those actions on device.
Start in the right place
Protocol
OACP
The open standard. oacp.json declares capabilities, ContentProviders expose them, Intents invoke them.
Read the protocol →Open-source assistant
Hark
Voice assistant built on OACP. Two-stage on-device AI pipeline: EmbeddingGemma for intent, Qwen3 for slot filling.
Meet Hark →SDKs
SDKs
Kotlin ships today. Flutter and React Native are under development and visible in the docs so contributors can follow along.
Get the SDK →How it works
- 1Describe the appThe app ships oacp.json and OACP.md so an assistant can understand the actions, parameters, and vocabulary.
- 2Discover capabilitiesAn OACP-compatible assistant scans the device for .oacp ContentProviders and reads each manifest at runtime.
- 3Match the requestThe assistant uses its on-device pipeline to match the user's request to the best capability and extract any parameters.
- 4Invoke the appThe assistant dispatches an Android intent. Background tasks use broadcasts and UI flows use activities.
- 5Return the resultIf the action is async, the app sends a structured result back and the assistant can speak or display it.
┌──────────────────────────┐ ┌──────────────────────────┐
│ Any Android app │ │ Hark (voice assistant) │
│ + OACP Kotlin SDK │◀──────▶│ on-device AI pipeline │
│ + oacp.json / OACP.md │ OACP │ discovers + dispatches │
└──────────────────────────┘ └──────────────────────────┘
▲ ▲
└──────── OACP Protocol ─────────────┘
(content providers + intents)What Is Being Cooked
A fuller open-source AI assistant with on-device AI, a stronger protocol, more SDKs, and more real apps using OACP. The current docs stay a little unfinished on purpose so the next contributors can see where help is needed.
See the full roadmap