Getting Started

Add OACP to any Android app. You create 2 metadata files, one ContentProvider, and one BroadcastReceiver (or Activity). That's it.

What you'll create

File

Purpose

assets/oacp.json

Declares your app's capabilities (actions, parameters, metadata)

assets/OACP.md

Plain-English context for the AI assistant

OacpMetadataProvider.kt

ContentProvider so assistants can discover your app (auto-provided if using the SDK)

Receiver or Activity

Handles incoming voice commands

Step 1: Create oacp.json

Place this in app/src/main/assets/oacp.json.

For Flutter apps, place it in android/app/src/main/assets/oacp.json (NOT in Flutter's assets/ folder).

Minimal example (one capability)

Key fields

Field

What it does

appId

Use "__APPLICATION_ID__" - replaced with real package name at runtime

invoke.android.type

"broadcast" for background actions, "activity" for actions needing UI

aliases

Alternate phrasings ("turn on torch", "enable flashlight")

examples

Real user utterances ("I need a light")

keywords

Ranking signals ("flashlight", "torch")

disambiguationHints

Rules for choosing between similar actions

More metadata = better voice matching. A capability with 5 aliases and 5 examples resolves much more reliably than one with just a description.

Example with parameters

extrasMapping tells the assistant which Android Intent extra key to use for each parameter.

Step 2: Create OACP.md

Place alongside oacp.json in the same assets directory. This gives the AI richer context about your app.

Keep it concise. Explain what your app does, list capabilities in plain English, and note any disambiguation rules.

Step 3: Add the ContentProvider

This is how assistants discover your app. If you're using the OACP Kotlin SDK, the OacpProvider is auto-registered via manifest merger - you can skip this step entirely.

If you're not using the SDK (for example a pure Flutter or React Native project), you need a ContentProvider in native code:

For Flutter apps, change the asset paths to flutter_assets/assets/oacp.json and flutter_assets/assets/OACP.md.

Step 4: Handle voice commands

Option A: Background action (OacpReceiver)

Use for actions that don't need UI - toggling settings, querying data, etc. The SDK's OacpReceiver handles goAsync(), threading, result broadcasting, and error wrapping for you:

Why endsWith instead of exact match? Because debug builds append .debug to the package name. Using endsWith(".ACTION_FLASHLIGHT_ON") works for both com.example.app.ACTION_FLASHLIGHT_ON and com.example.app.debug.ACTION_FLASHLIGHT_ON.

Option B: Foreground action (Activity)

Use for actions that need visible UI - opening the camera, showing an article, etc.

Step 5: Register in AndroidManifest.xml

Important: All OACP components must have android:exported="true". On Android 12+, this is required for components with intent filters.

Step 6: Test it with adb

Verify discovery

Test actions

For debug builds, replace com.example.myapp with com.example.myapp.debug.

Common gotchas

Issue

Fix

Discovery not working

Authority must be exactly ${applicationId}.oacp

Actions not firing

Check android:exported="true" on receiver/activity

Works in release but not debug

Use endsWith() for action matching, not exact strings

Flutter asset not found

Put oacp.json in android/app/src/main/assets/, not Flutter's assets/

Parameters are null

Check extrasMapping in oacp.json matches the extra keys you're reading

Next steps

Last Edited: April 9, 2026