Skip to content

cai-layer/cai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

269 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cai logo

Cai

Press ⌥C on anything. Run custom actions, locally.

Select any text or image and transform with AI, scripts, shortcuts and more.
Zero app switching.

Download macOS 14.0+ Swift 5.9 Hugging Face Runs locally

Website · Docs · Download · Extensions


Cai demo — select text, press ⌥C, pick an action

What

Select any text or image anywhere, press ⌥C, and run AI prompts, shell scripts, or connectors like GitHub and Linear without the app switching. One hotkey, no config, everything runs locally.

No cloud. No telemetry. No accounts.

How It Works

  1. Select text or copy an image anywhere on your Mac
  2. Press ⌥C (Option+C)
  3. Cai detects the content type and shows relevant actions
  4. Pick an action with arrow keys or ⌘1–9
  5. Hit return to finish. Result is auto-copied to your clipboard — just ⌘V to paste

Examples:

  • Take a screenshot → Create GitHub issue
  • Select a recipe → Ask AI: "Extract ingredients for 2 people"
  • Select "serendipity" → Define, Explain, Translate, Search
  • Select "Let's meet Tuesday at 3pm at Starbucks" → Create calendar event, Open in Maps
  • Select an email in Mail → Reply, Summarize, Translate

Read the full How It Works guide

Features

  • Smart content detection — recognizes what you copied (text, image, URL, JSON, meeting, address) and shows the right actions
  • Built-in AIApple Intelligence on macOS 26+, or in-process MLX inference on Apple Silicon. No server, no cloud, no setup
  • GitHub & Linear — create issues from any selected text with AI-generated title, body, and duplicate detection
  • Custom actions — save reusable AI prompts, URL templates, and shell commands as one-click actions
  • Image to Text — on-device OCR via Apple Vision framework
  • Bring your own LLM — works with LM Studio, Ollama, any OpenAI-compatible server, or any model from HuggingFace mlx-community

Also includes:

  • Custom output destinations (Mail, Notes, webhooks, AppleScript)
  • Follow-up questions
  • Context Snippets (pass per-app context to the LLM)
  • Clipboard history (last 100, search and pin)
  • Keyboard-first (arrow keys, ⌘1–9)
  • Community extensions

See all features in the docs

Installation

Homebrew

brew tap cai-layer/cai && brew install --cask cai

Manual Download

  1. Download the .dmg from the latest release
  2. Open the DMG and drag Cai.app to your Applications folder

After Install

  1. Open the app and grant Accessibility permission when prompted
  2. On macOS 26+, Cai uses Apple Intelligence automatically. Otherwise, the built-in MLX model downloads on first launch — or skip if you already use LM Studio / Ollama

Full installation guide · LLM setup

Build from Source

git clone https://github.com/cai-layer/cai.git
cd cai/Cai
open Cai.xcodeproj

In Xcode: select the Cai scheme and My Mac as destination, then Product → Run (⌘R).

Note: The app requires Accessibility permission and runs without App Sandbox (required for global hotkey and CGEvent posting).

What's New

→ Check the full changelog

Documentation

Full documentation is at getcai.app/docs:

Requirements

  • macOS 14.0 (Sonoma) or later
  • Apple Silicon (M1 or later) for the built-in AI engine
  • 8 GB RAM minimum, 16 GB recommended for larger models
  • Accessibility permission (for global hotkey ⌥C)

Under the Hood

  • SwiftUI + AppKit — native macOS, no Electron
  • MLX-Swift — in-process LLM inference on Apple Silicon, no subprocess or server
  • No App Sandbox — global hotkey requires CGEvent posting outside the sandbox
  • MCP via ~200-line JSON-RPC client (Beta) — GitHub and Linear connectors with zero external MCP dependencies

License

MIT

About

Press ⌥C on anything. Transform with AI, scripts, and shortcuts on any selection. 100% local and private.

Topics

Resources

License

Stars

Watchers

Forks

Sponsor this project

 

Packages

 
 
 

Contributors

Languages