Skip to content
Pro

AI

This page only applies to Raycast for Mac V1. Latest information for Raycast for Mac V2, Windows, and iOS is available in the New Raycast Manual.

Anyone can try Raycast AI with a limited number of free messages. For unlimited AI messages (subject to Raycast's Request limits), you'll need an active Pro subscription — upgrade here!

Read our AI Privacy and Security statement.

Welcome to the Raycast AI Manual! This doc should help to familiarize you with the different features Raycast AI offers and how to get started using them. Here you'll also find our approach to AI Privacy and Security, usage and request limits, and FAQs.


These are the main ways to interact with AI in Raycast.

  • Quick AI: It's the fastest way to get answers to your questions directly from within Raycast. Ideal for one-off questions. Here's a video to see it in action.
  • AI Chat: A standalone chat window to ask AI anything. Ideal as an assistant during your workday.
  • AI Commands: Built-in or custom prompts, ideal for common tasks such as improving your writing.
  • AI Extensions: Include an @-mention to ask any extension a question or help you with your daily workflow. Check out the guide linked below or watch the video to learn more.
Quick AI

Follow the steps to ask AI anything:

  1. Open Raycast
  2. Type your question, f.e. "What were the specs of the first Macintosh?"
  3. Press Tab to switch to Quick AI
  4. Perform your next action, e.g. press J to continue the conversation in the AI Chat

Some more noteworthy features:

  • You can press Tab after opening Raycast to switch to Quick AI. This way you can access your previous question.
  • Quick AI is only for one-off questions. If you want to ask a follow up question, continue in the chat.
  • You can hide the Ask AI Tab hint via Settings → Extensions → Raycast AI. The Tab key still works.
AI Chat

Follow the steps to interact with the AI Chat:

  1. Open Raycast
  2. Search for "AI Chat" to open the separate chat window
  3. Type in any question you want to start a conversation
  4. Continue the conversation with follow up questions
  5. Press K, select "Ask AI" and search for "Make Shorter"
  6. Raycast applies the AI Command to shorten the previous message

Assign a global hotkey (e.g. J) via Preferences → Extensions → Raycast AI → AI Chat to open the chat quicker.

You can import AI Chat Presets from a JSON file using the Import AI Chat Presets command (available in Raycast 1.72.0 or later).

Import AI Chat Presets

The JSON format should be an array of AI Chat presets with properties as described below:

PropertyDescriptionTypeRequired
nameName.string
modelModel."openai-gpt-3.5-turbo" | "openai-gpt-4"
creativityCreativity (temperature)."none" | "low" | "medium" | "high" | "maximum"
instructionsSystem instructions.string
web_searchEnable web search. Defaults to false.bool
image_generationEnable image generation (beta). Defaults to false.bool
[
  {
    "name": "Preset Name",
    "model": "openai-gpt-4",
    "creativity": "medium",
    "instructions": "Instructions for the AI.\nContinuing here.",
    "web_search": false,
    "image_generation": true
  }
]

After the import is completed, you'll see how many AI Chat presets were added and how many duplicates were skipped, if any. If a matching preset is found it's considered a duplicate.

AI Chat Presets import result

You can find ready-made presets at presets.ray.so.

Explain Words AI Command
  1. Select the word "philosophy"
  2. Open Raycast and search for "Explain This in Simple Terms"
  3. Raycast AI responds with the definition of the word "philosophy"
Improve Writing AI Command
  1. Select a sentence, paragraph, or more text
  2. Open Raycast and search for "Improve Writing"
  3. Raycast AI responds with a text improved for grammar and style
Customize a built-in AI Command

To add a customized AI Command to your Raycast Root Search, continue with these steps:

  1. Go to "Search AI Commands" and find for example "Improve Writing"
  2. Press D to duplicate the AI Command
  3. Give it a new title, for example "Improve Writing in My Personal Style", change the icon to Person Lines, and the creativity to low
  4. Tweak the Prompt, and define some rules that describe how you tend to write, for example: keep everything lowercase.
  5. Press to update the AI Command
  6. Now you can use the AI Command from your Root Search
Translate Text AI Command

Follow the steps to create a new AI Command that translates selected text to any language:

  1. Open Raycast, search for "Create AI Command"
  2. Give it a Title like "Translate Selected Text to English"
  3. Type in the Prompt field "Translate {selection} to English" (or any other language) and hit to save
  4. When you execute the AI Command, Raycast instantly shows a window with the translation of the selected text
Translate Text to Any Language

Follow the steps to create a new AI Command that translates selected text to any language:

  1. Create a new AI Command, or edit the one we just made by finding it and hitting E.
  2. Name the Title something like: "Translate Selected Text to..."
  3. Change the prompt to use an Argument with a dynamic placeholder. Example: Translate \{selection\} to {argument name="Language"}
  4. Press to update the AI Command
  5. Now any time you execute the AI Command, you can define the language you want.

Dynamic Placeholders can be even more powerful. Learn more about them in the next section.

You can make your AI Commands dynamic with Dynamic Placeholders.

You can import AI commands from a JSON file using the Import AI Commands command (available in Raycast 1.53.0+).

Import AI Commands

The JSON format should be an array of AI commands with properties as described below:

PropertyDescriptionTypeRequired
titleCommand title.string
promptCommand prompt.string
creativityCommand creativity (temperature)."none" | "low" | "medium" | "high" | "maximum"
iconCommand icon in kebab-case (e.g. AddPerson becomes add-person).string
modelCommand model (GPT-4 requires Pro)."openai-gpt-3.5-turbo" | "openai-gpt-4"
[
  {
    "title": "Write 10 Alternatives",
    "prompt": "Give me 10 alternative versions of the text. Ensure that the alternatives are all distinct from one another.\n\nText:\n\nAlternatives:",
    "creativity": "high",
    "icon": "shuffle"
  },
  {
    "title": "Find Synonyms",
    "prompt": "Find synonyms for the word {selection} and format the output as a list. Words should exist. Do not write any explanations. Do not include the original word in the list. The list should not have any duplicates.",
    "creativity": "medium",
    "icon": "text",
    "model": "openai-gpt-4"
  }
]

After the import is completed, you'll see how many AI commands were added and how many duplicates were skipped, if any. If a command has the same title and prompt, it's considered a duplicate.

AI Commands import result

You can find ready-made AI commands at prompts.ray.so.

Raycast AI Extensions let you interact and instruct using natural language. Describe what you want to do, add the relevant AI Extensions to your message or chat, and let AI do the hard work.

AI Extensions

AI Extensions include an @-mention to ask any extension a question or help you with your daily workflow. Check out the guide linked below or watch the video to learn more.

Model Context Protocol or in short MCP is an open protocol that standardizes how applications provide context to LLMs. You can use MCP servers within Raycast to extend AI even further. MCP servers work and behave similar to our AI Extensions. After installation, you can @-mention MCP servers in the root search, AI Commands, AI Chat, and Presets.

Only AI models that display AI Extensions support in their AI model card in the AI model selector will work with MCP.

Learn more on the Model Context Protocol page.

With BYOK you can now use Raycast AI with your own API Key from your AI provider accounts. We currently support: Anthropic, Google, OpenAI, and OpenRouter. This allows you to enjoy all of Raycast's powerful AI features, and send as many AI messages as you want at your own cost without a Pro subscription.

  1. Go to Raycast AI Settings and find the section called Custom API Keys.
    • Create and copy an API Key from your AI Provider's settings dashboard.
  2. Add your API Key per AI Provider in Raycast Settings.
  3. Validate the Key
  4. From now on when you select a model from the AI Provider, it uses your API Key. You can see a small key icon in the model picker.

All models via OpenRouter are supported and available as models via BYOK. Requests are sent directly to OpenRouter's servers.

Note: Chats do not support remote tools, and may not include web search

Note: Only models we support via Raycast AI will be available as models via BYOK.

Any message you send incurs cost via your AI provider. This can amount to more than a Raycast Pro subscription. Be sure to keep an eye on the provider's API dashboard to prevent costly surprises.

If using BYOK (excluding OpenRouter), requests are processed through our servers in order to unify the model APIs, integrate fallback behaviors, and do some final prompt management. To learn more about how we handle BYOK and privacy, see AI Privacy + Security.

BYOK API Key settings
BYOK model picker with key icon

Local models allow you to run nearly any LLM locally, on your machine. This is possible through our integration with Ollama -- giving you access to over 100 AI models from various providers like Google, Meta, Microsoft, and others, ranging from small 135M to massive 671B parameter models.

Learn more: Local Models FAQ

  1. Download and install Ollama. Make sure to move the app to your Applications folder.
  2. Install local models offered through Ollama
    • Explore models via: ollama.com/search
    • Install them via:
      • Ollama in a terminal or,
      • Raycast AI Settings and type the model name in the "Add Ollama Model" input field.
        • By default, the :latest version available gets downloaded. You can opt to get a specific version of the model by appending a colon : followed by the parameter, e.g. qwen3:4b.
  3. Choose your local model from the model picker, like you would any other model in Raycast

If you want to see which models are installed, you can run ollama list in a terminal.

Some local models are very large and can take up hundreds of gigabytes of space on your hard drive. Also, depending on your machine's hardware, they can quickly become noticeably slower than what you're used to with cloud-provided LLMs.

Local Models in Raycast AI Settings
Add Ollama Model input field
Local model in model picker
Local model running
Ollama model details

With the ever-growing selection of AI Models from various providers, you might want to control which ones are available to you, and which get hidden in the model pickers. With the "Manage Models" command you can do exactly that. Find the ones you won't use, and disable them by hitting Return.

This command supports multi-select. You can + Click on multiple rows, or use + / , just like selecting multiple items in Finder, and enable or disable multiple models in one go.

Manage Models command

Some AI models provide tools to generate images based on a prompt. OpenAI models allow you to generate images using DALL-E 2 or DALL-E 3. You can also use Stable Diffusion and Flux by finding them in Root search or including them in a prompt via an @-mention.

Not all models provide tools to generate images. When selecting a model, the detail view should include "Image Generation" in the supported tools section.

Model detail view showing Image Generation support

Simply request images using natural language in Quick AI or AI Chat.

  • Up to 4 images can be requested at once.
  • Several aspect ratios are supported. Landscape and portrait aspect ratios can be explicitly requested in the prompt, otherwise the default aspect ratio is square.
Generated images in AI Chat

Once images have been generated, there are several options to keep a copy.

  • Contextual Menu: Right-click on the image to copy, save, or share.
  • Quick Look: Double click to preview the image.
  • Action Panel ( K): Several actions are provided to interact with images — like "Copy Images" which copies every image generated in the last message.
  • Drag & Drop: Images can be dragged and dropped directly into the desired app.

Provide more context in your chats using attachments.

  • Select Add Attachments… in the action panel ( A)
  • Click the ⊕ button and select one of the attachment providers
  • Drag & drop or paste attachments directly into AI Chat
Adding an attachment in AI Chat

Quick Look an attachment with a double-click, or by selecting it and pressing space.

Quick Look preview of an AI Chat attachment

Some attachments may need System Permissions to work:

  • Screen Capture: Screen & System Audio Recording permission
  • My Schedule: Calendars access
  • Selected File From Finder: Automation → Raycast → Finder

Image attachments are securely uploaded to our servers for AI processing, enabling you to access them later. If you delete a message or chat that includes attachments, those attachments will be deleted from the server.

  • Text-based files or content (.txt, .md, .csv…). Some attachment providers also fit in this category like "My Schedule" or "Add Selected Text".
    • Text-based attachments work with any AI model.
  • PDF files — only the text content of the file is extracted.
  • Image types natively supported on Apple devices (.jpeg, .png, .webp, .gif, .heic…).
    • Some types may be converted to .png or .jpeg for compatibility.
    • Image-based attachments only work with models that support Vision, like GPT-4o.
Model detail view showing Vision support

Other composite file types are not supported (e.g. .docx, .pptx, .key).

Create alternate conversation paths from any point in your chat history. Think of it as a "save point" where you can explore different directions without losing your original conversation.

To branch a chat: Simply press CMD Shift B to create a new branched chat. Alternatively, you can right click on a chat in the sidebar, and select Branch Chat.

To navigate back to a parent chat: Simply press CMD Option to jump back to the parent chat. Alternatively you can right click on a chat in the sidebar, and select Go to Parent Chat.

Chat Branching

Open Raycast Settings → AI Tab, and customize your experience: pick your favorite model for Quick AI, set the behavior of the AI Chat window, or set up a hotkey for Chat. You can also completely turn off and hide AI across Raycast with the big switch on the left side.

AI Settings

To completely turn off and hide AI across Raycast:

  1. Open Raycast Settings → AI Tab
  2. Use the big toggle switch on the left side

Or simply open the following deeplinks:

  • Disable: raycast://ai/settings/disable
  • Enable: raycast://ai/settings/enable

As with all things AI, reliability is a tricky thing to get right. In order to improve and continually make Raycast AI more reliably and useful, we need to understand what works and what doesn't.

Providing feedback – the good and the bad – helps us to do this.

If you come across examples that do what you intended particularly well, or that missed the mark, you can report them to us by hitting the thumbs up 👍 or thumbs down 👎 button on the message.

Feedback buttons show when hovering any message
A feedback prompt will show after each message with an AI Extension

If you choose to report feedback, the full chat thread — including AI Extension tool calls and results — will be sent to us. This is entirely opt-in and you must give explicit consent before any data is sent. Please be mindful of any sensitive data the thread might contain. We may use this information to improve the reliability of our AI System.

We handle this data with the same privacy and security as outlined in our Terms of Service and Privacy Policy.

Consent prompt when submitting AI feedback
If you choose to submit feedback about an AI-generated message (thumbs up/down), we will prompt you to explain that you are sharing the content of the chat with us and require your consent beforehand.

Some AI features are more experimental in nature -- whether it be something we're testing out to collect feedback on, some functionality that might not yet be reliable, or even features that are still in development. Rather than building these behind closed doors, we want to share them so we can learn and build alongside those of you who are interested in trying out the bleeding edge of our AI features.

You can access and control these at the bottom of the Raycast AI Settings screen. Find the "Experiments" section.

Join the conversation and share your feedback about these features in our Slack Community #ai-experiments channel.

Experimental features in AI Settings

Enable HTTP MCP servers using the Server Sent Events (SSE) and Streamable protocols. This makes it easier to use, as you can just connect to a remote MCP server and the server itself is managed by the host. You no longer have to worry about any complex steps, like running a local Node process. You can try it with e.g. https://mcp.linear.app

If you've got Ollama installed, you can try out tool calling with local models. Enabling this feature allows you to use any AI Extension in a chat with a local model.

Tool choice and streaming for tool calls aren't supported by Ollama just yet. Meaning this can be a bit unreliable -- which is why we decided to make it available as an Experimental feature.

For advanced users you can now add any OpenAI compatible LLM provider to Raycast AI.

Custom providers are defined in a providers.yaml configuration file. Go to Settings → AI → Custom Providers and click the Reveal Providers Config button to open the file in the Finder. A template is created containing an example configuration.

You can also download the template configuration file.

To make it easier to choose the best model, we've introduced Raycast Auto model that, under the hood, selects the best provider and model for your request: it chooses a fast model for simple requests, a reasoning model for sophisticated tasks, the best coding model for programming requests, or a web-search model if it requires access to real-time data.

Check out the Usage Limits page in the new Raycast Manual for the latest usage limits for models available in Raycast AI.