Skip to content

Concepts

To make “Open in ChatGPT” actually feel like opening this page in ChatGPT, the plugin uses a different strategy per provider, picked to match what each service supports today.

Three strategies cover every supported provider:

The provider supports a query-string parameter that prefills the chat input. The plugin builds a URL like:

https://chatgpt.com/?q=<URL-encoded prompt>

…and opens it in a new tab. The user lands directly in a session with a prefilled message they can submit.

Used by: ChatGPT, Perplexity, GitHub Copilot, T3 Chat, Cursor (when under the byte cap), Grok, DuckDuckGo AI Chat, Kagi Assistant, Google AI Studio, You.com.

The provider has no reliable URL prefill, so the plugin:

  1. Copies the resolved prompt (with the page URL substituted) to the clipboard.
  2. Opens the provider’s app/web entry point in a new tab.

The user pastes once. This is more friction than url-prompt, but it’s the only honest path for providers without URL prefill — anything else would silently drop the page context.

Used by: Claude, Gemini, DeepSeek, Mistral Le Chat, HuggingChat, Phind.

The provider accepts the full markdown body in the URL, not just a prompt. The plugin builds a URL using the special {prompt_with_markdown} placeholder:

https://cursor.com/link/prompt?text=<prompt + "\n\n" + page markdown>

This is the ideal experience — the LLM sees the actual content, not a URL it has to fetch — but URLs have hard length limits.

The maxBytes budget. When the assembled URL would exceed maxBytes (default 8000 for Cursor), the plugin falls back to the provider’s fallbackStrategy. For Cursor, the fallback is url-prompt, so long pages still open Cursor with a prompt referencing the page URL.

Used by: Cursor.

Every strategy that mentions “the page” needs a stable URL where the page’s markdown body lives. The plugin injects a route at /[slug].md (configurable via markdownUrl) that serves Content Collection entries as text/markdown.

So if your page is at /guides/install/, the markdown route is at /guides/install.md. The dropdown’s View as Markdown action navigates there directly; Copy as Markdown fetches it client-side.

If your site already publishes per-page markdown — say through a custom HTML→Markdown pipeline at /raw/[...slug] — set injectRoute: false to disable the built-in route, and point markdownUrl at your existing one.

Every “Open in…” strategy starts from a prompt template. The default is:

Read {md_url}. I want to ask questions about it.

{md_url} is substituted with the absolute URL of the current page’s .md route — LLMs that fetch URLs get clean markdown instead of rendered HTML. Use {url} instead if you need the rendered page URL. You can customise the prompt globally via the prompt option, or per provider via providers.<id>.prompt. The customised prompt is what gets URL-encoded into the url-prompt and inline-content URL templates, or copied to the clipboard for clipboard-open providers.

Strategy choice is shaped by what each provider currently supports — not by what would make our integration code simpler. As of late 2025 / early 2026:

  • ChatGPT, Perplexity, T3 Chat, GitHub Copilot, Grok, DuckDuckGo, Kagi, Google AI Studio, You.com all expose query-string prefills (?q=, ?prompt=, etc.) that auto-populate the input box. url-prompt is the cleanest experience here. GitHub Copilot’s ?prompt= parameter was officially documented in December 2025.
  • Claude.ai’s ?q= parameter broke in October 2025. Gemini, DeepSeek, Mistral, HuggingChat, Phind have no documented prefill. For these, clipboard-open is the only path that always preserves context.
  • Cursor uniquely supports inline-content via its official deeplinks API, with an 8 KB cap.

When provider URL schemes change, the plugin adapts in a point release; you do not need to update your config. If you want to override a provider’s strategy yourself, see providers.<id>.strategy.