OpenClaw Plugin Analysis: GitHub Copilot Provider, Google Plugin, and Perplexity Plugin
Agent: E — Copilot & Google Plugin Capability Analysis
Date: 2026-04-15
For: Guillaume Descoteaux-Isabelle
Scope: @openclaw/github-copilot-provider, @openclaw/google-plugin, @openclaw/perplexity-plugin capabilities, synergy, and routing
Key Findings
-
The GitHub Copilot provider is a bundled, first-class OpenClaw extension that uses your Copilot subscription to access GPT-4o, GPT-4.1, Claude Sonnet 4.5/4.6, o1/o3-mini, and potentially any new Copilot-supported model — all at $0 marginal cost within your subscription. Authentication uses GitHub device-login flow (no API keys needed). It also provides memory search embeddings for free.
-
The Google plugin is a multi-capability powerhouse — not just "Google Search." It provides: Gemini LLM chat completions, image generation (Gemini 3.1 Flash Image), video generation (Veo 3.1), music generation (Lyria 3), media understanding (image/audio/video), and web search via Gemini Grounding. Auth requires a
GEMINI_API_KEYor Google OAuth. -
The Perplexity plugin is a web-search-only provider (not an LLM provider). It adds structured web search with domain/date filtering (native API) or AI-synthesized answers with citations (OpenRouter/Sonar). Requires its own
PERPLEXITY_API_KEYorOPENROUTER_API_KEY. -
All three can run simultaneously alongside Ollama. OpenClaw uses a
primary+fallbacksmodel array per agent. Each provider occupies a different capability niche (LLM inference vs. web search vs. media generation), so they complement rather than compete. -
Hermes Agent is a separate, Python-based framework — not a variant of OpenClaw. It has migration tools (
hermes claw migrate) to import OpenClaw configs/skills, but plugins are not directly cross-compatible at runtime. Different architecture (Node.js vs Python).
@openclaw/github-copilot-provider
What It Does
The github-copilot provider is a bundled OpenClaw extension (enabled by default) that turns your GitHub Copilot subscription into a full LLM provider for OpenClaw. It handles authentication, model discovery, transport selection, and token exchange — all without requiring a separate API key or VS Code.
Source location: extensions/github-copilot/ in the OpenClaw monorepo
Plugin manifest: extensions/github-copilot/openclaw.plugin.json
Integration Mechanism
The provider uses GitHub's Copilot API directly (not the Language Server):
- Device-login flow:
openclaw models auth login-github-copilottriggers a GitHub device OAuth flow (visit URL → enter code → token stored) - Token exchange: At runtime, OpenClaw exchanges the stored GitHub token for a short-lived Copilot API token
- Transport auto-selection: Claude model IDs route through
anthropic-messagestransport; GPT/o-series/Gemini useopenai-responsestransport — selected automatically based on model ID
// From extensions/github-copilot/models.ts
export function resolveCopilotTransportApi(
modelId: string,
): "anthropic-messages" | "openai-responses" {
return (normalizeOptionalLowercaseString(modelId) ?? "").includes("claude")
? "anthropic-messages"
: "openai-responses";
}
Models Available via Copilot
From extensions/github-copilot/models-defaults.ts, the default catalog includes:
| Model ID | Family | Notes |
|---|---|---|
claude-sonnet-4.6 | Anthropic | Via Anthropic Messages transport |
claude-sonnet-4.5 | Anthropic | Via Anthropic Messages transport |
gpt-4o | OpenAI | Via OpenAI Responses transport |
gpt-4.1 | OpenAI | Via OpenAI Responses transport |
gpt-4.1-mini | OpenAI | Smaller/faster variant |
gpt-4.1-nano | OpenAI | Smallest variant |
o1 | OpenAI | Reasoning model |
o1-mini | OpenAI | Reasoning model (smaller) |
o3-mini | OpenAI | Reasoning model |
Critical: The provider has a forward-compat catch-all — any unknown model ID is accepted and synthesized as a dynamic model definition. If Copilot adds gpt-5.4 tomorrow, you just set it in config without waiting for an OpenClaw update. The Copilot API rejects unavailable models at request time.
Model availability depends on your GitHub plan (Free, Pro, Enterprise). If a model is rejected, try another ID.
All model costs are set to $0 — your Copilot subscription covers API usage through this provider.
Configuration Fields (2 fields)
From the plugin manifest configSchema:
| Field | Type | Purpose |
|---|---|---|
discovery.enabled | boolean | Controls whether OpenClaw auto-discovers models from ambient Copilot credentials at startup. Set false to skip implicit discovery. |
The second "field" is the auth profile stored via the device-login flow (not user-editable config — managed by openclaw models auth login-github-copilot).
Environment Variable Resolution
| Priority | Variable | Notes |
|---|---|---|
| 1 | COPILOT_GITHUB_TOKEN | Highest priority, Copilot-specific |
| 2 | GH_TOKEN | GitHub CLI token (fallback) |
| 3 | GITHUB_TOKEN | Standard GitHub token (lowest) |
The device-login flow stores its token in the auth profile store and takes precedence over all env vars.
Memory Search Embeddings
The Copilot provider also serves as an embedding provider for OpenClaw's memory search:
- Auto-detected at priority 15 (after local embeddings, before paid OpenAI)
- Discovers embedding models from the Copilot
/modelsendpoint - Prefers
text-embedding-3-small - No separate API key needed — reuses your Copilot auth
{
agents: {
defaults: {
memorySearch: {
provider: "github-copilot",
model: "text-embedding-3-small" // optional override
}
}
}
}
Can It Run Alongside Ollama?
Yes, absolutely. OpenClaw's architecture supports multiple providers simultaneously. You can set Copilot as primary and Ollama as fallback, or vice versa:
{
agents: {
defaults: {
model: {
primary: "github-copilot/gpt-4o",
fallbacks: ["ollama/gemma4", "ollama/llama3.3"]
}
}
}
}
Does the Copilot Subscription Cover Usage?
Yes. All model costs through the Copilot provider are set to $0. The Copilot subscription covers API usage. Model availability depends on plan tier.
@openclaw/google-plugin
What It Does
The Google plugin is a multi-capability extension that provides access to the full Google Gemini ecosystem — far more than just search. It is enabled by default and registers under two provider IDs: google (API key) and google-gemini-cli (OAuth).
Source location: extensions/google/ in the OpenClaw monorepo
Key source files: api.ts, index.ts, gemini-cli-provider.ts, image-generation-provider.ts, media-understanding-provider.ts, web-search-provider.ts
Capabilities Matrix
| Capability | Supported | Details |
|---|---|---|
| Chat completions (LLM) | ✅ | Gemini 3.1 Pro, Flash, etc. |
| Image generation | ✅ | Up to 4 images/request, edit mode (5 inputs) |
| Video generation | ✅ | Veo 3.1, text-to-video, image-to-video (4-8s clips) |
| Music generation | ✅ | Lyria 3, mp3/wav, with lyrics/instrumental controls |
| Image understanding | ✅ | Analyze images via Gemini |
| Audio transcription | ✅ | Via media understanding |
| Video understanding | ✅ | Via media understanding |
| Web search (Grounding) | ✅ | Gemini Grounding for factual search |
| Thinking/reasoning | ✅ | Gemini 3.1+ with thinkingBudget |
| Gemma 4 models | ✅ | Local/cloud Gemma 4 with thinking support |
Configuration Fields (from plugin manifest)
The plugin manifest configSchema has a webSearch object with 2 fields:
| Field | Type | Purpose |
|---|---|---|
webSearch.apiKey | string/object | Gemini API key for Google Search grounding (falls back to GEMINI_API_KEY env var) |
webSearch.model | string | Optional Gemini model override for web search grounding |
Authentication (2 methods)
Method 1: API Key (recommended)
- Set
GEMINI_API_KEYorGOOGLE_API_KEYenvironment variable - Or run:
openclaw onboard --auth-choice gemini-api-key - Requires a Google AI Studio API key (free tier available)
Method 2: Gemini CLI OAuth (unofficial)
- Uses the
google-gemini-cliprovider - Requires
gemini-cliinstalled (brew install gemini-cliornpm install -g @google/gemini-cli) - Login:
openclaw models auth login --provider google-gemini-cli --set-default - ⚠️ Unofficial integration — some users report Google account restrictions
Does It Require a Separate Google API Key?
Yes. The Google plugin does NOT work through Copilot. It requires its own GEMINI_API_KEY from Google AI Studio, OR OAuth through the Gemini CLI. This is completely independent of your GitHub Copilot subscription.
Plugin Manifest Contracts
From openclaw.plugin.json:
{
"contracts": {
"mediaUnderstandingProviders": ["google"],
"imageGenerationProviders": ["google"],
"musicGenerationProviders": ["google"],
"videoGenerationProviders": ["google"],
"webSearchProviders": ["gemini"]
}
}
This means the Google plugin registers itself as a provider for five distinct capability contracts — making it the most capability-rich single plugin in OpenClaw.
Example Models
| Model Ref | Use |
|---|---|
google/gemini-3.1-pro-preview | Primary LLM chat |
google/gemini-3-flash-preview | Fast/cheap chat |
google/gemini-3.1-flash-image-preview | Image generation |
google/gemini-3-pro-image-preview | Image generation (higher quality) |
google/veo-3.1-fast-generate-preview | Video generation |
google/lyria-3-clip-preview | Music generation |
google/lyria-3-pro-preview | Music generation (pro) |
API Key Rotation
Google supports multiple API keys for rotation:
GEMINI_API_KEYS(comma-separated)GEMINI_API_KEY_1,GEMINI_API_KEY_2, etc.GOOGLE_API_KEY(fallback)OPENCLAW_LIVE_GEMINI_KEY(single override)
@openclaw/perplexity-plugin
What It Does
The Perplexity plugin is a web search provider only — it is NOT a model/LLM provider. It gives OpenClaw agents the ability to search the web using Perplexity's API or via OpenRouter's Sonar model.
Source location: extensions/perplexity/ in the OpenClaw monorepo
Configuration Fields (3 fields)
From the plugin manifest configSchema.webSearch:
| Field | Type | Purpose |
|---|---|---|
webSearch.apiKey | string/object | Perplexity or OpenRouter API key for web search |
webSearch.baseUrl | string | Optional base URL override for Perplexity/OpenRouter endpoint |
webSearch.model | string | Optional Sonar/OpenRouter model override |
Two Search Modes (Auto-Selected by Key Prefix)
| Key Prefix | Transport | Features |
|---|---|---|
pplx- | Native Perplexity Search API | Structured results, domain/language/date filters |
sk-or- | OpenRouter (Sonar) | AI-synthesized answers with citations |
Native API Filtering (pplx- keys only)
| Filter | Description | Example |
|---|---|---|
| Country | 2-letter code | us, de, jp |
| Language | ISO 639-1 | en, fr, zh |
| Date range | Recency window | day, week, month, year |
| Domain filters | Allow/deny list (max 20) | example.com |
| Content budget | Token limits | max_tokens, max_tokens_per_page |
Comparison with Google Plugin Web Search
| Feature | Google (Gemini Grounding) | Perplexity |
|---|---|---|
| Type | Gemini-powered grounding | Dedicated search API |
| Auth | GEMINI_API_KEY | PERPLEXITY_API_KEY or OPENROUTER_API_KEY |
| Filtering | Limited | Rich (country, language, date, domains) |
| Output | Integrated into LLM response | Structured results OR synthesized answers |
| Cost | Included in Gemini API | Separate Perplexity subscription or OpenRouter credits |
| Independence | Requires Google API key | Works with Perplexity OR OpenRouter key |
Does It Require a Separate Subscription?
Yes. Perplexity requires either:
- A Perplexity API key (
pplx-prefix) — requires a Perplexity account/plan - An OpenRouter API key (
sk-or-prefix) — uses your OpenRouter balance
Neither is included in a GitHub Copilot subscription.
Plugin Synergy and Multi-Provider Routing
How OpenClaw Routes Between Providers
OpenClaw uses a primary + fallbacks model configuration per agent. This is NOT round-robin — it's an ordered failover chain:
{
agents: {
defaults: {
model: {
primary: "github-copilot/gpt-4o",
fallbacks: ["ollama/gemma4", "google/gemini-3-flash-preview"]
}
}
}
}
Routing rules:
- OpenClaw always tries the
primarymodel first - On failure (rate limit, timeout, context overflow, overload), it falls to the next model in
fallbacks - Each provider plugin can classify its own error types via
classifyFailoverReason - Cooldown probes prevent hammering a failed provider
- Session-override persistence lets a specific conversation stick to a provider mid-session
Can You Use Multiple Providers Simultaneously?
Yes, but with nuance:
- LLM inference: One model handles each request, but fallback chains span providers. You can set
primary: "github-copilot/claude-sonnet-4.6"withfallbacks: ["ollama/gemma4"]— Copilot handles requests when available, Ollama takes over when it can't. - Web search: Perplexity and Google (Gemini Grounding) are independent web search providers. You configure which one is active via
tools.web.search.provider. - Image generation: Google and OpenAI both register as image providers. You set
imageGenerationModel.primaryto choose which handles image requests. - Embeddings: Copilot, Ollama, and OpenAI all register as embedding providers. Auto-detection tries them in priority order.
Recommended Synergy Configuration for Guillaume
Given a Copilot subscription + local Ollama + Google API key:
{
agents: {
defaults: {
// LLM: Copilot primary, Ollama fallback (free local backup)
model: {
primary: "github-copilot/gpt-4o",
fallbacks: ["ollama/gemma4"]
},
// Image gen: Google Gemini
imageGenerationModel: {
primary: "google/gemini-3.1-flash-image-preview"
},
// Memory embeddings: Copilot (free with subscription)
memorySearch: {
provider: "github-copilot"
}
}
},
tools: {
web: {
search: {
// Choose one: "perplexity", "gemini", or "ollama"
provider: "gemini"
}
}
}
}
How Cloud Plugins Complement Local Ollama
| Capability | Cloud (Copilot/Google) | Local (Ollama) |
|---|---|---|
| LLM quality | State-of-art (GPT-4o, Claude 4.6, Gemini 3.1 Pro) | Good but smaller models (Gemma 4, Llama 3.3) |
| Cost | $0 via Copilot sub / API key costs for Google | $0 (local compute) |
| Privacy | Data sent to cloud | Data stays local |
| Availability | Requires internet | Works offline |
| Latency | Network-dependent | Local, fast for small models |
| Image/video/music gen | Google provides these | Not available via Ollama |
| Web search | Gemini Grounding, Perplexity | Ollama Web Search (basic) |
| Embeddings | Copilot (text-embedding-3-small) | Ollama (nomic-embed-text, auto-pulled) |
Optimal pattern: Use Copilot/Google for quality-critical tasks and capabilities only available in the cloud (image gen, video gen, web search). Use Ollama as offline fallback and for privacy-sensitive work.
Hermes and Cross-Claw Compatibility
What Is Hermes?
Hermes Agent is a separate, Python-based AI agent framework built by Nous Research. It is NOT a variant or fork of OpenClaw. Key differences:
| Aspect | OpenClaw | Hermes Agent |
|---|---|---|
| Language | Node.js / TypeScript | Python |
| Philosophy | Gateway/orchestrator | Self-improving agent |
| Skills | Human-authored Markdown (SKILL.md) | Auto-generated from successful workflows |
| Memory | File-backed (SOUL.md, HEARTBEAT.md) | Multi-level (persistent notes, SQL, LLM summarization) |
| Learning | Static — no self-improvement | Continuous via replay, RL, skill extraction |
| Channels | 25+ platforms native | ~7 platforms |
| Plugin system | npm/manifest-driven | Python scripts + natural language skills |
| Security | File-based auth | Hardened container isolation, zero telemetry |
Plugin Cross-Compatibility
Plugins are NOT directly cross-compatible at runtime. The architectures are fundamentally different (Node.js vs Python, manifest-driven vs auto-learned).
However, Hermes provides migration tools:
hermes claw migrate— imports OpenClaw settings, memories, skills, and API keys- Markdown-defined skills transfer cleanly
- Complex Node.js/JavaScript plugins require manual Python rewrites
- Config files (SOUL.md, MEMORY.md, AGENTS.md) have mapped destinations in Hermes
Does Hermes Use the Same Plugin System?
No. OpenClaw plugins are npm packages with openclaw.plugin.json manifests that register capabilities (providers, channels, tools) via the Plugin SDK. Hermes skills are Python-based and auto-generated by the agent from its own task execution patterns. The two systems solve similar problems with completely different architectures.
Running Both Together
Many advanced users run both:
- OpenClaw as the messaging gateway/orchestrator (channel hub)
- Hermes as a specialist agent that learns and improves at specific tasks
- They can share API keys/credentials but maintain separate runtime environments
Evidence Quality
| Topic | Confidence | Source Quality |
|---|---|---|
| Copilot provider architecture | ✅ HIGH | Direct source code + official docs from openclaw/openclaw repo |
| Copilot model list | ✅ HIGH | Extracted from models-defaults.ts source code |
| Copilot auth flow | ✅ HIGH | Official docs/providers/github-copilot.md |
| Google plugin capabilities | ✅ HIGH | Official docs/providers/google.md + plugin manifest |
| Google config fields | ✅ HIGH | Directly from openclaw.plugin.json manifest |
| Perplexity plugin config | ✅ HIGH | Directly from openclaw.plugin.json manifest |
| Perplexity search modes | ✅ HIGH | Official docs/providers/perplexity-provider.md |
| Model failover/routing | ⚠️ MEDIUM | Inferred from config examples + partial doc; full failover doc not fetched |
| Hermes comparison | ⚠️ MEDIUM | Multiple comparison articles; no direct Hermes source code verified |
| Plugin cross-compatibility | ⚠️ MEDIUM | Based on migration docs referenced in search results |
| Copilot plan model availability | ⚠️ LOW-MED | Docs state "depends on your plan" without enumerating which plan gets which models |
Contradictions Found
-
"2 fields" for Copilot config — The Copilot plugin manifest only shows 1 user-configurable field (
discovery.enabled). The second "field" may refer to the auth profile managed via CLI, or to a UI hint (discoveryparent object). The user's source claiming "2 fields" may have been counting differently. -
"2 fields" for Google config — The Google plugin manifest shows 2 fields under
webSearch(apiKey,model). However, the overall Google plugin has much broader config including auth choices for both API key and OAuth. The "2 fields" matches the webSearch config specifically. -
"3 fields" for Perplexity config — Confirmed:
webSearch.apiKey,webSearch.baseUrl,webSearch.model. Matches exactly. -
Web search claims Copilot gives Claude/Gemini — One generic web search result said Copilot does NOT provide Claude or Gemini access. This is contradicted by the actual OpenClaw source code which explicitly lists
claude-sonnet-4.6andclaude-sonnet-4.5in Copilot model defaults, and auto-selectsanthropic-messagestransport for Claude IDs. The confusion likely arises because the web search was answering about GitHub Copilot in VS Code, not about the GitHub Copilot API accessed through OpenClaw. -
"OpenClaw" as an AI model — One search result confused OpenClaw (the agent framework) with an AI model name. OpenClaw is a framework, not a model.
Sources
Primary Sources (Official Documentation + Source Code)
- OpenClaw GitHub Repository: https://github.com/openclaw/openclaw (357K+ stars)
docs/providers/github-copilot.md— Copilot provider setup guidedocs/providers/google.md— Google/Gemini provider setup guidedocs/providers/perplexity-provider.md— Perplexity provider setup guidedocs/providers/ollama.md— Ollama provider setup guidedocs/providers/index.md— Provider directorydocs/concepts/model-providers.md— Model provider overview with routing configextensions/github-copilot/openclaw.plugin.json— Copilot plugin manifestextensions/github-copilot/models-defaults.ts— Default Copilot model catalogextensions/github-copilot/models.ts— Transport selection + forward-compat logicextensions/google/openclaw.plugin.json— Google plugin manifestextensions/perplexity/openclaw.plugin.json— Perplexity plugin manifest
Secondary Sources (Comparison Articles)
- OpenClaw vs Hermes Agent — A Deep Technical Comparison: https://www.vibesparking.com/en/blog/ai/openclaw/2026-04-09-openclaw-vs-hermes-agent-deep-comparison/
- Hermes Agent Migration from OpenClaw: https://hermes-agent.nousresearch.com/docs/guides/migrate-from-openclaw/
- OpenClaw vs Hermes Agent (Petronella Tech): https://petronellatech.com/blog/openclaw-vs-hermes-agent-2026
- Hermes Agent vs OpenClaw (a2a-mcp.org): https://a2a-mcp.org/blog/hermes-agent-vs-openclaw
- Hermes Agent vs OpenClaw (Turing Post): https://www.turingpost.com/p/hermes
- Community Plugins docs: https://docs.openclaw.ai/plugins/community
- OpenClaw npm package: https://www.npmjs.com/package/openclaw