CoAgents (Early Access)
Getting Started

Getting started with CoAgents

LangGraph agents are written in Python, using the LangGraph Python SDK and an API runtime server such as FastAPI. This guide presumes you’re familiar with LangGraph and FastAPI, but if you’re new to either, you can find a detailed setup walkthrough for our bootstrap project on our GitHub repo.

To integrate LangGraph-based agents into your CopilotKit application, you’ll need the copilotkit python package:

pip install copilotkit --extra-index-url https://copilotkit.gateway.scarf.sh/simple/

The CopilotKit integration is a simple endpoint that you can easily add to your FastAPI application, powered by any LangGraph agent:

import uvicorn
from fastapi import FastAPI
from copilotkit.integrations.fastapi import add_fastapi_endpoint
from copilotkit import CopilotKitSDK, LangGraphAgent
from your_agent_package import research_agent
 
app = FastAPI()
sdk = CopilotKitSDK(
  agents=[
    LangGraphAgent(
      name="research_agent",
      # the description helps CopilotKit to pick the agent when a certain skill is needed, i.e. researching
      description="Research the web.",
      agent=research_agent,
    )
  ]
)
 
add_fastapi_endpoint(app, sdk, "/copilotkit")
 
def main():
  """Run the uvicorn server."""
  uvicorn.run("your_package:app", host="127.0.0.1", port=8000, reload=True)

Self-hosting CopilotKit in your app

Step 1: Set up Copilot Runtime Endpoint

For now, to use CoAgents, you’ll need to self-host the Copilot Runtime. Soon CoAgents will be available through Copilot Cloud.

Copilot Runtime is designed to run as a handler for HTTP endpoints (typically /copilotkit or /api/copilotkit). We provide plug and play support for many common frameworks. For more information, check out the Self Hosting documentation.

Create a new route to handle the /api/copilotkit endpoint.

app/api/copilotkit/route.ts
import {
  CopilotRuntime,
  OpenAIAdapter,
  copilotRuntimeNextJSAppRouterEndpoint,
} from "@copilotkit/runtime";
import OpenAI from "openai";
import { NextRequest } from "next/server";
 
const openai = new OpenAI();
const serviceAdapter = new OpenAIAdapter({ openai });
 
const runtime = new CopilotRuntime();
 
export const POST = async (req: NextRequest) => {
  const { handleRequest } = copilotRuntimeNextJSAppRouterEndpoint({
    runtime,
    serviceAdapter,
    endpoint: "/api/copilotkit",
  });
 
  return handleRequest(req);
};

Your Copilot Runtime endpoint should be available at http://localhost:{port}/api/copilotkit.

To bring your own LLM via the service adapter, see here.

Step 2: point remote actions at your FastAPI endpoint created via the Copilot Python Backend SDK:

demo.py
// ...
// in your CopilotRuntime initialization from above, add a remote action
const runtime = new CopilotRuntime({
  remoteActions: [
    {
      url: `${BASE_URL}/copilotkit`,
    },
  ],
});
 
// ...

Integrating agents into your CopilotKit frontend app

By default, CoAgents are like skills you provide to the CopilotKit SDK. CopilotKit will take care of starting a specific agent when it detects that this agent is needed — all you need to do is to provide your agents to CopilotKit on the Python side.

Sometimes, you’ll want a specific agent to run even before your users take action, ensuring that the user interacts with that agent directly. For these scenarios, you can “lock” a specific agent in your CopilotKit frontend by passing in the agent prop:

For more information on agent-lock mode, and router mode, see Router Mode and Agent Lock.

<CopilotKit
  runtimeUrl="/api/copilotkit"
  agent="customer_service_agent"
>

You might also want to start your agent programmatically. For this, you can use the start/stop function returned by the useCoAgent hook:

const { start, stop } = useCoAgent({ name: "car_rental_agent" });
 
start(); // stop();

Streaming state updates

CopilotKit will automatically sync state between your LangGraph agent and the frontend when entering or exiting a node.

For more detailed user feedback or observability, you can also configure CopilotKit to stream messages, LLM state updates and tool calls from your LangGraph agent by setting some options in our Python SDK.

config = copilotkit_customize_config(
  config,
  # this will stream messages back to the chat window
  emit_messages=True,
  # this will stream tool calls to CopilotKit (call frontend actions)
  emit_tool_calls=True,
  # this will stream tool calls *as if* they were state
  emit_intermediate_state=[
    {
      "state_key": "outline",
      "tool": "set_outline",
      "tool_argument": "outline",
    }
  ]
)

Read more about state streaming

Working with agent state from the frontend

CopilotKit provides React hooks for accessing your agent’s state in the frontend, allowing you to get live updates in response to state changes, as well as to update the shared state from anywhere in your application.

We provide two hooks to access the shared agent state:

  1. useCoAgent to access the state of the agent from anywhere in your application.
const { state: carRentalState, setState: setCarRentalState } = useCoAgent({
  name: "car_rental_agent",
});
  1. useCoAgentAction to access the state and render it to the chat window.
useCoAgentAction({
  name: "car_rental_agent",
  // this is optional - can also have an action by agent
  nodeName: "confirm_node",
  // render can be a function or a string, just like useCopilotAction
  render: "Confirming rental...",
});

Read more about sharing agent state

Agent Q&A

One of the key ways agents differ from normal chatbots is the ability to interrupt execution and ask for user feedback, using standard LangGraph mechanisms (interrupts, __END__ node, etc.).

Most LangGraph agents should work out-of-the-box with CopilotKit, requiring only minimal configuration.

CoAgents receive the whole message history as context, even in nodes that are not directly exposed to the user, so they can take the user’s feedback into account and provide precisely tailored actions and responses.

Read more about agent Q&A

Want to Run CoAgents in Production?

We offer tailored solutions for Enterprise customers. We'd be happy to support you with custom use cases, deploying and scaling CoAgents in production.

🪁


© Tawkit, Inc. All rights reserved.