Use this adapter to use LangChain as a backend.

return copilotKit.response(
  req,
  new LangChainAdapter({
    chainFn: async ({ messages, tools }) => {
      const model = new ChatOpenAI({ modelName: "gpt-4o" });
      return model.stream(messages, { tools });
    },
  })
);

The async handler function can return:

  • a simple string response
  • a LangChain stream IterableReadableStream
  • a LangChain BaseMessageChunk object
  • a LangChain AIMessage object

Constructor

chainFn
(parameters: ChainFnParameters) => Promise<LangChainReturnType>
required

process(request: CopilotRuntimeChatCompletionRequest)

request
CopilotRuntimeChatCompletionRequest
required