Use this adapter to use LangChain as a backend.

return copilotKit.response(
  req,
  new LangChainAdapter(async (forwardedProps) => {
    const model = new ChatOpenAI({ modelName: "gpt-4o" });
    return model.stream(forwardedProps.messages, {
      tools: forwardedProps.tools,
    });
  })
);

The async handler function can return:

  • a simple string response
  • a LangChain stream IterableReadableStream
  • a LangChain BaseMessageChunk object
  • a LangChain AIMessage object

Constructor

constructor(private chainFn: (forwardedProps: any) => Promise<LangChainReturnType>)

To use LangChain as a backend, provide a handler function to the adapter with your custom LangChain logic.

chainFn
(forwardedProps: any) => Promise<LangChainReturnType>
required

getResponse(forwardedProps: any)

forwardedProps
any
required