我正在使用LangChain库编写一个小型JavaScript应用程序。我有以下代码片段:
/* LangChain Imports */import { OpenAI } from "langchain/llms/openai";import { BufferMemory } from "langchain/memory";import { ConversationChain } from "langchain/chains";// ========================================================================================= // // ============= Use LangChain to send request to OpenAi API =============================== // // ========================================================================================= // const openAILLMOptions = { modelName: chatModel.value, openAIApiKey: decryptedString, temperature: parseFloat(temperatureValue.value), topP: parseFloat(topP.value), maxTokens: parseInt(maxTokens.value), stop: stopSequences.value.length > 0 ? stopSequences.value : null, streaming: true,}; const model = new OpenAI(openAILLMOptions); const memory = new BufferMemory(); const chain = new ConversationChain({ llm: model, memory: memory }); try { const response = await chain.call({ input: content.value, signal: signal }, undefined, [ { handleLLMNewToken(token) { process.stdout.write(token); }, }, ] );// handle the response}
这不起作用(我尝试过使用TypeScript和不使用类型来处理token)。我查阅了各种论坛,它们要么是用Python实现流式传输,要么它们的解决方案与这个问题无关。总之,我可以通过LangChain的ConversationChain() API调用成功获取OpenAI的响应,但无法流式传输响应。有解决方案吗?
回答:
供参考,以下是我如何实现流式传输的:
const openAILLMOptions = { modelName: chatModel.value, cache: true, openAIApiKey: openAIDecryptedString, temperature: parseFloat(temperatureValue.value), topP: parseFloat(topP.value), maxTokens: parseInt(maxTokens.value), stop: stopSequences.value.length > 0 ? stopSequences.value : null, streaming: true, verbose: true, };const chat = new ChatOpenAI(openAILLMOptions); const chatPrompt = ChatPromptTemplate.fromMessages([ [ "system", systemPrompt.value, ], new MessagesPlaceholder("history"), ["human", content.value], ]); const chain = new ConversationChain({ memory: new BufferMemory({ returnMessages: true, memoryKey: "history" }), prompt: chatPrompt, llm: chat, }); await chain.call({ input: content.value, signal: signal, callbacks: [ { handleLLMNewToken(token) { aiResponse.value += token; } } ] });