Next.js Edge Function: 模块未找到错误(’path’)和使用OpenAI集成时的内部服务器错误

描述

我在Node.js环境中遇到一个问题,path模块未被找到。这个问题在尝试发送聊天框中的消息时发生。

错误信息

Module not found: Can't resolve 'path'https://nextjs.org/docs/messages/module-not-foundImport trace for requested module:./node_modules/dotenv/config.js./src/lib/db/index.ts./src/app/api/chat/route.ts./node_modules/next/dist/build/webpack/loaders/next-edge-app-route-loader/index.js?absolutePagePath=C%3A%5CUsers%5CDell%5COneDrive%5CDesktop%5C100xDevs%5Csummarize-my-pdf-ai%5Csrc%5Capp%5Capi%5Cchat%5Croute.ts&page=%2Fapi%2Fchat%2Froute&appDirLoader=bmV4dC1hcHAtbG9hZGVyP25hbWU9YXBwJTJGYXBpJTJGY2hhdCUyRnJvdXRlJnBhZ2U9JTJGYXBpJTJGY2hhdCUyRnJvdXRlJmFwcFBhdGhzPSZwYWdlUGF0aD1wcml2YXRlLW5leHQtYXBwLWRpciUyRmFwaSUyRmNoYXQlMkZyb3V0ZS50cyZhcHBEaXI9QyUzQSU1Q1VzZXJzJTVDRGVsbCU1Q09uZURyaXZlJTVDRGVza3RvcCU1QzEwMHhEZXZzJTVDc3VtbWFyaXplLW15LXBkZi1haSU1Q3NyYyU1Q2FwcCZwYWdlRXh0ZW5zaW9ucz10c3gmcGFnZUV4dGVuc2lvbnM9dHMmcGFnZUV4dGVuc2lvbnM9anN4JnBhZ2VFeHRlbnNpb25zPWpzJnJvb3REaXI9QyUzQSU1Q1VzZXJzJTVDRGVsbCU1Q09uZURyaXZlJTVDRGVza3RvcCU1QzEwMHhEZXZzJTVDc3VtbWFyaXplLW15LXBkZi1haSZpc0Rldj10cnVlJnRzY29uZmlnUGF0aD10c2NvbmZpZy5qc29uJmJhc2VQYXRoPSZhc3NldFByZWZpeD0mbmV4dENvbmZpZ091dHB1dD0mcHJlZmVycmVkUmVnaW9uPSZtaWRkbGV3YXJlQ29uZmlnPWUzMCUzRCE%3D&nextConfigOutput=&preferredRegion=&middlewareConfig=e30%3D!

环境

  • Node.js版本: v18.17.1
  • 框架: Next.js
  • 操作系统: WINDOWS 11

代码

route.ts

import { getContext } from "@/lib/context";import { db } from "@/lib/db";import { chats, messages as _messages } from "@/lib/db/schema";import { openai } from "@ai-sdk/openai";import { streamText } from "ai";import { eq } from "drizzle-orm";import { NextResponse } from "next/server";import { Message } from "ai/react";export const runtime = "edge";export async function POST(req: Request) {  console.log("Called api");  try {    const { messages,chatId } = await req.json();    console.log("Messages:", messages);    console.log("Chat ID:", chatId);    const _chats = await db.select().from(chats).where(eq(chats.id, chatId));    // Log retrieved chats    console.log("_chats:", _chats);    if (_chats.length != 1) {      return NextResponse.json({ Error: " Chat not found" }, { status: 404 });    }    const fileKey = _chats[0].fileKey;    const lastMessage = messages[messages.length - 1];    // Log fileKey and lastMessage    console.log("File Key:", fileKey);    console.log("Last Message:", lastMessage);    const context = await getContext(lastMessage.content, fileKey);    console.log("Context:", context);    const prompt = {      role: "system",      content: `AI assistant is a brand new, powerful, human-like artificial intelligence.      The traits of AI include expert knowledge, helpfulness, cleverness, and articulateness.      AI is a well-behaved and well-mannered individual.      AI is always friendly, kind, and inspiring, and he is eager to provide vivid and thoughtful responses to the user.      AI has the sum of all knowledge in their brain, and is able to accurately answer nearly any question about any topic in conversation.      AI assistant is a big fan of Pinecone and Vercel.      START CONTEXT BLOCK      ${context}      END OF CONTEXT BLOCK      AI assistant will take into account any CONTEXT BLOCK that is provided in a conversation.      If the context does not provide the answer to question, the AI assistant will say, "I'm sorry, but I don't know the answer to that question".      AI assistant will not apologize for previous responses, but instead will indicated new information was gained.      AI assistant will not invent anything that is not drawn directly from the context.      `,    };    const response = await streamText({      model: openai("gpt-4o-mini"),      messages: [        prompt,        ...messages.filter((message: Message) => message.role === "user"),      ],    });    return response.toDataStreamResponse();  } catch (error) {    console.log(error);    return NextResponse.json(      { error: "Internal Server Error" },      { status: 500 }    );  }}

ChatComponent.tsx

"use client";import React from "react";import { Input } from "./ui/input";import { useChat } from "ai/react";import { Button } from "./ui/button";import { SendIcon } from "lucide-react";import MessageList from "./MessageList";type Props = { chatId: number };const ChatComponent = ({ chatId }: Props) => {  console.log("Chat ID in ChatComponent:", chatId);  const { input, handleInputChange, handleSubmit, messages } = useChat({    api: "/api/chat",    body: {      chatId,    },  });  // React.useEffect(() => {  //   const messageContainer = document.getElementById("message-container");  //   if (messageContainer) {  //     messageContainer.scrollTo({  //       top: messageContainer.scrollHeight,  //       behavior: "smooth",  //     });  //   }  // }, [messages]);  return (    <div      className="relative max-h-screen overflow-scroll"      id="message-container"    >      {/* Header */}      <div className="sticky top-0 inset-x-0 p-2 bg-white h-fit">        <h3 className="text-xl font-bold">Chat</h3>      </div>      {/* Message List */}      <MessageList messages={messages} />      <form        onSubmit={handleSubmit}        className="sticky  bottom-0 px-2 py-4 inset-x-0 bg-white"      >        <div className="flex">          <Input            value={input}            onChange={handleInputChange}            placeholder="Ask any question..."            className="w-full"          />          <Button className="bg-gradient-to-r from-sky-400 to-blue-500 ml-2">            <SendIcon className="h-4 w-4" />          </Button>        </div>      </form>    </div>  );};export default ChatComponent;

日志

PS C:\Users\Dell\OneDrive\Desktop\100xDevs\summarize-my-pdf-ai> node -vv18.17.1PS C:\Users\Dell\OneDrive\Desktop\100xDevs\summarize-my-pdf-ai>this is console GET /chat/9?_rsc=a12k2 200 in 227msGET /chat/8?_rsc=18zah 200 in 383ms○ Compiling /api/chat ...⨯ ./node_modules/dotenv/lib/main.js:2:1Module not found: Can't resolve 'path'https://nextjs.org/docs/messages/module-not-foundImport trace for requested module:./node_modules/dotenv/config.js./src/lib/db/index.ts./src/app/api/chat/route.ts./node_modules/next/dist/build/webpack/loaders/next-edge-app-route-loader/index.js?absolutePagePath=C%3A%5CUsers%5CDell%5COneDrive%5CDesktop%5C100xDevs%5Csummarize-my-pdf-ai%5Csrc%5Capp%5Capi%5Cchat%5Croute.ts&page=%2Fapi%2Fchat%2Froute&appDirLoader=bmV4dC1hcHAtbG9hZGVyP25hbWU9YXBwJTJGYXBpJTJGY2hhdCUyRnJvdXRlJnBhZ2U9JTJGYXBpJTJGY2hhdCUyRnJvdXRlJmFwcFBhdGhzPSZwYWdlUGF0aD1wcml2YXRlLW5leHQtYXBwLWRpciUyRmFwaSUyRmNoYXQlMkZyb3V0ZS50cyZhcHBEaXI9QyUzQSU1Q1VzZXJzJTVDRGVsbCU1Q09uZURyaXZlJTVDRGVza3RvcCU1QzEwMHhEZXZzJTVDc3VtbWFyaXplLW15LXBkZi1haSU1Q3NyYyU1Q2FwcCZwYWdlRXh0ZW5zaW9ucz10c3gmcGFnZUV4dGVuc2lvbnM9dHMmcGFnZUV4dGVuc2lvbnM9anN4JnBhZ2VFeHRlbnNpb25zPWpzJnJvb3REaXI9QyUzQSU1Q1VzZXJzJTVDRGVsbCU1Q09uZURyaXZlJTVDRGVza3RvcCU1QzEwMHhEZXZzJTVDc3VtbWFyaXplLW15LXBkZi1haSZpc0Rldj10cnVlJnRzY29uZmlnUGF0aD10c2NvbmZpZy5qc29uJmJhc2VQYXRoPSZhc3NldFByZWZpeD0mbmV4dENvbmZpZ091dHB1dD0mcHJlZmVycmVkUmVnaW9uPSZtaWRkbGV3YXJlQ29uZmlnPWUzMCUzRCE%3D&nextConfigOutput=&preferredRegion=&middlewareConfig=e30%3D!

工作代码

当使用以下简化的代码时,我能够从聊天框中获得响应:

import { openai } from "@ai-sdk/openai";import { streamText } from "ai";import { NextResponse } from "next/server";export const runtime = "edge";export async function POST(req: Request) {  try {    const { messages } = await req.json();    const response = await streamText({      model: openai("gpt-4o-mini"),      messages,    });    return response.toDataStreamResponse();  } catch (error) {    console.log(error);    return NextResponse.json(      { error: "Internal Server Error" },      { status: 500 }    );  }}

这应该为问题提供了清晰的上下文,并表明问题可能与配置或依赖项有关,而不是获取聊天响应的核心功能。

重现步骤

  1. 使用提供的route.tsChatComponent.tsx代码运行应用程序。
  2. 在尝试启动或编译项目时观察错误。

我尝试在我的Next.js应用程序中使用streamText函数和openai模型实现聊天功能。具体来说,我使用以下代码来处理POST请求并获取聊天响应:

import { openai } from "@ai-sdk/openai";import { streamText } from "ai";import { NextResponse } from "next/server";export const runtime = "edge";export async function POST(req: Request) {  try {    const { messages } = await req.json();    const response = await streamText({      model: openai("gpt-4o-mini"),      messages,    });    return response.toDataStreamResponse();  } catch (error) {    console.log(error);    return NextResponse.json(      { error: "Internal Server Error" },      { status: 500 }    );  }}

我期望这个实现能够正确处理传入的消息并从聊天框返回有效的响应。

实际结果是什么?

虽然代码在简化设置中运行无误并返回响应,但在包含额外逻辑(如数据库交互和上下文处理)的更复杂实现中,我遇到了问题。在这些情况下,我收到了500 Internal Server Error,响应与预期不符。简化的代码按预期工作并提供了预期的聊天响应。



回答:

我删除了这个,它就工作了!!!!

export const runtime = "edge";

Related Posts

L1-L2正则化的不同系数

我想对网络的权重同时应用L1和L2正则化。然而,我找不…

使用scikit-learn的无监督方法将列表分类成不同组别,有没有办法?

我有一系列实例,每个实例都有一份列表,代表它所遵循的不…

f1_score metric in lightgbm

我想使用自定义指标f1_score来训练一个lgb模型…

通过相关系数矩阵进行特征选择

我在测试不同的算法时,如逻辑回归、高斯朴素贝叶斯、随机…

可以将机器学习库用于流式输入和输出吗?

已关闭。此问题需要更加聚焦。目前不接受回答。 想要改进…

在TensorFlow中,queue.dequeue_up_to()方法的用途是什么?

我对这个方法感到非常困惑,特别是当我发现这个令人费解的…

发表回复

您的邮箱地址不会被公开。 必填项已用 * 标注