Requests that we typically make in an application come all at the same time. We make the request, we get a response. But using a streaming mechanism, we can get our information as it comes resulting in a quicker initial response, as well as set ourselves up for different user experiences including conversational-like messaging.
Using Edge Functions in Next.js, we can stream our response back to the application and create an experience similar to the official ChatGPT interface. You'll learn how to set up a new Edge function, make a request directly to the OpenAI API, and listen for the stream events in Next.js with eventsource-parser.