Pass your LLM token stream directly to our SDK instead. Streamstraight requires you tag this stream with a unique streamId; we recommend this be a unique identifier for the generation.
Typescript/Javascript
import { streamstraightServer } from "@streamstraight/server";const ssServer = await streamstraightServer( { apiKey: process.env.STREAMSTRAIGHT_API_KEY }, { streamId: "your-stream-id" },);// llmStream can be any ReadableStream or AsyncIterable;// basically anything streamed from a LLM provider SDKawait ssServer.stream(llmStream);
Streamstraight will now consume and forward the LLM stream to your client. If you would also like to process the LLM stream on your server, make sure to tee the stream so that it can be consumed by both Streamstraight and your server.
Typescript/Javascript
import { streamstraightServer } from "@streamstraight/server";// ...// .tee is available on the ReadableStream interfaceconst [stream1, stream2] = llmStream.tee();const ssServer = await streamstraightServer( { apiKey: process.env.STREAMSTRAIGHT_API_KEY }, { streamId },);// Do not block on consuming the streamvoid ssServer.stream(stream1);// Process stream2 yourself.
Streamstraight requires your frontend client to connect via a JWT token. Our SDK contains helper functions that generates a JWT token from your API key.
import { fetchClientToken } from "@streamstraight/server";export async function POST(request: Request) { if (request.method !== "POST") { return new Response("Method not allowed", { status: 405 }); } // Ensure your user is authenticated const clientToken = await fetchClientToken({ apiKey: process.env.STREAMSTRAIGHT_API_KEY }); return { jwtToken: clientToken };}