Effect CommunityEC
Effect Community8mo ago
3 replies
Afonso Matos

Integrating LLM Observability with Langfuse Using Effect Typescript

Hey if anybody is trying to plug this with an LLM observability tool like Langfuse, and you need those extra input/output fields, this is how I've done it, only possible after the latest AI refactor.

export const addOutput = addSpanAttributes("output", String.camelToSnake)<{
    value: string;
}>;

export const addInput = addSpanAttributes("input", String.camelToSnake)<{
    value: string;
    mime_type: string;
}>;

export const AddIO = Layer.sync(CurrentSpanTransformer, () => (q) => {
    addOutput(q.span, { value: q.response.text });
    addInput(q.span, { value: JSON.stringify(q.prompt), mime_type: "application/json" });
});

export const OpenRouter = OpenAiClient.layer({
    apiKey: Redacted.make(process.env.OPENROUTER_API_KEY!),
    apiUrl: "https://openrouter.ai/api/v1"
}).pipe(Layer.provide(NodeHttpClient.layerUndici), Layer.provide(AddIO));
Was this page helpful?