Notebooks
L
Langfuse
Integration Mirascope

Integration Mirascope

observabilityllmsgenaicookbookprompt-managementhacktoberfestlarge-language-modelsnextraLangfuselangfuse-docs

description: Cookbook with examples of the Langfuse Integration for Mirascope (Python). category: Integrations

Cookbook: Mirascope x Langfuse Observability

Mirascope is a Python toolkit for building with LLMs. It allows devs to write Pythonic code while profiting from its abstractions to common LLM use cases and models.

Langfuse is an open source LLM engineering platform. Traces, evals, prompt management and metrics to debug and improve your LLM application.

With the Langfuse <-> Mirascope integration, you can log your application to Langfuse by adding the @with_langfuse decorator.

Let's dive right in with some examples:

[ ]
[2]

Log a first simple call

[6]
I recommend **"The House in the Cerulean Sea" by TJ Klune**. It's a heartwarming fantasy that follows Linus Baker, a caseworker for magical children, who is sent on a special assignment to a mysterious orphanage. There, he discovers unique and lovable characters and confronts themes of acceptance, found family, and the importance of love and kindness. The book combines whimsy, humor, and poignant moments, making it a delightful read for fantasy lovers.

Let's use it together with the Langfuse decorator

We'll use

  • Mirascope's @with_langfuse() decorator to log the generation
  • and Langfuse default @observe() decorator which works with any Python function to observe the generate_facts function and group the generations into a single trace.
[41]
Sure! Frogs can breathe through their skin, allowing them to absorb oxygen and release carbon dioxide directly into and out of their bloodstream. This process is known as cutaneous respiration.
Some species of frogs can absorb water through their skin, meaning they don't need to drink water with their mouths.
Frogs can breathe through their skin! This adaptation allows them to absorb oxygen directly from water, which is especially useful when they're submerged.

Head over to the Langfuse Traces table in Langfuse Cloud to see the entire chat history, token counts, cost, model, latencies and more

Example trace

Trace of complex Mirascope execution in Langfuse

That's a wrap.

There's a lot more you can do:

  • Mirascope: Head over to their docs to learn more about what you can do with the framework.
  • Langfuse: Have a look at Evals, Datasets, Prompt Management to start exploring all that Langfuse can do.