Integration Instructor
Instructor - Observability & Tracing
What is Instructor? Instructor (GitHub) is a popular library to get structured LLM outputs.Instructor makes it easy to reliably get structured data like JSON from Large Language Models (LLMs) like GPT-3.5, GPT-4, GPT-4-Vision, including open source models like Mistral/Mixtral from Together, Anyscale, Ollama, and llama-cpp-python. By leveraging various modes like Function Calling, Tool Calling and even constrained sampling modes like JSON mode, JSON Schema; Instructor stands out for its simplicity, transparency, and user-centric design. Under the hood, Instructor leverages Pydantic to do the heavy lifting, and provides a simple, easy-to-use API on top of it by helping you manage validation context, retries with Tenacity, and streaming Lists and Partial responses.
What is Langfuse? Langfuse is an open source LLM engineering platform that helps teams trace API calls, monitor performance, and debug issues in their AI applications.
This is a cookbook with examples of the Langfuse Integration for Python.
Setup
Initialize the Langfuse client with your API keys from the project settings in the Langfuse UI and add them to your environment.
Get started
It is easy to use instructor with Langfuse. We use the Langfuse OpenAI integration and simply patch the client with instructor. This works with both synchronous and asynchronous clients.
Langfuse-Instructor integration with sychnronous OpenAI client
Langfuse-Instructor integration with asychnronous OpenAI client
Example
In this example, we first classify customer feedback into categories like PRAISE, SUGGESTION, BUG and QUESTION, and further scores the relevance of each feedback to the business on a scale of 0.0 to 1.0. In this case, we use the asynchronous OpenAI client AsyncOpenAI to classify and evaluate the feedback.
