LangChain

LangChain has a callback framework that allows you to handle events from the various stages of a LangChain pipeline. You can use the AutoblocksCallbackHandler from any of the SDKs to send these events to Autoblocks in order to get a better understanding of the individual steps of your LLM chain.

See the LangChain documentation for details on where you can pass in the callback handler:

Quick Start

In order to run the examples below, you need:

  • to have langchain and openai installed:
pip install openai langchain
export OPENAI_API_KEY=<your-api-key>
export AUTOBLOCKS_INGESTION_KEY=<your-ingestion-key>

Then run the following:

from langchain.llms import OpenAI
from autoblocks.vendor.langchain import AutoblocksCallbackHandler

llm = OpenAI()
handler = AutoblocksCallbackHandler()
llm.predict(
  "Hello, world!",
  callbacks=[handler],
)

After running the above script, you should eventually see a trace on the explore page with events from the LLM pipeline.

Sending custom events alongside LangChain events

If you want to send custom events in addition to the events that are sent automatically by the callback handler, such as user feedback events, you can access the underlying AutoblocksTracer instance via the tracer property on the handler.

import uuid

from langchain.llms import OpenAI
from autoblocks.vendor.langchain import AutoblocksCallbackHandler

handler = AutoblocksCallbackHandler()

# Set the trace ID on the tracer directly
handler.tracer.set_trace_id(str(uuid.uuid4()))

# Events sent by the callback handler
# will have the trace ID set above
llm = OpenAI()
llm.predict(
  "Hello, world!",
  callbacks=[handler],
)

# This custom feedback event will also
# have the same trace ID set above
handler.tracer.send_event(
  "user.feedback", 
  properties=dict(feedback="good"),
)

On the explore page you should eventually see a new trace with the same events from the previous example and also a user.feedback event.

Examples

See the LangChain examples in our examples repository.