Skip to main content
Version: v1.x

LangChain Adapter

NLUX offers integration with LangChain through LangServe.

About LangChain and LangServe

LangChain is a popular framework for building backend services powered by large language models. It enables applications that are context-aware. It allows connecting a language model to sources of context such as databases, APIs, and other services. It also relies on language models to reason about the context.

LangServe is a library that comes with LangChain, and that allows developers to build Restful APIs that can be used to interact with language models. A LangServe API provides a set of pre-defined endpoints and an input/out schema that can be used to interact with the language model.

LangServe vs Custom APIs

The NLUX LangChain adapter supports the standard LangServe endpoints.
If you have created custom APIs powered by LangChain, that do not use LangServe, we recommend using NLUX's Custom Adapters feature to create adapters specific to your APIs.

Supported Features

The NLUX LangServe adapter can do the following:

  • Offers a way to send a user prompt to a LangServe API via /invoke or /stream endpoints.
  • Handles the API responses and displays them in the AI chat UI (single responses and streamed text).
  • Offers a way for developers to customize the payload sent and received to/from the LangServe API.
  • Reads the LangServe API input schema and attempts to construct a request payload that matches it.

It's also important to note what NLUX LangServe adapter does not do:

  • Does not handle /batch or /stream_log endpoints.
  • The generation of request payload based on input schema is limited to schemas with simple types (string, or object with a single string property). Of complex schemas, inputPreProcessor should be used to construct the payload.

If you have specific requirements that are not covered by the adapter, please submit a feature request here and we will consider adding it to the roadmap if enough users request it.

Example: LangServe Adapter With Default Settings

The example below shows how to use the NLUX LangServe adapter to connect to a LangServe API.

  • LangServe Runnable URL used: https://pynlux.api.nlux.ai/pirate-speak
  • Data transfor used: stream (default) ― Response is streamed to the UI as it's being generated.
import {AiChat} from '@nlux/react';
import {useChatAdapter} from '@nlux/langchain-react';
import '@nlux/themes/nova.css';

export default () => {
  // LangServe adapter that connects to a demo LangChain Runnable API
  const adapter = useChatAdapter({
    url: 'https://pynlux.api.nlux.ai/pirate-speak'
  });

  return (
    <AiChat
      adapter={adapter}
      personaOptions={{
        bot: {
          name: 'FeatherBot',
          picture: 'https://nlux.ai/images/demos/persona-feather-bot.png',
          tagline: 'Yer AI First Mate!',
        },
        user: {
          name: 'Alex',
          picture: 'https://nlux.ai/images/demos/persona-user.jpeg'
        }
      }}
      layoutOptions={{
        height: 320,
        maxWidth: 600
      }}
    />
  );
};