Get Started ― NLUX And ChatGPT via Node.js
This getting started guide will help you to integrate the OpenAI ChatGPT model with the NLUX library using Node.js.
We will perform the following steps:
- Create an Express.js server that connects to the OpenAI API (steps 1-3)
- Create an AI chat component using NLUX and connect it to the Express.js server (steps 4-9)
Express.js is a back end web application framework for building RESTful APIs with Node.js.
If you are not familiar with Express.js, you can learn more about it here.
We will also use nlbridge to create the server endpoint that
bridges the OpenAI API with the NLUX library.
nlbridge is a middleware library created by the NLUX team to
simplify the integration of LLMs with web applications.
1. Get Your OpenAI API Key
- react-js
- javascript
Start by getting a new API key from OpenAI.
- If you don't have an account, go to the OpenAI signup page and create an account.
- Go to the API keys page
- Click the
Creat new secret key
button
- Give your API key a name and click
Create secret key
- Copy the API key and save it in a safe place. You will need it to configure the OpenAI NLUX adapter.
Start by getting a new API key from OpenAI.
- If you don't have an account, go to the OpenAI signup page and create an account.
- Go to the API keys page
- Click the
Creat new secret key
button
- Give your API key a name and click
Create secret key
- Copy the API key and save it in a safe place. You will need it to configure the OpenAI NLUX adapter.
Back-end Node.js Steps
2. Create An Express.js Server
- react-js
- javascript
The following example was tested on Node.js v20.11.
We recommend using the latest LTS version of Node.js.
If you already have Node.js + Express.js project set up, you jump to the next section.
Set up a new Node.js and Typescript project
We will start by setting up a new Node.js and Typescript project, and install the dependencies.
Create a new directory for your project, navigate to it, and run the following commands:
- NPM
- Yarn
- PNPM
npm init --yes
npm install --dev typescript ts-node @types/node @types/express @types/cors
npm install express cors
npx tsc --init
yarn init --yes
yarn add --dev typescript ts-node @types/node @types/express @types/cors
yarn add express cors
yarn tsc --init
pnpm init
pnpm install -D typescript ts-node @types/node @types/express @types/cors
pnpm install express cors
pnpm tsc --init
Create a simple Express.js endpoint
Next, we will create a simple Express.js endpoint that returns a welcome message.
Create a file called index.ts
and add the following code:
import express, { Express, Request, Response } from 'express';
import cors from 'cors';
const app: Express = express();
const port = 8080;
app.use(cors());
app.use(express.json());
app.get('/', (req: Request, res: Response) => {
res.send('Welcome to NLUX + Node.js demo server!');
});
app.listen(port, () => {
console.log(`[server]: Server is running at http://localhost:${port}`);
});
Run the Express.js server
Run your Express.js application using the following command:
- NPM
- Yarn
- PNPM
npx ts-node index.ts
yarn ts-node index.ts
pnpm ts-node index.ts
This will run your development server on http://localhost:8080
.
When you navigate to this URL in your browser, you should see the following:
Now that we have an Express.js server set up, let's add some LLM capabilities.
The following example was tested on Node.js v20.11.
We recommend using the latest LTS version of Node.js.
If you already have Node.js + Express.js project set up, you jump to the next section.
Set up a new Node.js and Typescript project
We will start by setting up a new Node.js and Typescript project, and install the dependencies.
Create a new directory for your project, navigate to it, and run the following commands:
- NPM
- Yarn
- PNPM
npm init --yes
npm install --dev typescript ts-node @types/node @types/express @types/cors
npm install express cors
npx tsc --init
yarn init --yes
yarn add --dev typescript ts-node @types/node @types/express @types/cors
yarn add express cors
yarn tsc --init
pnpm init
pnpm install -D typescript ts-node @types/node @types/express @types/cors
pnpm install express cors
pnpm tsc --init
Create a simple Express.js endpoint
Next, we will create a simple Express.js endpoint that returns a welcome message.
Create a file called index.ts
and add the following code:
import express, { Express, Request, Response } from 'express';
import cors from 'cors';
const app: Express = express();
const port = 8080;
app.use(cors());
app.use(express.json());
app.get('/', (req: Request, res: Response) => {
res.send('Welcome to NLUX + Node.js demo server!');
});
app.listen(port, () => {
console.log(`[server]: Server is running at http://localhost:${port}`);
});
Run the Express.js server
Run your Express.js application using the following command:
- NPM
- Yarn
- PNPM
npx ts-node index.ts
yarn ts-node index.ts
pnpm ts-node index.ts
This will run your development server on http://localhost:8080
.
When you navigate to this URL in your browser, you should see the following:
Now that we have an Express.js server set up, let's add some LLM capabilities.
3. Setup nlbridge Express.js Middleware
- react-js
- javascript
Now we will add a new endpoint to our Express.js app that is powered by @nlbridge/express
library.
First, we start by adding the @nlbridge/express
library to our project:
- NPM
- Yarn
- PNPM
npm install @nlbridge/express
yarn add @nlbridge/express
pnpm install @nlbridge/express
Then, modify index.ts
file to add a new endpoint:
import {defaultMiddleware} from '@nlbridge/express';
app.post('/chat-api',
defaultMiddleware('openai', {
apiKey: '<YOUR_OPENAI_API_KEY>',
chatModel: 'gpt-3.5-turbo',
}),
);
Make sure to replace <YOUR_OPENAI_API_KEY>
with your actual OpenAI API key obtained in step 1.
Then restart your server and you will have a new endpoint at POST
http://localhost:3000/chat-api
that is
powered by OpenAI's gpt-3.5-turbo model, and ready for NLUX integration.
It's important to note that the new API is created with post
method.
This is a requirement for nlbridge integration.
Now we will add a new endpoint to our Express.js app that is powered by @nlbridge/express
library.
First, we start by adding the @nlbridge/express
library to our project:
- NPM
- Yarn
- PNPM
npm install @nlbridge/express
yarn add @nlbridge/express
pnpm install @nlbridge/express
Then, modify index.ts
file to add a new endpoint:
import {defaultMiddleware} from '@nlbridge/express';
app.post('/chat-api',
defaultMiddleware('openai', {
apiKey: '<YOUR_OPENAI_API_KEY>',
chatModel: 'gpt-3.5-turbo',
}),
);
Make sure to replace <YOUR_OPENAI_API_KEY>
with your actual OpenAI API key obtained in step 1.
Then restart your server and you will have a new endpoint at POST
http://localhost:3000/chat-api
that is
powered by OpenAI's gpt-3.5-turbo model, and ready for NLUX integration.
It's important to note that the new API is created with post
method.
This is a requirement for nlbridge integration.
Front-End Web App Steps
The following steps are specific to your front-end web application (either React JS or JavaScript).
You can use the toggle below to switch between the two platforms.
- React JS ⚛️
- JavaScript 🟨
4. Install NLUX Packages
Now that we have a Node.js server running with the nlbridge middleware, we can create a chat component using NLUX and connect it to the server.
- react-js
- javascript
If you don't have a React JS app set up yet, and you are looking for a quick way to get started, you can use
Vite's react-ts
template to quickly set up a React JS app.
Set up a React JS project with vite
Use the following npm
commands to set up a React JS app with Typescript using Vite's react-ts
template:
npm create vite@latest my-ai-chat-app -- --template react-ts
cd my-ai-chat-app
npm install
npm run dev
The last command will start the development server and open the app in your default browser.
If you don't have a web development environment set up, and you are looking for a quick way to get started,
you can use Vite's vanilla-ts
template to quickly set up a development environment with support for
ES6 and npm modules.
Set up a Typescript project with vite
Use the following npm
commands to set up a new project with the vanilla-ts
template:
npm create vite@latest my-ai-chat-app -- --template vanilla-ts
cd my-ai-chat-app
npm install
npm run dev
The last command will start the development server and open the app in your default browser.
- react-js
- javascript
You can start by adding NLUX and nlbridge adapter to your React JS app using your favorite package manager. At the root of your project, run the following command:
- NPM
- Yarn
- PNPM
npm install @nlux/react @nlux/nlbridge-react
yarn add @nlux/react @nlux/nlbridge-react
pnpm add @nlux/react @nlux/nlbridge-react
This will install the @nlux/react
and @nlux/nlbridge-react
packages.
You can start by adding NLUX and nlbridge adapter to your Typescript project using your favorite package manager. At the root of your project, run the following command:
- NPM
- Yarn
- PNPM
npm install @nlux/core @nlux/nlbridge
yarn add @nlux/core @nlux/nlbridge
pnpm add @nlux/core @nlux/nlbridge
This will install the @nlux/core
and @nlux/nlbridge
packages.
5. Import Component And Hook
- react-js
- javascript
Import the useChatAdapter
hook and the AiChat
component in your JSX file:
import {AiChat} from '@nlux/react';
import {useChatAdapter} from '@nlux/nlbridge-react';
The AiChat
component is the main chat component that you will use to display the chat UI.
The useChatAdapter
hook is used to create an adapter for the nlbridge API.
Import the createAiChat
and createChatAdapter
functions from the @nlux/core
and @nlux/nlbridge
packages.
import {createAiChat} from '@nlux/core';
import {createChatAdapter} from '@nlux/nlbridge';
The createAiChat
function will create the main chat component that you will use to display the chat UI.
The createChatAdapter
function is used to create an adapter for the nlbridge API.
6. Create nlbridge Adapter
- react-js
- javascript
You can use the useChatAdapter
hook to create a LangServe adapter.
You can optionally import ChatAdapterOptions
from @nlux/nlbridge-react
to define the type of the options object.
import {useChatAdapter, ChatAdapterOptions} from '@nlux/nlbridge-react';
const adapterOptions: ChatAdapterOptions = {
url: 'http://localhost:8080/chat-api',
};
export const App = () => {
const nlbridgeAdapter = useChatAdapter(adapterOptions);
}
The ChatAdapterOptions
interface has one required property: url
.
This is the URL of the nlbridge endpoint that the adapter should connect to.
In this example, we are connecting the endpoint created in the previous section.
You can use the createChatAdapter
function to create an nlbridge adapter as shown below:
const nlbridgeAdapter = createChatAdapter()
.withUrl('http://localhost:8080/chat-api');
The createChatAdapter
function takes will return an adapter builder that you can use to configure by chaining methods.
The withUrl(<URL>)
method is used to specify the URL of the nlbridge endpoint that the adapter should connect to.
In this example, we are connecting the endpoint created in the previous section.
7. Create Chat Component
- react-js
- javascript
Now that we have the nlbridge adapter, we will create the chat component and pass the adapter to it.
import {AiChat} from '@nlux/react';
import {useChatAdapter, ChatAdapterOptions} from '@nlux/nlbridge-react';
const adapterOptions: ChatAdapterOptions = {
url: 'http://localhost:8080/chat-api',
};
export const App = () => {
const nlbridgeAdapter = useChatAdapter(adapterOptions);
return (
<AiChat
adapter={nlbridgeAdapter}
promptBoxOptions={{
placeholder: 'How can I help you today?'
}}
/>
);
};
The AiChat
component can take several parameters:
- The first parameter
adapter
is the only required parameter, and it is the adapter that we created earlier. - The second parameter that we provide here is an object that contains the prompt box options. In this case, we are
passing a placeholder text
placeholder
to customize the prompt box.
For full documentation on how to customize the AiChat
component, please refer to the AiChat documentation.
Now that we have the nlbridge adapter, we will create the chat component, pass the adapter to it, and mount it to the DOM.
const aiChat = createAiChat().withAdapter(nlbridgeAdapter);
document.addEventListener('DOMContentLoaded', () => {
const chatContainer = document.getElementById('chat-container');
aiChat.mount(chatContainer!);
});
The function createAiChat()
returns a component builder that allows you to configure the chat component
by chaining method calls. The withAdapter()
method sets the adapter to be used by the chat component.
Note that aiChat.mount(<domElement>)
should only be called after the DOM has been loaded.
For full documentation on how to customize the aiChat
component, please refer to the AiChat documentation.
8. Add CSS Styles
- react-js
- javascript
NLUX comes with a default CSS theme that you can use to style the chat UI. There are 2 ways to import the stylesheet, depending on your setup.
Using JSX Bundler
You can import it in your JSX component file by installing the @nlux/themes
package:
- NPM
- Yarn
- PNPM
npm install @nlux/themes
yarn add @nlux/themes
pnpm add @nlux/themes
Then import the default theme nova.css
in your React component:
import '@nlux/themes/nova.css';
This will require a CSS bundler such as Vite, or Webpack that's configured to handle CSS imports for global styles. Most modern bundlers are configured to handle CSS imports.
Using HTML Link Tag
Alternatively, you can include the CSS stylesheet in your HTML file.
We provide a CDN link that you can use to include the stylesheet in your HTML file:
<link rel="stylesheet" href="https://themes.nlux.ai/v1.0.0/nova.css" />
This CDN link is not meant for production use, and it is only provided for convenience. Make sure you replace it with the latest version of the stylesheet before deploying your app to production.
NLUX comes with a default CSS theme that you can use to style the chat UI. There are 2 ways to import the stylesheet, depending on your setup.
Using JS Bundler
You can import it in your JS module by installing the @nlux/themes
package:
- NPM
- Yarn
- PNPM
npm install @nlux/themes
yarn add @nlux/themes
pnpm add @nlux/themes
Then import the default theme nova.css
in your webpage page or Javascript/Typescript file:
import '@nlux/themes/nova.css';
This will require a CSS bundler such as Vite, or Webpack that's configured to handle CSS imports for global styles. Most modern bundlers are configured to handle CSS imports.
Using HTML Link Tag
Alternatively, you can include the CSS stylesheet in your HTML file.
We provide a CDN link that you can use to include the stylesheet in your HTML file:
<link rel="stylesheet" href="https://themes.nlux.ai/v1.0.0/nova.css" />
This CDN link is not meant for production use, and it is only provided for convenience. Make sure you replace it with the latest version of the stylesheet before deploying your app to production.
9. Run Your App
- react-js
- javascript
Your final code will look like this:
import {AiChat} from '@nlux/react';
import {useChatAdapter, ChatAdapterOptions} from '@nlux/nlbridge-react';
import '@nlux/themes/nova.css';
const adapterOptions: ChatAdapterOptions = {
url: 'http://localhost:8080/chat-api',
};
export const App = () => {
const nlbridgeAdapter = useChatAdapter(adapterOptions);
return (
<AiChat
adapter={nlbridgeAdapter}
promptBoxOptions={{
placeholder: 'How can I help you today?'
}}
/>
);
};
You can now run your app and test the chatbot.
The result is a fully functional chatbot UI:
And NLUX is handling all the UI interactions and the communication with the nlbridge server.
Once you have configured all of the above, your code will look like this:
import {createAiChat} from '@nlux/core';
import {createChatAdapter} from '@nlux/nlbridge';
import '@nlux/themes/nova.css';
const nlbridgeAdapter = createChatAdapter()
.withUrl('http://localhost:8080/chat-api');
const aiChat = createAiChat().withAdapter(nlbridgeAdapter);
document.addEventListener('DOMContentLoaded', () => {
const chatContainer = document.getElementById('chat-container');
aiChat.mount(chatContainer!);
});
You can now run your app and test the chatbot.
The result is a fully functional chatbot UI:
And NLUX is handling all the UI interactions and the communication with the nlbridge server.