AI.js — AI Toolkit for Node.js
AI.js
AI.js
is the easiest way to add AI text, images,
embeddings and vector search to your Node.js app.
await AI("the color of the sky is"); // blue
await AI.Image("a red rose"); // <image buffer: red rose>
await AI.Image.Concept("a red rose"); // {prompt: a red rose in realist style, watercolor ...", <image buffer>}
await AI.Embeddings("hello world"); // Array(384)
const db = new AI.VectorDB();
await db.add("red");
await db.add("blue");
await db.search("redish") // red
Under the hood AI.js
seamlessly integrates easy to use
local and remote APIs
- Text: LLM.js use
GPT-4
,Gemini
,Claude
,Mistral
orLLaMa
in same simple interface - Image: Imagine.js supports
local
Stable Diffusion
and remote services likeReplicate
andStability AI
- Embeddings: Embeddings.js create
local
,OpenAI
orMistral
embeddings - VectorDB: VectorDB.js searches similar embeddings in memory
- Audio: Coming Soon
- Video: Coming Soon
Features
- Easy to use
- Same simple interface for hundreds of models (OpenAI, Google, Anthropic, Mistral, LLaMa, Replicate, Stability AI, Hugging Face and more)
- Works locally and offline by default!
- Offers best-in-class options through remote APIs
- Advanced LLM features like one-shot prompts, chat history, streaming and JSON schema and more
- Image generation using the best open and paid models
- Image concepts to easily combine LLMs with Image generators for impressive results
- Easy to use text embeddings and in-memory vector search
- MIT license
Installation
Install AI.js
via npm
npm install @themaximalist/ai.js
Enable at least one service by setting its environment
API_KEY
export OPENAI_API_KEY=sk-...
export ANTHROPIC_API_KEY=sk-ant-...
export GOOGLE_API_KEY=sk-ant-...
export STABILITY_API_KEY=sk-...
export REPLICATE_API_KEY=sk-....
export MISTRAL_API_KEY=...
LLM
The default interface is text. AI.js
lets you send
one-off requests or build up complex message histories with the Large
Language Model (LLM).
const AI = require("@themaximalist/ai.js");
await AI("what is the codeword?"); // i don't know any codewords
const ai = new AI("the codeword is blue");
await ai.chat("what is the codeword?"); // blue
The default AI.js
mode is LLM
, and is
running on top of LLM.js.
Please see that site for full documentation, that also applies to
AI.js
.
There are many features supported across dozens of popular models, like streaming, JSON support, max_tokens, temperature, seed and more.
Images
AI.js
provides powerful image generation functions
through Automatic1111
, StabilityAI
and
Replicate
. Make sure you have each service setup as needed,
either running locally or a valid environment variable.
const image = await AI.Image("a red rose");
.writeFileSync("rose.png", image); fs
See Imagine.js for all image generator documentation.
Imagine Concepts
AI.js
also provides a concept generator—a way of using
LLMs together with image generators.
const { prompt, buffer } = await AI.Image.Concept("a red rose");
console.log(prompt); // a red rose in realist style, watercolor ..."
.writeFileSync("complex-rose.png", buffer); fs
This hits your LLM
provider and generates a complex
image prompt before sending it off to the image generation service.
Embeddings
AI.js
let’s you easily generate local or remote
embeddings using Embeddings.js
.
const embeddings = await AI.Embeddings("hello world"); // embedding array
It works with local embeddings, OpenAI and Mistral.
Embeddings can be used in any vector database like Pinecone, Chroma, PG Vector, etc…
See Embeddings.js for all options.
Vector Search
AI.js
let’s you quickly find similar text strings using
a vector database.
It runs in-memory and can use embeddings from AI.js
.
To find similar strings, add a few to the database, and then search.
const db = new AI.VectorDB();
await db.add("orange");
await db.add("blue");
const result = await db.search("light orange");
// [ { input: 'orange', distance: 0.3109036684036255 } ]
See VectorDB.js for full documentation.
API
AI.js
includes these sub-projects:
- LLM.js — Simple interface to dozens of Large Language Models
- Imagine.js — Local and remote image generation library
- Embeddings.js — Simple local or OpenAI text embeddings
- VectorDB.js — Local text similarity search
Check out each individual project for full API documentation.
They can be used individually or together like in
AI.js
.
Debug
AI.js
uses the debug
npm module across all
of it’s sub-projects.
The namespace is the lowercase version of the project name.
View debug logs by setting the DEBUG
environment
variable.
> DEBUG=llm.js*
> node src/run_ai.js
# debug logs
You can combine multiple logs with a comma.
> DEBUG=llm.js*,imagine.js*,embeddings.js*,vectordb.js*
> node src/run_ai.js
# debug all logs
Projects
AI.js
is currently used in the following projects:
- Infinity Arcade — Play any text adventure game
License
MIT
Author
Created by The Maximalist, see our open-source projects.