Skip to main content

Introduction to LangChain.js

LangChain.js is a framework for building LLM-powered applications in TypeScript and JavaScript. It helps you chain together interoperable components and third-party integrations to simplify AI application development — all while future-proofing decisions as the underlying technology evolves.

What is LangChain.js?

LangChain provides a standard interface for working with:
  • Chat models - Interact with LLMs from various providers
  • Agents - Build autonomous systems that can use tools and make decisions
  • Prompts - Manage and optimize prompt templates
  • Embeddings - Generate and work with vector representations
  • Vector stores - Store and retrieve documents efficiently
  • Retrievers - Pull relevant information from various sources
  • Tools - Extend LLM capabilities with external functions

Why Use LangChain.js?

LangChain helps developers build applications powered by LLMs through a standard interface for agents, models, embeddings, vector stores, and more.

Real-time Data Augmentation

Easily connect LLMs to diverse data sources and external/internal systems, drawing from LangChain’s vast library of integrations with model providers, tools, vector stores, retrievers, and more.

Model Interoperability

Swap models in and out as your engineering team experiments to find the best choice for your application’s needs. As the industry frontier evolves, adapt quickly — LangChain’s abstractions keep you moving without losing momentum.

Rapid Prototyping

Quickly build and iterate on LLM applications with LangChain’s modular, component-based architecture. Test different approaches and workflows without rebuilding from scratch, accelerating your development cycle.

Production-Ready Features

Deploy reliable applications with built-in support for monitoring, evaluation, and debugging through integrations like LangSmith. Scale with confidence using battle-tested patterns and best practices.

Vibrant Community

Leverage a rich ecosystem of integrations, templates, and community-contributed components. Benefit from continuous improvements and stay up-to-date with the latest AI developments through an active open-source community.

Flexible Abstractions

Work at the level of abstraction that suits your needs - from high-level chains for quick starts to low-level components for fine-grained control. LangChain grows with your application’s complexity.

The LangChain Ecosystem

LangChain is part of a broader ecosystem of tools designed to help you build, test, and deploy LLM applications:

LangSmith

Unified developer platform for building, testing, and monitoring LLM applications. Debug poor-performing runs, evaluate agent trajectories, and gain production visibility.

LangGraph

Build agents that can reliably handle complex tasks with customizable architecture, long-term memory, and human-in-the-loop workflows. Trusted by LinkedIn, Uber, Klarna, and GitLab.

Deep Agents

Build sophisticated “deep” agents that go beyond simple tool-calling loops with planning tools, sub-agent spawning, and file system access for complex, multi-step tasks.

Monorepo Structure

LangChain.js is organized as a monorepo with multiple packages:

Core Packages

PackageDescription
langchainMain package with agents, prompts, and orchestration features
@langchain/coreCore abstractions and interfaces (base classes, runnables, messages)
@langchain/communityCommunity-maintained integrations
@langchain/textsplittersText splitting utilities for document processing

Provider Packages

First-party integrations are published as standalone packages:
PackageDescription
@langchain/openaiOpenAI and Azure OpenAI integration
@langchain/anthropicAnthropic (Claude) integration
@langchain/google-vertexaiGoogle Vertex AI integration
@langchain/google-genaiGoogle Generative AI integration
@langchain/mistralaiMistral AI integration
@langchain/cohereCohere integration
@langchain/groqGroq integration
@langchain/ollamaOllama integration for local models
…and many more provider packages.

Supported Environments

LangChain.js is written in TypeScript and works across multiple JavaScript runtimes:

Node.js

Node.js 20.x, 22.x, and 24.x (both ESM and CommonJS)

Edge Runtimes

Cloudflare Workers, Vercel Edge Functions, Supabase Edge Functions

Browsers

Modern web browsers with full TypeScript support

Alternative Runtimes

Deno and Bun

The Runnable Interface

At the heart of LangChain is the Runnable interface from @langchain/core/runnables. All major components extend Runnable, providing a consistent API:
import { Runnable } from "@langchain/core/runnables";

// All these methods are available on any Runnable
await runnable.invoke(input);        // Single invocation
await runnable.batch(inputs);        // Batch processing
for await (const chunk of await runnable.stream(input)) {
  // Streaming output
}
This unified interface means you can compose different components together seamlessly using the pipe operator:
const chain = prompt.pipe(model).pipe(outputParser);
const result = await chain.invoke({ input: "Hello!" });

Next Steps

Installation

Install LangChain.js and its dependencies

Quickstart

Build your first LangChain application

API Reference

Explore the complete API documentation

Community

Join the LangChain community forum

Build docs developers (and LLMs) love