Install and Setup LangGraph: Your First Steps to Building AI Workflows

Ready to dive into building smart, dynamic AI applications that can think, loop, and adapt? LangGraph, created by the LangChain team, is your go-to tool for crafting stateful, graph-based workflows. Whether you’re dreaming of a customer support bot that solves issues step-by-step or a code reviewer that refines suggestions until they’re perfect, LangGraph makes it possible. In this beginner-friendly guide, we’ll walk you through installing and setting up LangGraph, ensuring you’re ready to start building with clear, practical steps. No advanced coding skills needed—just a bit of curiosity and a computer!


What You’ll Need to Get Started

Before we jump into installation, let’s cover the basics you’ll need:

  • Python: Version 3.8 or higher. LangGraph is Python-based, so you’ll need Python installed.
  • A Code Editor: Something like VS Code, PyCharm, or even a simple text editor.
  • Basic Terminal Knowledge: You’ll run a few commands to install packages.
  • An Internet Connection: To download LangGraph and its dependencies.
  • Optional: API Keys: If you plan to use AI models (like OpenAI’s GPT), you’ll need API keys, but we’ll cover that later.

If you’re new to LangGraph, check out Introduction to LangGraph for an overview of what it can do.


Step 1: Set Up Your Python Environment

LangGraph requires Python 3.8 or later. Let’s make sure your environment is ready.

Install Python

  1. Check if Python is installed: Open a terminal (Command Prompt on Windows, Terminal on macOS/Linux) and run:
python --version

or

python3 --version

If you see a version like Python 3.8.x or higher, you’re good. If not, download Python from python.org.

  1. Install Python: Follow the installer instructions. Make sure to check “Add Python to PATH” during setup on Windows.

  2. Verify installation: Run the version command again to confirm.

Create a Virtual Environment

A virtual environment keeps your project’s dependencies separate, avoiding conflicts. Here’s how to set one up:

  1. Create a project folder:
mkdir langgraph-project
   cd langgraph-project
  1. Create a virtual environment:
python -m venv venv
  1. Activate the virtual environment:
    • On Windows:
    • venv\Scripts\activate
    • On macOS/Linux:
    • source venv/bin/activate

You’ll see (venv) in your terminal, indicating the environment is active.

For more on environments, see Environment Setup.


Step 2: Install LangGraph

With your Python environment ready, let’s install LangGraph and its core dependencies.

Install LangGraph

LangGraph is available via pip, Python’s package manager. Run this command in your activated virtual environment:

pip install langgraph

This installs LangGraph and its basic requirements.

Since LangGraph builds on LangChain, you’ll likely want LangChain for tools like prompts and memory. Install it with:

pip install langchain

Install an AI Model Integration (Optional)

LangGraph works with language models like OpenAI’s GPT or open-source models from Hugging Face. For this guide, we’ll use OpenAI’s models as an example. Install the OpenAI integration:

pip install langchain-openai

If you prefer open-source models, try:

pip install langchain-huggingface

Learn more about model options at OpenAI Integration or Hugging Face Integration.

Verify Installation

Check that everything installed correctly by running:

python -c "import langgraph; print(langgraph.__version__)"

You should see the LangGraph version number. If there’s an error, double-check your Python version and virtual environment.


Step 3: Configure API Keys (If Using External Models)

If you’re using an AI model like OpenAI’s GPT, you’ll need an API key to connect to it. Here’s how to set it up:

  1. Get an API Key:
  1. Set the API Key:
    • Option 1: Environment Variable (Recommended for security):

On macOS/Linux:

export OPENAI_API_KEY="your-api-key-here"
On Windows (Command Prompt):
set OPENAI_API_KEY=your-api-key-here
To make this permanent, add it to your shell profile (e.g., <mark>~/.bashrc</mark>) or system environment variables.
  • Option 2: In Code (Less secure, for testing only):
  • import os
         os.environ["OPENAI_API_KEY"] = "your-api-key-here"
  1. Verify Access: Test your key with a simple script (save as test_openai.py):
from langchain_openai import ChatOpenAI

   llm = ChatOpenAI(model="gpt-3.5-turbo")
   response = llm.invoke("Hello, world!")
   print(response.content)

Run it:

python test_openai.py

If you see a response, your API key is working!

For secure key management, see Security and API Keys.


Step 4: Build Your First LangGraph Workflow

Now that LangGraph is installed, let’s create a simple poem-writing bot to see it in action. This bot will generate a poem, check its quality, and retry if it’s too short.

The Goal

The bot: 1. Takes a topic (e.g., “stars”). 2. Generates a poem using an AI model. 3. Checks if the poem is long enough (>50 characters). 4. Stops if good; retries if not.

The Code

Create a file called poem_bot.py and add this:

from langgraph.graph import StateGraph, END
from typing import TypedDict
from langchain_openai import ChatOpenAI
from langchain.prompts import PromptTemplate

# State: The shared memory
class State(TypedDict):
    topic: str    # e.g., "stars"
    poem: str     # The generated poem
    is_good: bool # True if poem is good

# Task 1: Write a poem
def write_poem(state):
    llm = ChatOpenAI(model="gpt-3.5-turbo")
    template = PromptTemplate(input_variables=["topic"], template="Write a short poem about {topic}.")
    chain = template | llm
    poem = chain.invoke({"topic": state["topic"]}).content
    state["poem"] = poem
    return state

# Task 2: Check poem quality
def check_poem(state):
    state["is_good"] = len(state["poem"]) > 50
    return state

# Task 3: Decide next step
def decide_next(state):
    return "end" if state["is_good"] else "write_poem"

# Build the workflow
graph = StateGraph(State)
graph.add_node("write_poem", write_poem)
graph.add_node("check_poem", check_poem)
graph.add_edge("write_poem", "check_poem")
graph.add_conditional_edges("check_poem", decide_next, {"end": END, "write_poem": "write_poem"})
graph.set_entry_point("write_poem")

# Run the workflow
app = graph.compile()
result = app.invoke({"topic": "stars", "poem": "", "is_good": False})
print(result["poem"])

Run It

In your terminal (with the virtual environment active), run:

python poem_bot.py

You should see a poem about stars! If it’s too short, the bot will loop and try again until it meets the length requirement.

For a similar project, try Simple Chatbot Example.


Troubleshooting Common Issues

If you hit bumps, here are quick fixes:

  • “ModuleNotFoundError”: Ensure LangGraph and dependencies are installed in the active virtual environment. Re-run pip install langgraph langchain-openai.
  • API Key Errors: Verify your API key is set correctly. Check Security and API Keys.
  • Python Version Issues: Confirm you’re using Python 3.8+. Run python --version.
  • Code Errors: If the example fails, check for typos or missing imports. See Graph Debugging.

For more help, visit Troubleshooting.


Enhancing Your Setup

Once you’re comfortable, you can level up your LangGraph setup:

For a real-world project, try Customer Support Example.


Tips for Success

  • Keep It Simple: Start with a small workflow like the poem bot before tackling complex projects.
  • Test Your Environment: Run pip list to confirm all packages are installed.
  • Secure Your Keys: Never hardcode API keys in production. Use environment variables.
  • Explore Examples: Check Best LangGraph Uses for inspiration.

For best practices, see Best Practices.


Conclusion

You’ve just set up LangGraph and built your first AI workflow—a poem-writing bot that thinks and retries until it gets it right! With LangGraph installed, you’re ready to create dynamic, stateful applications that can loop, branch, and adapt. Whether it’s a chatbot, a data processor, or a troubleshooting agent, LangGraph gives you the tools to make it happen.

Next, explore Core Concepts to understand LangGraph’s building blocks, or jump into Workflow Design for advanced techniques. For more ideas, check out Best LangGraph Uses. Happy building, and let’s create some awesome AI with LangGraph!

External Resources: