Course Outline

Introduction to LangGraph and Graph Concepts

  • Why graphs for LLM apps: orchestration vs. simple chains
  • Nodes, edges, and state in LangGraph
  • Hello LangGraph: first runnable graph

State Management and Prompt Chaining

  • Designing prompts as graph nodes
  • Passing state between nodes and handling outputs
  • Memory patterns: short-term vs. persisted context

Branching, Control Flow, and Error Handling

  • Conditional routing and multi-path workflows
  • Retries, timeouts, and fallback strategies
  • Idempotency and safe re-runs

Tools and External Integrations

  • Function/tool calling from graph nodes
  • Calling REST APIs and services within the graph
  • Working with structured outputs

Retrieval-Augmented Workflows

  • Document ingestion and chunking basics
  • Embeddings and vector stores (e.g., ChromaDB)
  • Grounded answering with citations

Testing, Debugging, and Evaluation

  • Unit-style tests for nodes and paths
  • Tracing and observability
  • Quality checks: factuality, safety, and determinism

Packaging and Deployment Fundamentals

  • Environment setup and dependency management
  • Serving graphs behind APIs
  • Versioning workflows and rolling updates

Summary and Next Steps

Requirements

  • An understanding of basic Python programming
  • Experience with REST APIs or CLI tools
  • Familiarity with LLM concepts and prompt engineering fundamentals

Audience

  • Developers and software engineers new to graph-based LLM orchestration
  • Prompt engineers and AI newcomers building multi-step LLM apps
  • Data practitioners exploring workflow automation with LLMs
 14 Hours

Number of participants


Price per participant

Provisional Upcoming Courses (Require 5+ participants)

Related Categories