LangGraph Agent combines LLMs with a graph-structured memory to build autonomous agents that can remember facts, reason over relationships, and call external functions or tools when needed. Developers define memory schemas as graph nodes and edges, plug in custom tools or APIs, and orchestrate agent workflows through configurable planners and executors. This approach enhances context retention, enables knowledge-driven decision making, and supports dynamic tool invocation in diverse applications.
Amazon Bedrock Custom LangChain Agent is a reference architecture and code example that shows how to build AI agents by combining AWS Bedrock foundation models with LangChain. You define a set of tools (APIs, databases, RAG retrievers), configure agent policies and memory, and invoke multi-step reasoning flows. It supports streaming outputs for low-latency user experiences, integrates callback handlers for monitoring, and ensures security via IAM roles. This approach accelerates deployment of intelligent assistants for customer support, data analysis, and workflow automation, all on the scalable AWS cloud.
Amazon Bedrock Custom LangChain Agent Core Features