
MCP Client Using LangChain / TypeScript
MCP Client Implementation Using LangChain ReAct Agent / TypeScript
What is MCP Client Using LangChain / TypeScript?
MCP Client Using LangChain / TypeScript is a client implementation that demonstrates the use of Model Context Protocol (MCP) server tools through the LangChain ReAct Agent.
How to use MCP Client?
To use the MCP Client, install the necessary dependencies, set up your API keys, configure the MCP servers, and run the application using Node.js.
Key features of MCP Client?
- Supports multiple MCP servers and converts their tools into LangChain-compatible tools.
- Utilizes utility functions for parallel initialization of MCP servers.
- Compatible with LLMs from Anthropic, OpenAI, and Groq.
Use cases of MCP Client?
- Integrating various MCP server tools into a single application.
- Facilitating the use of language models in applications requiring context management.
- Enabling developers to build applications that leverage multiple AI models seamlessly.
FAQ from MCP Client?
- What are the prerequisites for using the MCP Client?
You need Node.js 16+, npm 7+, and API keys from the respective LLM providers.
- Is there a Python version of this client?
Yes, a Python version is available on GitHub.
- How do I configure the MCP servers?
You can configure the MCP servers in the
llm_mcp_config.json5
file following the specified format.
This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.
It leverages a utility function convertMcpToLangchainTools()
from
@h1deya/langchain-mcp-tools
.
This function handles parallel initialization of specified multiple MCP servers
and converts their available tools into an array of LangChain-compatible tools
(StructuredTool[]
).
LLMs from Anthropic, OpenAI and Groq are currently supported.
A python version of this MCP client is available here
Prerequisites
- Node.js 16+
- npm 7+ (
npx
) to run Node.js-based MCP servers - [optional]
uv
(uvx
) installed to run Python-based MCP servers - API keys from Anthropic, OpenAI, and/or Groq as needed.
Setup
-
Install dependencies:
npm install
-
Setup API keys:
cp .env.template .env
- Update
.env
as needed. .gitignore
is configured to ignore.env
to prevent accidental commits of the credentials.
- Update
-
Configure LLM and MCP Servers settings
llm_mcp_config.json5
as needed.- The configuration file format
for MCP servers follows the same structure as
Claude for Desktop,
with one difference: the key name
mcpServers
has been changed tomcp_servers
to follow the snake_case convention commonly used in JSON configuration files. - The file format is JSON5, where comments and trailing commas are allowed.
- The format is further extended to replace
${...}
notations with the values of corresponding environment variables. - Keep all the credentials and private info in the
.env
file and refer to them with${...}
notation as needed.
- The configuration file format
for MCP servers follows the same structure as
Claude for Desktop,
with one difference: the key name
Usage
Run the app:
npm start
Run in verbose mode:
npm run start:v
See commandline options:
npm run start:h
At the prompt, you can simply press Enter to use example queries that perform MCP server tool invocations.
Example queries can be configured in llm_mcp_config.json5