
MCP with Langchain Sample Setup
Sample MCP Server & Client setup, compatible with LangChain
what is MCP with Langchain?
MCP with Langchain is a sample setup that demonstrates how to create a server and client architecture using the MCP framework in conjunction with LangChain.
how to use MCP with Langchain?
To use MCP with Langchain, start the MCP servers by running the provided Python scripts in the mcp_servers
folder, and then run the MCP client to interact with the servers through a web interface.
key features of MCP with Langchain?
- Sample implementation of MCP servers with various functions.
- Easy-to-use client interface for interacting with the servers.
- Supports multiple server instances running on different ports.
use cases of MCP with Langchain?
- Building a chatbot that utilizes multiple language models.
- Creating a weather information service that fetches data from different sources.
- Developing a math problem-solving assistant that integrates with various APIs.
FAQ from MCP with Langchain?
- What programming language is used in this project?
The project is implemented in Python.
- How do I start the servers?
You can start the servers by navigating to the
mcp_servers
directory and running the provided Python scripts.
- Is there a web interface for the client?
Yes! The client runs a Streamlit web UI that you can access on port 8501.
MCP with Langchain Sample Setup
Start MCP servers
There are 3 sample MCP servers created in folder mcp_servers
, each has 1 or 2 functions. Start and listen for requests on 3 different ports: 8000, 8001, and 8002.
cd mcp_servers
nohup python math_mcp_server.py > math.log 2>&1 &
nohup python weather_mcp_server.py > weather.log 2>&1 &
nohup python which_llm_to_use_mcp_server.py > which_llm.log 2>&1 &
Start MCP client
MCP client works as an interface bridging users and MCP servers.
export OPENAI_API_KEY=sk-svcacct-Tn_rKHd............................your_key_please
streamlit run my_chat_bot_app.py
The streamlit command will start a web UI listening on port 8501. You can play with the bot from there.