Content
# local_mcp
A local MCP service implemented based on fastmcp. It calls the Qwen3-32b model and uses the MCP tool to access the Xinzhi Weather API for real-time weather queries and multi-turn dialogue streaming output.
# File Introduction
`main.py`: Main function
`server.py`: Records the get_weather function. It can be replaced with other MCP tools.
# Usage
```bash
python server.py
python main.py
```
# Usage Example
```bash
Please enter:>? Hello, I would like to inquire about the current weather in Nanjing.
Model Output:
==================== Thought Process ====================
Okay, the user wants to check the weather in Nanjing. I need to call the get_weather function, with the parameter being the city name Nanjing. First, I will confirm if the function exists by checking the tool list, and indeed there is a get_weather function, with the parameter city, which is a required string. So I should construct a JSON object where the name is get_weather and the arguments contain city set to "Nanjing". Then, I will ensure the format is correct by wrapping the tool call part with XML tags. This should correctly return the weather information.
==================== Response Content ====================
The current weather in Nanjing is light rain, with a temperature of 26 degrees.
Please enter:>? I would like to inquire about the current weather in Beijing.
Model Output:
==================== Thought Process ====================
Okay, the user previously asked about the weather in Nanjing, and now they want to know about the weather in Beijing. I need to call the get_weather function to get the real-time weather for Beijing. First, I will confirm that the user's request is to check the current weather in Beijing, with the parameter being the city name "Beijing". Then, I will check if the function in the tool is applicable; the get_weather function indeed requires the city name as a parameter. Next, I will construct the correct tool_call, ensuring the parameters are accurate. Finally, I will return the function call result and wait for further information to respond to the user. It is important to maintain timeliness and accuracy in the response.
==================== Response Content ====================
The current weather in Beijing is moderate rain, with a temperature of 19 degrees.
Please enter:>? Can you introduce yourself and tell me what functions you have?
Model Output:
==================== Thought Process ====================
Okay, the user is now asking me to introduce myself and my functions. First, I need to determine the context of the previous conversation. In the earlier dialogue, the user inquired about the weather in Nanjing and Beijing, and I provided the weather information. Now the user may want to know more about my functions or confirm what types of requests I can handle.
The user's question is quite direct, but I may need to introduce myself concisely. I need to ensure that my answer covers the main functions while maintaining a friendly and helpful attitude. Considering the user has previously used the weather query, I should highlight this function but also mention any other possible functions, if applicable.
However, based on the provided tools, I can currently only call the get_weather function. Therefore, my functions are primarily focused on weather queries. I need to clearly state that I can query real-time weather for any city, including temperature, precipitation, etc. Additionally, I should invite the user to ask other questions or requests to confirm if there are any other hidden needs.
Moreover, the user might want to know about my other capabilities, such as whether I support multiple languages or if I can provide other weather-related information (like humidity, wind speed, etc.), but according to the current tool description, this information may not be available. Therefore, I need to accurately respond based on the provided tool information to avoid exaggerating my capabilities.
To summarize the response structure: start with a greeting, then explain that my main function is weather querying, provide examples of the types of information I can query, and finally invite the user to specify their needs. Ensure the response is natural, avoiding technical jargon, and maintaining a conversational tone.
==================== Response Content ====================
Hello! I am a weather query assistant, and my main function is to help you obtain real-time weather information for various locations. For example, I can check the current weather conditions in a city (such as sunny/rainy/snowy), temperature (in Celsius or Fahrenheit), and more. You just need to tell me the name of the city you want to inquire about, and I will return the latest weather data for you. Besides weather queries, I currently do not have any other functions~ If you have any other questions, feel free to let me know!
Please enter:
```
Connection Info
You Might Also Like
awesome-mcp-servers
A collection of MCP servers.
git
A Model Context Protocol server for Git automation and interaction.
Appwrite
Build like a team of hundreds
TrendRadar
TrendRadar: Your hotspot assistant for real news in just 30 seconds.
oh-my-opencode
Background agents · Curated agents like oracle, librarians, frontend...
chatbox
User-friendly Desktop Client App for AI Models/LLMs (GPT, Claude, Gemini, Ollama...)