Python developers, want to provide LLM superpowers? Gradation is the fastest way to do that! Gradio’s Model Context Protocol (MCP) integration allows LLM to connect directly to thousands of AI models and spaces hosted on facehubs that you’re hugging. By pairing with the general reasoning ability of LLM and the special ability of face-hugging models, LLM goes beyond simply answering textual questions and actually solving problems of everyday life.
For Python developers, Gradio implements a powerful MCP server to achieve Breeze, providing the following features:
Automatic conversion of Python functions to LLM tools: Each API endpoint in a Gradio app is automatically converted to an MCP tool with a corresponding name, description, and input schema. The function docus string is used to generate a description of the tool and its parameters. Real-time progress notification: Gradio Streams Progress notifications stream notifications to MCP clients and allow you to monitor status in real-time without having to implement this feature yourself. Automatic file uploads, including public URL support and handling of various file types.
Imagine this. It takes too long and hates shopping and is afraid of trying to get dressed for yourself. What if LLM can handle this? In this post, you’ll browse online clothing stores, find specific clothing, and create LLM-driven AI assistants that show you what those clothes look like, using virtual tryon models. See the demo below:
Goal: Your Personal AI Stylist
Combine three key components to make your AI shopping assistant possible.
IDM-Vton Spreading Model: This AI model is responsible for the virtual trion function. You can edit existing photos to make people look like they’re wearing different clothes. Use the hug face space for IDM-Vton that you can access here.
Gradio: Gradio is an open source Python library that allows you to easily build web applications with AI and create MCP servers for very important projects. Gradio acts as a bridge and allows LLM to invoke IDM-Vton models and other tools.
AI Chat Features in Visual Studio Code: Use VS Code’s built-in AI chat to support the addition of any MCP server to interact with the AI Shopping Assistant. This provides a user-friendly interface for issuing commands and displaying virtual trion results.
Building a Gradio MCP server
The core of the AI Shopping Assistant is the Gradio MCP server. This server exposes one main tool.
VTON_GENERATION: This function takes a human model image and an image of clothing as input and uses the IDM-VTON model to generate a new image of a person wearing clothing.
Here’s the Python code for the Gradio MCP server:
from Gradio_Client Import Client, handle_file
Import Gradation As gr
Import re client = client(“Freddyaboulton/IDM-Vton”,hf_token =“”))
def vton_generation(human_model_img: strClothes: str):
“” “Use the IDM-Vton model to generate a new image of a person wearing clothes.” “”
“” “
args:
Human_model_img: A human model modeling clothing.
Clothing: Clothing to wear.
“” “
output = client.predict(
Dict= {“background”: handle_file (human_model_img), “Layer”:(), “composite”:none},garm_img = handle_file (garment),garment_des =“”is_checked =truthis_checked_crop =errordenoise_steps =30seed =42,API_NAME =“/Tryon”
))
return output(0)vton_mcp = gr.interface(vton_generation, inputs = (gr.image(type=“filepath”label =“Human model image URL”), gr.image (type=“filepath”label =“Clothing image URL or file”), output = gr.image(type=“filepath”label =“Generated Images”)))
if __NAME__ == “__Major__”:vton_mcp.launch(mcp_server=truth))
By setting MCP_SERVER = TRUE in the Launch() method, Gradio automatically converts Python functions to MCP tools that LLMS can understand and use. The function documentation is used to generate descriptions of the tool and its parameters.
The original IDM-Vton space was implemented in Gradio 4.x, which precedes the automatic MCP feature. So, in this demo, we’ll build a grade interface that queries the original space through a Gradio API client.
Finally, run this script in Python.
Code and Code Configuration
To connect your Gradio MCP server to VS Code’s AI chat, you will need to edit the MCP.JSON file. This configuration tells AI Chat where to find an MCP server and how to interact with it.
This file can be found by entering MCP in the command panel and selecting MCP:Open user configuration. Once opened, make sure the following servers exist:
{
“server”: {
“Vton”: {
“URL”: “http://127.0.0.1:7860/gradio_api/mcp/”
},
“Playwriter”: {
“Instructions”: “NPX”,
“args”: (
“-y”,
“@playwright/mcp@Latest”
))
}
}
}
The Playwright MCP server allows AI assistants to browse the web.
Make sure the VTON server URL matches the URL printed in the console in the previous section. To run the Playwright MCP server, you must install the node.
Put it all together
Now you can start interacting with your AI Shopping Assistant. When you open a new chat with your VS code, you can ask your assistant something like “Browse the Uniqlo website in a blue t-shirt” and show me what it looks like in three of them using my photo (Your-Image-url). ”
Check out the video above for an example!
Conclusion
A combination of powerful AI models like Gradio, MCP, and IDM-Vton opens up exciting possibilities for creating intelligent and useful AI assistants. You can follow the steps outlined in this blog post to build your own assistant and solve the problems that interest you most.