This open-source project was created by neka-nat and the project is available on GitHub.
The Model Context Protocol (MCP) is a bidirectional communication bridge between AI assistants and external tools, enabling not only information access but also empowering AI to take meaningful actions.
It's an open source protocol designed to safely and securely connect AI tools to data sources like your company's CRM, Slack workspace, or dev server. That means your AI assistant can pull in relevant data and trigger actions in those tools—like updating a record, sending a message, or kicking off a deployment. By giving AI assistants the power to both understand and act, MCP enables more useful, context-aware, and proactive AI experiences.
The Cloud Platform of Computer Vision (Dino-X) is an intelligent, cloud-based system designed to streamline and enhance computer vision tasks. It provides seamless integration with AI models for object detection, image classification, and real-time visual feedback. With a user-friendly interface and API access, Dino-X enables developers and researchers to deploy, test, and manage computer vision pipelines in the cloud—accelerating innovation in robotics, automation, and smart systems.
● Vision-based object detection using DINO-X API
● Automated pick-and-place control with myCobot
● Configurable parameters through a settings.json file
● Live camera feedback for accurate targeting
Required Hardware● myCobot 280 Series
● myCobot Suction Pump 2.0
● Camera Flange 2.0
The myCobot 280 M5 is a compact and lightweight robotic arm, ideal for low-load application scenarios such as education, research, and light automation tasks. With its simple and beginner-friendly Python API, it is especially suitable for users who are new to robotics development. The device offers a seamless setup experience, making it easy to get started with basic motion control and programming. More information here.
When combined with a myCobot Suction Pump 2.0, the myCobot 280 M5 becomes a powerful tool for Pick & Place operations, enabling users to grasp and move small objects with just a few lines of code. This integration allows developers and makers to quickly build automation projects with minimal setup and no need for complex hardware configurations.
In this project, we will use the open-source project provided by neka-nat to replicate and demonstrate the MCP Pick & Place process.
Build the Project1. Clone the Repository and Install the Toolsgit clone https://github.com/neka-nat/mycobot-mcp.git
cd mycobot-mcp
pip install .
curl -Ls https://astral.sh/uv/install.sh | bash
2. Get the DINO-X TokenGet the token from the DINO-X official website:https://deepdataspace.com/
We use the Ubuntu22.04 platform and download the software from the official Claude website. After installation, follow the prompts to complete local MCP configuration. Once configured, you can start Claude for local model interaction.
Follow the instructions to set up the json file on the Claude platform to configure the MCP server. Note the differences in configuration between the Linux platform and the Windows platform.
https://modelcontextprotocol.io/quickstart/server
{
"mcpServers": {
"mycobot": {
"command": "uv",
"args": [
"--directory",
"/path/to/mycobot-mcp/",
"run",
"mycobot-mcp",
"--settings-path",
"/path/to/settings.json"
],
"env": {
"DDS_API_TOKEN": "your-api-token"
}
}
}
}
5. Simple ControlAfter modifying the configuration file, you need to restart the Claude client to enable the MCP server. Then, you can control myCobot with simple commands.
We would like to extend our sincere gratitude to creative maker behind this project. By building an MCP that understands input command, identifies target positions, and controls a robotic arm myCobot 280 accordingly, they have successfully integrated vision with LLMs MCP. We hope more people will apply this technology to everyday life, bringing more convenience to the world.
Developers are welcome to participate in our User Case Initiative and showcase your innovative projects: https://www.elephantrobotics.com/en/call-for-user-cases-en/.
Comments