The myCobot 280 RDK X5 is built with the RDK OS operating system and equipped with the D-Robotics RDK X5 Robot Developer Kit. It offers up to 10 TOPS of computing power and supports various complex models, including Transformer, RWKV, Occupancy, Stereo Perception, and the latest algorithms. This enables rapid deployment of intelligent applications. With a focus on intelligent computing and robotics applications, it provides rich interfaces and exceptional ease of use.
In contrast to robotic arms equipped with M5-Stack, Raspberry Pi, or Jetson Nano development boards, the myCobot 280 RDK AI Kitfeatures pre-installed cases that integrate large language models with computer vision capabilities. Users can easily invoke and explore these applications by opening a terminal and executing simple commands.
1. Run ROS-Moveit to Drag & TeachThe system comes pre-installed with all essential ROS components and robot control code.
By following a few simple steps from the Gitbook tutorial, you can quickly launch MoveIt with just a few terminal commands. Once launched, MoveIt enables real-time robot control through intuitive motion planning.
With a simple click-and-drag operation in the RViz interface, you can interactively move the robot's end effector — and the system will automatically compute and execute the corresponding joint trajectories on the real robot.This provides a seamless experience from simulation to physical execution, requiring no manual coding or advanced configuration.
The system comes pre-installed with YOLOv8 detection models and essential computer vision dependencies. Following simple steps from the Gitbook guide, you can launch YOLOv8 object detection with just a few terminal commands. Once running, YOLOv8 delivers real-time object recognition through state-of-the-art neural networks. With live camera input, you can instantly detect and identify objects in your environment — the system automatically processes video frames and displays detection results with labeled bounding boxes and accuracy scores. This offers seamless plug-and-play object detection without requiring model configuration or coding expertise.
Once running, the gesture recognition system delivers real-time hand tracking through advanced computer vision algorithms. With live camera input, you can instantly perform various hand gestures in your environment — the system automatically processes video frames and recognizes predefined gestures such as pointing, grabbing, thumbs up, and open palm signals. This offers seamless plug-and-play gesture control without requiring model training or complex calibration procedures.
Join this live streaming to learn about the cutting-edge integration of Large Language Model (LLM) and Computer Vision (CV) technologies on the RDK X5 collaborative robot arm platform. This interactive session will feature real-time demonstrations of:
● YOLOv8 Object Detection in action with live camera feeds
● Gesture Recognition controlling robotic movements
Whether you're an educator, robotics enthusiast, or industry professional, this event offers valuable insights into the future of educational robotics.
Here are the details:
● Date: June 26.2025
● Time: 1 PM EST / 10 AM PDT / 7 PM CET
● Duration: 1 hour
● Where:Zoom Webinar& YouTube Live(1.6M Followers)
The myCobot 280 RDK X5 represents a breakthrough in educational robotics by combining advanced AI capabilities with user-friendly operation. Unlike traditional development board-based robotic arms, this desktop system comes pre-loaded with integrated Large Language Model (LLM) and Computer Vision (CV) applications that can be accessed through simple terminal commands, which serves as an ideal tool for researchers developing "AI×Robotics" applications, combines powerful computing capabilities with intuitive operation.
Comments