Desktop Dual-arm Cobot, myBuddy 280 Focuses on Education and Research with Various Functions



  • Background

    With the development of modern industry and the advancement of science and technology, people's requirements for industry, medical care, and service levels continue to increase. Single-arm robots cannot meet the requirements. Dual-arm robots should be used to meet the needs of complexity, intelligence, and flexibility of tasks. And live. The dual-arm robot is not a simple combination of two robotic arms. In addition to their respective control goals, they also need to satisfy mutual coordinated control and adaptability to the environment. This high complexity makes the operation of dual-arm robots more demanding—advanced integrated systems, high-level planning and reasoning, and adjustable control methods.
    Dual-arm collaborative robots are the inevitable trend in the future of robotics.

    Introduction

    myBuddy 280 is Elephant Robotics' first dual-arm collaborative robot, powered by Raspberry Pi, and is a service robot - a dual-arm 13-axis humanoid collaborative robot. myBuddy 280 has a single arm with a working radius of 280mm and a maximum payload of 250g. It has a 7" interactive display and 2-megapixel HD cameras. It can be adapted to the needs of different applications.
    alt text
    alt text

    Functions

    Excellent algorithm control

    Dual-arm robots have more apparent advantages over single-armed robots. A dual-arm robot can operate a single-armed double simultaneously, with higher total power, or it can reach two different positions simultaneously for separate operations, or even multiple robots can physically achieve object transfer. The trajectory of a robotic arm is ultimately single and requires human optimization to design algorithms for optimal trajectory calculation. This approach is quite complex to implement because of several factors, such as redundant kinematics, collision avoidance, unclear possibilities for performing tasks, complex objective functions, etc.
    With superior algorithms, myBuddy 280 can respond to commands as fast as 30ms, and with anti-collision detection, it can work safely with people.

    A more complete secondary development environment

    • Ultra-complete python control interface

      ■ Provides 100+ control interfaces for secondary application development or self-interference algorithm research.
      ■ Open interfaces for joint angle, speed control, and robot coordinate control make management more accessible and user-friendly.
      ■ Supports separate controls for left and right arm and waist, allowing more control at your fingertips.
      ■ Programming examples are provided to enable rapid deployment of scenario applications

           Sends a single joint angle to the robot arm.
           send_angle(id, joint, angle, speed)
           id - 1/2/3 (left arm/right arm/waist)
           joint - 1~ 6 (Corresponding to each joint)
           angle - (-180 ~ 180)Different angles have different limits, please check the product parameters for details
           speed – 1 ~ 100 (The higher the value, the faster the arm is moving)
           # Get the angle of a single joint
           get_angle(id, joint_id)
           id - 1/2/3 (left arm/right arm/waist)
           joint_id - 1~7 (7 is grapper)
           # Sending the arcs of all joints of the specified robot arm to the arm
           send_radians(id, radians, speed)
           id – 1/2(left arm/right arm)。
           radians – The radian values are stored as a list
           (List[float]),The length of the list is 6
           speed - 0 ~ 100(The higher the value, the faster the arm is moving)
           # There are many more functions, here is an example of their use
           from pymycobot.mybuddy import MyBuddy
           import time
            #MyBuddy('port',baud)
            mc = MyBuddy("/dev/ttyACM0",115200)
            # Send angles to the six joints of the left arm
           mc.send_angles(1, [0, 0, 0, 0, 0, 0], 50)
           time.sleep(3)
           # Send the angle to the first joint of the right arm
           mc.send_angle(2, 1, 90, 50)
           time.sleep(2)
    

    code on GitHub

    • ROS robot control system support

      ■ With RVIZ, RVIZ can display images, models, paths, and other information, complete with visual rendering, making it easier for developers to understand the meaning of the data.
      alt text
      ■ With MoveIt, among other things, motion planning, collision detection, kinematics, 3D perception, and manipulation control. When users develop paths and encounter different situations that require constraints, the functions of MoveIt can be helpful.alt text

    • Self-developed software support
      ■ myBlockly: myBlockly is visual modular programming software that belongs to the graphic programming language. Like Scratch, it is an excellent software for getting started with myBuddy 280 quickly.
      alt text
      ■myStudio: myStudio is a one-stop platform for the use of robotic arms. It offers firmware updates, driver installation, and tutorials on how to use the robot arm.alt text

    • Configuration
      ■With 13 high-performance brushless DC servos, a seven-inch interactive display can be used for image display and touch control.
      ■Two built-in 2-megapixel and OpenCV compiled environments for rapid deployment of machine vision development.
      ■The LEGO end unit interface allows users to use 3D-printed accessories for various scenarios.
      alt text

    Summary

    Dual-arm collaborative robots will dominate the future robotics landscape, and you could be designing more creative projects with myBuddy 280! Please leave your comments below and share them with us to start the journey of dual-arm collaborative robots!
    Learn more about us:
    Home | Elephant Robotics
    GitHub | Elephant Robotics
    Shop | Elephant Robotics



  • It looks great.I see you guys say it allows for VR linkage is this possible?



  • Thank you for your support. Yes, we are developing VR communication and the idea is to be able to control mybuddy from a remote location via VR to implement some projects.