Unity has emerged as a powerful platform for building simulation workflows, particularly in robotics and automation. On the other hand, Robot Operating System(ROS) is a widely adopted framework for robotic software development and is best supported on Linux.
To bridge this gap, Windows Subsystem for Linux (WSL) provides a seamless way to run a full Linux environment directly within Windows, without the need for dual-boot or a virtual machine. Communication between the two systemsβROS in WSL and Unity on Windowsβis made possible via standard networking over TCP, enabling a hybrid development workflow that combines the strengths of both platforms.
For this case, we use myCobot 280 to verify the effectiveness of hybrid system, visualizes the robot model and task environment in real-time in Unity.
Open the PowerShell in the Windows system
Then run the command:
wsl --install -d Ubuntu-20.04
After running the command, you can create an account by entering your username and password to complete the initial setup of Ubuntu.
1.2 Update System Packagessudo apt update && sudo apt upgrade -y
sudo apt install curl wget git build-essential -y
sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
curl -s https://raw.githubusercontent.com/ros/rosdistro/master/ros.asc | sudo apt-key add -
sudo apt update
2.2 Install ROS Noeticsudo apt install ros-noetic-desktop-full -y
sudo apt install python3-rosdep python3-rosinstall python3-rosinstall-generator python3-wstool build-essential -y
2.3 Initialize Rosdepsudo rosdep init
rosdep update
2.4 Setup ROS EnvironmentAdd to ~/.bashrc:
echo "source /opt/ros/noetic/setup.bash" >> ~/.bashrc
source ~/.bashrc
mkdir -p /mnt/d/catkin_ws/src
cd /mnt/d/catkin_ws/src
3.2 Clone myCobot Packages# Clone the official myCobot ROS package
git clone https://github.com/elephantrobotics/mycobot_ros.git
3.3 Install Dependenciescd /mnt/d/catkin_ws
rosdep install --from-paths src --ignore-src -r -y
3.4 Build Workspacecatkin_make
source devel/setup.bash
echo "source /mnt/d/catkin_ws/devel/setup.bash" >> ~/.bas
Step 4: Unity Setup and ROS Integration4.1 Install Unity Hub and Unity EditorDownload Unity Hub from the official website and install Unity 2022.3 LTS.
4.2 Create New Unity Project1. Open Unity Hub
2. Create a new 3D project named "myCobot_Control"
Open the project and go to Window > Package Manager in the top menu bar.π·
In Unity Package Manager, add packages by the followed URL:
https://github.com/Unity-Technologies/ROS-TCP-Connector.git?path=/com.unity.robotics.ros-tcp-connector
https://github.com/Unity-Technologies/URDF-Importer.git?path=/com.unity.robotics.urdf-importer
4.4 Import myCobot URDF1. Create folder Assets/URDF/mycobot_280/
2. Copy URDF files from the cloned repository: bash
# In WSL, copy files to Windows accessible location
cp -r /mnt/d/catkin_ws/src/mycobot_description/* /mnt/c/Users/[YourUsername]/Desktop/mycobot_urdf/
3. In Unity, import the URDF using Robotics > Import Robot from URDF
Step 5: Creating ROS-Unity Communication Bridge5.1 Install ROS TCP Endpoint (WSL)cd /mnt/d/catkin_ws/src
git clone https://github.com/Unity-Technologies/ROS-TCP-Endpoint.git
cd /mnt/d/catkin_ws
catkin_make
5.2 Configure Network SettingsIn WSL, find your IP address:
hostname -I
Note the IP address (e.g., 172.x.x.x).
5.3 Unity ROS SettingsIn Unity:
1. Go to Robotics > ROS Settings
2. Set ROS IP Address to your WSL IP
3. Set ROS Port to 10000
4. Set Unity IP Address to your Windows IP
Create /mnt/d/catkin_ws/src/mycobot_unity_bridge/scripts/unity_control_node.py:
#!/usr/bin/env python3
import rospy
from sensor_msgs.msg import JointState
from geometry_msgs.msg import Twist
from std_msgs.msg import Float64MultiArray
import json
class MyCobotUnityBridge:
def __init__(self):
rospy.init_node('mycobot_unity_bridge', anonymous=True)
# Publishers
self.joint_pub = rospy.Publisher('/mycobot/joint_states', JointState, queue_size=10)
self.cmd_pub = rospy.Publisher('/mycobot/joint_commands', Float64MultiArray, queue_size=10)
# Subscribers
self.unity_sub = rospy.Subscriber('/unity/joint_commands', Float64MultiArray, self.unity_callback)
self.rate = rospy.Rate(30) # 30Hz
def unity_callback(self, data):
"""Handle joint commands from Unity"""
joint_msg = Float64MultiArray()
joint_msg.data = data.data
self.cmd_pub.publish(joint_msg)
def publish_joint_states(self, positions):
"""Publish current joint states"""
joint_msg = JointState()
joint_msg.header.stamp = rospy.Time.now()
joint_msg.name = ['joint1', 'joint2', 'joint3', 'joint4', 'joint5', 'joint6']
joint_msg.position = positions
joint_msg.velocity = [0.0] * 6
joint_msg.effort = [0.0] * 6
self.joint_pub.publish(joint_msg)
def run(self):
while not rospy.is_shutdown():
# Simulate joint positions (replace with actual robot feedback)
positions = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
self.publish_joint_states(positions)
self.rate.sleep()
if __name__ == '__main__':
try:
bridge = MyCobotUnityBridge()
bridge.run()
except rospy.ROSInterruptException:
pass
6.2 Unity Control Script (C#)Create Assets/Scripts/MyCobotController.cs:
using UnityEngine;
using Unity.Robotics.ROSTCPConnector;
using RosMessageTypes.Std;
using RosMessageTypes.Sensor;
public class MyCobotController : MonoBehaviour
{
[Header("ROS Configuration")]
public string jointStateTopic = "/mycobot/joint_states";
public string jointCommandTopic = "/unity/joint_commands";
[Header("Robot Joints")]
public ArticulationBody[] joints;
[Header("Control Parameters")]
public float[] targetJointAngles = new float[6];
public float controlSpeed = 1.0f;
private ROSConnection ros;
void Start()
{
// Initialize ROS connection
ros = ROSConnection.GetOrCreateInstance();
ros.RegisterPublisher<Float64MultiArrayMsg>(jointCommandTopic);
ros.Subscribe<JointStateMsg>(jointStateTopic, JointStateCallback);
// Find joint articulation bodies
FindJoints();
}
void FindJoints()
{
// Auto-find joints if not assigned
if (joints == null || joints.Length == 0)
{
joints = GetComponentsInChildren<ArticulationBody>();
}
}
void Update()
{
// Handle keyboard input for manual control
HandleInput();
// Send joint commands to ROS
SendJointCommands();
// Update Unity joint positions
UpdateUnityJoints();
}
void HandleInput()
{
if (Input.GetKey(KeyCode.Q)) targetJointAngles[0] += Time.deltaTime * controlSpeed;
if (Input.GetKey(KeyCode.A)) targetJointAngles[0] -= Time.deltaTime * controlSpeed;
if (Input.GetKey(KeyCode.W)) targetJointAngles[1] += Time.deltaTime * controlSpeed;
if (Input.GetKey(KeyCode.S)) targetJointAngles[1] -= Time.deltaTime * controlSpeed;
// Add more controls for other joints...
}
void SendJointCommands()
{
Float64MultiArrayMsg msg = new Float64MultiArrayMsg();
msg.data = new double[targetJointAngles.Length];
for (int i = 0; i < targetJointAngles.Length; i++)
{
msg.data[i] = targetJointAngles[i];
}
ros.Publish(jointCommandTopic, msg);
}
void UpdateUnityJoints()
{
for (int i = 0; i < joints.Length && i < targetJointAngles.Length; i++)
{
if (joints[i] != null)
{
var drive = joints[i].xDrive;
drive.target = targetJointAngles[i] * Mathf.Rad2Deg;
joints[i].xDrive = drive;
}
}
}
void JointStateCallback(JointStateMsg msg)
{
// Handle feedback from ROS (real robot positions)
for (int i = 0; i < msg.position.Length && i < joints.Length; i++)
{
// Update Unity visualization based on real robot feedback
if (joints[i] != null)
{
var drive = joints[i].xDrive;
drive.target = (float)msg.position[i] * Mathf.Rad2Deg;
joints[i].xDrive = drive;
}
}
}
}
Step 7: Launch and Testing7.1 Start ROS CoreOpen a WSL terminal:
roscore
7.2 Launch ROS TCP EndpointOpen a new WSL terminal:
roslaunch ros_tcp_endpoint endpoint.launch tcp_ip:=0.0.0.0 tcp_port:=10000
7.3 Run Unity Bridge Nodecd /mnt/d/catkin_ws
source devel/setup.bash
rosrun mycobot_unity_bridge unity_control_node.py
7.4 Start Unity Application1. Open the Unity project
2. Attach the myCobot Controller script to Unity project folder
3. Configure joint references in inspector
4. Play the scene
7.5 Connect Real RobotFor physical robot connection:
# Install pymycobot library
pip3 install pymycobot
# Launch myCobot driver
roslaunch mycobot_280 mycobot_280_bringup.launch port:=/dev/ttyUSB0
Step 8: Advanced Features8.1 Inverse KinematicsImplement IK solver in Unity for end-effector control:
public class IKSolver : MonoBehaviour
{
public Transform target;
public Transform endEffector;
public Transform[] joints;
void Update()
{
SolveIK();
}
void SolveIK()
{
// Implement FABRIK or other IK algorithm
// Update joint angles to reach target position
}
}
8.2 Trajectory PlanningAdd trajectory planning for smooth motion:
# In ROS node
from trajectory_msgs.msg import JointTrajectory, JointTrajectoryPoint
def plan_trajectory(start_pos, end_pos, duration):
trajectory = JointTrajectory()
trajectory.joint_names = ['joint1', 'joint2', 'joint3', 'joint4', 'joint5', 'joint6']
# Generate waypoints
for i in range(10):
point = JointTrajectoryPoint()
t = i / 9.0
point.positions = [start_pos[j] + t * (end_pos[j] - start_pos[j]) for j in range(6)]
point.time_from_start = rospy.Duration(duration * t)
trajectory.points.append(point)
return trajectory
TroubleshootingCommon Issues and Solutions1. WSL Network Connection Issues
β Check Windows Firewall settings
β Verify WSL IP address with hostname -I
β Ensure ports 10000-10001 are open
2. Unity URDF Import Problems
β Verify URDF file paths are correct
β Check joint hierarchy matches expected structure
β Ensure materials and meshes are properly referenced
3. ROS Communication Failures
β Confirm ROS_MASTER_URI is correctly set
β Check topic names match between ROS and Unity
β Verify message types are compatible
4. Robot Connection Issues
β Check USB port permissions: sudo chmod 666 /dev/ttyUSB0
β Verify correct baud rate and port settings
β Test robot connection independently
ConclusionThis tutorial demonstrates a complete integration between WSL, ROS, and Unity for controlling the 6-axis collaborative robot arm, myCobot 280. This setup offers a powerful and flexible development environment tailored for robotics applications.
It enables real-time robot simulation and control, seamless development across Windows and Linux environments, and provides an extensible architecture for implementing advanced robotics features. Additionally, it supports visual feedback and debugging capabilities to streamline the development process.
Comments