Chapter 8: Unity Robotics and Visualization
Introduction
Unity has emerged as a powerful platform for robotics visualization, offering photorealistic rendering, advanced physics simulation, and extensive customization capabilities. While Gazebo dominates the research simulation space, Unity excels in visualization, human-robot interaction, and synthetic data generation. This chapter explores how to leverage Unity's capabilities for robotics applications, from real-time visualization to interactive training environments.
Unity's strength lies in its rendering pipeline and cross-platform deployment, making it ideal for robotics applications requiring high-fidelity visualization, user interaction, and mobile deployment.
8.1 Unity for Robotics Overview
8.1.1 Why Unity for Robotics?
Unity offers unique advantages for robotics applications:
Diagram: Unity Robotics Ecosystem
Unity Engine
├── Rendering Pipeline
│ ├── Photorealistic Graphics
│ ├── Real-time Lighting
│ ├── Post-processing Effects
│ └── VR/AR Support
├── Physics System
│ ├── NVIDIA PhysX
│ ├── Custom Physics Materials
│ ├── Collision Detection
│ └── Vehicle Physics
├── Scripting System
│ ├── C# Scripts
│ ├── Visual Scripting
│ ├── Animation Controller
│ └── Timeline System
├── Cross-Platform Support
│ ├── Windows/Linux/macOS
│ ├── iOS/Android
│ ├── VR Platforms
│ └── Web Browsers
└── Asset Pipeline
├── 3D Model Import
├── Material System
├── Animation System
└── Audio Integration
8.1.2 Unity vs Gazebo Comparison
Diagram: Unity vs Gazebo Comparison
Gazebo
┌─────────────────────────────────────────┐
│ Strengths: │
│ • Accurate physics simulation │
│ • Realistic sensor modeling │
│ • ROS 2 integration │
│ • Open source │
│ • Robotics-focused │
│ │
│ Best for: │
│ • Algorithm testing │
│ • Sensor simulation │
│ • Autonomous navigation │
│ • Multi-robot systems │
└─────────────────────────────────────────┘
Unity
┌─────────────────────────────────────────┐
│ Strengths: │
│ • Photorealistic rendering │
│ • User interaction │
│ • Mobile deployment │
│ • Asset ecosystem │
│ • Visual scripting │
│ │
│ Best for: │
│ • Human-robot interaction │
│ • Training and education │
│ • Synthetic data generation │
│ • Commercial applications │
└─────────────────────────────────────────┘
8.2 Unity-ROS Integration
8.2.1 ROS-TCP-Connector
using UnityEngine;
using System.Collections.Generic;
using RosSharp.RosBridgeClient;
using std_msgs = RosSharp.RosBridgeClient.std_msgs;
using geometry_msgs = RosSharp.RosBridgeClient.geometry_msgs;
using sensor_msgs = RosSharp.RosBridgeClient.sensor_msgs;
public class UnityROSConnector : MonoBehaviour
{
[Header("ROS Connection Settings")]
public string rosBridgeURL = "ws://localhost:9090";
public bool autoConnect = true;
[Header("Publishing Settings")]
public float publishRate = 30.0f;
public string robotFrameId = "base_link";
[Header("Subscribing Settings")]
public bool subscribeOdometry = true;
public bool subscribeLaserScan = true;
public bool subscribeCameraImage = true;
private RosSocket rosSocket;
private float lastPublishTime;
private bool isConnected = false;
// Publishers
private string robotPoseTopic = "/unity/robot_pose";
private string interactionTopic = "/unity/interaction";
// Subscribers
private string odomTopic = "/odom";
private string scanTopic = "/laser/scan";
private string cameraTopic = "/camera/image_raw";
// Robot state
public Vector3 robotPosition = Vector3.zero;
public Quaternion robotRotation = Quaternion.identity;
// Object references
[SerializeField] private Transform robotTransform;
[SerializeField] private GameObject laserScanner;
[SerializeField] private Material laserMaterial;
void Start()
{
if (autoConnect)
{
ConnectToROS();
}
}
void Update()
{
if (!isConnected) return;
// Update robot position
if (robotTransform != null)
{
robotPosition = robotTransform.position;
robotRotation = robotTransform.rotation;
}
// Publish robot pose
if (Time.time - lastPublishTime > 1.0f / publishRate)
{
PublishRobotPose();
lastPublishTime = Time.time;
}
}
public void ConnectToROS()
{
rosSocket = new RosSocket(rosBridgeURL);
// Add connection event handlers
rosSocket.OnConnected += () => {
isConnected = true;
Debug.Log("Connected to ROS Bridge");
// Subscribe to topics
if (subscribeOdometry)
SubscribeToOdometry();
if (subscribeLaserScan)
SubscribeToLaserScan();
if (subscribeCameraImage)
SubscribeToCameraImage();
};
rosSocket.OnClosed += () => {
isConnected = false;
Debug.Log("Disconnected from ROS Bridge");
};
rosSocket.OnError += (error) => {
Debug.LogError($"ROS Bridge Error: {error}");
};
rosSocket.Connect();
}
public void DisconnectFromROS()
{
if (rosSocket != null)
{
rosSocket.Close();
isConnected = false;
}
}
#region Publishers
private void PublishRobotPose()
{
var poseMsg = new geometry_msgs.PoseStamped
{
header = new std_msgs.Header
{
frame_id = robotFrameId,
stamp = RosSharp.RosBridgeClient.RosTime.Now()
}
};
// Convert Unity coordinate system to ROS
poseMsg.pose.position = new geometry_msgs.Point
{
x = robotPosition.z,
y = -robotPosition.x,
z = robotPosition.y
};
poseMsg.pose.orientation = new geometry_msgs.Quaternion
{
x = -robotRotation.y,
y = -robotRotation.z,
z = robotRotation.x,
w = robotRotation.w
};
rosSocket.Publish(robotPoseTopic, poseMsg);
}
public void PublishInteractionEvent(string eventType, object data)
{
var interactionMsg = new UnityInteractionMessage
{
timestamp = Time.time,
event_type = eventType,
data = data.ToString()
};
rosSocket.Publish(interactionTopic, interactionMsg);
}
#endregion
#region Subscribers
private void SubscribeToOdometry()
{
rosSocket.Subscribe<nav_msgs.Odometry>(odomTopic, HandleOdometry);
}
private void SubscribeToLaserScan()
{
rosSocket.Subscribe<sensor_msgs.LaserScan>(scanTopic, HandleLaserScan);
}
private void SubscribeToCameraImage()
{
rosSocket.Subscribe<sensor_msgs.Image>(cameraTopic, HandleCameraImage);
}
#endregion
#region Message Handlers
private void HandleOdometry(nav_msgs.Odometry odom)
{
// Convert ROS coordinate system to Unity
Vector3 newPosition = new Vector3(
-odom.pose.pose.position.y,
odom.pose.pose.position.z,
odom.pose.pose.position.x
);
Quaternion newRotation = new Quaternion(
odom.pose.pose.orientation.z,
-odom.pose.pose.orientation.y,
-odom.pose.pose.orientation.x,
odom.pose.pose.orientation.w
);
// Update robot transform
if (robotTransform != null)
{
robotTransform.position = newPosition;
robotTransform.rotation = newRotation;
}
}
private void HandleLaserScan(sensor_msgs.LaserScan scan)
{
if (laserScanner != null && laserMaterial != null)
{
// Create laser visualization
VisualizeLaserScan(scan);
}
}
private void HandleCameraImage(sensor_msgs.Image image)
{
// Process camera image
ProcessCameraImage(image);
}
#endregion
#region Visualization
private void VisualizeLaserScan(sensor_msgs.LaserScan scan)
{
// Create line renderer for laser scan
LineRenderer lineRenderer = laserScanner.GetComponent<LineRenderer>();
if (lineRenderer == null)
{
lineRenderer = laserScanner.AddComponent<LineRenderer>();
}
lineRenderer.positionCount = scan.ranges.Length + 1;
lineRenderer.material = laserMaterial;
lineRenderer.startWidth = 0.02f;
lineRenderer.endWidth = 0.02f;
lineRenderer.useWorldSpace = true;
Vector3 laserOrigin = laserScanner.transform.position;
lineRenderer.SetPosition(0, laserOrigin);
for (int i = 0; i < scan.ranges.Length; i++)
{
float angle = scan.angle_min + i * scan.angle_increment;
float range = scan.ranges[i];
if (range > scan.range_min && range < scan.range_max)
{
Vector3 point = laserOrigin + new Vector3(
range * Mathf.Cos(angle),
range * Mathf.Sin(angle),
0
);
// Transform to Unity coordinates
point = laserScanner.TransformDirection(point) + laserOrigin;
lineRenderer.SetPosition(i + 1, point);
}
else
{
lineRenderer.SetPosition(i + 1, laserOrigin);
}
}
}
private void ProcessCameraImage(sensor_msgs.Image image)
{
// Convert ROS image to Unity texture
StartCoroutine(ProcessImageCoroutine(image));
}
private System.Collections.IEnumerator ProcessImageCoroutine(sensor_msgs.Image image)
{
// Convert image data to Unity texture
Texture2D texture = new Texture2D(image.width, image.height, TextureFormat.RGB24, false);
// Load image data
byte[] imageData = ConvertRosImageToUnity(image);
texture.LoadRawTextureData(image.width, image.height, TextureFormat.RGB24, imageData);
texture.Apply();
// Apply texture to appropriate renderer
// This depends on your Unity scene setup
yield return null;
}
private byte[] ConvertRosImageToUnity(sensor_msgs.Image image)
{
// This is a simplified conversion
// In practice, you'd handle different encodings and color formats
byte[] convertedData = new byte[image.width * image.height * 3];
// Convert RGB8 to Unity's RGB24 format
for (int i = 0; i < image.data.Length; i++)
{
convertedData[i] = image.data[i];
}
return convertedData;
}
#endregion
#region Custom Message Classes
[System.Serializable]
public class UnityInteractionMessage
{
public float timestamp;
public string event_type;
public string data;
}
#endregion
void OnDestroy()
{
DisconnectFromROS();
}
}
8.2.2 Advanced ROS Integration
using UnityEngine;
using RosSharp.RosBridgeClient;
using System.Collections.Generic;
public class AdvancedROSHandler : MonoBehaviour
{
[Header("Advanced ROS Settings")]
public bool enableTFMonitoring = true;
public bool enableParameterSynchronization = true;
public bool enableActionServer = true;
private RosSocket rosSocket;
private Dictionary<string, Transform> tfFrames = new Dictionary<string, Transform>();
// TF monitoring
private string tfTopic = "/tf";
private string tfStaticTopic = "/tf_static";
// Parameter synchronization
private string paramGetService = "/get_parameters";
private string paramSetService = "/set_parameters";
// Action server
private string graspAction = "/grasp_execution";
void Start()
{
// Initialize advanced features
InitializeTFMonitoring();
InitializeParameterSync();
InitializeActionServer();
}
#region TF Monitoring
private void InitializeTFMonitoring()
{
if (!enableTFMonitoring) return;
// Subscribe to TF and static TF
rosSocket.Subscribe<tf2_msgs.TFMessage>(tfTopic, HandleTFMessage);
rosSocket.Subscribe<tf2_msgs.TFMessage>(tfStaticTopic, HandleTFMessage);
}
private void HandleTFMessage(tf2_msgs.TFMessage tfMsg)
{
foreach (var transform in tfMsg.transforms)
{
UpdateTFFrame(transform);
}
}
private void UpdateTFFrame(tf2_msgs.TransformStamped transformMsg)
{
string frameId = transformMsg.child_frame_id;
if (tfFrames.ContainsKey(frameId))
{
Transform tfTransform = tfFrames[frameId];
// Update transform based on ROS TF data
UpdateUnityTransform(tfTransform, transformMsg);
}
else
{
// Create new TF frame
CreateTFFrame(frameId, transformMsg);
}
}
private void UpdateUnityTransform(Transform unityTransform, tf2_msgs.TransformStamped tfTransform)
{
// Convert ROS transform to Unity transform
Vector3 position = new Vector3(
-tfTransform.transform.translation.y,
tfTransform.transform.translation.z,
tfTransform.transform.translation.x
);
Quaternion rotation = new Quaternion(
tfTransform.transform.rotation.z,
-tfTransform.transform.rotation.y,
-tfTransform.transform.rotation.x,
tfTransform.transform.rotation.w
);
unityTransform.localPosition = position;
unityTransform.localRotation = rotation;
}
private void CreateTFFrame(string frameId, tf2_msgs.TransformStamped transformMsg)
{
// Create new GameObject for TF frame
GameObject tfObject = new GameObject($"TF_{frameId}");
tfObject.transform.SetParent(this.transform);
// Store reference
tfFrames[frameId] = tfObject.transform;
// Apply initial transform
UpdateUnityTransform(tfObject.transform, transformMsg);
}
#endregion
#region Parameter Synchronization
private void InitializeParameterSync()
{
if (!enableParameterSynchronization) return;
// Get initial parameters
GetROSParameters();
}
private async void GetROSParameters()
{
if (rosSocket == null) return;
var request = new rcl_interfaces.GetParameters.Request
{
names = new string[] { "/unity/simulation_time", "/unity/robot_scale", "/unity/environment" }
};
var response = await rosSocket.CallService<rcl_interfaces.GetParameters, rcl_interfaces.GetParameters.Response>(
paramGetService, request);
if (response != null)
{
HandleParameterResponse(response);
}
}
private void HandleParameterResponse(rcl_interfaces.GetParameters.Response response)
{
foreach (var param in response.values)
{
UpdateUnityParameter(param);
}
}
private void UpdateUnityParameter(rcl_interfaces.Parameter param)
{
switch (param.name)
{
case "/unity/simulation_time":
if (param.double_value.HasValue)
{
Time.timeScale = (float)param.double_value.Value;
}
break;
case "/unity/robot_scale":
if (param.double_value.HasValue)
{
transform.localScale = Vector3.one * (float)param.double_value.Value;
}
break;
case "/unity/environment":
if (param.string_value != null)
{
LoadUnityEnvironment(param.string_value);
}
break;
}
}
private void LoadUnityEnvironment(string environmentName)
{
// Load Unity environment based on parameter
Debug.Log($"Loading environment: {environmentName}");
// Implementation depends on your Unity setup
}
#endregion
#region Action Server
private void InitializeActionServer()
{
if (!enableActionServer) return;
// Implement action server functionality
Debug.Log("Unity Action Server initialized");
}
public async Task<bool> ExecuteGraspAction(geometry_msgs.PoseStamped targetPose)
{
// Send grasp action goal
var goal = new example_interfaces.GraspGoal
{
target_pose = targetPose,
grasp_type = "pinch",
max_force = 10.0
};
var response = await rosSocket.CallAction<example_interfaces.GraspGoal, example_interfaces.GraspResult>(
graspAction, goal);
return response?.success ?? false;
}
#endregion
#region Utility Functions
public Vector3 ConvertRosPointToUnity(geometry_msgs.Point rosPoint)
{
return new Vector3(
-rosPoint.y,
rosPoint.z,
rosPoint.x
);
}
public Quaternion ConvertRosQuaternionToUnity(geometry_msgs.Quaternion rosQuat)
{
return new Quaternion(
rosQuat.z,
-rosQuat.y,
-rosQuat.x,
rosQuat.w
);
}
public geometry_msgs.Point ConvertUnityPointToRos(Vector3 unityPoint)
{
return new geometry_msgs.Point
{
x = unityPoint.z,
y = -unityPoint.x,
z = unityPoint.y
};
}
public geometry_msgs.Quaternion ConvertUnityQuaternionToRos(Quaternion unityQuat)
{
return new geometry_msgs.Quaternion
{
x = -unityQuat.y,
y = -unityQuat.z,
z = unityQuat.x,
w = unityQuat.w
};
}
#endregion
}
8.3 Photorealistic Rendering
8.3.1 High Definition Render Pipeline (HDRP)
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.HighDefinition;
[ExecuteInEditMode]
public class RoboticsHDRPSetup : MonoBehaviour
{
[Header("Volume Settings")]
public VolumeProfile globalVolume;
[Header("Lighting Settings")]
public LightingSetup lightingSetup;
[Header("Post-Processing")]
public bool enableBloom = true;
public bool enableMotionBlur = true;
public bool enableSSR = true; // Screen Space Reflections
void Start()
{
SetupHDRP();
ConfigureVolume();
SetupLighting();
ConfigureCamera();
}
private void SetupHDRP()
{
// Ensure HDRP asset is assigned
var hdrpAsset = GraphicsSettings.renderPipelineAsset as HDRenderPipelineAsset;
if (hdrpAsset == null)
{
Debug.LogWarning("HDRP Asset not found. Please assign HDRP Asset in Graphics Settings.");
return;
}
// Configure HDRP settings
ConfigureHDRPSettings(hdrpAsset);
}
private void ConfigureHDRPSettings(HDRenderPipelineAsset asset)
{
// Configure default frame settings
asset.defaultFrameSettings.InitDefaultFrameSettings();
// Enable features for robotics visualization
asset.defaultFrameSettingsSettings.xr.enabled = true; // VR/AR support
asset.defaultFrameSettingsSettings.invertColorMask = ColorMask.None;
// Configure ray tracing if available
if (asset.defaultFrameSettingsSettings.supportRaytracing)
{
asset.defaultFrameSettingsSettings.supportRaytracing = true;
asset.defaultFrameSettingsSettings.rayTracing = true;
}
}
private void ConfigureVolume()
{
if (globalVolume == null) return;
VolumeProfile profile = globalVolume.sharedProfile;
if (profile == null)
{
profile = ScriptableObject.CreateInstance<VolumeProfile>();
globalVolume.sharedProfile = profile;
}
// Clear existing settings
profile.components.Clear();
// Add bloom
if (enableBloom)
{
Bloom bloom = profile.Add<Bloom>(false);
bloom.threshold.value = 1f;
bloom.intensity.value = 0.5f;
bloom.tint.value = Color.white;
bloom.softKnee.value = 0.5f;
}
// Add motion blur
if (enableMotionBlur)
{
MotionBlur motionBlur = profile.Add<MotionBlur>(false);
motionBlur.intensity.value = 0.5f;
motionBlur.clamp.value = 0.05f;
}
// Add screen space reflections
if (enableSSR)
{
ScreenSpaceReflection ssr = profile.Add<ScreenSpaceReflection>(false);
ssr.intensity.value = 0.5f;
ssr.thickness.value = 0.5f;
ssr.distanceFade.value = 0.01f;
}
// Add ambient occlusion
AmbientOcclusion ao = profile.Add<AmbientOcclusion>(false);
ao.intensity.value = 0.5f;
ao.thickness.value = 1.0f;
ao.radius.value = 0.25f;
}
private void SetupLighting()
{
// Configure directional light (sun)
Light directionalLight = GameObject.FindObjectOfType<Light>();
if (directionalLight != null && directionalLight.type == LightType.Directional)
{
directionalLight.color = Color.white;
directionalLight.intensity = 1.2f;
directionalLight.shadowStrength = 0.8f;
// Configure HDRI sky for realistic lighting
RenderSettings.skybox = null; // Use HDRI sky
RenderSettings.ambientSkyColor = Color.white;
}
// Create lighting setup
CreateLightingSetup();
}
private void CreateLightingSetup()
{
if (lightingSetup != null)
{
lightingSetup.SetupForRobotics();
}
else
{
CreateDefaultLighting();
}
}
private void CreateDefaultLighting()
{
// Create additional lights for realistic rendering
CreateFillLight();
CreateRimLight();
}
private void CreateFillLight()
{
GameObject fillLightObj = new GameObject("Fill Light");
Light fillLight = fillLightObj.AddComponent<Light>();
fillLight.type = LightType.Directional;
fillLight.color = new Color(0.8f, 0.9f, 1f, 1f);
fillLight.intensity = 0.3f;
fillLight.shadows = LightShadows.None;
fillLightObj.transform.rotation = Quaternion.Euler(50f, -30f, 0f);
}
private void CreateRimLight()
{
GameObject rimLightObj = new GameObject("Rim Light");
Light rimLight = rimLightObj.AddComponent<Light>();
rimLight.type = LightType.Directional;
rimLight.color = new Color(1f, 0.9f, 0.8f, 1f);
rimLight.intensity = 0.4f;
rimLight.shadows = LightShadows.None;
rimLightObj.transform.rotation = Quaternion.Euler(10f, 120f, 0f);
}
private void ConfigureCamera()
{
Camera mainCamera = Camera.main;
if (mainCamera == null)
{
Debug.LogWarning("Main camera not found.");
return;
}
// Configure camera for robotics visualization
ConfigureMainCamera(mainCamera);
}
private void ConfigureMainCamera(Camera camera)
{
// Set camera parameters
camera.fieldOfView = 60f;
camera.nearClipPlane = 0.1f;
camera.farClipPlane = 100f;
// Add camera components for robotics
if (camera.GetComponent<RoboticsCameraController>() == null)
{
camera.gameObject.AddComponent<RoboticsCameraController>();
}
// Configure post-processing stack
ConfigureCameraPostProcessing(camera);
}
private void ConfigureCameraPostProcessing(Camera camera)
{
var additionalCameraData = camera.GetComponent<HDAdditionalCameraData>();
if (additionalCameraData == null)
{
additionalCameraData = camera.gameObject.AddComponent<HDAdditionalCameraData>();
}
// Enable post-processing
additionalCameraData.customRenderingSettings = true;
// Configure antialiasing
additionalCameraData.antialiasing = HDAdditionalCameraData.AntialiasingMode.FastApproximate;
// Configure depth of field if needed for cinematic shots
additionalCameraData.dofMode = HDAdditionalCameraData.DOFMode.Off;
}
}
[System.Serializable]
public class LightingSetup
{
public Color ambientColor = Color.white;
public float ambientIntensity = 0.3f;
public Color sunColor = Color.white;
public float sunIntensity = 1.2f;
public Color skyTint = Color.white;
public float exposure = 1.0f;
public void SetupForRobotics()
{
RenderSettings.ambientLight = ambientColor;
RenderSettings.ambientIntensity = ambientIntensity;
// Configure sky for realistic outdoor rendering
DynamicGI.UpdateEnvironment();
}
}
8.3.2 Material System for Robotics
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.HighDefinition;
public class RoboticsMaterialSystem : MonoBehaviour
{
[Header("Material Settings")]
public Material metalMaterial;
public Material plasticMaterial;
public Material glassMaterial;
public Material carbonFiberMaterial;
[Header("Real-time Settings")]
public bool enableDynamicWear = true;
public bool enableEnvironmentalEffects = true;
void Start()
{
CreateAdvancedMaterials();
SetupMaterialProperties();
}
private void CreateAdvancedMaterials()
{
// Create metallic material with PBR properties
metalMaterial = CreateMetallicMaterial();
// Create plastic material with subsurface scattering
plasticMaterial = CreatePlasticMaterial();
// Create glass material with transparency
glassMaterial = CreateGlassMaterial();
// Create carbon fiber material with anisotropic properties
carbonFiberMaterial = CreateCarbonFiberMaterial();
}
private Material CreateMetallicMaterial()
{
Material mat = new Material(Shader.Find("HDRP/Lit"));
// PBR properties
mat.SetFloat("_Metallic", 1.0f);
mat.SetFloat("_Smoothness", 0.8f);
mat.SetColor("_BaseColor", new Color(0.7f, 0.7f, 0.8f, 1.0f));
// Anisotropic properties for brushed metal
mat.EnableKeyword("_ANISOTROPIC");
mat.SetFloat("_Anisotropy", 0.5f);
mat.SetFloat("_AnisotropyRotation", 0.0f);
return mat;
}
private Material CreatePlasticMaterial()
{
Material mat = new Material(Shader.Find("HDRP/Lit"));
// PBR properties for plastic
mat.SetFloat("_Metallic", 0.0f);
mat.SetFloat("_Smoothness", 0.4f);
mat.SetColor("_BaseColor", new Color(0.2f, 0.3f, 0.8f, 1.0f));
// Subsurface scattering
mat.EnableKeyword("_SUBSURFACE_SCATTERING");
mat.SetVector("_SubsurfaceRadius", new Vector4(0.1f, 0.1f, 0.1f, 0.0f));
mat.SetFloat("_SubsurfaceScattering", 0.5f);
return mat;
}
private Material CreateGlassMaterial()
{
Material mat = new Material(Shader.Find("HDRP/Lit"));
// Transparency
mat.SetFloat("_SurfaceType", 1.0f); // Transparent
mat.SetFloat("_BlendMode", 0.0f); // Alpha blending
mat.SetFloat("_AlphaCutoff", 0.01f);
// Refraction
mat.SetFloat("_IOR", 1.5f);
mat.SetFloat("_Transmission", 0.9f);
// Color
mat.SetColor("_BaseColor", new Color(0.8f, 0.9f, 1.0f, 0.2f));
return mat;
}
private Material CreateCarbonFiberMaterial()
{
Material mat = new Material(Shader.Find("HDRP/Lit"));
// Dark base with metallic highlights
mat.SetFloat("_Metallic", 0.8f);
mat.SetFloat("_Smoothness", 0.6f);
mat.SetColor("_BaseColor", new Color(0.05f, 0.05f, 0.05f, 1.0f));
// Anisotropic pattern
mat.EnableKeyword("_ANISOTROPIC");
mat.SetFloat("_Anisotropy", 0.8f);
mat.SetFloat("_AnisotropyRotation", 45.0f);
return mat;
}
private void SetupMaterialProperties()
{
// Enable material animations and effects
if (enableDynamicWear)
{
SetupWearSystem();
}
if (enableEnvironmentalEffects)
{
SetupEnvironmentalEffects();
}
}
private void SetupWearSystem()
{
// Create wear and tear materials
CreateWearMaterials();
// Initialize wear tracking
StartCoroutine(UpdateWearOverTime());
}
private void CreateWearMaterials()
{
// Create materials with procedural wear
// This would involve texture generation and material property updates
}
private System.Collections.IEnumerator UpdateWearOverTime()
{
while (true)
{
// Update material wear based on usage
UpdateMaterialWear();
yield return new WaitForSeconds(1.0f);
}
}
private void UpdateMaterialWear()
{
// Update smoothness based on usage
if (metalMaterial != null)
{
float currentSmoothness = metalMaterial.GetFloat("_Smoothness");
float wearFactor = Mathf.PingPong(Time.time * 0.1f, 0.2f);
metalMaterial.SetFloat("_Smoothness", currentSmoothness - wearFactor);
}
}
private void SetupEnvironmentalEffects()
{
// Add water droplets, dust, rust, etc.
// These effects would be applied dynamically based on environment
}
public Material GetMaterialForComponent(string componentType)
{
switch (componentType.ToLower())
{
case "metal":
case "aluminum":
case "steel":
return metalMaterial;
case "plastic":
case "polymer":
return plasticMaterial;
case "glass":
case "transparent":
return glassMaterial;
case "carbon":
case "carbon_fiber":
return carbonFiberMaterial;
default:
return metalMaterial;
}
}
public void ApplyMaterial(GameObject target, string materialType)
{
Material material = GetMaterialForComponent(materialType);
if (material != null)
{
Renderer[] renderers = target.GetComponentsInChildren<Renderer>();
foreach (var renderer in renderers)
{
renderer.material = material;
}
}
}
}
8.4 Interactive Robotics Applications
8.4.1 Virtual Robot Control Interface
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.EventSystems;
public class RobotControlInterface : MonoBehaviour
{
[Header("Control References")]
public Joystick movementJoystick;
public Button graspingButton;
public Slider jointSliders;
public Text statusText;
[Header("Robot Information")]
public float maxLinearVelocity = 2.0f;
public float maxAngularVelocity = 1.0f;
private Vector2 movementInput = Vector2.zero;
private bool isGrasping = false;
private float[] jointPositions = new float[6];
// ROS communication
private UnityROSConnector rosConnector;
void Start()
{
// Initialize UI elements
InitializeUI();
// Find ROS connector
rosConnector = FindObjectOfType<UnityROSConnector>();
// Setup event handlers
SetupEventHandlers();
}
private void InitializeUI()
{
// Setup joystick
if (movementJoystick != null)
{
movementJoystick.DeadZone = 0.1f;
movementJoystick.AxisOptions = Joystick.AxisOption.Both;
}
// Setup sliders
if (jointSliders != null)
{
SetupJointSliders();
}
// Setup buttons
if (graspingButton != null)
{
graspingButton.onClick.AddListener(ToggleGrasping);
}
}
private void SetupJointSliders()
{
// Configure sliders for each joint
for (int i = 0; i < jointPositions.Length; i++)
{
jointSliders.minValue = -180f;
jointSliders.maxValue = 180f;
jointSliders.value = 0f;
jointSliders.onValueChanged.AddListener((value) => UpdateJointPosition(i, value));
}
}
private void SetupEventHandlers()
{
// Movement joystick
if (movementJoystick != null)
{
movementJoystick.OnValueUpdated += HandleMovementInput;
}
// Virtual buttons for other controls
SetupVirtualButtons();
}
private void SetupVirtualButtons()
{
// Create virtual buttons for additional controls
CreateVirtualButton("Home", new Vector2(100, 100), HomeRobot);
CreateVirtualButton("Stop", new Vector2(200, 100), StopRobot);
CreateVirtualButton("Emergency Stop", new Vector2(300, 100), EmergencyStop);
}
private void CreateVirtualButton(string label, Vector2 position, UnityEngine.Events.UnityAction callback)
{
GameObject buttonObj = new GameObject($"Button_{label}");
buttonObj.transform.SetParent(this.transform);
RectTransform rectTransform = buttonObj.AddComponent<RectTransform>();
rectTransform.anchoredPosition = position;
rectTransform.sizeDelta = new Vector2(80, 40);
Button button = buttonObj.AddComponent<Button>();
button.onClick.AddListener(callback);
Text buttonText = buttonObj.AddComponent<Text>();
buttonText.text = label;
buttonText.font = Resources.GetBuiltinResource<Font>("Arial.ttf");
buttonText.fontSize = 12;
buttonText.color = Color.white;
buttonText.alignment = TextAnchor.MiddleCenter;
// Style the button
Image buttonImage = buttonObj.AddComponent<Image>();
buttonImage.color = new Color(0.2f, 0.2f, 0.2f, 0.8f);
ButtonStateHelper stateHelper = buttonObj.AddComponent<ButtonStateHelper>();
stateHelper.SetupButton(button);
}
#region Control Methods
private void HandleMovementInput(Vector2 input)
{
movementInput = input;
// Send movement commands to robot
SendMovementCommand(input);
}
private void ToggleGrasping()
{
isGrasping = !isGrasping;
if (graspingButton != null)
{
Text buttonText = graspingButton.GetComponentInChildren<Text>();
if (buttonText != null)
{
buttonText.text = isGrasping ? "Release" : "Grasp";
}
Color buttonColor = isGrasping ? Color.red : Color.green;
Image buttonImage = graspingButton.GetComponent<Image>();
if (buttonImage != null)
{
buttonImage.color = buttonColor;
}
}
// Send grasping command
SendGraspingCommand(isGrasping);
}
private void UpdateJointPosition(int jointIndex, float angle)
{
jointPositions[jointIndex] = angle;
// Send joint position command
SendJointPositionCommand(jointIndex, angle);
}
private void HomeRobot()
{
// Move all joints to home position
for (int i = 0; i < jointPositions.Length; i++)
{
jointPositions[i] = 0f;
if (jointSliders != null)
{
jointSliders.value = 0f;
}
}
SendHomeCommand();
}
private void StopRobot()
{
// Stop all movement
movementInput = Vector2.zero;
isGrasping = false;
SendStopCommand();
}
private void EmergencyStop()
{
// Immediately stop all robot operations
EmergencyStopRobot();
// Show emergency stop status
if (statusText != null)
{
statusText.text = "EMERGENCY STOP ACTIVATED";
statusText.color = Color.red;
}
}
#endregion
#region ROS Communication
private void SendMovementCommand(Vector2 input)
{
if (rosConnector == null) return;
// Convert joystick input to velocity commands
float linearVelocity = input.y * maxLinearVelocity;
float angularVelocity = input.x * maxAngularVelocity;
// Create Twist message
var twistMsg = new geometry_msgs.Twist();
twistMsg.linear.x = linearVelocity;
twistMsg.angular.z = angularVelocity;
// Publish to ROS
rosSocket.Publish("/cmd_vel", twistMsg);
// Update status
UpdateStatus($"Velocity: linear={linearVelocity:F2}, angular={angularVelocity:F2}");
}
private void SendGraspingCommand(bool grasp)
{
if (rosConnector == null) return;
// Create gripper command
var gripperMsg = new GripperCommand();
gripperMsg.position = grasp ? 1.0f : 0.0f;
gripperMsg.max_effort = 10.0f;
// Publish to ROS
rosSocket.Publish("/gripper/command", gripperMsg);
UpdateStatus($"Grasping: {(grasp ? "Engaged" : "Released")}");
}
private void SendJointPositionCommand(int jointIndex, float angle)
{
if (rosConnector == null) return;
// Create joint command
var jointMsg = new sensor_msgs.JointState();
jointMsg.name = new string[] { $"joint_{jointIndex + 1}" };
jointMsg.position = new float[] { angle * Mathf.Deg2Rad };
jointMsg.velocity = new float[] { 0.0f };
jointMsg.effort = new float[] { 0.0f };
// Publish to ROS
rosSocket.Publish("/joint_states", jointMsg);
}
private void SendHomeCommand()
{
if (rosConnector == null) return;
// Create home command
var commandMsg = new std_msgs.String();
commandMsg.data = "home";
rosSocket.Publish("/robot/command", commandMsg);
UpdateStatus("Moving to home position");
}
private void SendStopCommand()
{
if (rosConnector == null) return;
// Create stop command
var commandMsg = new std_msgs.String();
commandMsg.data = "stop";
rosSocket.Publish("/robot/command", commandMsg);
UpdateStatus("Robot stopped");
}
private void EmergencyStopRobot()
{
if (rosConnector == null) return;
// Create emergency stop command
var eStopMsg = new std_msgs.Bool();
eStopMsg.data = true;
rosSocket.Publish("/emergency_stop", eStopMsg);
UpdateStatus("EMERGENCY STOP ACTIVATED");
}
#endregion
#region UI Updates
private void UpdateStatus(string message)
{
if (statusText != null)
{
statusText.text = message;
statusText.color = Color.white;
}
}
private void UpdateRobotStatus(float deltaTime)
{
// Update status based on robot state
if (movementInput.magnitude > 0.1f)
{
UpdateStatus($"Robot moving: v={movementInput.magnitude:F2}");
}
else if (isGrasping)
{
UpdateStatus("Robot grasping");
}
else
{
UpdateStatus("Robot idle");
}
}
#endregion
void Update()
{
// Update status display
UpdateRobotStatus(Time.deltaTime);
}
#region Utility Classes
[System.Serializable]
public class ButtonStateHelper : MonoBehaviour, IPointerEnterHandler, IPointerExitHandler, IPointerDownHandler, IPointerUpHandler
{
private Button button;
private Image image;
private Text text;
void Start()
{
button = GetComponent<Button>();
image = GetComponent<Image>();
text = GetComponentInChildren<Text>();
}
public void SetupButton(Button targetButton)
{
button = targetButton;
image = targetButton.GetComponent<Image>();
text = targetButton.GetComponentInChildren<Text>();
}
public void OnPointerEnter(PointerEventData eventData)
{
if (image != null)
{
image.color = Color.gray;
}
}
public void OnPointerExit(PointerEventData eventData)
{
if (image != null)
{
image.color = Color.white;
}
}
public void OnPointerDown(PointerEventData eventData)
{
if (image != null)
{
image.color = new Color(0.7f, 0.7f, 0.7f, 1f);
}
}
public void OnPointerUp(PointerEventData eventData)
{
if (image != null)
{
image.color = Color.white;
}
}
}
#endregion
}
// Custom message definitions for ROS communication
[System.Serializable]
public class GripperCommand
{
public float position;
public float max_effort;
public bool relative;
}
Summary
This chapter explored Unity's capabilities for robotics visualization, covering integration with ROS 2, photorealistic rendering, and interactive applications. Unity's strength lies in its rendering pipeline and cross-platform deployment, making it ideal for human-robot interaction, training, and commercial applications.
Key takeaways:
- Unity offers superior rendering quality compared to traditional simulators
- ROS-Unity integration enables bidirectional communication
- HDRP provides photorealistic rendering for realistic visualization
- Material systems enable realistic appearance of robot components
- Interactive interfaces enable natural human-robot interaction
- Cross-platform deployment enables mobile and web applications
Exercises
Exercise 8.1: Unity-ROS Integration
Implement a complete Unity-ROS integration:
- Set up ROS-TCP-connector
- Implement bidirectional communication
- Handle multiple message types
- Test with real robot data
- Validate coordinate transformations
Exercise 8.2: Photorealistic Rendering
Create a photorealistic robot visualization:
- Configure HDRP settings
- Create advanced PBR materials
- Implement environmental effects
- Add proper lighting setup
- Measure rendering performance
Exercise 8.3: Interactive Control Interface
Build an interactive robot control interface:
- Design intuitive UI controls
- Implement virtual joysticks
- Add visual feedback
- Handle emergency stops
- Validate usability
Exercise 8.4: Material System
Develop an advanced material system:
- Create realistic metal, plastic, glass materials
- Implement wear and tear effects
- Add environmental interactions
- Optimize for performance
- Test with different lighting
Exercise 8.5: Cross-Platform Deployment
Deploy Unity robotics application:
- Build for different platforms
- Optimize for mobile devices
- Implement WebGL deployment
- Test performance across devices
- Document deployment process
Glossary Terms
- HDRP (High Definition Render Pipeline): Unity's high-end rendering system for photorealistic graphics
- ROS-TCP-Connector: Unity asset for ROS communication via WebSocket bridge
- PBR (Physically Based Rendering): Material system that simulates real-world light behavior
- Material System: Unity system for defining surface properties and appearance
- Volume System: Unity's post-processing and visual effects system
- Anisotropic Materials: Materials with direction-dependent surface properties
- Subsurface Scattering: Light transport through translucent materials
- Motion Blur: Visual effect simulating camera motion
- Screen Space Reflections: Real-time reflection technique
- Virtual Reality (VR): Immersive visualization technology
- Augmented Reality (AR): Overlaying digital information on real world