Signaling Protocol Specification
This document outlines the signaling protocol IROC Bridge package uses to communicate with clients. Designed to communicate between the web client and ROS.
HTTP API
You can use the HTTP API to send requests to interact with the robots and receive status information. The requests and responses are in JSON format.
Robot control
Endpoints for controlling the robots.
GET
/robots
List available robots.POST
/robots/{robot_name}/takeoff
Command takeoff (single)POST
/robots/takeoff
Takeoff all robotsPOST
/robots/{robot_name}/hover
Command hover (single)POST
/robots/hover
Hover all robotsPOST
/robots/{robot_name}/land
Command land (single)POST
/robots/land
Land all robotsPOST
/robots/{robot_name}/home
Command land home (single)POST
/robots/home
Land home all robots
Environment setup
Endpoints for controlling the robot's environment.
NOTE
Each step in this sequence depends on the successful completion of the previous step. Please ensure that you first initialize the origin, then the borders, and finally the obstacles, in that exact order.
-
POST
/safety-area/origin
Set the world origin.Body raw (json)
We currently only support
frame_id
in LATLON (id
: 0){ "frame_id": 0, "x": 47.397978, "y": 8.545299 }
-
POST
/safety-area/borders
Set the safety area borders.Body raw (json)
{ "points": [ { "x": 47.39776, "y": 8.545254 }, { "x": 47.397719, "y": 8.545436 }, { "x": 47.397601, "y": 8.545367 }, { "x": 47.397657, "y": 8.545191 } ], "height_id": 1, "max_z": 347, "min_z": 343 }
-
POST
/safety-area/obstacles
Set the safety area obstacles.Body raw (json)
{ "points": [ { "x": 47.39776, "y": 8.545254 }, { "x": 47.397719, "y": 8.545436 }, { "x": 47.397601, "y": 8.545367 }, { "x": 47.397657, "y": 8.545191 } ], "height_id": 1, "max_z": 347, "min_z": 343 }
Missions
The missions are handled by IROC Fleet Manager
: node responsible of sending the mission to the robots, monitoring their progress and sending the aggregated information to the IROC Bridge
.
-
POST
/mission/waypoints
Set the waypoints for the mission.Body raw (json)
{ "mission": [ { "robot_name": "uav1", "frame_id": 0, "height_id": 0, "points": [ { "x": 10, "y": 10, "z": 2, "heading": 1 }, { "x": -10, "y": 10, "z": 2, "heading": 3 } ], "terminal_action": 0 }, { "robot_name": "uav2", "frame_id": 0, "height_id": 0, "points": [ { "x": 20, "y": 5, "z": 3, "heading": 0 } ], "terminal_action": 0 } ] }
-
POST
/mission/coverage
Set a coverage mission.Body raw (json)
{ "robots": [ "uav1", "uav2" ], "search_area": [ {"x": 47.397978, "y": 8.545299}, {"x": 47.397848, "y": 8.545872}, {"x": 47.397551, "y": 8.545720}, {"x": 47.397699, "y": 8.545129} ], "height_id": 0, "height": 10, "terminal_action": 0 }
-
POST
/mission/autonomy-test
Set the autonomy test mission.Body raw (json)
{ "robot_name": "uav1", "segment_length": 2 }
Mission Control Endpoints
We support for both fleet-wide and individual robot mission control.
Fleet Mission Control:
These endpoints control the mission status for all assigned robots at once: \
POST
/mission/start
Start the mission for all robots.POST
/mission/pause
Pause the mission for all robots.POST
/mission/stop
Stop the mission for all robots.
Robot Mission Control:
You can also control individual mission robots using these endpoints:
-
POST
/robots/{robot_name}/mission/start
Start the mission for a specific robot.NOTE
Starting a mission for a single robot will activate that robot while the others remain in a waiting state. You can later use the/mission/start
endpoint to activate the remaining robots and continue the mission. -
POST
/robots/{robot_name}/mission/pause
Pause the mission for a specific robot. -
POST
/robots/{robot_name}/mission/stop
Stop the mission for a specific robot.NOTE
Stopping the mission for a single robot will also abort the overall mission and stop all other robots. This behavior is intentional, as the mission assumes the participation of all assigned robots.
Feedback
During an active mission, the feedback message is broadcasted to the connected clients through a WebSocket in the /telemetry
path.
-
onmessage
Waypoint Mission and Autonomy Test Feedback.Message raw (json)
{ "type": "WaypointMissionFeedback", "progress": 0.75, "mission_state": "IN_PROGRESS", "message": "EXECUTING", "robots": [ { "robot_name": "uav1", "message": "EXECUTING", "mission_progress": 0.6, "current_goal": 2, "distance_to_goal": 15.3, "goal_estimated_arrival_time": 30, "goal_progress": 0.8, "distance_to_finish": 50.2, "finish_estimated_arrival_time": 50 }, { "robot_name": "uav2", "message": "EXECUTING", "mission_progress": 0.45, "current_goal": 1, "distance_to_goal": 5.7, "goal_estimated_arrival_time": 30, "goal_progress": 0.95, "distance_to_finish": 75.8, "finish_estimated_arrival_time": 50 } ] }
NOTE
Autonomy test follows the same structure as the waypoint mission feedback, but it will always contain only one robot.
Result
When a mission is finished, the result message will be sent to
POST
http://server:8000/api/missions/result
Send the result of the mission.
Body raw (json)
{
"success": true,
"message": "All robots finished successfully",
"robot_results": [
{
"robot_name": "uav1",
"success": true,
"message": "Robot finished successfully"
},
{
"robot_name": "uav2",
"success": true,
"message": "Robot finished successfully"
}
]
}
WebSocket API
You can use the WebSocket API to receive robots telemetry and send requests to control the robots.
Telemetry
Robot's data and status can be received periodically in the /telemetry
path.
-
onmessage
General Robot InfoMessage raw (json)
{ "errors": [], "type": "GeneralRobotInfo", "ready_to_start": 1, "problems_preventing_start": [], "battery_state": { "wh_drained": -1, "percentage": -1, "voltage": -1 }, "robot_type": 0, "robot_name": "uav2" }
-
onmessage
State Estimation InfoMessage raw (json)
{ "type": "StateEstimationInfo", "switchable_estimators": [ "gps_baro", "gps_garmin" ], "velocity": { "angular": { "z": 0, "y": 0, "x": 0 }, "linear": { "z": 4.6765261112091244e-21, "y": 0, "x": 0 } }, "global_pose": { "heading": 1.02729905983773, "altitude": 340, "longitude": 8.545800727209587, "latitude": 47.39776586900617 }, "local_pose": { "z": 0.059999996605801006, "heading": 1.02729905983773, "y": 2.4504742256806935, "x": 15.614331170562465 }, "current_estimator": "gps_baro", "above_ground_level_height": 0.059999996605801, "running_estimators": [ "gps_baro", "gps_garmin" ], "acceleration": { "angular": { "z": 0, "y": 0, "x": 0 }, "linear": { "z": 1.0095692646347513e-18, "y": 0, "x": 0 } }, "estimation_frame": "uav2/gps_garmin_origin", "robot_name": "uav2" }
-
onmessage
Control InfoMessage raw (json)
{ "type": "ControlInfo", "thrust": null, "available_trackers": [], "active_tracker": "unknown", "available_controllers": [], "active_controller": "unknown", "robot_name": "uav2" }
-
onmessage
Collision Avoidance InfoMessage raw (json)
{ "type": "CollisionAvoidanceInfo", "other_robots_visible": [ "uav1" ], "collision_avoidance_enabled": 1, "avoiding_collision": 0, "robot_name": "uav2" }
-
onmessage
UAV InfoMessage raw (json)
{ "mass_nominal": null, "type": "UavInfo", "flight_duration": 0, "flight_state": "OFFBOARD", "offboard": 1, "armed": 1, "robot_name": "uav2" }
-
onmessage
System Health InfoMessage raw (json)
{ "free_ram": 22.789223, "robot_name": "uav2", "cpu_load": 10.102389, "mag_strength": null, "total_ram": 30.061069, "type": "SystemHealthInfo", "mag_uncertainty": null, "free_hdd": 1393, "state_estimation_rate": 20.080807, "hw_api_rate": 99.019608, "control_manager_rate": 0.990196, "gnss_uncertainty": 0, "node_cpu_loads": [ ["/uav2/hw_api", 1.09215], ["/uav2/constraint_manager", 1.09215], ["/uav2/control_manager", 1.09215], ["/uav2/estimation_manager", 0] ], "available_sensors": [ { "name": "pixhawk", "status": "NOT_IMPLEMENTED", "ready": 1, "rate": -1 }, { "rate": -1, "ready": 1, "status": "NOT_IMPLEMENTED", "name": "garmin_down" } ] }
Robot remote control
You can use the WebSocket API to control the robots in the /rc
path.
-
onmessage
Message
Similar to a ping websocket message.Message raw (json)
{ "command": "message", "data": "Hello, World!" }
-
onmessage
Movement
To control the UAV, it receives normalized linear (x
,y
,z
) and angular (yaw
) velocities.Message raw (json)
{ "command": "move", "robot_name": "uav1", "data": { "x": 1.0, "y": -0.5, "z": 0, "heading": 1.0 } }
Camera stream using WebRTC
The features for the camera streaming are available, and the setup can be tested by starting the simulator with the camera argument for that will start the gazebo simulator:
./start --camera
This will start the WebRTC server and allow the camera stream to be visualized on port 9090
of the server.
NOTE
Please follow the instructions for the installation of dependencies in the webrtc_ros repository. A detailed example of how the integration can be done is here.