Soccer Simulation - Sensor usage

Hello Everyone :smiley:

This topic was created to collect ideas / suggestions / questions about using sensors in Soccer Simulation.

Did you know you can add sensors to the soccer robot? Always wanted to shorten the latency the robot is suffering from while receiving the position data from the referee? What about adding GPS and Compass sensors to the robots? To achieve this, only the world file (worlds/soccer.wbt) and the (controllers/<team>/rcj_soccer_robot.py) need to be changed.

As a starting point please find linked a modified version of the master branch:
https://orkabot.com/rcj-soccer-sim-master-sens.zip (try to copy/paste it to a new tab if it’s not working)

All the robots are now enhanced with a GPS and a Compass sensor, and the robot’s own position (data[self.name]) is now calculated from these sensors. You can benefit about 3-4 world time step (1-2 robot time step), so you will have your exact position sooner compared to relying on the position data got from the referee. Any code descended from the RCJSoccerRobot class should still work, so this change should not have any impact on your robot’s abilities. All the original source files are also included as *_orig.*, so you can compare them for the changes.

Technical Challenge #2 in RoboCup Junior 2021 Soccer Simulation: if you are interested in trying out these sensors in the challenge, you might want to consider to use the following modified C2 branch:
https://orkabot.com/rcj-soccer-sim-C2-sens.zip (try to copy/paste it to a new tab if it’s not working)

Have fun!

Cheers,
Robert

PS.: Running the simulation with six robots using the sensors on a slow machine may have negative impact on the simulation performance. Before using any sample in a competition, please have your own experiments in your own environment.

1 Like

Here is the link for the official GitHub repository:

1 Like

Hello Everyone :smiley:

Did you know you can add a Lidar to your soccer robot?

Here is a modified version of the master branch, where robots are using Lidar for collision avoidance:
https://orkabot.com/rcj-soccer-sim-master-sens2.zip (try to copy/paste it to a new tab if it’s not working)

For using the Lidars here are a few tips:

  1. You surely want to enable Lidar endpoint visualization by View | Optional Rendering | Show Lidar Point Cloud menu. For this to work, you have to enable it from the Python source code with calling self.lidar.enablePointCloud(). This can be resource intensive, so for production use, you may want to disable it if the point cloud data is not used by your program.
  2. In the Soccer Simulation worlds the Lidar will give you back an array with 360 values. The value at 0 index is in front of us, at 90 is to the right, at 180 is to the back, at 270 is to the left.
  3. If a range value from a Lidar is 0, you can think of it as an error. Also, if the returned value is inf, then the distance at that direction was longer than the Lidar’s maximum measure range.
  4. If you are not familiar yet with Lidars, take your time for some experiments. In the simulation the Lidar has a predefined refreshing rate (~10Hz in the simulation -> 10*360 = 3600 values / sec -> 3.6 value / 0.001 sec). Therefore during one world time step (0.032 sec), only a part of the range data will be refreshed (32 * 3.6 ~= 115 values), the remaining values are from the previous measures, if there are any. So be prepared that most of the values has some latency, check the pictures in the Appendix below. You may need to stop for a few time steps to get all your actual range data refreshed. During experiments, you may want to try your Lidar with your robot standing still at first, then try slowly rotating, and check the Lidar visualization with the menu above, then try to go ahead slowly, etc.

Have fun!

Cheers,
Robert

Appendix
These world time steps are showing how the range array initializes and then how turning quickly with the robot affects the latency of the data represented for the robot:

Here the range array even has not initialized yet, but the robot has already turned a bit. Check the corners of Lidar endpoints rectangle and the shadow of the middle defending blue robot:

Here the robot has already turned a lot, but the range array still holds the values previously measured:

1 Like