[Open] FESB Open from Croatia

How many years have you been competing in in RCJ Soccer?

Four and more years

What’s the most interesting thing you’ve prepared for this year’s RoboCupJunior Soccer competition?

Lidar localization (more info in Discussion part)

Where can we find your poster?

It should be available at https://drive.google.com/file/d/1xRnhDvyzx9PhASEGJ26kzaMuY9DtlJwR/view?usp=sharing

If you’d like to know more, please feel free to ask @Ardi123 by replying to this thread !

Hi Guys,

the localization is impressive, did I understand it correctly that you had to drop the particle filter because it did not run quickly enough and are using the estimate from the lines that are the walls directly now?

Also: What do you use the ESP32 for? It seems like the RPi is doing the heavy lifting on the compute.

Best Regards,

Very nice work on the localization algorithm!
I wonder… why not adding line sensors also? Maybe you could use the white line as a fail-safe, and also to “calibrate” the localization algorithm. Or is it not necessary?

Yes, we dropped the particle filter because it was not fast enough for Raspberry to compute. To be fair, our code was in python and we did not try to optimize it a lot. We got the idea to move the lidar higher and fit the lines to walls soon after the first implementation of particle filter, and when we tried to fit the lines it worked great and in real time, so we did not really go back and optimize.

ESP32 was mainly built to run PID, but it also functions as a way to stop motors if the raspberry freezes and reboots.

We decided to go without the line sensors to show that robots can be built without them and still avoid line and to greatly reduce complexity, one big sensor for a whole array of line and ultrasonic sensors. We still had a backup plan to put one sensor in the middle if our localization did not work, but at the end it worked so good that we left it like that.
Relying on these “virtual” lines (the lines that we have precoded in the robot and which it uses to decide if it crossed the line) have enabled us many different things. We solved double defense by just drawing an imaginary line arount 20cm in front of the penalty box.

We can add line sensors but currently with kalman filter we do not have a straightforward way of adding them since their measurement is not gaussian at all. We could try particle filter and line sensors but as I said they are just a hassle, you need to calibrate them, they take stace and add complexity to robot.


Thanks for giving us a good new idea about robot localization, well done Team!

Can you share with us the results you have with this new localization algorithm? Do you use both the omnidirectional mirror and the camera for it, or do they work for independent functions?

Do you mean results from the competitions?
We only managed to test it in one competition before corona, and that was our national competition where we won first place. It did go outside of the line around 0.6 times per game, but that was because of the referee and his straight hands (the robot thought they were the wall) and that was without kalman filter, so the robot was just using current estimate.

We only use lidar, it just really precise on its own, of course it would improve if we added the camera, but that would be marginal. Our idea was to remove the lidar and use only camera in future. This robot would be even simpler and we could fit more stuff in it.

1 Like

Thanks for the explanation. Very nice that you were able to solve the double defense also.

Very nice work indeed! I especially like your aim at robustness – the less sensors your robot has, the less chance there is for it to break down.

I was wondering to what extend is the lidar affected by the other robot’s construction and what effect does it have on the localization’s precision? I can imagine situation in which the construction of the opponent’s robot would obstruct the lidar’s signal significantly but it may be that it does not cause all that much trouble in the grand scheme of things.


– Marek

In general the other robots in the field are not that big of a problem. There are four walls and only three other robots, which means that the robot should be able to see at least one wall. Robots in Open have a small cross section at 15 cm so that means that they should not corrupt the measurement much and the longest straight line that they can have is 22 cm which can easily be removed from “wall status”.

It may seem that other robots blocking lidar are a problem, but that is actually a good thing, because we can then locate and track them. In the early stages we wanted to use this kind of blocking (and that was the reason why the lidar was much lower in the first generation) to then create special strategies, robots could communicate a shared Kalman filter that in its state has positions of robots on our and the opposing team. Sadly we found out that many robots in Open have a smaller cross section the higher you go (because of the omnidirectional mirror) and could not do that at the end.

The scan with other robot provided in the poster (the scan on the right) was with a robot from LWL and even there you can see that the walls are easily seen. And as I shared in my previous answer the out of bounds rate is really low and is not because of other robots.

1 Like