[Open] XLC-Innovation from Slovakia

How many years have you been competing in in RCJ Soccer?

Four and more years

What’s the most interesting thing you’ve prepared for this year’s RoboCupJunior Soccer competition?

We used tiny YOLO v3 neural network accelerated on Coral USB TPU accelerator to detect the ball, goals and our teammate. More information on:



Where can we find your poster?

It should be available at https://drive.google.com/file/d/1mmYTOxWSNOsuLN-UX9sywHZmTKlaXygj/view?usp=sharing

Also, the team has prepared this video: https://www.youtube.com/watch?v=fuTrGaSgjog


If you’d like to know more, please feel free to ask @bukajlag by replying to this thread or check the team’s website/repository at https://github.com/xlcteam/OpenBot !

Hi there! Great work done on your robot, especially on your neural networks.
I would just like to know more about the latency you experience on your ball detection, have you experienced any issues so far? Did you measure the latency of your system?
I’m asking this since our team has tried to use Python to program the RPi or used USB cameras and we found that USB & Python’s interpreter were great bottlenecks.

Hello! First of all, thank you for checking out my work.
This is a good question. I used CSI to communicate with the camera and since I am using Raspberry pi 4 there are 2 USB 3.0 ports available, so the USB is not a bottleneck. I also have a USB Coral Accelator(https://coral.ai/products/accelerator/) ,which communicates with Raspberry through one of these USB 3.0 ports for accelerating the neural network. Therefore, I can achieve a speed of 27 FPS, which is good enough. There have been some issues with compatibility because of some problems with compiling(https://coral.ai/docs/edgetpu/compiler/#system-requirements) but, fortunately, I was able to overcome them.

Have you tried to detect objects with an omnidirectional mirror?
Does it generalize well to other fields, have you tried it in a competition?

Excellent question,
I haven’t used the omnidirectional mirror this year yet, but it will definitely come up next year.
Unfortunately, I don’t have more playing fields to test and no competitions I could go to because of the covid 19 situation, but I will test it as soon as possible. So far, I have changed the environment by moving to different rooms in the house and it looked good. And if there was a problem, I could enlarge the dataset, which should help. I also started working on my own architecture specializing in robocup, which could also help.

Hi! I was mostly asking about latency over USB due to your Coral Accelerator, but that FPS seems to be good enough. Thanks for the reply!

Excellent work! Congrats!

1 Like

Good presentation team! Thanks for showing us your improvements.

My questions are:

  1. How many team members are in XLC-Innovation team? I only saw Jakub listed

  2. Can you provide more details about the vision system? Camera used and capabilities

  3. What are ultrasonic sensors used for in the line system?

Unfortunately, I were not able to find any teammate till the covid 19 lock down in my country but I did not give up and I have done all the work myself. I do not consider it to be an advantage over the other teams. However, I am aware that this is supposed to be a team competition and I will try to find a teammate in future years but due to this year’s special circumstances I sadly was not able to do that.
2.
As written here https://www.kaggle.com/jakubgal/robocup-junior-open-xlc
we use this cheap camera, which has a large viewing angle that is still below the limit. We are able to detect our teammate, the ball and the goal (distance and angle to the camera). We detect a teammate using a simple but not ordinary black and white pattern that is easy for training of our neural network and ,at the same time, our robot is easy to distinguish from the opponent. Moreover, we use the neural network for detection,so neither light condition nor most of any interfering colors are not a problem for us. It also reduces time for setup of the robot.
3.
We use ultrasonic sensors to verify that we did not accidentally cross to the other side of the line when the robot hits it. This means that when the robot hits, for example, 90 degrees to the right, it checks whether the distance to the right is less than the one to the left, and when it is not, it still checks whether the closer distance is less than a set constant (whichever is before goals). Basically, it works as a checking mechanism for the light sensors.