I was wondering whether or not it’s normal to exhibit the robot slowing down or having reaction delays later on in the game. For example, during some simulations, after 5 minutes have passed, the robots no longer run at the same speed and the motors don’t seem to be synchronized. My teammates and I all experienced very similar situations so I wanted to ask if this is normal or have we done something wrong?
We did not experience such a behavior with the tests we ran. Is it still happening?
To see if this is something related to your code, you could create and run simple versions of your team and add code step-by-step, running matches at each step.
I am sorry I can’t help you much.
If you solve this issue, please share your solution here.
According to the documentation on Code Submission, it says the .ZIP has to have exactly three top-level subfolders. Our code has an extra folder for our shared class code, because the folder rcj_soccer_player_y1 has a file with the same exact name within it, rcj_soccer_player_y1.py meaning that instead of importing rcj_soccer_player_y1.europa, the file named rcj_soccer_player_y1.py would get imported instead, causing a module import cycle and failing. Are there any recommendations on how to fix this?
Also, will there be any time to fix our programs if it turns out that our submission doesn’t run? There doesn’t seem to be any way to test the ‘production’ version of the sim locally…
You are free to rename your files. You don’t have to use the names from the example code. What is important is that the structure described in the instructions is followed. So, I recommend you move the extra files to the folder of robot1 (for example) so that you end up with exactly 3 top-level subfolders.
Currently, there is no option to check your code against our simulator. This is something we intend to do for the competition in June. What you can do now is to use our example code from GitHub as an opponent and test your code against it.
**Edit: The team ZZ was missing from the list, and is now added. The name of the team Robotics School Makers was corrected to SESI Robotics School Makers. **
In total we got uploads from 41 teams!! The list below shows the names of the teams that uploaded code.
It looks like we are going to have an exciting competition!
The next steps are: we will divide the teams in groups, run the simulated games and publish results and videos from 17 February onwards. Semifinals are going to be on Saturday, 20 February and the finals on Sunday, 21st February.
In the next few days we are going to post more information here in the RCJ forum. Stay tuned!
And thanks for participating!!
On behalf of the RCJ Soccer TC and Soccer Simulation OC
List of teams that uploaded code (alphabetical order):
First of all, I wanted to thank you and all the organizers your kind efforts you did for this amazing competition!
I’m a true enthusiast on how the RoboCupJunior competitions encourage kids to share knowledge and learn from each other. We participated in 2019 Sydney in LightWeight (with LEGO robots ) and learned a lot from posters created by other teams. We went there to learn, and I can confirm it was the best source of knowledge we could get.
I know that sharing codes was mentioned from the first steps, and recently was mentioned in the Issues and questions topic as well:
As the virtual competition is something really new, may I share some of my concerns about the “how to” questions I’m recently thinking of. While releasing all the team’s full source code is the most obvious and easiest way to share their knowledge, maybe it has some didactical and practical drawbacks.
With all the hardware requirements in the background, releasing the source code in traditional competitions gives a good hint to other teams, and since hardware evolves dramatically from year to year, it does not really give disadvantage to any team. The next year they can come up with completely new methods and ideas based on new hardware and techniques appearing.
In contrast, in a simulated virtual environment where the software is everything, releasing all the source code could easily mean that any team can start with the last champions’ result, and evolve from there. Teams can use clipboard to copy paste slices of codes even from different teams, finally creating a new code. From the organizer’s perspective, it would be hard if not impossible to check the originality of any source codes. In a few seasons, most of the teams would play the champions’ merged code, hard to differentiate between them, leading to equals in goals and giving luck a much higher role in match results.
By sharing the code the human connection could be lost between teams. If the code is there, no need to chat, discuss, etc.
Coaching my team I would not like them to learn by reading other team’s ready source codes. I’d rather like them to learn by analyzing and experiencing, and designing / developing code through failing. It was really heart warming for me to see them replaying the simulated matches, and trying to figure out which team did what and why. They had about a dozen of new ideas just from watching the matches. I know that coding can be hard, it is for my team as well, but they can google up slices of codes if they get stucked with Python.
So I would not recommend to release full source codes, but rather open this question to the community, and with their help work out a bit more sophisticated way of knowledge sharing.
A. Teams could earn points not only by winning matches but by other means as well.
B. Posters are important part of the traditional competitions, why not transfer this to the simulation environment? Simulation posters could be a requirement maybe with slightly different areas, e.g. instead of hardware, maybe more detailed strategy and techniques.
C. On posters each team could share some of their code, e.g. the slices they most proud of. I admit the importance of learning from slices of source codes as well
D. Maybe teams could earn some points to get in contact with other teams before the competition, and help each other in questions the other team has.
E. It would be nice to have an online evaluation environment which could serve trial code uploads before final deadline as well as for autonomous play of the uploaded codes with earlier season’s selected team’s code. This way anyone could try their current code against e.g any earlier champion’s code. Maybe the resource have to be restricted for a limited trials per day for practical reasons, but anyway, this would give a big challenge opportunity. Any code still wanted to be shared could be built into the example codes of the installation files.
F. Etc. I’m pretty sure other teams would also have really cool ideas
What do you think?
Could these questions be opened to the teams and the community?
If you plan an online meeting to discuss technical questions, do you think it could serve as an opportunity to discuss also knowledge sharing with the community?
I’ve discussed your question with the kids in my team, and collected their feedbacks and suggestions.
We would suggest the followings for human connections between teams in RCJ virtual competitions:
Registration of a team should require the team to have a RCJ Forum registration (e.g. mandatory field in registration form to fill in RCJ Forum @ name representing the team)
After registrations are closed, the OC should create pairs of teams.
Let say a pair consists of Team A and Team B, these teams preferably should originate from different countries, but in same/near time zones to each other
In the registration feedback on the forum the OC should list the team pairs with their RCJ forum @ names so that they can get in contact with each other via RCJ Forum private messages
It would be up to the teams to choose a common online tool and organize their online collaboration target and schedule. Their common target could be e.g. to develop a method both the teams aimed to achieve, like smoothing the robot’s move or how to create code that can be uploaded onto any robot without dependencies. It’s up to the teams’ demands and fantasy, the aim is to have both team benefit from the common work.
Let’s open a dedicated RCJ forum topic for teams to share collaboration results let’s say “Soccer Simulation 2021 - Collaboration”
Each pair of team should post their common article into this topic presenting their collaboration results
Let’s open a dedicated RCJ forum topic for teams to help each other let’s say “Soccer Simulation 2021 - Coopetition Q&A”
In this topic either a question or an answer to a question could be posted
Each team would be required to have at least 2 posts in this RCJ Forum topic
For veteran teams at least one of their posts should be an answer for another team’s question
Integration to the competition
The code upload form should have dedicated fields where teams can copy & paste of their RCJ Forum posts’ share link into
If posters are implemented in the Soccer Simulation, then the hardware section should be replaced by presenting the team’s collaboration / coopetition results.
Teams should be awarded with points for their respective works by judges
Most popular / most innovative / etc. posts could earn extra points given by the community or judges
These points could earn a dedicated separate award (replacing robot design award?) as well as count to the match progression / final results in some way. For example this kind of cooperation would be a requirement for a team to be eligible to get into the final 16 teams. Or these points could decide in case of tie (equal goals at the end of a match), etc.
We considered the following design principles during composing the suggestions above:
Keep the spirit: “It is not whether you win or lose, but how much you learn that counts!”
The suggestions should require the least extra work from organizers (OC), should constrain teams the least and provide the maximum benefits for both the teams and the community
The coopetition / collaboration results should be well documented, shared, and should be measurable by OC
Teams should be motivated in coopetition and collaboration, and we count their learning demand as a primer motivation
RCJ Forum as a tool is ready, easy to use and safe, and satisfy the previous requirements. Organizers (OC) are familiar with RCJ Forum, they have complete control over the RCJ Forum content, e.g. they can provide admin support, moderate, etc.
Boosting RCJ Forum traffic would boost knowledge sharing as well, RCJ community would have immediate and direct benefit
Other collaboration platforms are more than welcome, but should not be mandatory but just optional and decided by the collaborating teams on their own.
What do you think?
Do these suggestions make sense?
Anyone having any feedback / ideas / suggestions, please feel free to share
Thanks a lot for your feedback and suggestions! I am sorry for my late reaction… I am glad we could talk during the meeting earlier today to discuss some of the above points. We will consider your suggestions.