Jrobots Info
[English] [Italiano] [Espaņol]
What are Jrobots?
These two paragraphs describe the purpose and the main characteristics of the game.
Introduction
Jrobots is a clone of Crobots, an old DOS game written by Tom Poindexter in 1985. There are some robots that fight in an arena, firing missiles and avoiding enemies' projectiles. This game seems quite involving, but it's not interactive. First, you must develop the algorithms of your own robots in a particular programming language, and then you upload them in the arena and follow on-line their fights against other robots. You can't help them, the robots are alone and you must teach them the combat techniques to survive. To see some robots fighting, go to the Match Applet page.
Index
The Robots and the Battlefield
The following paragraphs describe the robots and battlefield characteristics. To develop your own Jrobots you must know these data.
The battlefield is a square with a side of 1 Km (1000 meters). When a robot hits the walls of this square, it earns 2 damage points out of a total amount of 100 and the engine stops. It's not the worst thing, but it's not a good one, so avoid the walls of the battlefield. When a robot reaches 100 damage points, it's disabled and loses the game.
Index
There are three types of play:
- Single Match: Two robots fight one against the other.
- Double Match: Two couples of robots fight one against the other. This type of play is more difficult than the previous because it's not simple to distinguish the friend from the enemies.
- Team Match: Four team of eight robots each fight one against all the others.
All matches last for 180 seconds. The robot or the team that disables all the other robots in the battlefield wins.
Index
Robots have an engine and they can run everywhere in the battlefield. The maximum speed of the robots is 30 m/s (meters per second), i.e. 100 Km/h, and the acceleration is 5 m/s2. This means that a robot needs six seconds to reach the maximum speed.
When the engine has 0% power, the robot stops with a deceleration of 5 m/s2, while a 100% power gives the maximum speed.
When a robot hits the walls, the engine reaches 0% power and speed suddenly falls to 0 m/s.
Index
Robots have a cannon. This cannon fires missiles. The robot can point the cannon all around and can fire all the missiles it wants, but there is a reload time of 1 second.
Index
Missiles have a range of 700 meters and a speed of 300 m/s, so they need 2 seconds and a third to reach the maximum range. The speed of the missile is independent from the speed of the robot, so it's always 300 m/s. When a missile explodes, it gives damage points to all the robots nearby (remember that 100 damage points disable a robot). Damage points depend on the distance of the robot from the explosion. This is the correspondence
- 5 meters 10 damage points
- 20 meters 5 damage points
- 40 meters 3 damage points
If a robot fires a missile within a circle of 5 meters radius, it gives itself 10 damage points, so it's better to fire the missiles far away.
Index
Robots use a scanner to find other robots. It scans the battlefield with a resolution from 1 to 20 degrees. Scanning the battlefield, the robot receives the distance of the nearest robot (both friend or enemy) or zero if there is no one in that sector.
Index
How can I develop a Jrobot?
This section describes the procedure you can follow to program your own robot.
The Programming Language (Java)
Crobots uses C as programming language. Jrobots uses Java instead.
To program your robots you must write Java classes, but it's not so difficult: you need only a Java compiler (it's free) and Java has a syntax similar to C. If you have already developed a Crobot, the conversion to a Jrobot is straightforward.
To get some examples of coding go to the Downloads page.
Index
Rules and Limitations
These are the rules you need to know:
Index
Available Functions
You can manage the cannon, the engine and the scanner using these methods:
- degrees is the direction in degrees of the shot (angles start from 3 o'clock and increase clockwise).
- range is the distance where the missile explodes.
- returns 1 if the missile was fired, 0 if not (due to reload time).
Index
- returns the damage points of the robot: 0 to 99 means alive, 100 means dead (the robot will never read this value).
Index
- degrees is the direction of movement of the robot (angles start from 3 o'clock and increase clockwise). You must remember that robots can change their direction only if the speed is lower that 50% (say 15 m/s).
- speed is the speed in percent that the robot must reach: 0 means 0 m/s, 100 means 30 m/s.
- returns nothing.
Index
- degrees is the direction in degrees of the scan (angles start from 3 o'clock and increase clockwise).
- resolution is the width of the scan in degrees (scan start degrees-resolution/2 to degrees+resolution/2). Its value must be greater than 0 and lesser than 21 degrees.
- returns the distance of the nearest robot found in that sector, or zero if no robot was found.
Index
- returns the X coordinate of the robot in the battlefield (the origin is in the upper-left corner and X coordinates increase to the right).
Index
- returns the Y coordinate of the robot in the battlefield (the origin is in the upper-left corner and Y coordinates increase to the bottom).
Index
- returns the speed of the robot in percent: 0 means 0 m/s, 100 means 30 m/s.
Index
- returns the elapsed seconds from the beginning of the fight (this function wasn't included in the original Crobots game. It's available to get better timing control).
Index
- returns the identification number of the robot in the team. If there are n robots in the team, this number goes from 0, the first robot created, to n-1, the last one (this function wasn't included in the original Crobots game. It's available to distinguish between robots in the team and double matches).
Index
You can't use the java.lang.Math class, but there are several built-in functions to perform math calculations. There are the original Crobots functions (for compatibility only) and advanced versions, too.
The original are:
- int rand(int limit) to get random int values.
- int sqrt(int value) square root.
- int sin(int degrees) sin function (result is sinus times 100000).
- int cos(int degrees) cos function (result is cosinus times 100000).
- int tan(int degrees) tan function (result is tangent times 100000).
- int atan(int value) atan function (value must be 100000 times the real value and result is in degrees).
The advanced are:
- double d_sqrt(double value) square root.
- double d_sin(double radians) sin function.
- double d_cos(double radians) cos function.
- double d_tan(double radians) tan function.
- double d_atan(double value) atan function.
- double deg2rad(double degrees) to convert from degrees to radians.
Index
Extensions to the Basic API
Those listed above are the basic methods available. The following is a list of extensions already included in the SDK.
EXTENSIONS
----------
SUMMARY
=======
1. JJVector class
2. JJRobot class (additions)
2.1 additions by Christo Fogelberg (November 2001)
2.2 additions by Alan Lund (February 2002)
2.3 additions by Tim Strazny (February 2002)
-----------------
1. JJVector class
=================
by Christo Fogelberg (doubtme@hotmail.com) (November 2001)
--------------------
additions by Alan Lund (alan.lund@acm.org) (February 2002)
----------------------
Function Reference:
# Constructors:
JJVector()
- Initializes to x=0, y=0, t=1
JJVector(double x, double y)
- Initializes to x, y, t=1
JJVector(double x, double y, double t)
- Initializes to x, y, t
JJVector(JJVector v)
- Copy constructor
# Factory methods:
static JJVector Polar(double r, double a)
static JJVector Polar(double r, double a, double t)
- Construct a vector using polar coordinates instead of cartesian.
# Basic Mutators:
void set_x(double x)
void set_y(double y)
void set_t(double t)
void set (double x, double y)
void set (double x, double y, double t)
void set (JJVector v)
Using set(JJVector) is _not_ the same as using "=". For JJVectors a and b,
a = b;
a.set(b);
do different things - in the first case a points to the same JJVector as b.
In the second, the value of the JJVector pointed to by a is the same as that
pointed to by b - so they can be changed independently.
# Basic Maths Functions:
None of these affect the internal state of the vector that calls them.
public double x() - returns the x component of the vector
public double y() - returns the y component of the vector
public double t() - returns the t component of the vector
public double mag() - returns the magnitude of the vector
public double angle() - returns the angle of the vector
public double r() - returns the r (polar) coordinate of the vector
(same as mag())
public double a() - returns the a (polar) coordinate of the vector
(same as angle())
public double dot(JJVector v)
- returns this . v, i.e. the dot product of this and v.
public double speed()
- returns the speed (distance / time) represented by the vector;
that is, mag() / t().
public JJVector velocity()
public JJVector velocity(JJVector result)
- Computes the velocity vector, which is just the vector pointing
in the same direction as "this", but with both time and distance
scaled, such that result.t() == 1. (See below for the
usefulness of the second form.)
# Vector arithmetic:
There are two versions of each of these functions, the convenient
version and the potentially faster version.
First the convenient ones...
public JJVector plus(JJVector v)
- Adds "this" and "v" and returns the result
public JJVector minus(JJVector v)
- Subtracts "v" from "this" and returns the result
public JJVector mult(double d)
- Multiples" this" by "d" and returns the result
public JJVector rotate(double a)
- Rotates this by "a" degrees and returns the result
Note that the time component _is_ affected by all but rotate().
The following maths functions are designed slightly differently.
The returned JJVector is a reference to the result argument.
This avoids doing lots of ugly construction outside the control
of the caller - instead, they can construct temporarys as and
when necessary, while still chaining ops together.
public JJVector plus(JJVector v, JJVector result)
- sums "this" and "v" and returns the result.
public JJVector minus(JJVector v, JJVector result)
- returns "this" - "v"
public JJVector mult(double d, JJVector result)
- multiples "this" by the scalar "d" and returns the result
public JJVector rotate(double a, JJVector result)
- Rotates this by "a" degrees and returns the result
# Helpful Functions:
Pretty much the ones I found I needed when I didn't want to do a
whole bunch of calculating in my code.
Feel free to request new ones.
public double dist(JJVector v)
- returns the distance beteween the two vectors. Will always be
positive. Same as minus(v).mag().
public JJVector unit()
public JJVector unit(JJVector result)
- returns the "unit-ised" vector of this. This function works in
the same way as plus, minus and mult.
Sample Use:
Here is a function that calculates the estimated future position of a
target using a prior location (and time) and a current location (and
time), as well as the amount of time to predict into the future,
assuming simple target leading.
private JJVector lead(JJVector lastPosition,
JJVector newPosition,
double timeFromNow)
{
JJVector velocity = newPosition.minus(lastPosition).velocity();
return newPosition.plus(velocity.mult(timeFromNow));
}
The estimated missile flight time might be
JJVector myLocation = ...
JJVector targetLocation = ...
double flightTime = myLocation.dist(targetLocation) / 300.0;
-----------------
2. JJRobots class (additions)
=================
2.1 by Christo Fogelberg (doubtme @ hotmail.com) (November 2001)
------------------------
double rad2deg(double val) - converts val radians to an equivalent value
between 0 and 360, measured in degrees.
double d_abs(double val) - returns the absolute value of val.
double d_rnd(double val) - returns val, as a double.
int i_rnd(double val) - returns val, as an integer.
2.2 by Alan Lund (alan.lund @ acm.org) (February 2002)
----------------
double d_loc_x() - returns the x location of the robot as a double
double d_loc_y() - returns the x location of the robot as a double
- more precise than the older integer methods loc_x and loc_y
int cannon(JJVector v)
- v gives direction and distance
- returns 1 if the cannon shot, 0 otherwise
JJVector vscan(int degree, int resolution)
- arguments work as in usual scan method
- returns the relative position as a JJVector
void drive(JJVector v)
- v gives direction and REAL speed (0 to 30 in virtual m/s)
double actual_speed()
- returns the real speed (0 to 30 in virtual m/s) as double
double heading()
- returns the robot's current direction in radians
JJVector velocity()
- returns the robot's current direction and REAL speed
JJVector location()
- returns the robot's current location and the current time
2.3 by Tim Strazny (timstrazny @ gmx.de) (February 2002)
------------------
double exp(double value) - returns the exp
double log(double value) - returns the log
2.3 by Leonardo Boselli (boselli @ uno.it) (March 2002)
-----------------------
int getFriendsCount() - returns the number of team mates
- gives 1 for SINGLE, 2 for DOUBLE, 8 for TEAM
- doesn't report the number of mates still active (!)
Index
How can I send my Jrobot to challenge other robots?
This section describes the Jrobots submission process.
Choose an Account
First of all you must get an account. Follow the instructions you find in the Accounts page.
Index
Upload
Then you can upload the file. Follow the instructions you find in the Uploads page.
Index
How can I convert my old Crobot to a new Jrobot?
This section describes how to convert an existing Crobot in a new Jrobot.
Convert
The conversion is quite straightforward. In fact, the syntax of C is similar to the syntax of Java. To see the changes, you can find in the ZIP file Jrobots Java Classes & Docs some sample robots. They are the original robots Sniper, Counter, Rook and Rabbit converted.
The changes are simple. However some settings may be different, because Jrobots uses another simulation engine than Crobots. Even timings that were performed with cycles, now must use the time() function because the CPU clocks of the computers where fights run may be quite different.
Send me an email for more information about the Java language or the conversion of C code.
Index
Appendices
These two sections describe aspects of the game not directly involved in the developement of the Jrobots: The selection rule used to choose the opponents during a tournament and the algorithm used to generate the virtual time of the fights.
Appendix A - The Weighted Selection Rule
The new selection strategy for Jrobots challenges
--------------------------------------------------
Excerpt from three emails of Alan Lund < alan.lund @ acm.org >
(aka "Zeke" - author of "IonStorm")
[First Post]
I hate to say it, but I'm pretty sure there's a problem with the
"Near" [or "Bubble-sorting"] selection algorithm...
I've been thinking about how it should work out, and finally wrote
a little simulation, with five robots, A through E.
Here is a table with the simulated winning percentages of each pair.
vs A B C D E
A - 70 75 80 90
B 30 - 60 65 80
C 25 40 - 52 80
D 20 35 48 - 80
E 10 20 20 20 -
So A is the strongest (by far), B next, C next (barely), D next,
and E last (by far).
I ran a simulated 4000 combats several times, which is
proportionately much higher than what the real arena would see.
Here are some results:
a 0.7302371542
c 0.511662531
b 0.5114541833
d 0.5111336032
e 0.1951466127
a 0.7329490874
d 0.51017294
b 0.5099403579
c 0.5094243505
e 0.2043222004
a 0.7064128257
b 0.517611026
c 0.5165923725
d 0.5160811479
e 0.1944167498
a 0.7388781431
c 0.5084409136
b 0.5077772203
d 0.505758638
e 0.1975051975
Basically, the presence of a very strong A and a very weak E act as
mixers. Whoever is in second (or second to last) very quickly wins
(or loses) enough to move back toward the middle. A ends up with a
winning percentage between it's percentage vs B and C... running the
simulation longer gets it very close. Similarly for E: it ends up
with a percentage approximately equal to the best it does vs C and D.
B, C and D end up in the middle, all very close together (usually
within 0.1%!), and their order is not particularly related to their
relative strengths. In the four samples above we see CBD, DBC, BCD,
and CBD (again).
Now, the real mix of winning percentages is not going to be so nice,
but I'm pretty sure that what is going to happen is that if there is a
robot that beats each opponent more than 50% of the time, its
winning percentage will tend toward the average of the winning
percentages it has versus the top few remaining robots. Which one of
those robots gets second place is a question of when the tournament
ends. The second best robot (B) has a better probability of ending up
second, but it's really a matter of luck. And it doesn't matter if
you run the tournament longer. In some ways, that just guarantees
that the middle few will converge to the middle and that the last few
combats will decide the final placement.
I am sure there are lots of other things that could be examined,
including things like A beats B, B beats C, C beats A, or that no
robot beats all others more than 50% of the time. But as it happens,
IonStorm (in my tests) does beat each of the other robots at least 70%
of the time (in Single play). So, I expect that a true bubble sort
selection algorithm would clearly (and fairly) show IonStorm on top,
but the remaining places would be somewhat questionable, and the
situation would get worse as time went on...
Now, I started this message by mentioning that I don't think the
selection algorithm is working quite right. Given the above, that
might not be so bad. Mixing up the opponents, but giving preference
to those near, should help maintain a more representative ordering.
In any case, I do agree that there should be some way to distinguish
among the top few robots better, but I'm not sure yet what that should
be. I do think that a good starting point would be to use random
selection and keep a record of each individual match. Then you could
do some more detailed reporting.
For instance, maybe you could do the rating in two passes. First, you
rank all the robots using their overall win percentages. Then, rank
them again using their win percentages vs. the top five other robots,
or the top 25%, or whatever. (Team mode would be more complicated.)
That's just one idea. But recording the results of individual matches
would allow you to play with some of those kinds of things without
having to re-run the entire tournament. It would also provide some
helpful debugging information when the selection algorithm is not
working as planned (so we could find out, for instance, who IonStorm
is fighting that isn't KillerBees).
[Second Post]
I've been working on evaluating different selection strategies, and I
have a proposal.
The method I used was to write a simulation in which seven robots did
combat. Each robot had a particular probability of winning against
each other robot; different opponents had different probabilities.
In general, there was not an "obvious" order... for instance, A beats
B, B beats C, C beats A, and that sort of thing.
I took the ideal ranking of the robots to be what would be obtained
after a large number of combats in which each pair was chosen an equal
number of times. This just reduces to taking the average of each
robot's per-opponent winning percentage.
So given a defined "best" ranking, I evaluated different selection
strategies based on how quickly and reliably they reproduced the
correct ordering. (The simulated combats are just random numbers
evaluated against the winning percentage of one robot over the other.)
I evaluated purely random pairings as JRobots does now in Random mode,
which I'll call RANDOM and five other selection strategies, all of
which were variations on a weighted selection.
Four of the others were really the same algorithm, just with different
values for one parameter. The algorithm is
1. Pick one robot (Ri) at random (0 <= i < N)
2. Calculate the difference in overall winning percentage between Ri
and for each other robot Rj, 0 <= j < N, j <> i. Call these
differences Dj. 0 <= Dj <= 1 for each j <> i;
3. Calculate "weights" for each robot. Wj = 1 - Dj^P, where ^ is
exponentiation and P is the parameter I mentioned earlier.
0 < P <= 1. (Actually P could be greater than 1, but I didn't
try any such values.) Wj is greater when Dj is smaller, so Wj
increases as the winning percentage of Rj gets closer to that
of Ri. Due to the way that Dj^P looks for 0 <= Dj <= 1, smaller
values of P correspond to "narrower" selections... that is nearer
robots get chosen more often.
4. Choose an opponent Rk with probability Wk / sum(Wj). (Or in team
mode, choose three opponents.)
As I said, I tried four values for P: 0.25, 0.3333, 0.5 and 1.0. Call
these WEIGHTED(0.25), ... WEIGHTED(1.0).
The last strategy I tried was too increase the probability of a robot
being chosen (in step 1 above) based on how close it was to its
nearest neighbor in overall winning percentage. The idea there was
that allocating more combats to robots that are close together would
disambiguate the ranking more quickly. Call this one GOOFY. (Sorry,
I can't think of a good name... it won't matter.)
In the end, WEIGHTED(1.0) performed the best. In simulated tournaments
of about 6000 simulated combats, that selection strategy yielded the
correct ordering about 70% of the time. RANDOM produced about 45%
correct values. WEIGHTED(0.5) was also about 70% accurate, but was
less accurate for smaller numbers of combats than WEIGHTED(1.0) was.
WEIGHTED(0.3333) and WEIGHTED(0.25) finished at about 68% and 61%
respectively. GOOFY finished at a terrible 44%, about the same as
RANDOM, but looking at the graph of success rate vs. number of
combats, it was barely improving with more combats, and was never
competitive with any of the WEIGHTED strategies.
Assuming the results aren't too specific to the test data I used, it
appears that using WEIGHTED(1.0) or even WEIGHTED(0.5) would result in
approximately a 50% increase in "efficiency" (70 / 45 = 1.56).
It's also depressing to see how _many_ combats are required: 6000 just
to get a 70% chance of being right, and further combats improve the
percentage more and more slowly. I'd guess that the number of
required combats goes up at least linearly and quite possibly as the
square of the number of robots.
Of course, that is all evaluating the selection strategies on getting
the entire ranking correct. An alternative would be to put more
emphasis on just getting the top 6, or the top 25% or top 50% correct.
This would correspond to modifying Step 1 of the algorithm above to
give the chosen few a better chance of getting selected.
[Third and last post]
I modified the second half of initArena(), replacing the old "near"
selection with the weighted selection. I also added two private helper
methods, magnifyUserSelections() and weightedSelection(). I tried to
comment rather thoroughly.
As you read through it, you will see three parameters that you could
adjust: "floor", "threshhold" and "p". "floor" and "threshhold" are
used to compute a value for "c", which was one of two parameters I
mentioned previously.
"floor" is basically the lower limit for "c". I set it to 0.05.
"threshhold" provides some control over how quickly the selection of
the first robot goes from mostly random to biased. Higher values of
"threshhold" result in a slower transition. I made this 5 times the
number of robots in the tournament.
"p" is used to control how heavily the selection of additional robots
is biased towards those with a winning percentage near that of the
first robot. Smaller values of "p" produce a higher bias. Larger
values produce a more even selection. I set this to 1.0, which was
the value that worked best in my simulations.
The algorithm does correctly handle the cases where the user has
chosen one or more robots in the list. Robots are picked
preferentially from the selected items, and then from the unselected
items if necessary.
I ran several thousand single and team matches, and the match
distribution seems reasonable.
It may shift percentages downward, since the best robots get a
disproportionate number of matches (meaning that others will be beaten
more than usual). You may want to consider whether or not to adjust
the 50/50/25 rule based on this effect.
For instance, I got the following results for fourteen robots in
Singles matches (run at 4X if matters):
Robot Wins Matches Win %
------------------------------------------
IonStorm 620 660 93.9
KillerBees 570 707 80.6
Myst 356 601 59.2
IlTristoSmorzatore 339 584 58.0
Fish 237 524 45.2
NeoMonty 230 525 43.8
LvRDumber 206 473 43.6
Pulse1 215 530 40.6
MontyZ 168 434 38.7
Marvin3 175 481 36.4
Epa1 137 446 30.7
Firetron 124 412 30.1
Jimbo 58 305 19.0
bdj1 19 226 8.4
You can see that the best robots got about three times as many matches
as the worst one, and that 10 of 14 ended up below 50%, but only 6
ended up below 40%.
[To read the code, go to the downloads section
http://jrobots.sourceforge.net/downloads.shtml ,
download the source code of the simulator of
Jrobots and open the file JJArena.java]
Index
Appendix B - The Virtual Clock Generator
JJRobots Virtual Clock Generator (March 2002)
--------------------------------
This latest version of the game actually brings a little revolution with it: in
order to obtain faithful results of robot performance across a vast multitude
of environments, OSes, Virtual Machines, CPUs, it was necessary to separate the
time as seen by the robots (virtual time) from the time seen by the observer
(real time), in order to provide constant CPU power per second to the robot
threads.
With this version, the simulation time (or virtual time) is calibrated around
the speed of the machine it's running on by a special "benchmark" function; as
a result, the *real* speed of the simulation now varies according to the speed
of the CPU, JVM and OS. So, you may notice an increase or decrease of the
number of matches per hour of real time that can be played on your machine, but
you can be (almost!) guaranteed that the performance of the robots tested and
developed on your machine will be consistent across all the possible
environments they could be executed on in the real tournament. CPU contention
due to other running programs will be gracefully accomodated, causing the
simulation to slow down, rather than becoming "jumpy" as with earlier versions.
A new option has been provided in the user interface: "Display", which can be
set to "Smooth" (60 fps of real time), "Fast" (30 fps of *virtual* time) and
"None", where the match is not shown but the clock counter is updated and the
results are displayed; this setting can be easily changed even within a match.
"None" corresponds most closely with the way actual tournament matches are
executed, as well as being a bit faster.
When watching a match, choose between Smooth and Fast, whichever looks best for
you. While running extended tests, you might consider opting for "None" in
order to get a higher number of matches played per hour. Whatever you choose,
unless you have a very fast machine you will notice a considerable slow down in
Team play. That is actually one of the biggest benefits of virtual time, as
it's now granting a lot more CPU time to the robots, which can now make use of
more advanced targeting techniques even in this previously CPU-constrained mode
of play.
As a way to compensate to for the possible slow down / speed up of
visualization, two new simulation speed settings have been provided: 8X and
1/8X; however, keep in mind that changing the simulation speed actually changes
the CPU time allocated to the robots, so in order to obtain the most faithful
picture of your robot's performance, do intensive testings at the official
speed of 1X.
Finally, you should expect to see some changes to win rates, especially in Team
mode where robots are now given more CPU time than they were previously.
Alan Lund (alan.lund @ acm.org)
Walter Nistico' (walnist @ infinito.it)
Index