10655
179
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:25 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Env Duckietown-Lf-Lfv-Navv-Silent-v1 not found (valid versions include ['Duckietown-Lf-Lfv-Navv-Silent-v0'])
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10644
217
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:32 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Env Duckietown-Lf-Lfv-Navv-Silent-v1 not found (valid versions include ['Duckietown-Lf-Lfv-Navv-Silent-v0'])
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10630
224
Julian Zilly Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:44 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 581, in wrap_solution
solution.run(cis)
File "solution.py", line 88, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Can't connect to the gym-duckietown-server
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10553
347
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:20:00 Timeout:
Waited 114 [...] Timeout:
Waited 1140.60789084 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10552
348
Bhairav Mehta PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:00:55 InvalidEnvironment:
[...] InvalidEnvironment:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 482, in wrap_evaluator
raise InvalidEnvironment(out[SPECIAL_INVALID_ENVIRONMENT])
InvalidEnvironment: InvalidEnvironment:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 117, in run
solve(params, cis)
File "solution.py", line 74, in solve
check_valid_observations(observation)
File "solution.py", line 21, in check_valid_observations
raise InvalidEnvironment(msg)
InvalidEnvironment: I expected size (3, 480, 640), while I got size (3, 120, 160)
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10506
354
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:09 Timeout:
Waited 601 [...] Timeout:
Waited 601.24152422 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10503
356
Bhairav Mehta PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:00:34 Error while running [...]
Pulling solution ... error
stderr | ERROR: for solution Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10474
389
Liam Paull 🇨🇦Random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:03 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.828310674996585 deviation-center-line_median 0.39675702124407014 in-drivable-lane_median 0
other stats deviation-center-line_max 0.7179354734005684 deviation-center-line_mean 0.4064549886813335 deviation-center-line_min 0.214033161323768 deviation-heading_max 3.82953016565333 deviation-heading_mean 1.496615913836184 deviation-heading_median 1.0018811217976218 deviation-heading_min 0.18888049888053884 driven_any_max 1.6148765125383513 driven_any_mean 0.8644251786016122 driven_any_median 0.831568440938805 driven_any_min 0.3219592324979507 driven_lanedir_max 1.255171198733825 driven_lanedir_mean 0.7417908945600653 driven_lanedir_min 0.16277084737591307 in-drivable-lane_max 0.9999999999999964 in-drivable-lane_mean 0.32666666666666594 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.37570161473959574, "driven_lanedir": 0.16277084737591307, "in-drivable-lane": 0.6333333333333332, "deviation-heading": 1.9914188553009555, "deviation-center-line": 0.214033161323768}, "ep001": {"driven_any": 0.3219592324979507, "driven_lanedir": 0.3207934956529499, "in-drivable-lane": 0, "deviation-heading": 0.18888049888053884, "deviation-center-line": 0.2509394613684371}, "ep002": {"driven_any": 1.1780200922933584, "driven_lanedir": 1.1419082560410534, "in-drivable-lane": 0, "deviation-heading": 1.0018811217976218, "deviation-center-line": 0.45260982606982386}, "ep003": {"driven_any": 0.831568440938805, "driven_lanedir": 0.828310674996585, "in-drivable-lane": 0, "deviation-heading": 0.4713689275484741, "deviation-center-line": 0.39675702124407014}, "ep004": {"driven_any": 1.6148765125383513, "driven_lanedir": 1.255171198733825, "in-drivable-lane": 0.9999999999999964, "deviation-heading": 3.82953016565333, "deviation-center-line": 0.7179354734005684}}
No reset possible 10462
394
Maxim Kuzmin 🇷🇺Random execution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10447
394
Maxim Kuzmin 🇷🇺Random execution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.49999999999998
other stats episodes details {"ep000": {"nsteps": 78, "reward": -13.073740402221656, "good_angle": 1.2822185210774788, "survival_time": 2.6000000000000005, "traveled_tiles": 1, "valid_direction": 2.2}, "ep001": {"nsteps": 71, "reward": -14.64982845078052, "good_angle": 0.01315761006529569, "survival_time": 2.366666666666668, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 255, "reward": -4.574526128406618, "good_angle": 1.054991355859468, "survival_time": 8.49999999999998, "traveled_tiles": 3, "valid_direction": 2.466666666666659}, "ep003": {"nsteps": 456, "reward": -2.163129653090676, "good_angle": 1.4434967110165102, "survival_time": 15.199999999999957, "traveled_tiles": 4, "valid_direction": 2.4333333333333247}, "ep004": {"nsteps": 345, "reward": -3.663357601692711, "good_angle": 16.137431452383584, "survival_time": 11.49999999999997, "traveled_tiles": 3, "valid_direction": 3.93333333333332}}good_angle_max 16.137431452383584 good_angle_mean 3.9862591300804673 good_angle_median 1.2822185210774788 good_angle_min 0.01315761006529569 reward_max -2.163129653090676 reward_mean -7.624916447238436 reward_median -4.574526128406618 reward_min -14.64982845078052 survival_time_max 15.199999999999957 survival_time_mean 8.033333333333315 survival_time_min 2.366666666666668 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 3.93333333333332 valid_direction_mean 2.206666666666661 valid_direction_median 2.4333333333333247 valid_direction_min 0
No reset possible 10437
393
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:58 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 89, in run
raise InvalidSubmission(str(e))
InvalidSubmission: error loading <rosparam> tag:
'param' attribute must be set for non-dictionary values
XML is <rosparam command="load" file="$(find duckietown)/config/$(arg config)/line_detector/$(arg node_name)/$(arg param_file_name).yaml"/>
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10422
408
Liam Paull 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:02 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10418
408
Liam Paull 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.133333333333332
other stats episodes details {"ep000": {"nsteps": 73, "reward": -13.913316035566671, "good_angle": 1.045392562444471, "survival_time": 2.4333333333333345, "traveled_tiles": 1, "valid_direction": 1.966666666666668}, "ep001": {"nsteps": 86, "reward": -11.956789544220408, "good_angle": 3.02166692046445, "survival_time": 2.8666666666666663, "traveled_tiles": 2, "valid_direction": 2.033333333333333}, "ep002": {"nsteps": 156, "reward": -7.691832452564715, "good_angle": 1.562632668279302, "survival_time": 5.199999999999991, "traveled_tiles": 4, "valid_direction": 3.2999999999999936}, "ep003": {"nsteps": 119, "reward": -9.212272953297887, "good_angle": 1.330101325560742, "survival_time": 3.9666666666666623, "traveled_tiles": 2, "valid_direction": 3.099999999999996}, "ep004": {"nsteps": 94, "reward": -11.344474870533226, "good_angle": 1.3763510691387104, "survival_time": 3.133333333333332, "traveled_tiles": 2, "valid_direction": 2.633333333333332}}good_angle_max 3.02166692046445 good_angle_mean 1.6672289091775354 good_angle_median 1.3763510691387104 good_angle_min 1.045392562444471 reward_max -7.691832452564715 reward_mean -10.823737171236584 reward_median -11.344474870533226 reward_min -13.913316035566671 survival_time_max 5.199999999999991 survival_time_mean 3.519999999999998 survival_time_min 2.4333333333333345 traveled_tiles_max 4 traveled_tiles_mean 2.2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 3.2999999999999936 valid_direction_mean 2.6066666666666647 valid_direction_median 2.633333333333332 valid_direction_min 1.966666666666668
No reset possible 10412
403
Liam Paull 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:00 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 89, in run
raise InvalidSubmission(str(e))
InvalidSubmission: error loading <rosparam> tag:
'param' attribute must be set for non-dictionary values
XML is <rosparam command="load" file="$(find duckietown)/config/$(arg config)/line_detector/$(arg node_name)/$(arg param_file_name).yaml"/>
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10403
410
Liam Paull 🇨🇦Template for ROS Submission aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:22 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 4.599999999999993
other stats episodes details {"ep000": {"nsteps": 57, "reward": -17.750922405185474, "good_angle": 0.8141817641246087, "survival_time": 1.9000000000000024, "traveled_tiles": 1, "valid_direction": 1.433333333333336}, "ep001": {"nsteps": 178, "reward": -7.768181461301957, "good_angle": 17.859116692064553, "survival_time": 5.933333333333322, "traveled_tiles": 2, "valid_direction": 4.999999999999989}, "ep002": {"nsteps": 89, "reward": -11.491952836764662, "good_angle": 0.17297111778732624, "survival_time": 2.966666666666666, "traveled_tiles": 2, "valid_direction": 0.6999999999999975}, "ep003": {"nsteps": 138, "reward": -8.431312846950975, "good_angle": 1.5809971585343194, "survival_time": 4.599999999999993, "traveled_tiles": 2, "valid_direction": 3.4333333333333256}, "ep004": {"nsteps": 316, "reward": -4.749980154913656, "good_angle": 29.774166774820433, "survival_time": 10.533333333333308, "traveled_tiles": 4, "valid_direction": 6.0999999999999845}}good_angle_max 29.774166774820433 good_angle_mean 10.040286701466249 good_angle_median 1.5809971585343194 good_angle_min 0.17297111778732624 reward_max -4.749980154913656 reward_mean -10.038469941023346 reward_median -8.431312846950975 reward_min -17.750922405185474 survival_time_max 10.533333333333308 survival_time_mean 5.1866666666666585 survival_time_min 1.9000000000000024 traveled_tiles_max 4 traveled_tiles_mean 2.2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 6.0999999999999845 valid_direction_mean 3.3333333333333264 valid_direction_median 3.4333333333333256 valid_direction_min 0.6999999999999975
No reset possible 10376
416
Benjamin Ramtoula 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:09:44 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 88, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10363
424
Mandana Samiei 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:38 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10349
424
Mandana Samiei 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:25 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.0333333333333323
other stats episodes details {"ep000": {"nsteps": 78, "reward": -13.046566619245892, "good_angle": 1.245016835404162, "survival_time": 2.6000000000000005, "traveled_tiles": 1, "valid_direction": 2.0666666666666673}, "ep001": {"nsteps": 91, "reward": -11.262670951506513, "good_angle": 3.4341853646336196, "survival_time": 3.0333333333333323, "traveled_tiles": 2, "valid_direction": 2.466666666666666}, "ep002": {"nsteps": 180, "reward": -6.989241703059007, "good_angle": 2.101766745926625, "survival_time": 5.9999999999999885, "traveled_tiles": 4, "valid_direction": 4.466666666666658}, "ep003": {"nsteps": 117, "reward": -9.302637351330809, "good_angle": 1.289583186306075, "survival_time": 3.899999999999996, "traveled_tiles": 2, "valid_direction": 2.733333333333329}, "ep004": {"nsteps": 76, "reward": -13.879091851730328, "good_angle": 1.4151845278591406, "survival_time": 2.533333333333334, "traveled_tiles": 2, "valid_direction": 2.033333333333334}}good_angle_max 3.4341853646336196 good_angle_mean 1.897147332025925 good_angle_median 1.4151845278591406 good_angle_min 1.245016835404162 reward_max -6.989241703059007 reward_mean -10.89604169537451 reward_median -11.262670951506513 reward_min -13.879091851730328 survival_time_max 5.9999999999999885 survival_time_mean 3.6133333333333306 survival_time_min 2.533333333333334 traveled_tiles_max 4 traveled_tiles_mean 2.2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 4.466666666666658 valid_direction_mean 2.753333333333331 valid_direction_median 2.466666666666666 valid_direction_min 2.033333333333334
No reset possible 10314
459
Dzenan Lapandic Tensorflow template aido1_LF1_r3-v3
step3-videos error yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:13:16 Timeout:
Waited 601 [...] Timeout:
Waited 601.96602416 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10310
459
Dzenan Lapandic Tensorflow template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:47 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.031212838899889905, "good_angle": 8.88899949058751, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 3.033333333333331}, "ep001": {"nsteps": 52, "reward": -19.64357185062881, "good_angle": 0.29079419509175913, "survival_time": 1.7333333333333354, "traveled_tiles": 1, "valid_direction": 0.6666666666666681}, "ep002": {"nsteps": 500, "reward": 0.12084204444650094, "good_angle": 0.3711774951313858, "survival_time": 16.666666666666654, "traveled_tiles": 11, "valid_direction": 0.6999999999999975}, "ep003": {"nsteps": 500, "reward": 0.058611713459176824, "good_angle": 0.23460770646770485, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 0.29999999999999893}, "ep004": {"nsteps": 111, "reward": -9.028757644372549, "good_angle": 0.6850287522463006, "survival_time": 3.6999999999999966, "traveled_tiles": 2, "valid_direction": 1.1666666666666643}}good_angle_max 8.88899949058751 good_angle_mean 2.094121527904932 good_angle_median 0.3711774951313858 good_angle_min 0.23460770646770485 reward_max 0.12084204444650094 reward_mean -5.7048177151991135 reward_median -0.031212838899889905 reward_min -19.64357185062881 survival_time_max 16.666666666666654 survival_time_mean 11.086666666666655 survival_time_min 1.7333333333333354 traveled_tiles_max 11 traveled_tiles_mean 6.8 traveled_tiles_median 10 traveled_tiles_min 1 valid_direction_max 3.033333333333331 valid_direction_mean 1.173333333333332 valid_direction_median 0.6999999999999975 valid_direction_min 0.29999999999999893
No reset possible 10257
459
Dzenan Lapandic Tensorflow template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:37 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10251
463
Orlando Marquez 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:28 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 105, in run
solve(params, cis)
File "solution.py", line 70, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10241
466
Ruixiang Zhang 🇨🇦Random execution aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:13 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 64, in run
solve(gym_environment, cis) # let's try to solve the challenge, exciting ah?
File "solution.py", line 39, in solve
observation, reward, done, info = env.step(action)
File "/usr/local/lib/python2.7/dist-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/notebooks/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/notebooks/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/notebooks/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10240
467
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:56 The result file is n [...] The result file is not found. This usually means that the evaluator did not finish
and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10225
473
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:51 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 59, in run
raise InvalidSubmission(str(e))
InvalidSubmission: local variable 'a' referenced before assignment
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10212
490
Mandana Samiei 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:38 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.21787974222140205 deviation-center-line_median 0.20204868373001375 in-drivable-lane_median 0
other stats deviation-center-line_max 0.389070689680848 deviation-center-line_mean 0.20406881355983664 deviation-center-line_min 0.07061055779700529 deviation-heading_max 1.8694245663191784 deviation-heading_mean 0.851116106665668 deviation-heading_median 0.6129558246408027 deviation-heading_min 0.419494114396449 driven_any_max 0.5571586675755388 driven_any_mean 0.29928835217431277 driven_any_median 0.23572244180842228 driven_any_min 0.06427777830553057 driven_lanedir_max 0.4289623127780047 driven_lanedir_mean 0.23175050592350727 driven_lanedir_min 0.04690391464083277 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.5571586675755388, "driven_lanedir": 0.2870574349158814, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.8694245663191784, "deviation-center-line": 0.2212133074217656}, "ep001": {"driven_any": 0.06427777830553057, "driven_lanedir": 0.04690391464083277, "in-drivable-lane": 0, "deviation-heading": 0.419494114396449, "deviation-center-line": 0.07061055779700529}, "ep002": {"driven_any": 0.4393004122047971, "driven_lanedir": 0.4289623127780047, "in-drivable-lane": 0, "deviation-heading": 0.6129558246408027, "deviation-center-line": 0.389070689680848}, "ep003": {"driven_any": 0.19998246097727512, "driven_lanedir": 0.17794912506141536, "in-drivable-lane": 0, "deviation-heading": 0.7518048933288591, "deviation-center-line": 0.1374008291695507}, "ep004": {"driven_any": 0.23572244180842228, "driven_lanedir": 0.21787974222140205, "in-drivable-lane": 0, "deviation-heading": 0.6019011346430503, "deviation-center-line": 0.20204868373001375}}
No reset possible 10204
483
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:48 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 63, in run
raise InvalidSubmission(str(e))
InvalidSubmission: unbound method __init__() must be called with ObservationWrapper instance as first argument (got undistort_JP instance instead)
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10165
513
Dzenan Lapandic Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:27 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.36515912995221256 deviation-center-line_median 0.27711992167464045 in-drivable-lane_median 0
other stats deviation-center-line_max 0.3117033235146992 deviation-center-line_mean 0.2623622272834181 deviation-center-line_min 0.1665246338038381 deviation-heading_max 2.574347252804103 deviation-heading_mean 0.8892329512459147 deviation-heading_median 0.4770064183773758 deviation-heading_min 0.24848767262080632 driven_any_max 0.6483408761308959 driven_any_mean 0.40453323349081344 driven_any_median 0.38822633693573416 driven_any_min 0.1747027989975962 driven_lanedir_max 0.6375019301596847 driven_lanedir_mean 0.3542640235155541 driven_lanedir_min 0.16513136021295438 in-drivable-lane_max 0.7333333333333333 in-drivable-lane_mean 0.14666666666666667 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.38822633693573416, "driven_lanedir": 0.16513136021295438, "in-drivable-lane": 0.7333333333333333, "deviation-heading": 2.574347252804103, "deviation-center-line": 0.27711992167464045}, "ep001": {"driven_any": 0.1747027989975962, "driven_lanedir": 0.17209583435129394, "in-drivable-lane": 0, "deviation-heading": 0.24848767262080632, "deviation-center-line": 0.1665246338038381}, "ep002": {"driven_any": 0.37269882609342025, "driven_lanedir": 0.36515912995221256, "in-drivable-lane": 0, "deviation-heading": 0.4770064183773758, "deviation-center-line": 0.26623431759254207}, "ep003": {"driven_any": 0.6483408761308959, "driven_lanedir": 0.6375019301596847, "in-drivable-lane": 0, "deviation-heading": 0.6865299431479928, "deviation-center-line": 0.2902289398313708}, "ep004": {"driven_any": 0.43869732929642086, "driven_lanedir": 0.4314318629016247, "in-drivable-lane": 0, "deviation-heading": 0.4597934692792956, "deviation-center-line": 0.3117033235146992}}
No reset possible 10141
513
Dzenan Lapandic Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:05:20 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10136
513
Dzenan Lapandic Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.3999999999999977
other stats episodes details {"ep000": {"nsteps": 102, "reward": -10.132622478844818, "good_angle": 1.4670832287292752, "survival_time": 3.3999999999999977, "traveled_tiles": 1, "valid_direction": 2.799999999999998}, "ep001": {"nsteps": 47, "reward": -21.83779888267213, "good_angle": 0.04671686943416164, "survival_time": 1.5666666666666682, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 98, "reward": -10.615272532008133, "good_angle": 0.1375469760389463, "survival_time": 3.266666666666665, "traveled_tiles": 2, "valid_direction": 0.6999999999999975}, "ep003": {"nsteps": 169, "reward": -6.143681455668084, "good_angle": 0.19516475627043273, "survival_time": 5.633333333333323, "traveled_tiles": 2, "valid_direction": 0.8666666666666636}, "ep004": {"nsteps": 115, "reward": -9.104113079153974, "good_angle": 0.1322048698500476, "survival_time": 3.8333333333333295, "traveled_tiles": 1, "valid_direction": 0.6999999999999975}}good_angle_max 1.4670832287292752 good_angle_mean 0.3957433400645727 good_angle_median 0.1375469760389463 good_angle_min 0.04671686943416164 reward_max -6.143681455668084 reward_mean -11.566697685669428 reward_median -10.132622478844818 reward_min -21.83779888267213 survival_time_max 5.633333333333323 survival_time_mean 3.5399999999999965 survival_time_min 1.5666666666666682 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 2.799999999999998 valid_direction_mean 1.0133333333333314 valid_direction_median 0.6999999999999975 valid_direction_min 0
No reset possible 10118
513
Dzenan Lapandic Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10109
522
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step2-scoring aborted yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:18 Error while running [...]
Pulling scorer ... error
stderr | ERROR: for scorer Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10075
522
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:43 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 10029
538
Jonathan Plante 🇨🇦Random execution aido1_LF1_r3-v3
step4-viz error yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:14:37 Timeout:
Waited 600 [...] Timeout:
Waited 600.801052094 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9960
560
Dzenan Lapandic Baseline solution using imitation learning from logs aido1_LFV_r1-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:13:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9942
566
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:10 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 63, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9903
583
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:01 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.9974379100572336 deviation-center-line_median 0.6560590207850371 in-drivable-lane_median 3.699999999999987
other stats deviation-center-line_max 1.8788618819421845 deviation-center-line_mean 0.7440539779834248 deviation-center-line_min 0 deviation-heading_max 3.6541289260109022 deviation-heading_mean 1.7614495500973806 deviation-heading_median 1.3203671878871144 deviation-heading_min 0 driven_any_max 2.444226372077783 driven_any_mean 1.6843065428712958 driven_any_median 1.6572132165268052 driven_any_min 0.9085628560098704 driven_lanedir_max 2.2570891926315286 driven_lanedir_mean 0.9682329297041452 driven_lanedir_min 0 in-drivable-lane_max 11.099999999999971 in-drivable-lane_mean 4.733333333333325 in-drivable-lane_min 0.6333333333333648 per-episodes details {"ep000": {"driven_any": 1.6572132165268052, "driven_lanedir": 0, "in-drivable-lane": 11.099999999999971, "deviation-heading": 0, "deviation-center-line": 0}, "ep001": {"driven_any": 1.6544189234489706, "driven_lanedir": 1.1038938191780283, "in-drivable-lane": 3.699999999999987, "deviation-heading": 2.7077627268438604, "deviation-center-line": 0.6560590207850371}, "ep002": {"driven_any": 0.9085628560098704, "driven_lanedir": 0.48274372665393583, "in-drivable-lane": 2.79999999999999, "deviation-heading": 1.3203671878871144, "deviation-center-line": 0.30032301321125654}, "ep003": {"driven_any": 1.75711134629305, "driven_lanedir": 0.9974379100572336, "in-drivable-lane": 5.433333333333315, "deviation-heading": 1.1249889097450263, "deviation-center-line": 0.885025973978646}, "ep004": {"driven_any": 2.444226372077783, "driven_lanedir": 2.2570891926315286, "in-drivable-lane": 0.6333333333333648, "deviation-heading": 3.6541289260109022, "deviation-center-line": 1.8788618819421845}}
No reset possible 9866
582
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9848
608
David Abraham Random execution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:41 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9824
608
David Abraham Random execution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.466666666666647
other stats episodes details {"ep000": {"nsteps": 76, "reward": -13.42108862062818, "good_angle": 1.27845074693678, "survival_time": 2.533333333333334, "traveled_tiles": 1, "valid_direction": 2.1666666666666674}, "ep001": {"nsteps": 61, "reward": -16.952884606650617, "good_angle": 0.01694829582997042, "survival_time": 2.033333333333336, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 254, "reward": -4.308219343446302, "good_angle": 0.9057175912932236, "survival_time": 8.466666666666647, "traveled_tiles": 3, "valid_direction": 1.899999999999994}, "ep003": {"nsteps": 455, "reward": -2.1754017377785244, "good_angle": 1.4324059816508266, "survival_time": 15.166666666666623, "traveled_tiles": 4, "valid_direction": 2.3999999999999915}, "ep004": {"nsteps": 345, "reward": -3.5136348463038987, "good_angle": 18.324189158401456, "survival_time": 11.49999999999997, "traveled_tiles": 3, "valid_direction": 3.899999999999987}}good_angle_max 18.324189158401456 good_angle_mean 4.391542354822451 good_angle_median 1.27845074693678 good_angle_min 0.01694829582997042 reward_max -2.1754017377785244 reward_mean -8.074245830961505 reward_median -4.308219343446302 reward_min -16.952884606650617 survival_time_max 15.166666666666623 survival_time_mean 7.939999999999982 survival_time_min 2.033333333333336 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 3.899999999999987 valid_direction_mean 2.073333333333328 valid_direction_median 2.1666666666666674 valid_direction_min 0
No reset possible 9699
1264
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:38 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.4666666666666677
other stats episodes details {"ep000": {"nsteps": 244, "reward": -4.374022879227462, "good_angle": 0.6697246300368211, "survival_time": 8.133333333333315, "traveled_tiles": 2, "valid_direction": 1.5999999999999943}, "ep001": {"nsteps": 70, "reward": -14.862278394613949, "good_angle": 0.02721982411650538, "survival_time": 2.333333333333335, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 46, "reward": -22.19249565705009, "good_angle": 0.5394539707593838, "survival_time": 1.5333333333333348, "traveled_tiles": 2, "valid_direction": 0.6333333333333349}, "ep003": {"nsteps": 127, "reward": -8.308228122143765, "good_angle": 0.31993870085218107, "survival_time": 4.233333333333328, "traveled_tiles": 2, "valid_direction": 0.4999999999999988}, "ep004": {"nsteps": 74, "reward": -14.009329262617472, "good_angle": 0.30200732283866505, "survival_time": 2.4666666666666677, "traveled_tiles": 1, "valid_direction": 0.7}}good_angle_max 0.6697246300368211 good_angle_mean 0.37166888972071127 good_angle_median 0.31993870085218107 good_angle_min 0.02721982411650538 reward_max -4.374022879227462 reward_mean -12.749270863130548 reward_median -14.009329262617472 reward_min -22.19249565705009 survival_time_max 8.133333333333315 survival_time_mean 3.739999999999996 survival_time_min 1.5333333333333348 traveled_tiles_max 2 traveled_tiles_mean 1.6 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 1.5999999999999943 valid_direction_mean 0.6866666666666656 valid_direction_median 0.6333333333333349 valid_direction_min 0
No reset possible 9695
1263
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:46 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9693
1263
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:13 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9692
1262
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.22288207636524104 deviation-center-line_median 0.2057823206070708 in-drivable-lane_median 0
other stats deviation-center-line_max 0.36384835245355496 deviation-center-line_mean 0.24842903696303711 deviation-center-line_min 0.1456580635185044 deviation-heading_max 1.690087737774333 deviation-heading_mean 0.7746338834775621 deviation-heading_median 0.5984170646104535 deviation-heading_min 0.2159015006763972 driven_any_max 0.8036065767750349 driven_any_mean 0.3728660027172318 driven_any_median 0.2392848088637356 driven_any_min 0.1821502344889684 driven_lanedir_max 0.5653270922139106 driven_lanedir_mean 0.31517512442374074 driven_lanedir_min 0.1803846029682585 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.8036065767750349, "driven_lanedir": 0.5653270922139106, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.690087737774333, "deviation-center-line": 0.33762577592741005}, "ep001": {"driven_any": 0.1821502344889684, "driven_lanedir": 0.1803846029682585, "in-drivable-lane": 0, "deviation-heading": 0.2159015006763972, "deviation-center-line": 0.18923067230864535}, "ep002": {"driven_any": 0.42859120366913617, "driven_lanedir": 0.4174283755075572, "in-drivable-lane": 0, "deviation-heading": 0.5984170646104535, "deviation-center-line": 0.36384835245355496}, "ep003": {"driven_any": 0.21069718978928384, "driven_lanedir": 0.18985347506373615, "in-drivable-lane": 0, "deviation-heading": 0.7820498043055376, "deviation-center-line": 0.1456580635185044}, "ep004": {"driven_any": 0.2392848088637356, "driven_lanedir": 0.22288207636524104, "in-drivable-lane": 0, "deviation-heading": 0.5867133100210893, "deviation-center-line": 0.2057823206070708}}
No reset possible 9690
1262
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:31 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9689
1262
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.3000000000000016
other stats episodes details {"ep000": {"nsteps": 227, "reward": -4.550470444116852, "good_angle": 0.7578883000350387, "survival_time": 7.5666666666666496, "traveled_tiles": 2, "valid_direction": 1.0666666666666635}, "ep001": {"nsteps": 53, "reward": -19.43188279192403, "good_angle": 0.03328980550980436, "survival_time": 1.7666666666666688, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 122, "reward": -8.659334487113796, "good_angle": 0.2732274925487942, "survival_time": 4.066666666666662, "traveled_tiles": 2, "valid_direction": 0.36666666666666536}, "ep003": {"nsteps": 61, "reward": -16.777097997118215, "good_angle": 0.4679190169094889, "survival_time": 2.033333333333336, "traveled_tiles": 1, "valid_direction": 0.9666666666666692}, "ep004": {"nsteps": 69, "reward": -14.972821211469348, "good_angle": 0.37034570333692146, "survival_time": 2.3000000000000016, "traveled_tiles": 1, "valid_direction": 0.5999999999999996}}good_angle_max 0.7578883000350387 good_angle_mean 0.3805340636680095 good_angle_median 0.37034570333692146 good_angle_min 0.03328980550980436 reward_max -4.550470444116852 reward_mean -12.87832138634845 reward_median -14.972821211469348 reward_min -19.43188279192403 survival_time_max 7.5666666666666496 survival_time_mean 3.5466666666666633 survival_time_min 1.7666666666666688 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.0666666666666635 valid_direction_mean 0.5999999999999995 valid_direction_median 0.5999999999999996 valid_direction_min 0
No reset possible 9687
1262
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:16 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9685
1259
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:09:19 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9684
1260
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:03 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 97, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9681
1257
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:19 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.4000000000000012
other stats episodes details {"ep000": {"nsteps": 218, "reward": -4.733045888046837, "good_angle": 0.8730727947660979, "survival_time": 7.266666666666651, "traveled_tiles": 2, "valid_direction": 0.8333333333333304}, "ep001": {"nsteps": 96, "reward": -11.001215070486069, "good_angle": 0.02613195916675251, "survival_time": 3.1999999999999984, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 44, "reward": -23.19450922649015, "good_angle": 0.4947327293520531, "survival_time": 1.466666666666668, "traveled_tiles": 2, "valid_direction": 0.666666666666668}, "ep003": {"nsteps": 62, "reward": -16.48422414929636, "good_angle": 0.6180961657508399, "survival_time": 2.066666666666669, "traveled_tiles": 1, "valid_direction": 0.7000000000000011}, "ep004": {"nsteps": 72, "reward": -14.341735198679896, "good_angle": 0.33473161191708783, "survival_time": 2.4000000000000012, "traveled_tiles": 1, "valid_direction": 0.6666666666666659}}good_angle_max 0.8730727947660979 good_angle_mean 0.46935305219056617 good_angle_median 0.4947327293520531 good_angle_min 0.02613195916675251 reward_max -4.733045888046837 reward_mean -13.95094590659986 reward_median -14.341735198679896 reward_min -23.19450922649015 survival_time_max 7.266666666666651 survival_time_mean 3.2799999999999976 survival_time_min 1.466666666666668 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 0.8333333333333304 valid_direction_mean 0.573333333333333 valid_direction_median 0.666666666666668 valid_direction_min 0
No reset possible 9678
1256
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:40 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.329163691620562 deviation-center-line_median 0.35827461205411315 in-drivable-lane_median 0
other stats deviation-center-line_max 0.3937814135164912 deviation-center-line_mean 0.291213522879301 deviation-center-line_min 0.13450398548614903 deviation-heading_max 1.7665900011038165 deviation-heading_mean 0.8255861889551109 deviation-heading_median 0.6007002310416163 deviation-heading_min 0.4156529767389977 driven_any_max 0.7928861093818448 driven_any_mean 0.3971517271419362 driven_any_median 0.33572577130684705 driven_any_min 0.16428089881466457 driven_lanedir_max 0.5553370718712161 driven_lanedir_mean 0.3382078713840022 driven_lanedir_min 0.14698925329475143 in-drivable-lane_max 1.8666666666666691 in-drivable-lane_mean 0.37333333333333385 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.7928861093818448, "driven_lanedir": 0.5553370718712161, "in-drivable-lane": 1.8666666666666691, "deviation-heading": 1.7665900011038165, "deviation-center-line": 0.35827461205411315}, "ep001": {"driven_any": 0.33572577130684705, "driven_lanedir": 0.329163691620562, "in-drivable-lane": 0, "deviation-heading": 0.4156529767389977, "deviation-center-line": 0.36163658004759613}, "ep002": {"driven_any": 0.16428089881466457, "driven_lanedir": 0.14698925329475143, "in-drivable-lane": 0, "deviation-heading": 0.5629682504671758, "deviation-center-line": 0.13450398548614903}, "ep003": {"driven_any": 0.44286019832024265, "driven_lanedir": 0.4279002685926354, "in-drivable-lane": 0, "deviation-heading": 0.7820194854239486, "deviation-center-line": 0.3937814135164912}, "ep004": {"driven_any": 0.250005657886082, "driven_lanedir": 0.23164907154084613, "in-drivable-lane": 0, "deviation-heading": 0.6007002310416163, "deviation-center-line": 0.2078710232921555}}
No reset possible 9677
1256
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9674
1256
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.1999999999999984
other stats episodes details {"ep000": {"nsteps": 224, "reward": -4.6342155636666575, "good_angle": 0.7731504622031996, "survival_time": 7.46666666666665, "traveled_tiles": 2, "valid_direction": 1.4999999999999951}, "ep001": {"nsteps": 96, "reward": -11.020615682937205, "good_angle": 0.1791175555044248, "survival_time": 3.1999999999999984, "traveled_tiles": 1, "valid_direction": 0.2666666666666657}, "ep002": {"nsteps": 48, "reward": -21.29593548985819, "good_angle": 0.40772748053506686, "survival_time": 1.6000000000000016, "traveled_tiles": 2, "valid_direction": 0.5666666666666684}, "ep003": {"nsteps": 126, "reward": -8.428223190622198, "good_angle": 0.3376643428838403, "survival_time": 4.199999999999995, "traveled_tiles": 2, "valid_direction": 0.5999999999999996}, "ep004": {"nsteps": 72, "reward": -14.353482292758097, "good_angle": 0.4188168584270678, "survival_time": 2.4000000000000012, "traveled_tiles": 1, "valid_direction": 0.5333333333333321}}good_angle_max 0.7731504622031996 good_angle_mean 0.4232953399107198 good_angle_median 0.40772748053506686 good_angle_min 0.1791175555044248 reward_max -4.6342155636666575 reward_mean -11.94649444396847 reward_median -11.020615682937205 reward_min -21.29593548985819 survival_time_max 7.46666666666665 survival_time_mean 3.773333333333329 survival_time_min 1.6000000000000016 traveled_tiles_max 2 traveled_tiles_mean 1.6 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 1.4999999999999951 valid_direction_mean 0.6933333333333322 valid_direction_median 0.5666666666666684 valid_direction_min 0.2666666666666657
No reset possible 9672
1255
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:32 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9671
1254
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:50 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9670
1252
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:08 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.2237568345062462 deviation-center-line_median 0.5255505066574591 in-drivable-lane_median 0.7666666666666639
other stats deviation-center-line_max 1.1284977526080535 deviation-center-line_mean 0.623917788943267 deviation-center-line_min 0.4096956396980083 deviation-heading_max 3.57311994111298 deviation-heading_mean 2.254088297294393 deviation-heading_median 2.585362095649338 deviation-heading_min 0.9691292266181551 driven_any_max 4.136167333607006 driven_any_mean 1.9020135611935427 driven_any_median 1.4810907703652432 driven_any_min 1.0948001182085714 driven_lanedir_max 3.61134816285692 driven_lanedir_mean 1.5652151416562976 driven_lanedir_min 0.7006939707855828 in-drivable-lane_max 1.6000000000000163 in-drivable-lane_mean 0.8133333333333358 in-drivable-lane_min 0.33333333333333215 per-episodes details {"ep000": {"driven_any": 1.2012008767727027, "driven_lanedir": 1.0196871948302384, "in-drivable-lane": 0.9333333333333348, "deviation-heading": 0.9691292266181551, "deviation-center-line": 0.4096956396980083}, "ep001": {"driven_any": 1.0948001182085714, "driven_lanedir": 0.7006939707855828, "in-drivable-lane": 0.4333333333333318, "deviation-heading": 2.585362095649338, "deviation-center-line": 0.5255505066574591}, "ep002": {"driven_any": 1.4810907703652432, "driven_lanedir": 1.2705895453025002, "in-drivable-lane": 0.33333333333333215, "deviation-heading": 2.688349100642005, "deviation-center-line": 0.6380697975902526}, "ep003": {"driven_any": 4.136167333607006, "driven_lanedir": 3.61134816285692, "in-drivable-lane": 1.6000000000000163, "deviation-heading": 3.57311994111298, "deviation-center-line": 1.1284977526080535}, "ep004": {"driven_any": 1.59680870701419, "driven_lanedir": 1.2237568345062462, "in-drivable-lane": 0.7666666666666639, "deviation-heading": 1.454481122449486, "deviation-center-line": 0.4177752481625615}}
No reset possible 9668
1252
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 6.733333333333319
other stats episodes details {"ep000": {"nsteps": 154, "reward": -6.756369775678213, "good_angle": 0.26924966321218124, "survival_time": 5.133333333333325, "traveled_tiles": 3, "valid_direction": 0.7666666666666675}, "ep001": {"nsteps": 185, "reward": -6.319986207199258, "good_angle": 11.358373278490214, "survival_time": 6.1666666666666545, "traveled_tiles": 2, "valid_direction": 2.8333333333333233}, "ep002": {"nsteps": 223, "reward": -4.676453666757341, "good_angle": 0.9692714802305116, "survival_time": 7.433333333333317, "traveled_tiles": 4, "valid_direction": 2.1333333333333258}, "ep003": {"nsteps": 500, "reward": -0.17902887826110236, "good_angle": 0.8052548061128889, "survival_time": 16.666666666666654, "traveled_tiles": 7, "valid_direction": 2.9333333333333425}, "ep004": {"nsteps": 202, "reward": -5.6194859258461705, "good_angle": 11.385585117165173, "survival_time": 6.733333333333319, "traveled_tiles": 3, "valid_direction": 2.333333333333325}}good_angle_max 11.385585117165173 good_angle_mean 4.9575468690421935 good_angle_median 0.9692714802305116 good_angle_min 0.26924966321218124 reward_max -0.17902887826110236 reward_mean -4.710264890748418 reward_median -5.6194859258461705 reward_min -6.756369775678213 survival_time_max 16.666666666666654 survival_time_mean 8.426666666666653 survival_time_min 5.133333333333325 traveled_tiles_max 7 traveled_tiles_mean 3.8 traveled_tiles_median 3 traveled_tiles_min 2 valid_direction_max 2.9333333333333425 valid_direction_mean 2.1999999999999966 valid_direction_median 2.333333333333325 valid_direction_min 0.7666666666666675
No reset possible 9667
1250
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:19 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 119, in run
solve(params, cis)
File "solution.py", line 67, in solve
env = make_env(config, debug=debug)
File "solution.py", line 37, in make_env
env = FrameStackWrappper(env=env, k=3, config=config)
File "/workspace/wrappers.py", line 88, in __init__
self.seg_model = linknet.LinkNet()
File "/workspace/segmentation/models/linknet.py", line 60, in __init__
super().__init__()
TypeError: super() takes at least 1 argument (0 given)
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9666
1251
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:13 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 610, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 136, in _send_request
success_response, exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 210, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 277, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
EndpointConnectionError: Could not connect to the endpoint URL: "https://duckietown-ai-driving-olympics-1.s3.amazonaws.com/v3/frankfurt/by-value/sha256/75b230e67e0123c2aa0e99857e953be23dd0120eeca2fa4c16911c6613b77285"
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9662
1248
Yun Chen 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:07 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median -0.09634093160376268 deviation-center-line_median 0.248635564017718 in-drivable-lane_median 8.099999999999993
other stats deviation-center-line_max 0.4518704036228205 deviation-center-line_mean 0.2556494300836503 deviation-center-line_min 0.1162971123732259 deviation-heading_max 6.789023413990527 deviation-heading_mean 3.7061564353510863 deviation-heading_median 2.50474661382244 deviation-heading_min 1.03058851155584 driven_any_max 1.1756196406474568 driven_any_mean 0.7656277009866187 driven_any_median 1.0455417133799771 driven_any_min 0.1608946405898283 driven_lanedir_max -0.03651771060084963 driven_lanedir_mean -0.19827609256797435 driven_lanedir_min -0.40302944350976366 in-drivable-lane_max 14.699999999999974 in-drivable-lane_mean 7.1533333333333236 in-drivable-lane_min 1.466666666666666 per-episodes details {"ep000": {"driven_any": 1.0455417133799771, "driven_lanedir": -0.07352062251664337, "in-drivable-lane": 14.699999999999974, "deviation-heading": 1.6184468947045891, "deviation-center-line": 0.2760737542983069}, "ep001": {"driven_any": 0.1608946405898283, "driven_lanedir": -0.03651771060084963, "in-drivable-lane": 1.466666666666666, "deviation-heading": 1.03058851155584, "deviation-center-line": 0.1162971123732259}, "ep002": {"driven_any": 1.0563205493851964, "driven_lanedir": -0.40302944350976366, "in-drivable-lane": 8.13333333333332, "deviation-heading": 6.587976742682035, "deviation-center-line": 0.4518704036228205}, "ep003": {"driven_any": 1.1756196406474568, "driven_lanedir": -0.3819717546088523, "in-drivable-lane": 8.099999999999993, "deviation-heading": 6.789023413990527, "deviation-center-line": 0.1853703161061803}, "ep004": {"driven_any": 0.3897619609306344, "driven_lanedir": -0.09634093160376268, "in-drivable-lane": 3.366666666666659, "deviation-heading": 2.50474661382244, "deviation-center-line": 0.248635564017718}}
No reset possible 9661
1248
Yun Chen 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:08 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9660
1248
Yun Chen 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:16 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -1.221439646458253, "good_angle": 53.427938021183294, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 14.499999999999988}, "ep001": {"nsteps": 83, "reward": -13.900101248040256, "good_angle": 2.3379234031911302, "survival_time": 2.7666666666666666, "traveled_tiles": 1, "valid_direction": 2.1666666666666665}, "ep002": {"nsteps": 500, "reward": -1.1177792339541484, "good_angle": 13.504118279440878, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.833333333333323}, "ep003": {"nsteps": 500, "reward": -1.0688708793438273, "good_angle": 13.909208766492162, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.83333333333332}, "ep004": {"nsteps": 195, "reward": -6.8847574649712975, "good_angle": 5.411281289004235, "survival_time": 6.499999999999987, "traveled_tiles": 1, "valid_direction": 5.066666666666656}}good_angle_max 53.427938021183294 good_angle_mean 17.718093951862336 good_angle_median 13.504118279440878 good_angle_min 2.3379234031911302 reward_max -1.0688708793438273 reward_mean -4.838589694553557 reward_median -1.221439646458253 reward_min -13.900101248040256 survival_time_max 16.666666666666654 survival_time_mean 11.85333333333332 survival_time_min 2.7666666666666666 traveled_tiles_max 1 traveled_tiles_mean 1 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 14.499999999999988 valid_direction_mean 9.479999999999992 valid_direction_median 12.83333333333332 valid_direction_min 2.1666666666666665
No reset possible 9659
1248
Yun Chen 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:05:39 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9657
1247
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:13:43 Timeout:
Waited 620 [...] Timeout:
Waited 620.966861963 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9655
1245
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 1:10:09 Timeout:
Waited 606 [...] Timeout:
Waited 606.311518192 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9654
1243
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:00:45 Error while running [...]
Pulling solution ... error
stderr | ERROR: for solution Get https://registry-1.docker.io/v2/heyt0ny/aido1_lf1_r3-v3-submission/manifests/2018_11_15_22_01_16: Get https://auth.docker.io/token?scope=repository%3Aheyt0ny%2Faido1_lf1_r3-v3-submission%3Apull&service=registry.docker.io: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/heyt0ny/aido1_lf1_r3-v3-submission/manifests/2018_11_15_22_01_16: Get https://auth.docker.io/token?scope=repository%3Aheyt0ny%2Faido1_lf1_r3-v3-submission%3Apull&service=registry.docker.io: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9652
1240
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:59 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 2.0738231022989133 deviation-center-line_median 0.7116612408743404 in-drivable-lane_median 1.399999999999995
other stats deviation-center-line_max 0.925901111475584 deviation-center-line_mean 0.5945266797558918 deviation-center-line_min 0.2435136923747967 deviation-heading_max 3.6382145944438986 deviation-heading_mean 2.5110810486834083 deviation-heading_median 2.906035526925262 deviation-heading_min 1.2159711938569278 driven_any_max 4.263875075671843 driven_any_mean 2.596787927777529 driven_any_median 2.413689698065759 driven_any_min 1.0845154267163306 driven_lanedir_max 3.831121332492839 driven_lanedir_mean 2.0357980883279714 driven_lanedir_min 0.7032177206667558 in-drivable-lane_max 5.033333333333352 in-drivable-lane_mean 2.073333333333334 in-drivable-lane_min 1.1000000000000008 per-episodes details {"ep000": {"driven_any": 2.413689698065759, "driven_lanedir": 2.0738231022989133, "in-drivable-lane": 1.1000000000000008, "deviation-heading": 2.906035526925262, "deviation-center-line": 0.7793493935560738}, "ep001": {"driven_any": 1.0845154267163306, "driven_lanedir": 0.755643838236794, "in-drivable-lane": 1.399999999999995, "deviation-heading": 1.2159711938569278, "deviation-center-line": 0.2435136923747967}, "ep002": {"driven_any": 1.3782746870934162, "driven_lanedir": 0.7032177206667558, "in-drivable-lane": 1.6333333333333275, "deviation-heading": 1.8733527073683736, "deviation-center-line": 0.31220796049866406}, "ep003": {"driven_any": 4.263875075671843, "driven_lanedir": 3.831121332492839, "in-drivable-lane": 1.1999999999999955, "deviation-heading": 3.6382145944438986, "deviation-center-line": 0.925901111475584}, "ep004": {"driven_any": 3.843584751340294, "driven_lanedir": 2.8151844479445547, "in-drivable-lane": 5.033333333333352, "deviation-heading": 2.9218312208225816, "deviation-center-line": 0.7116612408743404}}
No reset possible 9649
1240
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:39 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9648
1240
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:55 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 9.833333333333307
other stats episodes details {"ep000": {"nsteps": 295, "reward": -3.495740544256484, "good_angle": 0.8437524014731178, "survival_time": 9.833333333333307, "traveled_tiles": 5, "valid_direction": 2.066666666666663}, "ep001": {"nsteps": 141, "reward": -8.276325672230822, "good_angle": 15.996851752106846, "survival_time": 4.699999999999993, "traveled_tiles": 3, "valid_direction": 3.166666666666658}, "ep002": {"nsteps": 157, "reward": -7.045858895764642, "good_angle": 2.3884615379621152, "survival_time": 5.2333333333333245, "traveled_tiles": 5, "valid_direction": 3.866666666666659}, "ep003": {"nsteps": 455, "reward": -2.249894424716003, "good_angle": 1.0133499739127358, "survival_time": 15.166666666666623, "traveled_tiles": 8, "valid_direction": 1.566666666666661}, "ep004": {"nsteps": 500, "reward": -0.7398527064763475, "good_angle": 35.164651620368794, "survival_time": 16.666666666666654, "traveled_tiles": 8, "valid_direction": 7.233333333333343}}good_angle_max 35.164651620368794 good_angle_mean 11.081413457164722 good_angle_median 2.3884615379621152 good_angle_min 0.8437524014731178 reward_max -0.7398527064763475 reward_mean -4.36153444868886 reward_median -3.495740544256484 reward_min -8.276325672230822 survival_time_max 16.666666666666654 survival_time_mean 10.319999999999984 survival_time_min 4.699999999999993 traveled_tiles_max 8 traveled_tiles_mean 5.8 traveled_tiles_median 5 traveled_tiles_min 3 valid_direction_max 7.233333333333343 valid_direction_mean 3.579999999999997 valid_direction_median 3.166666666666658 valid_direction_min 1.566666666666661
No reset possible 9646
1240
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:17:50 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9644
1241
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:06 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 95, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9642
1239
Bhairav Mehta ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:14:30 Timeout:
Waited 611 [...] Timeout:
Waited 611.285361052 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9640
1238
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:14:53 Timeout:
Waited 607 [...] Timeout:
Waited 607.260257959 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9638
1237
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:54 Timeout:
Waited 602 [...] Timeout:
Waited 602.67279911 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9637
1236
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:00:44 Error while running [...]
Pulling solution ... error
stderr | ERROR: for solution Get https://registry-1.docker.io/v2/gunshi/aido1_lf1_r3-v3-submission/manifests/2018_11_15_10_45_41: Get https://auth.docker.io/token?scope=repository%3Agunshi%2Faido1_lf1_r3-v3-submission%3Apull&service=registry.docker.io: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/gunshi/aido1_lf1_r3-v3-submission/manifests/2018_11_15_10_45_41: Get https://auth.docker.io/token?scope=repository%3Agunshi%2Faido1_lf1_r3-v3-submission%3Apull&service=registry.docker.io: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9634
1234
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:13:31 Timeout:
Waited 610 [...] Timeout:
Waited 610.635639906 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9632
1232
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:08 Timeout:
Waited 601 [...] Timeout:
Waited 601.875596046 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9631
1230
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 5.233205580663916 deviation-center-line_median 0.9622528676347796 in-drivable-lane_median 1.86666666666666
other stats deviation-center-line_max 1.4887727823522934 deviation-center-line_mean 1.0165775523241307 deviation-center-line_min 0.6787543098843081 deviation-heading_max 2.9759401772443645 deviation-heading_mean 2.5017191355213586 deviation-heading_median 2.7190723206994707 deviation-heading_min 1.5376994510379216 driven_any_max 6.340440794264879 driven_any_mean 5.482326668108364 driven_any_median 6.323719113434832 driven_any_min 2.1357289801566743 driven_lanedir_max 6.012629298880372 driven_lanedir_mean 4.674452797642727 driven_lanedir_min 1.6482613967380175 in-drivable-lane_max 3.0999999999999908 in-drivable-lane_mean 1.7466666666666608 in-drivable-lane_min 0.33333333333333215 per-episodes details {"ep000": {"driven_any": 6.278494215108127, "driven_lanedir": 4.988778536550464, "in-drivable-lane": 3.0999999999999908, "deviation-heading": 2.9759401772443645, "deviation-center-line": 0.9622528676347796}, "ep001": {"driven_any": 2.1357289801566743, "driven_lanedir": 1.6482613967380175, "in-drivable-lane": 0.93333333333333, "deviation-heading": 1.5376994510379216, "deviation-center-line": 0.6787543098843081}, "ep002": {"driven_any": 6.340440794264879, "driven_lanedir": 5.489389175380863, "in-drivable-lane": 1.86666666666666, "deviation-heading": 2.4944146864875445, "deviation-center-line": 1.016623969752082}, "ep003": {"driven_any": 6.333250237577308, "driven_lanedir": 5.233205580663916, "in-drivable-lane": 2.499999999999991, "deviation-heading": 2.7190723206994707, "deviation-center-line": 0.9364838319971908}, "ep004": {"driven_any": 6.323719113434832, "driven_lanedir": 6.012629298880372, "in-drivable-lane": 0.33333333333333215, "deviation-heading": 2.781469042137491, "deviation-center-line": 1.4887727823522934}}
No reset possible 9627
1231
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:55 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9625
1229
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:55 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9624
1228
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:51 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 95, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9622
1227
Gunshi Gupta 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:45 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9621
1226
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:27 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 110, in run
solve(params, cis)
File "solution.py", line 58, in solve
env = make_env(config, debug=debug)
File "solution.py", line 43, in make_env
env = ActionWrapper(env=env, config=config, submit=True)
File "/workspace/wrappers.py", line 12, in __init__
super().__init__(env)
TypeError: super() takes at least 1 argument (0 given)
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9620
1225
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:26 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9612
1224
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:18 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9606
1223
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:24 Timeout:
Waited 606 [...] Timeout:
Waited 606.33307004 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9595
1222
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:38 Timeout:
Waited 626 [...] Timeout:
Waited 626.038208961 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9577
1220
Mandana Samiei 🇨🇦Improved ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:10:57 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9557
1218
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:10:47 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9530
1216
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:09:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.47290116201527416, "good_angle": 0.620527758689177, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.4666666666666688}, "ep001": {"nsteps": 500, "reward": -0.3814245066232979, "good_angle": 20.941140955183226, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 3.2333333333333236}, "ep002": {"nsteps": 500, "reward": -0.4611001099124551, "good_angle": 0.6792900770956242, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.699999999999997}, "ep003": {"nsteps": 500, "reward": -0.49903926293135736, "good_angle": 0.5362987411971321, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.4333333333333318}, "ep004": {"nsteps": 500, "reward": -0.264232877929011, "good_angle": 6.46899920987801, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 1.433333333333334}}good_angle_max 20.941140955183226 good_angle_mean 5.849251348408634 good_angle_median 0.6792900770956242 good_angle_min 0.5362987411971321 reward_max -0.264232877929011 reward_mean -0.4157395838822791 reward_median -0.4611001099124551 reward_min -0.49903926293135736 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 18 traveled_tiles_mean 16.4 traveled_tiles_median 18 traveled_tiles_min 10 valid_direction_max 3.2333333333333236 valid_direction_mean 1.853333333333331 valid_direction_median 1.4666666666666688 valid_direction_min 1.4333333333333318
No reset possible 9518
1215
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:10:19 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.5497224612673745, "good_angle": 0.5439216145491801, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.4666666666666714}, "ep001": {"nsteps": 500, "reward": -0.3210465839970857, "good_angle": 21.22155437720925, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 3.666666666666668}, "ep002": {"nsteps": 500, "reward": -0.6283954581220169, "good_angle": 0.5428892749021961, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.266666666666663}, "ep003": {"nsteps": 500, "reward": -0.3240200620025862, "good_angle": 0.6178064113300014, "survival_time": 16.666666666666654, "traveled_tiles": 18, "valid_direction": 1.8333333333333268}, "ep004": {"nsteps": 500, "reward": -0.2921708645388717, "good_angle": 13.532035740296866, "survival_time": 16.666666666666654, "traveled_tiles": 15, "valid_direction": 2.566666666666661}}good_angle_max 21.22155437720925 good_angle_mean 7.2916414836575 good_angle_median 0.6178064113300014 good_angle_min 0.5428892749021961 reward_max -0.2921708645388717 reward_mean -0.42307108598558696 reward_median -0.3240200620025862 reward_min -0.6283954581220169 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 18 traveled_tiles_mean 17.4 traveled_tiles_median 18 traveled_tiles_min 15 valid_direction_max 3.666666666666668 valid_direction_mean 2.1599999999999984 valid_direction_median 1.8333333333333268 valid_direction_min 1.266666666666663
No reset possible 9501
1214
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:27 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9485
1212
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:08 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9454
1208
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:59 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9444
1203
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:51 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 1.7666666666666688
other stats episodes details {"ep000": {"nsteps": 94, "reward": -10.880140648332445, "good_angle": 0.2389941919765413, "survival_time": 3.133333333333332, "traveled_tiles": 3, "valid_direction": 0.2333333333333325}, "ep001": {"nsteps": 13, "reward": -77.54556220540634, "good_angle": 0.12722664664767305, "survival_time": 0.4333333333333333, "traveled_tiles": 1, "valid_direction": 0.2666666666666666}, "ep002": {"nsteps": 22, "reward": -45.918432533741, "good_angle": 0.04667275508301345, "survival_time": 0.7333333333333333, "traveled_tiles": 2, "valid_direction": 0.1333333333333333}, "ep003": {"nsteps": 500, "reward": -0.7811867533139885, "good_angle": 1.41219851160597, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 3.3666666666666556}, "ep004": {"nsteps": 53, "reward": -19.349999496959292, "good_angle": 0.09429089177989708, "survival_time": 1.7666666666666688, "traveled_tiles": 2, "valid_direction": 0.2333333333333341}}good_angle_max 1.41219851160597 good_angle_mean 0.38387659941861896 good_angle_median 0.12722664664767305 good_angle_min 0.04667275508301345 reward_max -0.7811867533139885 reward_mean -30.895064327550617 reward_median -19.349999496959292 reward_min -77.54556220540634 survival_time_max 16.666666666666654 survival_time_mean 4.546666666666664 survival_time_min 0.4333333333333333 traveled_tiles_max 12 traveled_tiles_mean 4 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 3.3666666666666556 valid_direction_mean 0.8466666666666643 valid_direction_median 0.2333333333333341 valid_direction_min 0.1333333333333333
No reset possible 9435
1203
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:28 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9409
1200
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:09 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9394
1197
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:10 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.6000000000000005
other stats episodes details {"ep000": {"nsteps": 224, "reward": -4.623365467477145, "good_angle": 0.7319553509950669, "survival_time": 7.46666666666665, "traveled_tiles": 2, "valid_direction": 1.2999999999999965}, "ep001": {"nsteps": 22, "reward": -46.07200496440584, "good_angle": 0.32625887867237263, "survival_time": 0.7333333333333333, "traveled_tiles": 1, "valid_direction": 0.49999999999999994}, "ep002": {"nsteps": 52, "reward": -19.717869508724945, "good_angle": 0.32697238196849376, "survival_time": 1.7333333333333354, "traveled_tiles": 2, "valid_direction": 0.7333333333333348}, "ep003": {"nsteps": 131, "reward": -8.08410547377954, "good_angle": 0.2988844426919217, "survival_time": 4.366666666666661, "traveled_tiles": 2, "valid_direction": 0.5666666666666653}, "ep004": {"nsteps": 78, "reward": -13.277005767593018, "good_angle": 0.3977287577520742, "survival_time": 2.6000000000000005, "traveled_tiles": 1, "valid_direction": 0.6333333333333311}}good_angle_max 0.7319553509950669 good_angle_mean 0.41635996241598583 good_angle_median 0.32697238196849376 good_angle_min 0.2988844426919217 reward_max -4.623365467477145 reward_mean -18.354870236396096 reward_median -13.277005767593018 reward_min -46.07200496440584 survival_time_max 7.46666666666665 survival_time_mean 3.3799999999999963 survival_time_min 0.7333333333333333 traveled_tiles_max 2 traveled_tiles_mean 1.6 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 1.2999999999999965 valid_direction_mean 0.7466666666666655 valid_direction_median 0.6333333333333311 valid_direction_min 0.49999999999999994
No reset possible 9377
1195
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:56 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9364
1193
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:51 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9304
1180
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step4-viz error yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:41:39 Timeout:
Waited 179 [...] Timeout:
Waited 1798.72978711 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9276
1178
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:28 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.28060777012340976 deviation-center-line_median 0.22721196650330475 in-drivable-lane_median 0
other stats deviation-center-line_max 0.397387598231267 deviation-center-line_mean 0.25293935350159086 deviation-center-line_min 0.13210976538295569 deviation-heading_max 1.7444564974638874 deviation-heading_mean 0.7954867340184384 deviation-heading_median 0.6045070411462525 deviation-heading_min 0.2618915420632834 driven_any_max 0.5678510646493321 driven_any_mean 0.3450032324283415 driven_any_median 0.2821608403404374 driven_any_min 0.19998585583219508 driven_lanedir_max 0.42669405448571496 driven_lanedir_mean 0.2838617218300153 driven_lanedir_min 0.1775901034723475 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.5678510646493321, "driven_lanedir": 0.3130090110394963, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.7444564974638874, "deviation-center-line": 0.22721196650330475}, "ep001": {"driven_any": 0.2821608403404374, "driven_lanedir": 0.28060777012340976, "in-drivable-lane": 0, "deviation-heading": 0.2618915420632834, "deviation-center-line": 0.30221720153425513}, "ep002": {"driven_any": 0.43572704749065816, "driven_lanedir": 0.42669405448571496, "in-drivable-lane": 0, "deviation-heading": 0.5989330467566464, "deviation-center-line": 0.397387598231267}, "ep003": {"driven_any": 0.19998585583219508, "driven_lanedir": 0.1775901034723475, "in-drivable-lane": 0, "deviation-heading": 0.7676455426621223, "deviation-center-line": 0.13210976538295569}, "ep004": {"driven_any": 0.2392913538290846, "driven_lanedir": 0.22140767002910788, "in-drivable-lane": 0, "deviation-heading": 0.6045070411462525, "deviation-center-line": 0.20577023585617177}}
No reset possible 9260
1177
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.3140615527405992 deviation-center-line_median 0.2120344929627202 in-drivable-lane_median 0
other stats deviation-center-line_max 0.3825504647991373 deviation-center-line_mean 0.2555782149297515 deviation-center-line_min 0.13299211806624614 deviation-heading_max 1.8124373524520645 deviation-heading_mean 0.8371617918782187 deviation-heading_median 0.6068979230123304 deviation-heading_min 0.43067669951320775 driven_any_max 0.5857327986545419 driven_any_mean 0.3564370675813312 driven_any_median 0.3214398113505129 driven_any_min 0.20355954308580224 driven_lanedir_max 0.41144029184237896 driven_lanedir_mean 0.29196390143672385 driven_lanedir_min 0.1819474087964288 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.5857327986545419, "driven_lanedir": 0.32097209404557536, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.8124373524520645, "deviation-center-line": 0.2120344929627202}, "ep001": {"driven_any": 0.3214398113505129, "driven_lanedir": 0.3140615527405992, "in-drivable-lane": 0, "deviation-heading": 0.43067669951320775, "deviation-center-line": 0.34423621628418904}, "ep002": {"driven_any": 0.4214477505013758, "driven_lanedir": 0.41144029184237896, "in-drivable-lane": 0, "deviation-heading": 0.6068979230123304, "deviation-center-line": 0.3825504647991373}, "ep003": {"driven_any": 0.20355954308580224, "driven_lanedir": 0.1819474087964288, "in-drivable-lane": 0, "deviation-heading": 0.739481192030715, "deviation-center-line": 0.13299211806624614}, "ep004": {"driven_any": 0.25000543431442346, "driven_lanedir": 0.23139815975863695, "in-drivable-lane": 0, "deviation-heading": 0.5963157923827757, "deviation-center-line": 0.2060777825364651}}
No reset possible 9213
1175
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:30:48 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 610, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 136, in _send_request
success_response, exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 210, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 277, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
EndpointConnectionError: Could not connect to the endpoint URL: "https://duckietown-ai-driving-olympics-1.s3.us-east-2.amazonaws.com/v3/frankfurt/by-value/sha256/d324607b930393f3447b0956126aa69d8b01523d5aa68d65d129ccc48036f3e5"
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9208
1172
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:28 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9202
1172
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:15 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.4000000000000012
other stats episodes details {"ep000": {"nsteps": 221, "reward": -4.680065435220433, "good_angle": 0.8143340540122375, "survival_time": 7.36666666666665, "traveled_tiles": 2, "valid_direction": 1.03333333333333}, "ep001": {"nsteps": 88, "reward": -11.970444464548068, "good_angle": 0.09934511537248952, "survival_time": 2.9333333333333327, "traveled_tiles": 1, "valid_direction": 0.29999999999999893}, "ep002": {"nsteps": 48, "reward": -21.25994161920001, "good_angle": 0.5257756234168345, "survival_time": 1.6000000000000016, "traveled_tiles": 2, "valid_direction": 0.7000000000000017}, "ep003": {"nsteps": 61, "reward": -16.74233506925282, "good_angle": 0.4446579903733168, "survival_time": 2.033333333333336, "traveled_tiles": 1, "valid_direction": 0.8333333333333355}, "ep004": {"nsteps": 72, "reward": -14.358987438182035, "good_angle": 0.3486535214226296, "survival_time": 2.4000000000000012, "traveled_tiles": 1, "valid_direction": 0.599999999999999}}good_angle_max 0.8143340540122375 good_angle_mean 0.4465532609195016 good_angle_median 0.4446579903733168 good_angle_min 0.09934511537248952 reward_max -4.680065435220433 reward_mean -13.802354805280675 reward_median -14.358987438182035 reward_min -21.25994161920001 survival_time_max 7.36666666666665 survival_time_mean 3.2666666666666644 survival_time_min 1.6000000000000016 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.03333333333333 valid_direction_mean 0.693333333333333 valid_direction_median 0.7000000000000017 valid_direction_min 0.29999999999999893
No reset possible 9188
1171
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:57 InvalidEvaluator:
Tr [...] InvalidEvaluator:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 488, in wrap_evaluator
evaluator.score(cie)
File "eval.py", line 97, in score
raise dc.InvalidEvaluator(msg)
InvalidEvaluator: Gym exited with code 2
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9176
1170
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:02 InvalidEvaluator:
Tr [...] InvalidEvaluator:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 488, in wrap_evaluator
evaluator.score(cie)
File "eval.py", line 97, in score
raise dc.InvalidEvaluator(msg)
InvalidEvaluator: Gym exited with code 2
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9159
1166
Tien Nguyen ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:14:35 Timeout:
Waited 607 [...] Timeout:
Waited 607.414505959 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9153
1164
Mandana Samiei 🇨🇦My Improved ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:21 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.20552955460650055 deviation-center-line_median 0.147201300169161 in-drivable-lane_median 0
other stats deviation-center-line_max 0.35753048570357787 deviation-center-line_mean 0.1907781397534624 deviation-center-line_min 0.09263278175679672 deviation-heading_max 1.7183292422352954 deviation-heading_mean 0.7984177229793936 deviation-heading_median 0.5858313015455301 deviation-heading_min 0.38730041512189434 driven_any_max 0.7928873277300456 driven_any_mean 0.3064261243806684 driven_any_median 0.22498124254441865 driven_any_min 0.08927605174825964 driven_lanedir_max 0.5510805099921994 driven_lanedir_mean 0.24507277155126625 driven_lanedir_min 0.07586780200910015 in-drivable-lane_max 1.9000000000000024 in-drivable-lane_mean 0.3800000000000005 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.7928873277300456, "driven_lanedir": 0.5510805099921994, "in-drivable-lane": 1.9000000000000024, "deviation-heading": 1.7183292422352954, "deviation-center-line": 0.35753048570357787}, "ep001": {"driven_any": 0.08927605174825964, "driven_lanedir": 0.07586780200910015, "in-drivable-lane": 0, "deviation-heading": 0.38730041512189434, "deviation-center-line": 0.09263278175679672}, "ep002": {"driven_any": 0.17141768242842462, "driven_lanedir": 0.15359552259448783, "in-drivable-lane": 0, "deviation-heading": 0.5858313015455301, "deviation-center-line": 0.1407064486409552}, "ep003": {"driven_any": 0.22498124254441865, "driven_lanedir": 0.20552955460650055, "in-drivable-lane": 0, "deviation-heading": 0.7449872273121362, "deviation-center-line": 0.147201300169161}, "ep004": {"driven_any": 0.2535683174521935, "driven_lanedir": 0.23929046855404312, "in-drivable-lane": 0, "deviation-heading": 0.5556404286821123, "deviation-center-line": 0.2158196824968214}}
No reset possible 9115
1159
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:14:29 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 2.9645880551955353 deviation-center-line_median 0.7225207950313806 in-drivable-lane_median 0.2
other stats deviation-center-line_max 2.5192439549059884 deviation-center-line_mean 0.9522369835871388 deviation-center-line_min 0.07012102503315373 deviation-heading_max 1.961878196598454 deviation-heading_mean 1.0905478743744097 deviation-heading_median 1.1917587236710998 deviation-heading_min 0.06263261537654082 driven_any_max 6.367591009197094 driven_any_mean 3.386638963448918 driven_any_median 3.3854397950746216 driven_any_min 0.33973165334795075 driven_lanedir_max 6.303948340587919 driven_lanedir_mean 3.162113937155253 driven_lanedir_min 0.133546879125491 in-drivable-lane_max 0.93333333333333 in-drivable-lane_mean 0.4066666666666654 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.33973165334795075, "driven_lanedir": 0.133546879125491, "in-drivable-lane": 0.2, "deviation-heading": 0.7239127722490644, "deviation-center-line": 0.07012102503315373}, "ep001": {"driven_any": 0.5214051897237224, "driven_lanedir": 0.5205213017203099, "in-drivable-lane": 0, "deviation-heading": 0.06263261537654082, "deviation-center-line": 0.1594339097661895}, "ep002": {"driven_any": 3.3854397950746216, "driven_lanedir": 2.9645880551955353, "in-drivable-lane": 0.93333333333333, "deviation-heading": 1.512557063976889, "deviation-center-line": 0.7225207950313806}, "ep003": {"driven_any": 6.319027169901199, "driven_lanedir": 5.887965109147009, "in-drivable-lane": 0.8999999999999968, "deviation-heading": 1.961878196598454, "deviation-center-line": 1.2898652331989815}, "ep004": {"driven_any": 6.367591009197094, "driven_lanedir": 6.303948340587919, "in-drivable-lane": 0, "deviation-heading": 1.1917587236710998, "deviation-center-line": 2.5192439549059884}}
No reset possible 9081
1157
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:52 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9057
1153
David Abraham Pytorch IL aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:34 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9037
1150
Iban Harlouchet 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:05:25 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9009
1147
Krishna Murthy Jatavallabhula 🇨🇦gym_duckietown + opencv aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:52 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 556, in run
solve(params, cis)
File "solution.py", line 521, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 9006
1146
Martin Weiss 🇨🇦PyTorch template aido1_LF1_r3-v3
step4-viz aborted yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:38 Error while running [...]
Pulling visualization ... error
stderr | ERROR: for visualization Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8847
1121
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz timeout yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:31:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8821
1117
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:13:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 3.398552203462148 deviation-center-line_median 1.1629913185907466 in-drivable-lane_median 7.4333333333333425
other stats deviation-center-line_max 1.624202371989478 deviation-center-line_mean 1.0726177383196294 deviation-center-line_min 0.09897289157693905 deviation-heading_max 1.6411382726446502 deviation-heading_mean 1.072766539005956 deviation-heading_median 1.2405699143350992 deviation-heading_min 0.06092138286581164 driven_any_max 6.341423765977694 driven_any_mean 5.025218720795255 driven_any_median 6.18869350629341 driven_any_min 0.34666666666667245 driven_lanedir_max 4.279411544180419 driven_lanedir_mean 2.7799088404744516 driven_lanedir_min 0.3458718774424743 in-drivable-lane_max 9.56666666666667 in-drivable-lane_mean 5.966666666666661 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 6.18869350629341, "driven_lanedir": 3.398552203462148, "in-drivable-lane": 7.699999999999975, "deviation-heading": 0.874562261434017, "deviation-center-line": 1.624202371989478}, "ep001": {"driven_any": 0.34666666666667245, "driven_lanedir": 0.3458718774424743, "in-drivable-lane": 0, "deviation-heading": 0.06092138286581164, "deviation-center-line": 0.09897289157693905}, "ep002": {"driven_any": 6.268925578379452, "driven_lanedir": 3.402540543997432, "in-drivable-lane": 7.4333333333333425, "deviation-heading": 1.6411382726446502, "deviation-center-line": 1.1629913185907466}, "ep003": {"driven_any": 6.341423765977694, "driven_lanedir": 4.279411544180419, "in-drivable-lane": 5.133333333333315, "deviation-heading": 1.2405699143350992, "deviation-center-line": 1.4140966651281546}, "ep004": {"driven_any": 5.980384086659046, "driven_lanedir": 2.4731680332897836, "in-drivable-lane": 9.56666666666667, "deviation-heading": 1.5466408637502016, "deviation-center-line": 1.0628254443128289}}
No reset possible 8805
1116
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring aborted yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:15:05 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 610, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 136, in _send_request
success_response, exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 210, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 277, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
EndpointConnectionError: Could not connect to the endpoint URL: "https://duckietown-ai-driving-olympics-1.s3.amazonaws.com/v3/frankfurt/by-value/sha256/be6c892ed4210246dbd123f76738d5eea52fa288c9c12ab870908ceb65609c11"
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8794
1114
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:09:19 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8783
1113
Krishna Murthy Jatavallabhula 🇨🇦gym_duckietown + opencv aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:25 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median -0.15741338591251397 deviation-center-line_median 0.4650944282204903 in-drivable-lane_median 8.299999999999994
other stats deviation-center-line_max 0.7159918628482274 deviation-center-line_mean 0.3821124886057812 deviation-center-line_min 0 deviation-heading_max 6.540131967240777 deviation-heading_mean 5.221082427757779 deviation-heading_median 6.537669075338772 deviation-heading_min 0 driven_any_max 0.49735405772776664 driven_any_mean 0.496556394292239 driven_any_median 0.4973540577277655 driven_any_min 0.4933657405501446 driven_lanedir_max 0 driven_lanedir_mean -0.125929382017318 driven_lanedir_min -0.15748677407321887 in-drivable-lane_max 16.63333333333332 in-drivable-lane_mean 9.973333333333326 in-drivable-lane_min 8.299999999999994 per-episodes details {"ep000": {"driven_any": 0.4933657405501446, "driven_lanedir": 0, "in-drivable-lane": 16.63333333333332, "deviation-heading": 0, "deviation-center-line": 0}, "ep001": {"driven_any": 0.49735405772775193, "driven_lanedir": -0.15748677407321887, "in-drivable-lane": 8.299999999999994, "deviation-heading": 6.540131967240777, "deviation-center-line": 0.7159918628482274}, "ep002": {"driven_any": 0.49735405772776664, "driven_lanedir": -0.15729812685794475, "in-drivable-lane": 8.333333333333329, "deviation-heading": 6.487894280561919, "deviation-center-line": 0.4650944282204903}, "ep003": {"driven_any": 0.4973540577277655, "driven_lanedir": -0.15744862324291242, "in-drivable-lane": 8.299999999999994, "deviation-heading": 6.539716815647427, "deviation-center-line": 0.2023017828020126}, "ep004": {"driven_any": 0.4973540577277666, "driven_lanedir": -0.15741338591251397, "in-drivable-lane": 8.299999999999994, "deviation-heading": 6.537669075338772, "deviation-center-line": 0.527174369158176}}
No reset possible 8771
1113
Krishna Murthy Jatavallabhula 🇨🇦gym_duckietown + opencv aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:53 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8752
1110
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:13:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 3.4671741334926467 deviation-center-line_median 1.2958101170355834 in-drivable-lane_median 6.300000000000013
other stats deviation-center-line_max 1.6351280540821935 deviation-center-line_mean 0.986355175052001 deviation-center-line_min 0.09897289157693905 deviation-heading_max 2.013916250570275 deviation-heading_mean 1.2276586971395471 deviation-heading_median 1.496818149580234 deviation-heading_min 0.06092138286581164 driven_any_max 6.300532347308807 driven_any_mean 4.383603780695079 driven_any_median 6.015722154991967 driven_any_min 0.34666666666667245 driven_lanedir_max 3.856705062202037 driven_lanedir_mean 2.8568503518682036 driven_lanedir_min 0.3458718774424743 in-drivable-lane_max 6.9666666666666455 in-drivable-lane_mean 3.9400000000000017 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 6.214405296996363, "driven_lanedir": 3.661117948394939, "in-drivable-lane": 6.4000000000000155, "deviation-heading": 2.013916250570275, "deviation-center-line": 1.4114794115316016}, "ep001": {"driven_any": 0.34666666666667245, "driven_lanedir": 0.3458718774424743, "in-drivable-lane": 0, "deviation-heading": 0.06092138286581164, "deviation-center-line": 0.09897289157693905}, "ep002": {"driven_any": 6.300532347308807, "driven_lanedir": 3.856705062202037, "in-drivable-lane": 6.300000000000013, "deviation-heading": 1.6193895682632025, "deviation-center-line": 1.2958101170355834}, "ep003": {"driven_any": 3.040692437511586, "driven_lanedir": 2.953382737808921, "in-drivable-lane": 0.033333333333333215, "deviation-heading": 0.9472481344182136, "deviation-center-line": 0.4903854010336875}, "ep004": {"driven_any": 6.015722154991967, "driven_lanedir": 3.4671741334926467, "in-drivable-lane": 6.9666666666666455, "deviation-heading": 1.496818149580234, "deviation-center-line": 1.6351280540821935}}
No reset possible 8743
1109
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:31 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 4.266666666666661
other stats episodes details {"ep000": {"nsteps": 40, "reward": -25.25177037610556, "good_angle": 0.6402653252392663, "survival_time": 1.333333333333334, "traveled_tiles": 1, "valid_direction": 1.1333333333333342}, "ep001": {"nsteps": 40, "reward": -25.562104041129352, "good_angle": 0.005956591799219283, "survival_time": 1.333333333333334, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 128, "reward": -8.312191555043682, "good_angle": 0.4929748188962192, "survival_time": 4.266666666666661, "traveled_tiles": 3, "valid_direction": 1.0666666666666629}, "ep003": {"nsteps": 228, "reward": -4.689726662446271, "good_angle": 0.5025668708841372, "survival_time": 7.599999999999983, "traveled_tiles": 4, "valid_direction": 1.099999999999996}, "ep004": {"nsteps": 173, "reward": -6.3794289034955485, "good_angle": 10.402549052319154, "survival_time": 5.766666666666656, "traveled_tiles": 3, "valid_direction": 1.999999999999993}}good_angle_max 10.402549052319154 good_angle_mean 2.408862531827599 good_angle_median 0.5025668708841372 good_angle_min 0.005956591799219283 reward_max -4.689726662446271 reward_mean -14.03904430764408 reward_median -8.312191555043682 reward_min -25.562104041129352 survival_time_max 7.599999999999983 survival_time_mean 4.059999999999993 survival_time_min 1.333333333333334 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 1.999999999999993 valid_direction_mean 1.0599999999999974 valid_direction_median 1.099999999999996 valid_direction_min 0
No reset possible 8647
1090
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos timeout yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:31:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8635
1087
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:05:36 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.1385571958462242, "good_angle": 1.0827976173705394, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 3.066666666666658}, "ep001": {"nsteps": 500, "reward": -0.4588272203770466, "good_angle": 26.36631660973232, "survival_time": 16.666666666666654, "traveled_tiles": 11, "valid_direction": 3.899999999999987}, "ep002": {"nsteps": 302, "reward": -3.5442560831083307, "good_angle": 0.8967085704316418, "survival_time": 10.066666666666642, "traveled_tiles": 6, "valid_direction": 1.7999999999999945}, "ep003": {"nsteps": 500, "reward": -0.24455104245571416, "good_angle": 1.170968986704092, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 2.5333333333333243}, "ep004": {"nsteps": 500, "reward": -0.47790012961020695, "good_angle": 22.573560307157873, "survival_time": 16.666666666666654, "traveled_tiles": 11, "valid_direction": 2.599999999999991}}good_angle_max 26.36631660973232 good_angle_mean 10.418070418279294 good_angle_median 1.170968986704092 good_angle_min 0.8967085704316418 reward_max -0.1385571958462242 reward_mean -0.9728183342795044 reward_median -0.4588272203770466 reward_min -3.5442560831083307 survival_time_max 16.666666666666654 survival_time_mean 15.346666666666652 survival_time_min 10.066666666666642 traveled_tiles_max 11 traveled_tiles_mean 9.6 traveled_tiles_median 10 traveled_tiles_min 6 valid_direction_max 3.899999999999987 valid_direction_mean 2.7799999999999914 valid_direction_median 2.599999999999991 valid_direction_min 1.7999999999999945
No reset possible 8569
1077
David Abraham Pytorch IL aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:39 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.566666666666647
other stats episodes details {"ep000": {"nsteps": 257, "reward": -3.7163317949839065, "good_angle": 0.4040754514380267, "survival_time": 8.566666666666647, "traveled_tiles": 5, "valid_direction": 0.7333333333333316}, "ep001": {"nsteps": 234, "reward": -4.4394585757176035, "good_angle": 7.730404185747497, "survival_time": 7.799999999999982, "traveled_tiles": 5, "valid_direction": 1.6999999999999946}, "ep002": {"nsteps": 500, "reward": 0.18717753134015944, "good_angle": 0.470133995447931, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 0.9999999999999964}, "ep003": {"nsteps": 459, "reward": -1.9749117352453684, "good_angle": 0.4213946639061889, "survival_time": 15.299999999999956, "traveled_tiles": 8, "valid_direction": 0.6999999999999975}, "ep004": {"nsteps": 124, "reward": -8.247120564695516, "good_angle": 2.1871857981951406, "survival_time": 4.133333333333328, "traveled_tiles": 3, "valid_direction": 0.5999999999999979}}good_angle_max 7.730404185747497 good_angle_mean 2.242638818946957 good_angle_median 0.470133995447931 good_angle_min 0.4040754514380267 reward_max 0.18717753134015944 reward_mean -3.6381290278604466 reward_median -3.7163317949839065 reward_min -8.247120564695516 survival_time_max 16.666666666666654 survival_time_mean 10.493333333333313 survival_time_min 4.133333333333328 traveled_tiles_max 9 traveled_tiles_mean 6 traveled_tiles_median 5 traveled_tiles_min 3 valid_direction_max 1.6999999999999946 valid_direction_mean 0.9466666666666635 valid_direction_median 0.7333333333333316 valid_direction_min 0.5999999999999979
No reset possible 8550
1071
Krishna Murthy Jatavallabhula 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:27 The result file is n [...] The result file is not found. This usually means that the evaluator did not finish
and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8545
1070
Claudio Ruch Java template aido1_amod_service_quality_r1-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:16:15 Timeout:
Waited 601 [...] Timeout:
Waited 601.578508854 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8533
1067
Claudio Ruch Webinar Algorithm aido1_amod_service_quality_r1-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:13:49 Timeout:
Waited 603 [...] Timeout:
Waited 603.854336977 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8512
1064
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:37 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8506
1063
Benjamin Ramtoula 🇨🇦My ROS solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:09:35 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.25416273124999134, "good_angle": 0.7444893646124573, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 1.4666666666666617}, "ep001": {"nsteps": 500, "reward": -0.44939830574346706, "good_angle": 23.524621673805804, "survival_time": 16.666666666666654, "traveled_tiles": 11, "valid_direction": 4.166666666666655}, "ep002": {"nsteps": 500, "reward": -0.3045769970673136, "good_angle": 0.7257529187394196, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 1.5666666666666649}, "ep003": {"nsteps": 500, "reward": -0.2888979723546654, "good_angle": 0.4591799947974504, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 0.6999999999999975}, "ep004": {"nsteps": 500, "reward": -0.39088739768415687, "good_angle": 17.242621474406523, "survival_time": 16.666666666666654, "traveled_tiles": 11, "valid_direction": 2.166666666666659}}good_angle_max 23.524621673805804 good_angle_mean 8.539333085272329 good_angle_median 0.7444893646124573 good_angle_min 0.4591799947974504 reward_max -0.25416273124999134 reward_mean -0.3375846808199189 reward_median -0.3045769970673136 reward_min -0.44939830574346706 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 11 traveled_tiles_mean 10.2 traveled_tiles_median 10 traveled_tiles_min 9 valid_direction_max 4.166666666666655 valid_direction_mean 2.013333333333327 valid_direction_median 1.5666666666666649 valid_direction_min 0.6999999999999975
No reset possible 8501
1062
Mandana Samiei 🇨🇦bazinga!!! aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:22 Timeout:
Waited 601 [...] Timeout:
Waited 601.105962038 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8495
1061
Mandana Samiei 🇨🇦bazinga!!! aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 1.9666666666666697
other stats episodes details {"ep000": {"nsteps": 220, "reward": -4.685009788676557, "good_angle": 0.8076957338944025, "survival_time": 7.333333333333317, "traveled_tiles": 2, "valid_direction": 0.9999999999999964}, "ep001": {"nsteps": 53, "reward": -19.414937066582013, "good_angle": 0.03280049822537973, "survival_time": 1.7666666666666688, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 48, "reward": -21.27974657838543, "good_angle": 0.4468633402845622, "survival_time": 1.6000000000000016, "traveled_tiles": 2, "valid_direction": 0.7000000000000014}, "ep003": {"nsteps": 59, "reward": -17.301283067692136, "good_angle": 0.5392666859174008, "survival_time": 1.9666666666666697, "traveled_tiles": 1, "valid_direction": 0.8000000000000019}, "ep004": {"nsteps": 68, "reward": -15.174569403862252, "good_angle": 0.3832377575607332, "survival_time": 2.2666666666666684, "traveled_tiles": 1, "valid_direction": 0.6333333333333333}}good_angle_max 0.8076957338944025 good_angle_mean 0.4419728031764957 good_angle_median 0.4468633402845622 good_angle_min 0.03280049822537973 reward_max -4.685009788676557 reward_mean -15.571109181039676 reward_median -17.301283067692136 reward_min -21.27974657838543 survival_time_max 7.333333333333317 survival_time_mean 2.9866666666666655 survival_time_min 1.6000000000000016 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 0.9999999999999964 valid_direction_mean 0.6266666666666667 valid_direction_median 0.7000000000000014 valid_direction_min 0
No reset possible 8480
1059
Mandana Samiei 🇨🇦bazinga!!! aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:12:16 Timeout:
Waited 600 [...] Timeout:
Waited 600.105436802 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8461
1056
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8446
1055
Benjamin Ramtoula 🇨🇦My ROS solution - param 3 aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:48 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8442
1054
David Abraham Pytorch IL aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:20 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.566666666666647
other stats episodes details {"ep000": {"nsteps": 257, "reward": -3.7163317949839065, "good_angle": 0.4040754514380267, "survival_time": 8.566666666666647, "traveled_tiles": 5, "valid_direction": 0.7333333333333316}, "ep001": {"nsteps": 234, "reward": -4.4394585757176035, "good_angle": 7.730404185747497, "survival_time": 7.799999999999982, "traveled_tiles": 5, "valid_direction": 1.6999999999999946}, "ep002": {"nsteps": 500, "reward": 0.18717753134015944, "good_angle": 0.470133995447931, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 0.9999999999999964}, "ep003": {"nsteps": 459, "reward": -1.9749117352453684, "good_angle": 0.4213946639061889, "survival_time": 15.299999999999956, "traveled_tiles": 8, "valid_direction": 0.6999999999999975}, "ep004": {"nsteps": 124, "reward": -8.247120564695516, "good_angle": 2.1871857981951406, "survival_time": 4.133333333333328, "traveled_tiles": 3, "valid_direction": 0.5999999999999979}}good_angle_max 7.730404185747497 good_angle_mean 2.242638818946957 good_angle_median 0.470133995447931 good_angle_min 0.4040754514380267 reward_max 0.18717753134015944 reward_mean -3.6381290278604466 reward_median -3.7163317949839065 reward_min -8.247120564695516 survival_time_max 16.666666666666654 survival_time_mean 10.493333333333313 survival_time_min 4.133333333333328 traveled_tiles_max 9 traveled_tiles_mean 6 traveled_tiles_median 5 traveled_tiles_min 3 valid_direction_max 1.6999999999999946 valid_direction_mean 0.9466666666666635 valid_direction_median 0.7333333333333316 valid_direction_min 0.5999999999999979
No reset possible 8434
1053
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:17 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8402
1048
Mandana Samiei 🇨🇦Solution template aido1_luck-v3
step1 success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:31:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8361
1042
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step3-videos error yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:43:16 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 623, in _make_api_call
raise error_class(parsed_response, operation_name)
ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8327
1036
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:00 Error while running [...]
Pulling evaluator ... error
stderr | ERROR: for solution Get https://registry-1.docker.io/v2/maivincent/aido1_lf1_r3-v3-submission/manifests/2018_11_14_00_16_27: proxyconnect tcp: dial tcp 192.168.65.1:3129: i/o timeout
stderr |
stderr | ERROR: for evaluator Get https://registry-1.docker.io/v2/andreacensi/aido1_lf1_r3-v3-step1-simulation-evaluator/manifests/2018_11_08_16_21_06: proxyconnect tcp: dial tcp 192.168.65.1:3129: i/o timeout
stderr | Get https://registry-1.docker.io/v2/maivincent/aido1_lf1_r3-v3-submission/manifests/2018_11_14_00_16_27: proxyconnect tcp: dial tcp 192.168.65.1:3129: i/o timeout
stderr | Get https://registry-1.docker.io/v2/andreacensi/aido1_lf1_r3-v3-step1-simulation-evaluator/manifests/2018_11_08_16_21_06: proxyconnect tcp: dial tcp 192.168.65.1:3129: i/o timeout
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8288
1027
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:20:34 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 7.933333333333315
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.3701488013258204, "good_angle": 0.6058817932506364, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 1.2333333333333292}, "ep001": {"nsteps": 28, "reward": -36.270419410296846, "good_angle": 0.004123794322536422, "survival_time": 0.9333333333333332, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 500, "reward": -0.4321476400345564, "good_angle": 0.6623218333699774, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 1.1333333333333293}, "ep003": {"nsteps": 224, "reward": -4.642331055026651, "good_angle": 0.1130436329615555, "survival_time": 7.46666666666665, "traveled_tiles": 5, "valid_direction": 0.16666666666666607}, "ep004": {"nsteps": 238, "reward": -4.661016504452446, "good_angle": 10.885551075653066, "survival_time": 7.933333333333315, "traveled_tiles": 6, "valid_direction": 2.1333333333333258}}good_angle_max 10.885551075653066 good_angle_mean 2.454184425911554 good_angle_median 0.6058817932506364 good_angle_min 0.004123794322536422 reward_max -0.3701488013258204 reward_mean -9.275212682227265 reward_median -4.642331055026651 reward_min -36.270419410296846 survival_time_max 16.666666666666654 survival_time_mean 9.93333333333332 survival_time_min 0.9333333333333332 traveled_tiles_max 12 traveled_tiles_mean 7.2 traveled_tiles_median 6 traveled_tiles_min 1 valid_direction_max 2.1333333333333258 valid_direction_mean 0.93333333333333 valid_direction_median 1.1333333333333293 valid_direction_min 0
No reset possible 8281
1027
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:11 Error while running [...]
Pulling evaluator ... error
stderr | ERROR: for evaluator Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr | Get https://registry-1.docker.io/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8279
1026
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:05 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 69, in solve
num_stack=1)
File "/workspace/model.py", line 246, in __init__
self._current_obs = self._current_obs.cuda()
RuntimeError: cuda runtime error (35) : CUDA driver version is insufficient for CUDA runtime version at /tmp/pip-req-build-vRRdPa/aten/src/THC/THCGeneral.cpp:74
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8274
1025
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:35 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 89, in solve
observation, reward, done, info = env.step(action)
File "/workspace/wrappers.py", line 252, in step
return self.step_wait()
File "/workspace/wrappers.py", line 271, in step_wait
results = [env.step(a) for (a,env) in zip(self.actions, self.envs)]
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8264
1020
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:09 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 81, in solve
model.update_current_obs(observation)
File "/workspace/model.py", line 257, in update_current_obs
obs = torch.from_numpy(obs).float()
RuntimeError: PyTorch was compiled without NumPy support
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 81, in solve
model.update_current_obs(observation)
File "/workspace/model.py", line 257, in update_current_obs
obs = torch.from_numpy(obs).float()
RuntimeError: PyTorch was compiled without NumPy support
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8254
1017
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:14 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 69, in solve
num_stack=1)
File "/workspace/model.py", line 239, in __init__
self._obs_shape = (obs_shape[0] * args.num_stack, *obs_shape[1:])
NameError: name 'args' is not defined
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 69, in solve
num_stack=1)
File "/workspace/model.py", line 239, in __init__
self._obs_shape = (obs_shape[0] * args.num_stack, *obs_shape[1:])
NameError: name 'args' is not defined
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8238
1016
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:00 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 7.533333333333316
other stats episodes details {"ep000": {"nsteps": 76, "reward": -13.416412785493058, "good_angle": 0.9723576108709706, "survival_time": 2.533333333333334, "traveled_tiles": 1, "valid_direction": 2.333333333333334}, "ep001": {"nsteps": 108, "reward": -9.848042460503402, "good_angle": 2.041641545621004, "survival_time": 3.599999999999997, "traveled_tiles": 2, "valid_direction": 0.5666666666666647}, "ep002": {"nsteps": 226, "reward": -4.574839447271614, "good_angle": 1.5760425170941503, "survival_time": 7.533333333333316, "traveled_tiles": 5, "valid_direction": 2.5999999999999908}, "ep003": {"nsteps": 500, "reward": -0.13448444758018013, "good_angle": 0.6536031283623902, "survival_time": 16.666666666666654, "traveled_tiles": 6, "valid_direction": 2.06666666666666}, "ep004": {"nsteps": 282, "reward": -4.769903202343019, "good_angle": 16.060156931528677, "survival_time": 9.399999999999975, "traveled_tiles": 3, "valid_direction": 4.166666666666653}}good_angle_max 16.060156931528677 good_angle_mean 4.260760346695439 good_angle_median 1.5760425170941503 good_angle_min 0.6536031283623902 reward_max -0.13448444758018013 reward_mean -6.548736468638255 reward_median -4.769903202343019 reward_min -13.416412785493058 survival_time_max 16.666666666666654 survival_time_mean 7.946666666666656 survival_time_min 2.533333333333334 traveled_tiles_max 6 traveled_tiles_mean 3.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 4.166666666666653 valid_direction_mean 2.3466666666666605 valid_direction_median 2.333333333333334 valid_direction_min 0.5666666666666647
No reset possible 8231
1014
David Abraham Pytorch IL aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:05:16 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8218
1015
David Abraham Pytorch IL aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:10 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8205
1013
David Abraham Pytorch IL aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 3.8666666666666623
other stats episodes details {"ep000": {"nsteps": 190, "reward": -5.205830141776977, "good_angle": 1.396838191834734, "survival_time": 6.333333333333321, "traveled_tiles": 4, "valid_direction": 0.7333333333333308}, "ep001": {"nsteps": 46, "reward": -22.44177474923756, "good_angle": 0.6096318451873292, "survival_time": 1.5333333333333348, "traveled_tiles": 1, "valid_direction": 0.33333333333333437}, "ep002": {"nsteps": 116, "reward": -8.61695077078987, "good_angle": 1.3546548994691416, "survival_time": 3.8666666666666623, "traveled_tiles": 3, "valid_direction": 0.6666666666666643}, "ep003": {"nsteps": 339, "reward": -2.86965735018072, "good_angle": 1.3466075416605623, "survival_time": 11.29999999999997, "traveled_tiles": 7, "valid_direction": 0.7999999999999983}, "ep004": {"nsteps": 87, "reward": -11.927641337501932, "good_angle": 0.7192143058080895, "survival_time": 2.8999999999999995, "traveled_tiles": 2, "valid_direction": 0.466666666666665}}good_angle_max 1.396838191834734 good_angle_mean 1.0853893567919712 good_angle_median 1.3466075416605623 good_angle_min 0.6096318451873292 reward_max -2.86965735018072 reward_mean -10.212370869897413 reward_median -8.61695077078987 reward_min -22.44177474923756 survival_time_max 11.29999999999997 survival_time_mean 5.186666666666658 survival_time_min 1.5333333333333348 traveled_tiles_max 7 traveled_tiles_mean 3.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 0.7999999999999983 valid_direction_mean 0.5999999999999985 valid_direction_median 0.6666666666666643 valid_direction_min 0.33333333333333437
No reset possible 8193
1011
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:28 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 67, in solve
from model import A2CPG
File "/workspace/model.py", line 5, in <module>
import torch
ModuleNotFoundError: No module named 'torch'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 127, in run
solve(params, cis)
File "solution.py", line 67, in solve
from model import A2CPG
File "/workspace/model.py", line 5, in <module>
import torch
ModuleNotFoundError: No module named 'torch'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8186
1009
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:59 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 124, in run
solve(params, cis)
File "solution.py", line 45, in solve
env = DummyVecEnv([make_env(params)])
TypeError: make_env() takes 0 positional arguments but 1 was given
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 124, in run
solve(params, cis)
File "solution.py", line 45, in solve
env = DummyVecEnv([make_env(params)])
TypeError: make_env() takes 0 positional arguments but 1 was given
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8178
1008
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:11 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 122, in run
solve(params, cis)
File "solution.py", line 43, in solve
env = DummyVecEnv([env])
File "/workspace/wrappers.py", line 261, in __init__
self.envs = [fn() for fn in env_fns]
File "/workspace/wrappers.py", line 261, in <listcomp>
self.envs = [fn() for fn in env_fns]
TypeError: 'ScaleObservations' object is not callable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 122, in run
solve(params, cis)
File "solution.py", line 43, in solve
env = DummyVecEnv([env])
File "/workspace/wrappers.py", line 261, in __init__
self.envs = [fn() for fn in env_fns]
File "/workspace/wrappers.py", line 261, in <listcomp>
self.envs = [fn() for fn in env_fns]
TypeError: 'ScaleObservations' object is not callable
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8176
1007
Ruixiang Zhang 🇨🇦stay young aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:23 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 122, in run
solve(params, cis)
File "solution.py", line 43, in solve
env = DummyVecEnv([env])
File "/workspace/wrappers.py", line 261, in __init__
self.envs = [fn() for fn in env_fns]
File "/workspace/wrappers.py", line 261, in <listcomp>
self.envs = [fn() for fn in env_fns]
TypeError: 'ScaleObservations' object is not callable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
duckietown_challenges.exceptions.InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 122, in run
solve(params, cis)
File "solution.py", line 43, in solve
env = DummyVecEnv([env])
File "/workspace/wrappers.py", line 261, in __init__
self.envs = [fn() for fn in env_fns]
File "/workspace/wrappers.py", line 261, in <listcomp>
self.envs = [fn() for fn in env_fns]
TypeError: 'ScaleObservations' object is not callable
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8168
1004
Laurent Mandrile Tensorflow template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.13312858444658682 deviation-center-line_median 0.08744010848352336 in-drivable-lane_median 8.366666666666639
other stats deviation-center-line_max 0.2951251593343002 deviation-center-line_mean 0.12729673166499883 deviation-center-line_min 0.047860468716545944 deviation-heading_max 1.2058790583016687 deviation-heading_mean 0.7178061254891885 deviation-heading_median 0.7905924332634966 deviation-heading_min 0.1745193388357215 driven_any_max 4.454184189442348 driven_any_mean 2.5257450968093726 driven_any_median 2.3537329967677016 driven_any_min 1.0662869877338133 driven_lanedir_max 0.4531051551321528 driven_lanedir_mean 0.193667880939661 driven_lanedir_min 0.07969926819667117 in-drivable-lane_max 15.999999999999988 in-drivable-lane_mean 9.086666666666648 in-drivable-lane_min 3.766666666666661 per-episodes details {"ep000": {"driven_any": 1.995026722965619, "driven_lanedir": 0.07969926819667117, "in-drivable-lane": 7.866666666666648, "deviation-heading": 0.1745193388357215, "deviation-center-line": 0.047860468716545944}, "ep001": {"driven_any": 2.3537329967677016, "driven_lanedir": 0.13312858444658682, "in-drivable-lane": 9.433333333333303, "deviation-heading": 1.2058790583016687, "deviation-center-line": 0.1298299749331926}, "ep002": {"driven_any": 4.454184189442348, "driven_lanedir": 0.18029873261228624, "in-drivable-lane": 15.999999999999988, "deviation-heading": 0.47847911565363, "deviation-center-line": 0.07622794685743216}, "ep003": {"driven_any": 1.0662869877338133, "driven_lanedir": 0.12210766431060804, "in-drivable-lane": 3.766666666666661, "deviation-heading": 0.7905924332634966, "deviation-center-line": 0.08744010848352336}, "ep004": {"driven_any": 2.7594945871373833, "driven_lanedir": 0.4531051551321528, "in-drivable-lane": 8.366666666666639, "deviation-heading": 0.9395606813914256, "deviation-center-line": 0.2951251593343002}}
No reset possible 8164
1004
Laurent Mandrile Tensorflow template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:42 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 10.533333333333308
other stats episodes details {"ep000": {"nsteps": 244, "reward": -6.2869903129815565, "good_angle": 1.776589935962943, "survival_time": 8.133333333333315, "traveled_tiles": 4, "valid_direction": 2.6999999999999997}, "ep001": {"nsteps": 316, "reward": -5.700926186605297, "good_angle": 40.99984006033056, "survival_time": 10.533333333333308, "traveled_tiles": 3, "valid_direction": 7.099999999999983}, "ep002": {"nsteps": 500, "reward": -2.864125187820755, "good_angle": 41.203790317142946, "survival_time": 16.666666666666654, "traveled_tiles": 6, "valid_direction": 7.699999999999982}, "ep003": {"nsteps": 138, "reward": -9.236848439602856, "good_angle": 1.853637092777763, "survival_time": 4.599999999999993, "traveled_tiles": 2, "valid_direction": 2.5}, "ep004": {"nsteps": 316, "reward": -4.6504965261512, "good_angle": 1.752446115197312, "survival_time": 10.533333333333308, "traveled_tiles": 4, "valid_direction": 3.166666666666659}}good_angle_max 41.203790317142946 good_angle_mean 17.517260704282304 good_angle_median 1.853637092777763 good_angle_min 1.752446115197312 reward_max -2.864125187820755 reward_mean -5.747877330632333 reward_median -5.700926186605297 reward_min -9.236848439602856 survival_time_max 16.666666666666654 survival_time_mean 10.093333333333314 survival_time_min 4.599999999999993 traveled_tiles_max 6 traveled_tiles_mean 3.8 traveled_tiles_median 4 traveled_tiles_min 2 valid_direction_max 7.699999999999982 valid_direction_mean 4.633333333333324 valid_direction_median 3.166666666666659 valid_direction_min 2.5
No reset possible 8149
1002
Laurent Mandrile Tensorflow template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:53 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8148
1002
Laurent Mandrile Tensorflow template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:19 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 6.733333333333319
other stats episodes details {"ep000": {"nsteps": 216, "reward": -6.392362615874003, "good_angle": 22.74045157653506, "survival_time": 7.199999999999984, "traveled_tiles": 2, "valid_direction": 5.499999999999989}, "ep001": {"nsteps": 202, "reward": -6.445137121382695, "good_angle": 4.28775818782871, "survival_time": 6.733333333333319, "traveled_tiles": 1, "valid_direction": 5.233333333333322}, "ep002": {"nsteps": 43, "reward": -25.32363423219947, "good_angle": 0.6317146216669308, "survival_time": 1.4333333333333345, "traveled_tiles": 2, "valid_direction": 1.2666666666666673}, "ep003": {"nsteps": 500, "reward": -1.0371820561804344, "good_angle": 9.409300943573458, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 12.533333333333331}, "ep004": {"nsteps": 32, "reward": -32.28967246040702, "good_angle": 0.848158727035083, "survival_time": 1.0666666666666669, "traveled_tiles": 1, "valid_direction": 0.6666666666666666}}good_angle_max 22.74045157653506 good_angle_mean 7.583476811327849 good_angle_median 4.28775818782871 good_angle_min 0.6317146216669308 reward_max -1.0371820561804344 reward_mean -14.297597697208724 reward_median -6.445137121382695 reward_min -32.28967246040702 survival_time_max 16.666666666666654 survival_time_mean 6.619999999999992 survival_time_min 1.0666666666666669 traveled_tiles_max 3 traveled_tiles_mean 1.8 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 12.533333333333331 valid_direction_mean 5.039999999999996 valid_direction_median 5.233333333333322 valid_direction_min 0.6666666666666666
No reset possible 8140
1001
Ruixiang Zhang 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:08 The result file is n [...] The result file is not found. This usually means that the evaluator did not finish
and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8127
997
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:09:49 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 3.475756387262958 deviation-center-line_median 0.49560335850959286 in-drivable-lane_median 0.4999999999999982
other stats deviation-center-line_max 0.6065490419207544 deviation-center-line_mean 0.4244847081266988 deviation-center-line_min 0.14133001541463242 deviation-heading_max 2.0975060124105807 deviation-heading_mean 1.3631473642597385 deviation-heading_median 1.3679922168602072 deviation-heading_min 0.489578171429585 driven_any_max 4.919781794687056 driven_any_mean 3.5985282374288516 driven_any_median 3.7340983940482646 driven_any_min 1.7385862129427283 driven_lanedir_max 4.139898140690089 driven_lanedir_mean 3.043896426628955 driven_lanedir_min 1.3724302605954322 in-drivable-lane_max 1.0666666666666638 in-drivable-lane_mean 0.6666666666666645 in-drivable-lane_min 0.2666666666666657 per-episodes details {"ep000": {"driven_any": 4.454081912764259, "driven_lanedir": 3.5609636276616254, "in-drivable-lane": 1.066666666666663, "deviation-heading": 2.0975060124105807, "deviation-center-line": 0.49560335850959286}, "ep001": {"driven_any": 3.1460928727019506, "driven_lanedir": 2.6704337169346704, "in-drivable-lane": 0.4999999999999982, "deviation-heading": 1.187324127325582, "deviation-center-line": 0.36788207121595706}, "ep002": {"driven_any": 3.7340983940482646, "driven_lanedir": 3.475756387262958, "in-drivable-lane": 0.2666666666666657, "deviation-heading": 1.3679922168602072, "deviation-center-line": 0.5110590535725572}, "ep003": {"driven_any": 4.919781794687056, "driven_lanedir": 4.139898140690089, "in-drivable-lane": 1.0666666666666638, "deviation-heading": 1.6733362932727374, "deviation-center-line": 0.6065490419207544}, "ep004": {"driven_any": 1.7385862129427283, "driven_lanedir": 1.3724302605954322, "in-drivable-lane": 0.4333333333333318, "deviation-heading": 0.489578171429585, "deviation-center-line": 0.14133001541463242}}
No reset possible 8105
995
Manfred Diaz DAgger aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:10:22 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8099
993
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:41 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.732358459893614, "good_angle": 0.7240931219756537, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.06666666666666}, "ep001": {"nsteps": 500, "reward": -0.8166965480893851, "good_angle": 4.595518930008325, "survival_time": 16.666666666666654, "traveled_tiles": 10, "valid_direction": 1.633333333333332}, "ep002": {"nsteps": 500, "reward": -0.5397966891722754, "good_angle": 0.6194322536008012, "survival_time": 16.666666666666654, "traveled_tiles": 13, "valid_direction": 1.6333333333333275}, "ep003": {"nsteps": 500, "reward": -0.31135352698806673, "good_angle": 0.5891132255178639, "survival_time": 16.666666666666654, "traveled_tiles": 11, "valid_direction": 1.4666666666666617}, "ep004": {"nsteps": 500, "reward": -0.5582869492536411, "good_angle": 24.80906323546956, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 3.333333333333359}}good_angle_max 24.80906323546956 good_angle_mean 6.267444153314441 good_angle_median 0.7240931219756537 good_angle_min 0.5891132255178639 reward_max -0.31135352698806673 reward_mean -0.5916984346793965 reward_median -0.5582869492536411 reward_min -0.8166965480893851 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 13 traveled_tiles_mean 11.6 traveled_tiles_median 12 traveled_tiles_min 10 valid_direction_max 3.333333333333359 valid_direction_mean 2.026666666666668 valid_direction_median 1.633333333333332 valid_direction_min 1.4666666666666617
No reset possible 8072
989
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:13:39 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 8005
978
Vincent Mai 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 1:12:47 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 375, in go_
uploaded = upload_files(wd, aws_config)
File "/project/src/duckietown_challenges_runner/runner.py", line 903, in upload_files
uploaded = upload(aws_config, toupload)
File "/project/src/duckietown_challenges_runner/runner.py", line 1067, in upload
aws_object.load()
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/factory.py", line 505, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 320, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/client.py", line 610, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 136, in _send_request
success_response, exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/endpoint.py", line 210, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 277, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python2.7/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
EndpointConnectionError: Could not connect to the endpoint URL: "https://duckietown-ai-driving-olympics-1.s3.amazonaws.com/v3/frankfurt/by-value/sha256/89549b1a771f44fcc8e35b9c4e5f08cc1b574724cdf8c1b3bcacdb71461f2162"
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7988
976
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:12 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 4.533333333333327
other stats episodes details {"ep000": {"nsteps": 82, "reward": -12.52588034088996, "good_angle": 0.17952429330929992, "survival_time": 2.7333333333333334, "traveled_tiles": 3, "valid_direction": 0.6666666666666665}, "ep001": {"nsteps": 172, "reward": -6.511317504494083, "good_angle": 5.578101579727321, "survival_time": 5.733333333333323, "traveled_tiles": 3, "valid_direction": 1.7666666666666644}, "ep002": {"nsteps": 107, "reward": -9.778149597277151, "good_angle": 0.03286934148681326, "survival_time": 3.566666666666664, "traveled_tiles": 3, "valid_direction": 0}, "ep003": {"nsteps": 136, "reward": -7.620555359009065, "good_angle": 0.4826043130813186, "survival_time": 4.533333333333327, "traveled_tiles": 3, "valid_direction": 1.1333333333333313}, "ep004": {"nsteps": 162, "reward": -6.687916881631899, "good_angle": 5.260282393500154, "survival_time": 5.399999999999991, "traveled_tiles": 3, "valid_direction": 2.5333333333333243}}good_angle_max 5.578101579727321 good_angle_mean 2.306676384220981 good_angle_median 0.4826043130813186 good_angle_min 0.03286934148681326 reward_max -6.511317504494083 reward_mean -8.624763936660433 reward_median -7.620555359009065 reward_min -12.52588034088996 survival_time_max 5.733333333333323 survival_time_mean 4.393333333333328 survival_time_min 2.7333333333333334 traveled_tiles_max 3 traveled_tiles_mean 3 traveled_tiles_median 3 traveled_tiles_min 3 valid_direction_max 2.5333333333333243 valid_direction_mean 1.2199999999999973 valid_direction_median 1.1333333333333313 valid_direction_min 0
No reset possible 7962
973
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:43 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.6505766153109622 deviation-center-line_median 0.16014963851655162 in-drivable-lane_median 0
other stats deviation-center-line_max 0.7603053620165483 deviation-center-line_mean 0.2564637341640809 deviation-center-line_min 0.09210699726460604 deviation-heading_max 0.9642614812629324 deviation-heading_mean 0.3996068763062016 deviation-heading_median 0.1951626949882956 deviation-heading_min 0.08068462255521815 driven_any_max 2.9665349579780176 driven_any_mean 1.03460971155806 driven_any_median 0.6566328495709304 driven_any_min 0.3366328495709288 driven_lanedir_max 2.923355747095626 driven_lanedir_mean 0.9885477295561832 driven_lanedir_min 0.3321162244136633 in-drivable-lane_max 0.3666666666666667 in-drivable-lane_mean 0.07333333333333333 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.8532550200589977, "driven_lanedir": 0.6787448401236269, "in-drivable-lane": 0.3666666666666667, "deviation-heading": 0.6046620566532065, "deviation-center-line": 0.1704259240979422}, "ep001": {"driven_any": 0.3599928806114263, "driven_lanedir": 0.3579452208370377, "in-drivable-lane": 0, "deviation-heading": 0.08068462255521815, "deviation-center-line": 0.09933074892475642}, "ep002": {"driven_any": 0.3366328495709288, "driven_lanedir": 0.3321162244136633, "in-drivable-lane": 0, "deviation-heading": 0.15326352607135502, "deviation-center-line": 0.09210699726460604}, "ep003": {"driven_any": 0.6566328495709304, "driven_lanedir": 0.6505766153109622, "in-drivable-lane": 0, "deviation-heading": 0.1951626949882956, "deviation-center-line": 0.16014963851655162}, "ep004": {"driven_any": 2.9665349579780176, "driven_lanedir": 2.923355747095626, "in-drivable-lane": 0, "deviation-heading": 0.9642614812629324, "deviation-center-line": 0.7603053620165483}}
No reset possible 7929
970
Martin Weiss 🇨🇦PyTorch template aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:17:25 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 105, in run
solve(params, cis)
File "solution.py", line 51, in solve
import model
File "/workspace/model.py", line 107
SyntaxError: Non-ASCII character '\xce' in file /workspace/model.py on line 107, but no encoding declared; see http://python.org/dev/peps/pep-0263/ for details
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7873
962
David Abraham Pytorch IL aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:51:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7852
956
Jason Chun Lok Li ðŸ‡ðŸ‡°PyTorch template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:08 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median -1.05985247177248 deviation-center-line_median 0.32463013910329175 in-drivable-lane_median 8.23333333333332
other stats deviation-center-line_max 1.2229848197846576 deviation-center-line_mean 0.5013585011463242 deviation-center-line_min 0.18421462158296809 deviation-heading_max 6.639256027638261 deviation-heading_mean 6.130316773672168 deviation-heading_median 6.587348625790274 deviation-heading_min 4.253182461720226 driven_any_max 3.317636741607105 driven_any_mean 3.31763674160709 driven_any_median 3.317636741607098 driven_any_min 3.3176367416070653 driven_lanedir_max -0.8727849850798894 driven_lanedir_mean -1.0237415377242507 driven_lanedir_min -1.0652501105207834 in-drivable-lane_max 10.466666666666663 in-drivable-lane_mean 8.686666666666657 in-drivable-lane_min 8.233333333333318 per-episodes details {"ep000": {"driven_any": 3.317636741607105, "driven_lanedir": -0.8727849850798894, "in-drivable-lane": 10.466666666666663, "deviation-heading": 4.253182461720226, "deviation-center-line": 1.2229848197846576}, "ep001": {"driven_any": 3.3176367416070653, "driven_lanedir": -1.0585760658647092, "in-drivable-lane": 8.23333333333332, "deviation-heading": 6.639256027638261, "deviation-center-line": 0.4939995088117161}, "ep002": {"driven_any": 3.317636741607098, "driven_lanedir": -1.0622440553833916, "in-drivable-lane": 8.23333333333332, "deviation-heading": 6.60795448532735, "deviation-center-line": 0.28096341644898776}, "ep003": {"driven_any": 3.317636741607083, "driven_lanedir": -1.05985247177248, "in-drivable-lane": 8.266666666666653, "deviation-heading": 6.563842267884727, "deviation-center-line": 0.18421462158296809}, "ep004": {"driven_any": 3.3176367416071, "driven_lanedir": -1.0652501105207834, "in-drivable-lane": 8.233333333333318, "deviation-heading": 6.587348625790274, "deviation-center-line": 0.32463013910329175}}
No reset possible 7847
956
Jason Chun Lok Li ðŸ‡ðŸ‡°PyTorch template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:17:43 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.966810409963131, "good_angle": 47.08330822850749, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 14.233333333333322}, "ep001": {"nsteps": 500, "reward": -1.300191262960434, "good_angle": 13.8317262841996, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 13.099999999999993}, "ep002": {"nsteps": 500, "reward": -1.117795492440462, "good_angle": 13.808909540839275, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 12.933333333333326}, "ep003": {"nsteps": 500, "reward": -0.9294596527849426, "good_angle": 13.819737231108473, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 12.96666666666666}, "ep004": {"nsteps": 500, "reward": -1.163820459574461, "good_angle": 13.824014957850544, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 13.09999999999999}}good_angle_max 47.08330822850749 good_angle_mean 20.473539248501076 good_angle_median 13.824014957850544 good_angle_min 13.808909540839275 reward_max -0.9294596527849426 reward_mean -1.0956154555446862 reward_median -1.117795492440462 reward_min -1.300191262960434 survival_time_max 16.666666666666654 survival_time_mean 16.666666666666654 survival_time_min 16.666666666666654 traveled_tiles_max 2 traveled_tiles_mean 1.2 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 14.233333333333322 valid_direction_mean 13.266666666666657 valid_direction_median 13.09999999999999 valid_direction_min 12.933333333333326
No reset possible 7840
955
Pravish Sainath 🇨🇦PyTorch template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:29:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median -1.05985247177248 deviation-center-line_median 0.32463013910329175 in-drivable-lane_median 8.23333333333332
other stats deviation-center-line_max 1.2229848197846576 deviation-center-line_mean 0.5013585011463242 deviation-center-line_min 0.18421462158296809 deviation-heading_max 6.639256027638261 deviation-heading_mean 6.130316773672168 deviation-heading_median 6.587348625790274 deviation-heading_min 4.253182461720226 driven_any_max 3.317636741607105 driven_any_mean 3.31763674160709 driven_any_median 3.317636741607098 driven_any_min 3.3176367416070653 driven_lanedir_max -0.8727849850798894 driven_lanedir_mean -1.0237415377242507 driven_lanedir_min -1.0652501105207834 in-drivable-lane_max 10.466666666666663 in-drivable-lane_mean 8.686666666666657 in-drivable-lane_min 8.233333333333318 per-episodes details {"ep000": {"driven_any": 3.317636741607105, "driven_lanedir": -0.8727849850798894, "in-drivable-lane": 10.466666666666663, "deviation-heading": 4.253182461720226, "deviation-center-line": 1.2229848197846576}, "ep001": {"driven_any": 3.3176367416070653, "driven_lanedir": -1.0585760658647092, "in-drivable-lane": 8.23333333333332, "deviation-heading": 6.639256027638261, "deviation-center-line": 0.4939995088117161}, "ep002": {"driven_any": 3.317636741607098, "driven_lanedir": -1.0622440553833916, "in-drivable-lane": 8.23333333333332, "deviation-heading": 6.60795448532735, "deviation-center-line": 0.28096341644898776}, "ep003": {"driven_any": 3.317636741607083, "driven_lanedir": -1.05985247177248, "in-drivable-lane": 8.266666666666653, "deviation-heading": 6.563842267884727, "deviation-center-line": 0.18421462158296809}, "ep004": {"driven_any": 3.3176367416071, "driven_lanedir": -1.0652501105207834, "in-drivable-lane": 8.233333333333318, "deviation-heading": 6.587348625790274, "deviation-center-line": 0.32463013910329175}}
No reset possible 7826
952
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 2.0672675368220776 deviation-center-line_median 0.6566656999765133 in-drivable-lane_median 0.09999999999999964
other stats deviation-center-line_max 1.4643514655555545 deviation-center-line_mean 0.8334349074486465 deviation-center-line_min 0.24338888302510395 deviation-heading_max 3.895204271440406 deviation-heading_mean 2.4961505054632283 deviation-heading_median 2.7219860107949923 deviation-heading_min 1.2930067462546844 driven_any_max 4.506138192977895 driven_any_mean 2.2237786474344547 driven_any_median 2.2046019076161603 driven_any_min 0.5841864345325466 driven_lanedir_max 4.447833623411079 driven_lanedir_mean 2.114176640245193 driven_lanedir_min 0.4026670018218528 in-drivable-lane_max 0.433333333333334 in-drivable-lane_mean 0.14666666666666658 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.5841864345325466, "driven_lanedir": 0.4026670018218528, "in-drivable-lane": 0.433333333333334, "deviation-heading": 1.2930067462546844, "deviation-center-line": 0.24338888302510395}, "ep001": {"driven_any": 2.2046019076161603, "driven_lanedir": 2.0672675368220776, "in-drivable-lane": 0.1999999999999993, "deviation-heading": 3.895204271440406, "deviation-center-line": 1.3143242925087115}, "ep002": {"driven_any": 1.3807368787922434, "driven_lanedir": 1.3045249361196944, "in-drivable-lane": 0, "deviation-heading": 2.7219860107949923, "deviation-center-line": 0.48844419617734985}, "ep003": {"driven_any": 2.4432298232534286, "driven_lanedir": 2.3485901030512597, "in-drivable-lane": 0.09999999999999964, "deviation-heading": 1.8472393554659785, "deviation-center-line": 0.6566656999765133}, "ep004": {"driven_any": 4.506138192977895, "driven_lanedir": 4.447833623411079, "in-drivable-lane": 0, "deviation-heading": 2.723316143360079, "deviation-center-line": 1.4643514655555545}}
No reset possible 7824
953
Claudio Ruch Funky Controller 1 aido1_amod_fleet_size_r1-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:00:49 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats efficiency -63.89231136588219 service_quality -32.640257841471076
No reset possible 7817
952
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:10:46 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.833333333333313
other stats episodes details {"ep000": {"nsteps": 86, "reward": -11.732459801606574, "good_angle": 0.4010022285327249, "survival_time": 2.8666666666666663, "traveled_tiles": 2, "valid_direction": 1.7333333333333347}, "ep001": {"nsteps": 500, "reward": -0.28499583853548394, "good_angle": 37.66553914189548, "survival_time": 16.666666666666654, "traveled_tiles": 5, "valid_direction": 7.733333333333341}, "ep002": {"nsteps": 215, "reward": -5.1582206549339515, "good_angle": 0.6614523307733964, "survival_time": 7.166666666666651, "traveled_tiles": 3, "valid_direction": 1.7999999999999936}, "ep003": {"nsteps": 265, "reward": -3.694234299309049, "good_angle": 0.18738695412101675, "survival_time": 8.833333333333313, "traveled_tiles": 5, "valid_direction": 0.06666666666666643}, "ep004": {"nsteps": 500, "reward": -0.16612535384879448, "good_angle": 34.84228326204095, "survival_time": 16.666666666666654, "traveled_tiles": 9, "valid_direction": 4.899999999999983}}good_angle_max 37.66553914189548 good_angle_mean 14.751532783472715 good_angle_median 0.6614523307733964 good_angle_min 0.18738695412101675 reward_max -0.16612535384879448 reward_mean -4.207207189646771 reward_median -3.694234299309049 reward_min -11.732459801606574 survival_time_max 16.666666666666654 survival_time_mean 10.439999999999989 survival_time_min 2.8666666666666663 traveled_tiles_max 9 traveled_tiles_mean 4.8 traveled_tiles_median 5 traveled_tiles_min 2 valid_direction_max 7.733333333333341 valid_direction_mean 3.246666666666664 valid_direction_median 1.7999999999999936 valid_direction_min 0.06666666666666643
No reset possible 7803
950
Claudio Ruch Funky Controller 1 aido1_amod_efficiency_r1-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:28 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
efficiency -63.228242690318865
other stats fleet_size -1000000000 service_quality -32.553660672580165
No reset possible 7800
949
Claudio Ruch Funky Controller 1 aido1_amod_service_quality_r1-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:45 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7799
948
Claudio Ruch Java template aido1_amod_service_quality_r1-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:25 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
service_quality -31.648553085062535
other stats efficiency -61.44853234024771 fleet_size -1000000000
No reset possible 7790
945
Claudio Ruch Python template aido1_amod_fleet_size_r1-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:00:52 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
other stats efficiency -63.39357988341456 service_quality -32.53305497085425
No reset possible 7777
942
Claudio Ruch Python template aido1_amod_service_quality_r1-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:20:04 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
service_quality -32.316005441692916
other stats efficiency -63.08398176676834 fleet_size -1000000000
No reset possible 7774
941
David Abraham Pytorch IL aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:57 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.4524595137794645 deviation-center-line_median 0.12839941939762656 in-drivable-lane_median 0
other stats deviation-center-line_max 0.7983372443672196 deviation-center-line_mean 0.3380569545926784 deviation-center-line_min 0.07408682132034992 deviation-heading_max 5.279562013364543 deviation-heading_mean 2.036813170556792 deviation-heading_median 0.3199694854199965 deviation-heading_min 0.1808021473176744 driven_any_max 9.64812532793395 driven_any_mean 3.716725938501999 driven_any_median 0.7942371126606618 driven_any_min 0.4879643179665967 driven_lanedir_max 4.224767646159132 driven_lanedir_mean 1.4928239510861143 driven_lanedir_min 0.2364698935657361 in-drivable-lane_max 6.1000000000000085 in-drivable-lane_mean 1.4600000000000009 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 6.869135875499666, "driven_lanedir": 2.2877849291825276, "in-drivable-lane": 6.1000000000000085, "deviation-heading": 5.279562013364543, "deviation-center-line": 0.6012225514085052}, "ep001": {"driven_any": 0.7942371126606618, "driven_lanedir": 0.26263777274371103, "in-drivable-lane": 0, "deviation-heading": 0.3199694854199965, "deviation-center-line": 0.08823873646969087}, "ep002": {"driven_any": 0.7841670584491253, "driven_lanedir": 0.4524595137794645, "in-drivable-lane": 0, "deviation-heading": 0.29175076592833826, "deviation-center-line": 0.12839941939762656}, "ep003": {"driven_any": 9.64812532793395, "driven_lanedir": 4.224767646159132, "in-drivable-lane": 1.1999999999999955, "deviation-heading": 4.111981440753408, "deviation-center-line": 0.7983372443672196}, "ep004": {"driven_any": 0.4879643179665967, "driven_lanedir": 0.2364698935657361, "in-drivable-lane": 0, "deviation-heading": 0.1808021473176744, "deviation-center-line": 0.07408682132034992}}
No reset possible 7770
941
David Abraham Pytorch IL aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:43 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 1.5666666666666682
other stats episodes details {"ep000": {"nsteps": 500, "reward": -1.4576184816499007, "good_angle": 40.9939386169918, "survival_time": 16.666666666666654, "traveled_tiles": 4, "valid_direction": 9.933333333333335}, "ep001": {"nsteps": 30, "reward": -33.80661088923613, "good_angle": 0.13794416402598422, "survival_time": 1, "traveled_tiles": 1, "valid_direction": 0.33333333333333326}, "ep002": {"nsteps": 47, "reward": -21.693168340528263, "good_angle": 0.06418811201799242, "survival_time": 1.5666666666666682, "traveled_tiles": 2, "valid_direction": 0.1000000000000002}, "ep003": {"nsteps": 433, "reward": -2.8357827126970148, "good_angle": 8.859400916232572, "survival_time": 14.433333333333293, "traveled_tiles": 7, "valid_direction": 4.466666666666655}, "ep004": {"nsteps": 24, "reward": -42.15544073283672, "good_angle": 0.05680776574138376, "survival_time": 0.7999999999999999, "traveled_tiles": 1, "valid_direction": 0.1333333333333333}}good_angle_max 40.9939386169918 good_angle_mean 10.022455915001943 good_angle_median 0.13794416402598422 good_angle_min 0.05680776574138376 reward_max -1.4576184816499007 reward_mean -20.389724231389607 reward_median -21.693168340528263 reward_min -42.15544073283672 survival_time_max 16.666666666666654 survival_time_mean 6.893333333333322 survival_time_min 0.7999999999999999 traveled_tiles_max 7 traveled_tiles_mean 3 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 9.933333333333335 valid_direction_mean 2.9933333333333314 valid_direction_median 0.33333333333333326 valid_direction_min 0.1000000000000002
No reset possible 7767
940
David Abraham Pytorch IL aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:11:23 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.4524595137794645 deviation-center-line_median 0.12839941939762656 in-drivable-lane_median 0
other stats deviation-center-line_max 0.7983372443672196 deviation-center-line_mean 0.3380569545926784 deviation-center-line_min 0.07408682132034992 deviation-heading_max 5.279562013364543 deviation-heading_mean 2.036813170556792 deviation-heading_median 0.3199694854199965 deviation-heading_min 0.1808021473176744 driven_any_max 9.64812532793395 driven_any_mean 3.716725938501999 driven_any_median 0.7942371126606618 driven_any_min 0.4879643179665967 driven_lanedir_max 4.224767646159132 driven_lanedir_mean 1.4928239510861143 driven_lanedir_min 0.2364698935657361 in-drivable-lane_max 6.1000000000000085 in-drivable-lane_mean 1.4600000000000009 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 6.869135875499666, "driven_lanedir": 2.2877849291825276, "in-drivable-lane": 6.1000000000000085, "deviation-heading": 5.279562013364543, "deviation-center-line": 0.6012225514085052}, "ep001": {"driven_any": 0.7942371126606618, "driven_lanedir": 0.26263777274371103, "in-drivable-lane": 0, "deviation-heading": 0.3199694854199965, "deviation-center-line": 0.08823873646969087}, "ep002": {"driven_any": 0.7841670584491253, "driven_lanedir": 0.4524595137794645, "in-drivable-lane": 0, "deviation-heading": 0.29175076592833826, "deviation-center-line": 0.12839941939762656}, "ep003": {"driven_any": 9.64812532793395, "driven_lanedir": 4.224767646159132, "in-drivable-lane": 1.1999999999999955, "deviation-heading": 4.111981440753408, "deviation-center-line": 0.7983372443672196}, "ep004": {"driven_any": 0.4879643179665967, "driven_lanedir": 0.2364698935657361, "in-drivable-lane": 0, "deviation-heading": 0.1808021473176744, "deviation-center-line": 0.07408682132034992}}
No reset possible 7746
938
Hristo Vrigazov 🇧🇬ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:49 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.333333333333335
other stats episodes details {"ep000": {"nsteps": 224, "reward": -4.621182094348375, "good_angle": 0.7706784765932334, "survival_time": 7.46666666666665, "traveled_tiles": 2, "valid_direction": 0.8999999999999968}, "ep001": {"nsteps": 92, "reward": -11.484535440478636, "good_angle": 0.19582252450820103, "survival_time": 3.0666666666666655, "traveled_tiles": 1, "valid_direction": 0.3999999999999986}, "ep002": {"nsteps": 53, "reward": -19.31181934138514, "good_angle": 0.45340019768764384, "survival_time": 1.7666666666666688, "traveled_tiles": 2, "valid_direction": 0.4666666666666681}, "ep003": {"nsteps": 57, "reward": -17.88823687298256, "good_angle": 0.6787207341248404, "survival_time": 1.9000000000000024, "traveled_tiles": 1, "valid_direction": 0.7000000000000022}, "ep004": {"nsteps": 70, "reward": -14.757806219799178, "good_angle": 0.4106352195159575, "survival_time": 2.333333333333335, "traveled_tiles": 1, "valid_direction": 0.6333333333333329}}good_angle_max 0.7706784765932334 good_angle_mean 0.5018514304859752 good_angle_median 0.45340019768764384 good_angle_min 0.19582252450820103 reward_max -4.621182094348375 reward_mean -13.612715993798778 reward_median -14.757806219799178 reward_min -19.31181934138514 survival_time_max 7.46666666666665 survival_time_mean 3.306666666666664 survival_time_min 1.7666666666666688 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 0.8999999999999968 valid_direction_mean 0.6199999999999998 valid_direction_median 0.6333333333333329 valid_direction_min 0.3999999999999986
No reset possible 7715
929
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:53 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.15452735404053652 deviation-center-line_median 0.2678767736384 in-drivable-lane_median 0.4000000000000008
other stats deviation-center-line_max 0.31082345672843653 deviation-center-line_mean 0.23635870772679252 deviation-center-line_min 0.11648018064560534 deviation-heading_max 2.5372186668491503 deviation-heading_mean 1.6780024160940694 deviation-heading_median 1.729976996227318 deviation-heading_min 0.8733144887328745 driven_any_max 1.9561262793008625 driven_any_mean 1.1234495227446186 driven_any_median 0.9265852125600964 driven_any_min 0.26993116944546164 driven_lanedir_max 0.3939344103988932 driven_lanedir_mean 0.15505554608366856 driven_lanedir_min -0.006124570623144088 in-drivable-lane_max 4.19999999999999 in-drivable-lane_mean 1.5199999999999982 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 1.9561262793008625, "driven_lanedir": 0.3939344103988932, "in-drivable-lane": 2.999999999999999, "deviation-heading": 2.0063800547895867, "deviation-center-line": 0.31082345672843653}, "ep001": {"driven_any": 0.26993116944546164, "driven_lanedir": -0.006124570623144088, "in-drivable-lane": 0, "deviation-heading": 0.8733144887328745, "deviation-center-line": 0.11648018064560534}, "ep002": {"driven_any": 1.8539052586165647, "driven_lanedir": 0.15452735404053652, "in-drivable-lane": 4.19999999999999, "deviation-heading": 2.5372186668491503, "deviation-center-line": 0.2678767736384}, "ep003": {"driven_any": 0.9265852125600964, "driven_lanedir": 0.20110015817026072, "in-drivable-lane": 0, "deviation-heading": 1.243121873871418, "deviation-center-line": 0.2956628607939545}, "ep004": {"driven_any": 0.6106996938001087, "driven_lanedir": 0.031840378431796434, "in-drivable-lane": 0.4000000000000008, "deviation-heading": 1.729976996227318, "deviation-center-line": 0.19095026682756633}}
No reset possible 7706
929
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:30 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7700
928
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:31 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7690
926
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:02:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.08995210883203586 deviation-center-line_median 0.13999283732729784 in-drivable-lane_median 0
other stats deviation-center-line_max 0.5967702926810171 deviation-center-line_mean 0.22107473955257056 deviation-center-line_min 0.09553931818728711 deviation-heading_max 5.561174620118044 deviation-heading_mean 2.052529886458908 deviation-heading_median 1.2003638965457502 deviation-heading_min 0.8821821364432593 driven_any_max 2.72233971937701 driven_any_mean 0.7572215383759043 driven_any_median 0.25726582444831586 driven_any_min 0.1327951737101693 driven_lanedir_max 0.2016102188459316 driven_lanedir_mean 0.08463395806118029 driven_lanedir_min -0.015412842586705278 in-drivable-lane_max 5.9999999999999805 in-drivable-lane_mean 1.1999999999999962 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 2.72233971937701, "driven_lanedir": 0.2016102188459316, "in-drivable-lane": 5.9999999999999805, "deviation-heading": 5.561174620118044, "deviation-center-line": 0.5967702926810171}, "ep001": {"driven_any": 0.25726582444831586, "driven_lanedir": 0.03969031674007928, "in-drivable-lane": 0, "deviation-heading": 0.8821821364432593, "deviation-center-line": 0.13999283732729784}, "ep002": {"driven_any": 0.2353800698745608, "driven_lanedir": 0.08995210883203586, "in-drivable-lane": 0, "deviation-heading": 1.0206839946950763, "deviation-center-line": 0.15601821040973946}, "ep003": {"driven_any": 0.4383269044694661, "driven_lanedir": 0.10732998847456, "in-drivable-lane": 0, "deviation-heading": 1.5982447844924086, "deviation-center-line": 0.117053039157511}, "ep004": {"driven_any": 0.1327951737101693, "driven_lanedir": -0.015412842586705278, "in-drivable-lane": 0, "deviation-heading": 1.2003638965457502, "deviation-center-line": 0.09553931818728711}}
No reset possible 7609
915
Ruixiang Zhang 🇨🇦stay simple aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:08:15 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7601
914
Manfred Diaz Tensorflow template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:06:11 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7590
912
Manfred Diaz Tensorflow template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7581
910
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:27 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7571
909
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:01:53 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.10746971065748948, "good_angle": 1.0001678522849908, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.3999999999999915}, "ep001": {"nsteps": 19, "reward": -53.18326093491755, "good_angle": 0.007962716061123577, "survival_time": 0.6333333333333333, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 500, "reward": -0.1759225153435109, "good_angle": 1.0382389856173804, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 2.966666666666658}, "ep003": {"nsteps": 500, "reward": -0.5756291651261272, "good_angle": 1.7821133008320895, "survival_time": 16.666666666666654, "traveled_tiles": 12, "valid_direction": 4.599999999999984}, "ep004": {"nsteps": 84, "reward": -12.164676775002764, "good_angle": 0.1817097120242578, "survival_time": 2.8, "traveled_tiles": 2, "valid_direction": 0.36666666666666536}}good_angle_max 1.7821133008320895 good_angle_mean 0.8020385133639685 good_angle_median 1.0001678522849908 good_angle_min 0.007962716061123577 reward_max -0.10746971065748948 reward_mean -13.241391820209486 reward_median -0.5756291651261272 reward_min -53.18326093491755 survival_time_max 16.666666666666654 survival_time_mean 10.68666666666666 survival_time_min 0.6333333333333333 traveled_tiles_max 12 traveled_tiles_mean 7.8 traveled_tiles_median 12 traveled_tiles_min 1 valid_direction_max 4.599999999999984 valid_direction_mean 2.06666666666666 valid_direction_median 2.3999999999999915 valid_direction_min 0
No reset possible 7561
908
Pravish Sainath 🇨🇦PyTorch template aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:51 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 6.233333333333321
other stats episodes details {"ep000": {"nsteps": 77, "reward": -13.25760446571104, "good_angle": 1.2102786325198769, "survival_time": 2.5666666666666673, "traveled_tiles": 1, "valid_direction": 2.1333333333333337}, "ep001": {"nsteps": 54, "reward": -19.086375530119295, "good_angle": 0.06435254244870461, "survival_time": 1.8000000000000025, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 226, "reward": -4.901912556033977, "good_angle": 2.5406340061787915, "survival_time": 7.533333333333316, "traveled_tiles": 3, "valid_direction": 4.966666666666653}, "ep003": {"nsteps": 221, "reward": -5.707385117474657, "good_angle": 1.0355929673408153, "survival_time": 7.36666666666665, "traveled_tiles": 3, "valid_direction": 4.999999999999986}, "ep004": {"nsteps": 187, "reward": -6.5791364537432475, "good_angle": 1.5165908226878455, "survival_time": 6.233333333333321, "traveled_tiles": 2, "valid_direction": 5.099999999999987}}good_angle_max 2.5406340061787915 good_angle_mean 1.2734897942352066 good_angle_median 1.2102786325198769 good_angle_min 0.06435254244870461 reward_max -4.901912556033977 reward_mean -9.906482824616443 reward_median -6.5791364537432475 reward_min -19.086375530119295 survival_time_max 7.533333333333316 survival_time_mean 5.099999999999992 survival_time_min 1.8000000000000025 traveled_tiles_max 3 traveled_tiles_mean 2 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 5.099999999999987 valid_direction_mean 3.439999999999992 valid_direction_median 4.966666666666653 valid_direction_min 0
No reset possible 7550
906
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:07:35 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7502
894
Martin Weiss 🇨🇦PyTorch template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:04:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median -1.05985247177248 deviation-center-line_median 0.32463013910329175 in-drivable-lane_median 8.23333333333332
other stats deviation-center-line_max 1.2229848197846576 deviation-center-line_mean 0.5013585011463242 deviation-center-line_min 0.18421462158296809 deviation-heading_max 6.639256027638261 deviation-heading_mean 6.130316773672168 deviation-heading_median 6.587348625790274 deviation-heading_min 4.253182461720226 driven_any_max 3.317636741607105 driven_any_mean 3.31763674160709 driven_any_median 3.317636741607098 driven_any_min 3.3176367416070653 driven_lanedir_max -0.8727849850798894 driven_lanedir_mean -1.0237415377242507 driven_lanedir_min -1.0652501105207834 in-drivable-lane_max 10.466666666666663 in-drivable-lane_mean 8.686666666666657 in-drivable-lane_min 8.233333333333318 per-episodes details {"ep000": {"driven_any": 3.317636741607105, "driven_lanedir": -0.8727849850798894, "in-drivable-lane": 10.466666666666663, "deviation-heading": 4.253182461720226, "deviation-center-line": 1.2229848197846576}, "ep001": {"driven_any": 3.3176367416070653, "driven_lanedir": -1.0585760658647092, "in-drivable-lane": 8.23333333333332, "deviation-heading": 6.639256027638261, "deviation-center-line": 0.4939995088117161}, "ep002": {"driven_any": 3.317636741607098, "driven_lanedir": -1.0622440553833916, "in-drivable-lane": 8.23333333333332, "deviation-heading": 6.60795448532735, "deviation-center-line": 0.28096341644898776}, "ep003": {"driven_any": 3.317636741607083, "driven_lanedir": -1.05985247177248, "in-drivable-lane": 8.266666666666653, "deviation-heading": 6.563842267884727, "deviation-center-line": 0.18421462158296809}, "ep004": {"driven_any": 3.3176367416071, "driven_lanedir": -1.0652501105207834, "in-drivable-lane": 8.233333333333318, "deviation-heading": 6.587348625790274, "deviation-center-line": 0.32463013910329175}}
No reset possible 7495
895
Martin Weiss 🇨🇦PyTorch template aido1_LFV_r1-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:03:33 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 106, in run
solve(params, cis)
File "solution.py", line 72, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7489
894
Martin Weiss 🇨🇦PyTorch template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 4 months 6 years, 4 months 0:05:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7480
893
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:49 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 0.49999999999999994
other stats episodes details {"ep000": {"nsteps": 48, "reward": -22.10871991511279, "good_angle": 3.150220155346793, "survival_time": 1.6000000000000016, "traveled_tiles": 2, "valid_direction": 1.266666666666668}, "ep001": {"nsteps": 14, "reward": -72.85214451168265, "good_angle": 0.4903066366810786, "survival_time": 0.4666666666666666, "traveled_tiles": 1, "valid_direction": 0.4}, "ep002": {"nsteps": 14, "reward": -72.56261728703976, "good_angle": 0.38122320781889096, "survival_time": 0.4666666666666666, "traveled_tiles": 1, "valid_direction": 0.3666666666666666}, "ep003": {"nsteps": 15, "reward": -67.71164356426647, "good_angle": 0.4429239307234649, "survival_time": 0.49999999999999994, "traveled_tiles": 1, "valid_direction": 0.4333333333333333}, "ep004": {"nsteps": 15, "reward": -68.05443244576455, "good_angle": 0.44260828931489266, "survival_time": 0.49999999999999994, "traveled_tiles": 1, "valid_direction": 0.3999999999999999}}good_angle_max 3.150220155346793 good_angle_mean 0.981456443977024 good_angle_median 0.4429239307234649 good_angle_min 0.38122320781889096 reward_max -22.10871991511279 reward_mean -60.65791154477324 reward_median -68.05443244576455 reward_min -72.85214451168265 survival_time_max 1.6000000000000016 survival_time_mean 0.706666666666667 survival_time_min 0.4666666666666666 traveled_tiles_max 2 traveled_tiles_mean 1.2 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.266666666666668 valid_direction_mean 0.5733333333333335 valid_direction_median 0.4 valid_direction_min 0.3666666666666666
No reset possible 7469
890
Mandana Samiei 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:03:15 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7425
881
Jonathan Plante 🇨🇦JP pipeline aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:06:38 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7421
878
Mandana Samiei 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:04:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.42583954081123343 deviation-center-line_median 0.17320688210284996 in-drivable-lane_median 0.8999999999999968
other stats deviation-center-line_max 0.2228829251353732 deviation-center-line_mean 0.16434571981296003 deviation-center-line_min 0.09093832662212982 deviation-heading_max 1.4075803299755223 deviation-heading_mean 0.9176843383697352 deviation-heading_median 0.8078204297304372 deviation-heading_min 0.5613042426861848 driven_any_max 0.9219548432356148 driven_any_mean 0.7299005046004915 driven_any_median 0.793164018584243 driven_any_min 0.4054401413209636 driven_lanedir_max 0.5721425621400456 driven_lanedir_mean 0.39567261648125274 driven_lanedir_min 0.19465232407579425 in-drivable-lane_max 1.699999999999994 in-drivable-lane_mean 1.0266666666666644 in-drivable-lane_min 0.7333333333333356 per-episodes details {"ep000": {"driven_any": 0.4054401413209636, "driven_lanedir": 0.19465232407579425, "in-drivable-lane": 0.7999999999999999, "deviation-heading": 1.4075803299755223, "deviation-center-line": 0.17320688210284996}, "ep001": {"driven_any": 0.9219548432356148, "driven_lanedir": 0.5721425621400456, "in-drivable-lane": 0.8999999999999968, "deviation-heading": 1.0430727836679292, "deviation-center-line": 0.2228829251353732}, "ep002": {"driven_any": 0.8734300505158462, "driven_lanedir": 0.42583954081123343, "in-drivable-lane": 1.699999999999994, "deviation-heading": 0.7686439057886024, "deviation-center-line": 0.14653797754600437}, "ep003": {"driven_any": 0.793164018584243, "driven_lanedir": 0.4434648915398909, "in-drivable-lane": 0.9999999999999964, "deviation-heading": 0.8078204297304372, "deviation-center-line": 0.18816248765844276}, "ep004": {"driven_any": 0.6555134693457899, "driven_lanedir": 0.3422637638392996, "in-drivable-lane": 0.7333333333333356, "deviation-heading": 0.5613042426861848, "deviation-center-line": 0.09093832662212982}}
No reset possible 7420
878
Mandana Samiei 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:46 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7411
878
Mandana Samiei 🇨🇦PyTorch DDPG template aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:05:16 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7403
875
Philippe Lacaille Dolores' Awakening aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:02:46 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.3000000000000016
other stats episodes details {"ep000": {"nsteps": 227, "reward": -4.5744455482416875, "good_angle": 0.7937623130473065, "survival_time": 7.5666666666666496, "traveled_tiles": 2, "valid_direction": 1.2333333333333298}, "ep001": {"nsteps": 99, "reward": -10.711396231193737, "good_angle": 0.19533482846247435, "survival_time": 3.299999999999998, "traveled_tiles": 1, "valid_direction": 0.2666666666666657}, "ep002": {"nsteps": 54, "reward": -18.98695876918457, "good_angle": 0.34666212011073927, "survival_time": 1.8000000000000025, "traveled_tiles": 2, "valid_direction": 0.6666666666666685}, "ep003": {"nsteps": 53, "reward": -19.22507929984691, "good_angle": 0.5550129967406443, "survival_time": 1.7666666666666688, "traveled_tiles": 1, "valid_direction": 0.9000000000000022}, "ep004": {"nsteps": 69, "reward": -14.967612463063087, "good_angle": 0.3386407174472176, "survival_time": 2.3000000000000016, "traveled_tiles": 1, "valid_direction": 0.7}}good_angle_max 0.7937623130473065 good_angle_mean 0.4458825951616764 good_angle_median 0.34666212011073927 good_angle_min 0.19533482846247435 reward_max -4.5744455482416875 reward_mean -13.693098462306 reward_median -14.967612463063087 reward_min -19.22507929984691 survival_time_max 7.5666666666666496 survival_time_mean 3.346666666666664 survival_time_min 1.7666666666666688 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.2333333333333298 valid_direction_mean 0.7533333333333333 valid_direction_median 0.7 valid_direction_min 0.2666666666666657
No reset possible 7388
872
Patrick Pfreundschuh 🇨ðŸ‡AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:04:55 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7263
844
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:04:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1314269717058074 deviation-center-line_median 0.5813102795653945 in-drivable-lane_median 0.16666666666666607
other stats deviation-center-line_max 0.8318073362698775 deviation-center-line_mean 0.5282089568497466 deviation-center-line_min 0.25983970914502263 deviation-heading_max 4.16826922233794 deviation-heading_mean 1.6450824421508792 deviation-heading_median 1.0426196322852967 deviation-heading_min 0.2217039640085142 driven_any_max 1.93337991823903 driven_any_mean 1.07539403334284 driven_any_median 1.1802158531236375 driven_any_min 0.2872893822401655 driven_lanedir_max 1.931771955318411 driven_lanedir_mean 0.9542381720624638 driven_lanedir_min 0.15913915801793432 in-drivable-lane_max 0.9999999999999964 in-drivable-lane_mean 0.39999999999999913 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3765819473110307, "driven_lanedir": 0.15913915801793432, "in-drivable-lane": 0.8333333333333329, "deviation-heading": 2.348508487116988, "deviation-center-line": 0.25983970914502263}, "ep001": {"driven_any": 0.2872893822401655, "driven_lanedir": 0.28613778999603623, "in-drivable-lane": 0, "deviation-heading": 0.2217039640085142, "deviation-center-line": 0.2729106554978187}, "ep002": {"driven_any": 1.1802158531236375, "driven_lanedir": 1.1314269717058074, "in-drivable-lane": 0.16666666666666607, "deviation-heading": 1.0426196322852967, "deviation-center-line": 0.69517680377062}, "ep003": {"driven_any": 1.93337991823903, "driven_lanedir": 1.931771955318411, "in-drivable-lane": 0, "deviation-heading": 0.44431090500565745, "deviation-center-line": 0.8318073362698775}, "ep004": {"driven_any": 1.599503065800336, "driven_lanedir": 1.26271498527413, "in-drivable-lane": 0.9999999999999964, "deviation-heading": 4.16826922233794, "deviation-center-line": 0.5813102795653945}}
No reset possible 7254
843
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:02:37 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1334002070910218 deviation-center-line_median 0.4769969513039974 in-drivable-lane_median 0.13333333333333286
other stats deviation-center-line_max 1.0445636675840984 deviation-center-line_mean 0.5559782024940929 deviation-center-line_min 0.24270590326204855 deviation-heading_max 3.6560128066760536 deviation-heading_mean 1.578869415829056 deviation-heading_median 1.055493987447958 deviation-heading_min 0.21554163794861217 driven_any_max 1.9333799184818008 driven_any_mean 1.082382171288414 driven_any_median 1.1802158539779026 driven_any_min 0.34164143148619897 driven_lanedir_max 1.9317723933220692 driven_lanedir_mean 0.9620222872803948 driven_lanedir_min 0.145213403619873 in-drivable-lane_max 1.233333333333329 in-drivable-lane_mean 0.41999999999999904 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3610528764251814, "driven_lanedir": 0.145213403619873, "in-drivable-lane": 0.7333333333333333, "deviation-heading": 2.4383237313164408, "deviation-center-line": 0.24270590326204855}, "ep001": {"driven_any": 0.34164143148619897, "driven_lanedir": 0.3407319473159087, "in-drivable-lane": 0, "deviation-heading": 0.21554163794861217, "deviation-center-line": 0.32516568274283286}, "ep002": {"driven_any": 1.1802158539779026, "driven_lanedir": 1.1334002070910218, "in-drivable-lane": 0.13333333333333286, "deviation-heading": 1.055493987447958, "deviation-center-line": 0.6904588075774868}, "ep003": {"driven_any": 1.9333799184818008, "driven_lanedir": 1.9317723933220692, "in-drivable-lane": 0, "deviation-heading": 0.5289749157562148, "deviation-center-line": 1.0445636675840984}, "ep004": {"driven_any": 1.5956207760709862, "driven_lanedir": 1.2589934850531015, "in-drivable-lane": 1.233333333333329, "deviation-heading": 3.6560128066760536, "deviation-center-line": 0.4769969513039974}}
No reset possible 7247
843
Orlando Marquez 🇨🇦Imitating aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:35 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 10.199999999999974
other stats episodes details {"ep000": {"nsteps": 95, "reward": -10.781536498239362, "good_angle": 1.49666587060377, "survival_time": 3.166666666666665, "traveled_tiles": 1, "valid_direction": 2.666666666666665}, "ep001": {"nsteps": 90, "reward": -11.679380984107654, "good_angle": 0.01572243865648299, "survival_time": 2.999999999999999, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 306, "reward": -3.838494162982195, "good_angle": 1.1705377735998244, "survival_time": 10.199999999999974, "traveled_tiles": 3, "valid_direction": 2.5999999999999917}, "ep003": {"nsteps": 500, "reward": -0.33937937901914117, "good_angle": 0.3143868799971964, "survival_time": 16.666666666666654, "traveled_tiles": 4, "valid_direction": 1.0333333333333652}, "ep004": {"nsteps": 413, "reward": -3.0206985883762862, "good_angle": 23.68327713673608, "survival_time": 13.766666666666628, "traveled_tiles": 3, "valid_direction": 4.699999999999983}}good_angle_max 23.68327713673608 good_angle_mean 5.33611801991867 good_angle_median 1.1705377735998244 good_angle_min 0.01572243865648299 reward_max -0.33937937901914117 reward_mean -5.931897922544928 reward_median -3.838494162982195 reward_min -11.679380984107654 survival_time_max 16.666666666666654 survival_time_mean 9.359999999999983 survival_time_min 2.999999999999999 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 4.699999999999983 valid_direction_mean 2.200000000000001 valid_direction_median 2.5999999999999917 valid_direction_min 0
No reset possible 7242
645
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:05:58 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.9667184269351564 deviation-center-line_median 1.2944816163430288 in-drivable-lane_median 0.4666666666666668
other stats deviation-center-line_max 1.332540552664937 deviation-center-line_mean 0.9557224490437596 deviation-center-line_min 0.3160574297309683 deviation-heading_max 12.275421338609751 deviation-heading_mean 8.530644290489814 deviation-heading_median 12.103540642741873 deviation-heading_min 1.351018203217863 driven_any_max 1.546688395233118 driven_any_mean 1.169644276395163 driven_any_median 1.5466883952331036 driven_any_min 0.5869961981908871 driven_lanedir_max 1.0061254000059208 driven_lanedir_mean 0.694241123541594 driven_lanedir_min 0.11832612271112986 in-drivable-lane_max 4.366666666666661 in-drivable-lane_mean 1.2666666666666655 in-drivable-lane_min 0.3999999999999986 per-episodes details {"ep000": {"driven_any": 0.5869961981908871, "driven_lanedir": 0.11832612271112986, "in-drivable-lane": 4.366666666666661, "deviation-heading": 1.351018203217863, "deviation-center-line": 0.3160574297309683}, "ep001": {"driven_any": 1.5466883952331036, "driven_lanedir": 1.0061254000059208, "in-drivable-lane": 0.3999999999999986, "deviation-heading": 12.103540642741873, "deviation-center-line": 1.31446479721917}, "ep002": {"driven_any": 1.546688395233118, "driven_lanedir": 0.9909278972008222, "in-drivable-lane": 0.4666666666666668, "deviation-heading": 12.26450711077056, "deviation-center-line": 1.332540552664937}, "ep003": {"driven_any": 0.6211599980856, "driven_lanedir": 0.38910777085494086, "in-drivable-lane": 0.4333333333333318, "deviation-heading": 4.6587341571090235, "deviation-center-line": 0.5210678492606929}, "ep004": {"driven_any": 1.5466883952331063, "driven_lanedir": 0.9667184269351564, "in-drivable-lane": 0.6666666666666696, "deviation-heading": 12.275421338609751, "deviation-center-line": 1.2944816163430288}}
No reset possible 7232
644
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:03:29 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 105, in run
solve(params, cis)
File "solution.py", line 70, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/wrappers.py", line 92, in step
ob, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7207
656
Aleksandar Petrov 🇨ðŸ‡Tuned lane controller - ETHZ baseline extension aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:10:23 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7202
661
Gunshi Gupta 🇨🇦Template for ROS Submission aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:02:33 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.6143632659258544 deviation-center-line_median 0.1696530888106755 in-drivable-lane_median 0.06666666666666643
other stats deviation-center-line_max 0.32128234918088944 deviation-center-line_mean 0.22045201408874524 deviation-center-line_min 0.14256661845141258 deviation-heading_max 1.5036971187541308 deviation-heading_mean 1.1087963362638873 deviation-heading_median 1.219921889431829 deviation-heading_min 0.5535777621631963 driven_any_max 1.1977098976844505 driven_any_mean 0.8004711372814892 driven_any_median 0.7962085036983177 driven_any_min 0.3605925885187385 driven_lanedir_max 1.1165942432319294 driven_lanedir_mean 0.6261714411279169 driven_lanedir_min 0.14502682897285568 in-drivable-lane_max 2.4333333333333247 in-drivable-lane_mean 0.5733333333333316 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3605925885187385, "driven_lanedir": 0.14502682897285568, "in-drivable-lane": 0.36666666666666664, "deviation-heading": 1.5036971187541308, "deviation-center-line": 0.14256661845141258}, "ep001": {"driven_any": 1.14948586418305, "driven_lanedir": 0.6143632659258544, "in-drivable-lane": 2.4333333333333247, "deviation-heading": 1.219921889431829, "deviation-center-line": 0.30535797205584114}, "ep002": {"driven_any": 1.1977098976844505, "driven_lanedir": 1.1165942432319294, "in-drivable-lane": 0.06666666666666643, "deviation-heading": 1.5021668594719586, "deviation-center-line": 0.32128234918088944}, "ep003": {"driven_any": 0.4983588323228892, "driven_lanedir": 0.4824290887298672, "in-drivable-lane": 0, "deviation-heading": 0.5535777621631963, "deviation-center-line": 0.1696530888106755}, "ep004": {"driven_any": 0.7962085036983177, "driven_lanedir": 0.7724437787790781, "in-drivable-lane": 0, "deviation-heading": 0.7646180514983221, "deviation-center-line": 0.16340004194490756}}
No reset possible 7178
680
Vadim Volodin 🇷🇺Random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:10:07 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1155040627244637 deviation-center-line_median 0.2883461198494968 in-drivable-lane_median 0.33333333333333215
other stats deviation-center-line_max 0.7218678022193264 deviation-center-line_mean 0.3875631456377959 deviation-center-line_min 0.2059316683984905 deviation-heading_max 3.30619748437322 deviation-heading_mean 1.9255160897240635 deviation-heading_median 2.070149899739153 deviation-heading_min 0.18509804461505103 driven_any_max 2.1264195915065853 driven_any_mean 1.1142573292677826 driven_any_median 1.1789506284030202 driven_any_min 0.3081167020120333 driven_lanedir_max 1.9545574115518585 driven_lanedir_mean 0.956702392065244 driven_lanedir_min 0.14145282038560492 in-drivable-lane_max 0.8999999999999968 in-drivable-lane_mean 0.36666666666666575 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.35842785223892915, "driven_lanedir": 0.14145282038560492, "in-drivable-lane": 0.5666666666666667, "deviation-heading": 2.070149899739153, "deviation-center-line": 0.2059316683984905}, "ep001": {"driven_any": 0.3081167020120333, "driven_lanedir": 0.3070081902941335, "in-drivable-lane": 0, "deviation-heading": 0.18509804461505103, "deviation-center-line": 0.2423240970049194}, "ep002": {"driven_any": 1.1789506284030202, "driven_lanedir": 1.1155040627244637, "in-drivable-lane": 0.033333333333333215, "deviation-heading": 1.4982707078030315, "deviation-center-line": 0.2883461198494968}, "ep003": {"driven_any": 2.1264195915065853, "driven_lanedir": 1.9545574115518585, "in-drivable-lane": 0.33333333333333215, "deviation-heading": 2.5678643120898617, "deviation-center-line": 0.7218678022193264}, "ep004": {"driven_any": 1.5993718721783456, "driven_lanedir": 1.2649894753701587, "in-drivable-lane": 0.8999999999999968, "deviation-heading": 3.30619748437322, "deviation-center-line": 0.4793460407167462}}
No reset possible 7170
680
Vadim Volodin 🇷🇺Random execution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:01:38 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7161
680
Vadim Volodin 🇷🇺Random execution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:01:45 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.433333333333314
other stats episodes details {"ep000": {"nsteps": 78, "reward": -13.084314118139446, "good_angle": 1.2804002968450447, "survival_time": 2.6000000000000005, "traveled_tiles": 1, "valid_direction": 2.2}, "ep001": {"nsteps": 68, "reward": -15.26583971302299, "good_angle": 0.016335064193729276, "survival_time": 2.2666666666666684, "traveled_tiles": 1, "valid_direction": 0}, "ep002": {"nsteps": 253, "reward": -4.2296746153273554, "good_angle": 1.0508279689712432, "survival_time": 8.433333333333314, "traveled_tiles": 3, "valid_direction": 1.9999999999999936}, "ep003": {"nsteps": 458, "reward": -2.1569371363595455, "good_angle": 1.1979596767246576, "survival_time": 15.266666666666625, "traveled_tiles": 4, "valid_direction": 2.033333333333326}, "ep004": {"nsteps": 344, "reward": -3.569009223936138, "good_angle": 19.327802267629785, "survival_time": 11.466666666666637, "traveled_tiles": 3, "valid_direction": 3.9666666666666535}}good_angle_max 19.327802267629785 good_angle_mean 4.574665054872892 good_angle_median 1.1979596767246576 good_angle_min 0.016335064193729276 reward_max -2.1569371363595455 reward_mean -7.661154961357094 reward_median -4.2296746153273554 reward_min -15.26583971302299 survival_time_max 15.266666666666625 survival_time_mean 8.00666666666665 survival_time_min 2.2666666666666684 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 3.9666666666666535 valid_direction_mean 2.0399999999999947 valid_direction_median 2.033333333333326 valid_direction_min 0
No reset possible 7128
686
Anna Tsalapova 🇷🇺ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:08:05 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "./solution.py", line 93, in run
raise InvalidSubmission(str(e))
InvalidSubmission: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7122
691
Mandana Samiei 🇨🇦ROS-based Lane Following aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:01:07 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.333333333333335
other stats episodes details {"ep000": {"nsteps": 223, "reward": -4.66185992484549, "good_angle": 0.8040903052381936, "survival_time": 7.433333333333317, "traveled_tiles": 2, "valid_direction": 1.7333333333333272}, "ep001": {"nsteps": 90, "reward": -11.73474597831567, "good_angle": 0.16196488977725604, "survival_time": 2.999999999999999, "traveled_tiles": 1, "valid_direction": 0.2666666666666657}, "ep002": {"nsteps": 46, "reward": -22.231552162896033, "good_angle": 0.4525401484482057, "survival_time": 1.5333333333333348, "traveled_tiles": 2, "valid_direction": 0.6333333333333349}, "ep003": {"nsteps": 54, "reward": -18.891673464466024, "good_angle": 0.6012986772790297, "survival_time": 1.8000000000000025, "traveled_tiles": 1, "valid_direction": 0.866666666666669}, "ep004": {"nsteps": 70, "reward": -14.749368751474789, "good_angle": 0.3567665824374326, "survival_time": 2.333333333333335, "traveled_tiles": 1, "valid_direction": 0.6333333333333329}}good_angle_max 0.8040903052381936 good_angle_mean 0.47533212063602354 good_angle_median 0.4525401484482057 good_angle_min 0.16196488977725604 reward_max -4.66185992484549 reward_mean -14.453840056399605 reward_median -14.749368751474789 reward_min -22.231552162896033 survival_time_max 7.433333333333317 survival_time_mean 3.2199999999999975 survival_time_min 1.5333333333333348 traveled_tiles_max 2 traveled_tiles_mean 1.4 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.7333333333333272 valid_direction_mean 0.8266666666666659 valid_direction_median 0.6333333333333349 valid_direction_min 0.2666666666666657
No reset possible 7103
690
Yun Chen 🇨🇦Random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:03:45 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1274329756269736 deviation-center-line_median 0.5379419385170994 in-drivable-lane_median 0.1999999999999993
other stats deviation-center-line_max 0.7866919501416939 deviation-center-line_mean 0.5221867754294346 deviation-center-line_min 0.20030306300319184 deviation-heading_max 4.575522499322844 deviation-heading_mean 2.1502891150598824 deviation-heading_median 2.0252916274164585 deviation-heading_min 0.5009392272420599 driven_any_max 1.5935100272528435 driven_any_mean 1.1054062457692493 driven_any_median 1.1792163148608197 driven_any_min 0.3569087829948569 driven_lanedir_max 1.328316271004079 driven_lanedir_mean 0.9100333412535176 driven_lanedir_min 0.1405065035073686 in-drivable-lane_max 1.1333333333333293 in-drivable-lane_mean 0.4266666666666656 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3569087829948569, "driven_lanedir": 0.1405065035073686, "in-drivable-lane": 0.6, "deviation-heading": 2.0252916274164585, "deviation-center-line": 0.20030306300319184}, "ep001": {"driven_any": 1.0663142904278304, "driven_lanedir": 0.6965180617680535, "in-drivable-lane": 0.1999999999999993, "deviation-heading": 4.575522499322844, "deviation-center-line": 0.7866919501416939}, "ep002": {"driven_any": 1.1792163148608197, "driven_lanedir": 1.1274329756269736, "in-drivable-lane": 0.1999999999999993, "deviation-heading": 0.8866296549330918, "deviation-center-line": 0.6320493719595913}, "ep003": {"driven_any": 1.3310818133098965, "driven_lanedir": 1.328316271004079, "in-drivable-lane": 0, "deviation-heading": 0.5009392272420599, "deviation-center-line": 0.5379419385170994}, "ep004": {"driven_any": 1.5935100272528435, "driven_lanedir": 1.2573928943611123, "in-drivable-lane": 1.1333333333333293, "deviation-heading": 2.7630625663849586, "deviation-center-line": 0.45394755352559646}}
No reset possible 7091
690
Yun Chen 🇨🇦Random execution aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:02:47 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7088
690
Yun Chen 🇨🇦Random execution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:34 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 8.533333333333314
other stats episodes details {"ep000": {"nsteps": 78, "reward": -13.0784481679364, "good_angle": 1.261470067709927, "survival_time": 2.6000000000000005, "traveled_tiles": 1, "valid_direction": 2.1333333333333337}, "ep001": {"nsteps": 230, "reward": -5.458967429075552, "good_angle": 15.733903129358277, "survival_time": 7.666666666666649, "traveled_tiles": 2, "valid_direction": 3.9333333333333194}, "ep002": {"nsteps": 256, "reward": -4.529157752578612, "good_angle": 0.9684816396646848, "survival_time": 8.533333333333314, "traveled_tiles": 3, "valid_direction": 2.2333333333333263}, "ep003": {"nsteps": 286, "reward": -3.7468779909756633, "good_angle": 0.04049942237003909, "survival_time": 9.53333333333331, "traveled_tiles": 3, "valid_direction": 0}, "ep004": {"nsteps": 343, "reward": -3.525950401962946, "good_angle": 19.13283315301782, "survival_time": 11.433333333333303, "traveled_tiles": 3, "valid_direction": 3.899999999999987}}good_angle_max 19.13283315301782 good_angle_mean 7.427437482424151 good_angle_median 1.261470067709927 good_angle_min 0.04049942237003909 reward_max -3.525950401962946 reward_mean -6.067880348505834 reward_median -4.529157752578612 reward_min -13.0784481679364 survival_time_max 11.433333333333303 survival_time_mean 7.953333333333315 survival_time_min 2.6000000000000005 traveled_tiles_max 3 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 3.9333333333333194 valid_direction_mean 2.4399999999999933 valid_direction_median 2.2333333333333263 valid_direction_min 0
No reset possible 7079
690
Yun Chen 🇨🇦Random execution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:03:06 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7044
710
Benjamin Ramtoula 🇨🇦My modified ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:07:40 Uncaught exception w [...] Uncaught exception while running Docker Compose:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 580, in run_one
container = client.containers.get(container_id)
File "/usr/local/lib/python2.7/dist-packages/docker/models/containers.py", line 843, in get
resp = self.client.api.inspect_container(container_id)
File "/usr/local/lib/python2.7/dist-packages/docker/utils/decorators.py", line 19, in wrapped
return f(self, resource_id, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/docker/api/container.py", line 730, in inspect_container
self._get(self._url("/containers/{0}/json", container)), True
File "/usr/local/lib/python2.7/dist-packages/docker/utils/decorators.py", line 46, in inner
return f(self, *args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/docker/api/client.py", line 198, in _get
return self.get(url, **self._set_request_timeout(kwargs))
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 537, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 524, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 637, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 529, in send
raise ReadTimeout(e, request=request)
ReadTimeout: UnixHTTPConnectionPool(host='localhost', port=None): Read timed out. (read timeout=60)
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 7019
731
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:07:49 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.9305905924147009 deviation-center-line_median 1.3166245673767798 in-drivable-lane_median 5.83333333333332
other stats deviation-center-line_max 2.190052040727978 deviation-center-line_mean 1.537467072921491 deviation-center-line_min 1.1972677840087966 deviation-heading_max 3.085153380789145 deviation-heading_mean 2.0259398616459228 deviation-heading_median 2.1311029620614073 deviation-heading_min 0.875548107183915 driven_any_max 2.8759858176062045 driven_any_mean 2.639942350082019 driven_any_median 2.5716139779039615 driven_any_min 2.410110552755829 driven_lanedir_max 2.7963784567630405 driven_lanedir_mean 1.9202196748323077 driven_lanedir_min 1.429315746745775 in-drivable-lane_max 9.200000000000005 in-drivable-lane_mean 5.126666666666665 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 2.8759858176062045, "driven_lanedir": 2.0120182351915763, "in-drivable-lane": 5.83333333333332, "deviation-heading": 1.669274201530968, "deviation-center-line": 1.7208338073879272}, "ep001": {"driven_any": 2.5716139779039615, "driven_lanedir": 1.9305905924147009, "in-drivable-lane": 3.7666666666666537, "deviation-heading": 3.085153380789145, "deviation-center-line": 1.2625571651059724}, "ep002": {"driven_any": 2.410110552755829, "driven_lanedir": 1.4327953430464448, "in-drivable-lane": 6.8333333333333455, "deviation-heading": 2.368620656664179, "deviation-center-line": 1.3166245673767798}, "ep003": {"driven_any": 2.5094972759239234, "driven_lanedir": 1.429315746745775, "in-drivable-lane": 9.200000000000005, "deviation-heading": 0.875548107183915, "deviation-center-line": 1.1972677840087966}, "ep004": {"driven_any": 2.8325041262201736, "driven_lanedir": 2.7963784567630405, "in-drivable-lane": 0, "deviation-heading": 2.1311029620614073, "deviation-center-line": 2.190052040727978}}
No reset possible 7010
724
Benjamin Ramtoula 🇨🇦My modified ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:01:59 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.26327185463323577 deviation-center-line_median 0.09407302276624312 in-drivable-lane_median 0
other stats deviation-center-line_max 0.11264272347850494 deviation-center-line_mean 0.09356708894403554 deviation-center-line_min 0.062411895189137014 deviation-heading_max 0.656215869753844 deviation-heading_mean 0.38930965367368103 deviation-heading_median 0.4302528527612178 deviation-heading_min 0.07207946314592673 driven_any_max 0.9551132045854264 driven_any_mean 0.3953301005147894 driven_any_median 0.27315525904639104 driven_any_min 0.16425806757193345 driven_lanedir_max 0.7212299505938224 driven_lanedir_mean 0.338215253928598 driven_lanedir_min 0.14595676933386947 in-drivable-lane_max 0.5 in-drivable-lane_mean 0.1 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.9551132045854264, "driven_lanedir": 0.7212299505938224, "in-drivable-lane": 0.5, "deviation-heading": 0.4302528527612178, "deviation-center-line": 0.11214971718523511}, "ep001": {"driven_any": 0.3813080450684302, "driven_lanedir": 0.3780795107095576, "in-drivable-lane": 0, "deviation-heading": 0.07207946314592673, "deviation-center-line": 0.062411895189137014}, "ep002": {"driven_any": 0.16425806757193345, "driven_lanedir": 0.14595676933386947, "in-drivable-lane": 0, "deviation-heading": 0.48955418212725754, "deviation-center-line": 0.08655808610105742}, "ep003": {"driven_any": 0.20281592630176587, "driven_lanedir": 0.1825381843725049, "in-drivable-lane": 0, "deviation-heading": 0.656215869753844, "deviation-center-line": 0.11264272347850494}, "ep004": {"driven_any": 0.27315525904639104, "driven_lanedir": 0.26327185463323577, "in-drivable-lane": 0, "deviation-heading": 0.29844590058015913, "deviation-center-line": 0.09407302276624312}}
No reset possible 7000
724
Benjamin Ramtoula 🇨🇦My modified ROS-based Lane Following aido1_LF1_r3-v3
step3-videos success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:01:41 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6998
727
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:44 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 191, "reward": -6.196011624255106, "good_angle": 4.428420066160578, "survival_time": 6.366666666666654, "traveled_tiles": 2, "valid_direction": 3.63333333333332}, "ep001": {"nsteps": 500, "reward": -0.2818664828536566, "good_angle": 13.147421445314093, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 12.8}, "ep002": {"nsteps": 500, "reward": -0.28251015127880963, "good_angle": 13.53247134063488, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 12.966666666666663}, "ep003": {"nsteps": 202, "reward": -5.274311137265793, "good_angle": 5.546111295205837, "survival_time": 6.733333333333319, "traveled_tiles": 1, "valid_direction": 5.266666666666655}, "ep004": {"nsteps": 500, "reward": -0.4299322907009628, "good_angle": 17.066966744151905, "survival_time": 16.666666666666654, "traveled_tiles": 2, "valid_direction": 13.066666666666665}}good_angle_max 17.066966744151905 good_angle_mean 10.74427817829346 good_angle_median 13.147421445314093 good_angle_min 4.428420066160578 reward_max -0.2818664828536566 reward_mean -2.4929263372708657 reward_median -0.4299322907009628 reward_min -6.196011624255106 survival_time_max 16.666666666666654 survival_time_mean 12.619999999999989 survival_time_min 6.366666666666654 traveled_tiles_max 2 traveled_tiles_mean 1.8 traveled_tiles_median 2 traveled_tiles_min 1 valid_direction_max 13.066666666666665 valid_direction_mean 9.546666666666662 valid_direction_median 12.8 valid_direction_min 3.63333333333332
No reset possible 6988
727
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:05:45 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6987
728
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:32 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 116, in run
solve(params, cis)
File "solution.py", line 75, in solve
cis.info("OBS", str(observation.shape), str(observation.min()), str(observation.max()))
TypeError: info() takes exactly 2 arguments (5 given)
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6956
749
Bhairav Mehta Tuned lane controller - ETHZ baseline extension aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:10:36 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 477, in wrap_evaluator
cie.wait_for_solution()
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 270, in wait_for_solution
raise InvalidSubmission(msg)
InvalidSubmission: Time out: Timeout of 600 while waiting for /challenge-solution-output/output-solution.yaml.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6938
756
Martin Shin ðŸ‡ðŸ‡°PyTorch template aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:04:53 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.5757069480630213 deviation-center-line_median 0.3322557836406373 in-drivable-lane_median 3.2999999999999883
other stats deviation-center-line_max 0.5786941905407607 deviation-center-line_mean 0.3377442083615078 deviation-center-line_min 0.18855547367358297 deviation-heading_max 3.758632683038244 deviation-heading_mean 1.9811875966600605 deviation-heading_median 2.097307918082255 deviation-heading_min 0.9623147226295942 driven_any_max 1.2584590019330784 driven_any_mean 1.004524145951728 driven_any_median 1.11910045538496 driven_any_min 0.3397751108123201 driven_lanedir_max 0.7521477226218158 driven_lanedir_mean 0.5272163479401082 driven_lanedir_min 0.10656125392172688 in-drivable-lane_max 4.366666666666651 in-drivable-lane_mean 2.4799999999999915 in-drivable-lane_min 0.5333333333333314 per-episodes details {"ep000": {"driven_any": 0.3397751108123201, "driven_lanedir": 0.10656125392172688, "in-drivable-lane": 0.7666666666666666, "deviation-heading": 2.097307918082255, "deviation-center-line": 0.18855547367358297}, "ep001": {"driven_any": 1.11910045538496, "driven_lanedir": 0.7521477226218158, "in-drivable-lane": 0.5333333333333314, "deviation-heading": 3.758632683038244, "deviation-center-line": 0.5786941905407607}, "ep002": {"driven_any": 1.2584590019330784, "driven_lanedir": 0.6742449772647152, "in-drivable-lane": 3.2999999999999883, "deviation-heading": 2.119646741028135, "deviation-center-line": 0.35989700221751453}, "ep003": {"driven_any": 1.253692855103408, "driven_lanedir": 0.5757069480630213, "in-drivable-lane": 4.366666666666651, "deviation-heading": 0.9680359185220728, "deviation-center-line": 0.3322557836406373}, "ep004": {"driven_any": 1.0515933065248737, "driven_lanedir": 0.5274208378292615, "in-drivable-lane": 3.433333333333321, "deviation-heading": 0.9623147226295942, "deviation-center-line": 0.22931859173504385}}
No reset possible 6922
757
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:05:14 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6917
762
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:45 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 16.666666666666654
other stats episodes details {"ep000": {"nsteps": 500, "reward": -0.21208607520758233, "good_angle": 1.6015674010805028, "survival_time": 16.666666666666654, "traveled_tiles": 4, "valid_direction": 2.0333333333333616}, "ep001": {"nsteps": 292, "reward": -5.508039669529216, "good_angle": 7.368203871756997, "survival_time": 9.73333333333331, "traveled_tiles": 1, "valid_direction": 6.499999999999983}, "ep002": {"nsteps": 500, "reward": -1.4502935718223453, "good_angle": 44.240695230710614, "survival_time": 16.666666666666654, "traveled_tiles": 3, "valid_direction": 8.600000000000005}, "ep003": {"nsteps": 491, "reward": -2.5160094610967363, "good_angle": 0.08127785185071255, "survival_time": 16.36666666666664, "traveled_tiles": 3, "valid_direction": 0}, "ep004": {"nsteps": 500, "reward": -0.45074015378952026, "good_angle": 0.08216956005044482, "survival_time": 16.666666666666654, "traveled_tiles": 1, "valid_direction": 0}}good_angle_max 44.240695230710614 good_angle_mean 10.674782783089857 good_angle_median 1.6015674010805028 good_angle_min 0.08127785185071255 reward_max -0.21208607520758233 reward_mean -2.02743378628908 reward_median -1.4502935718223453 reward_min -5.508039669529216 survival_time_max 16.666666666666654 survival_time_mean 15.21999999999998 survival_time_min 9.73333333333331 traveled_tiles_max 4 traveled_tiles_mean 2.4 traveled_tiles_median 3 traveled_tiles_min 1 valid_direction_max 8.600000000000005 valid_direction_mean 3.4266666666666703 valid_direction_median 2.0333333333333616 valid_direction_min 0
No reset possible 6900
762
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:04:40 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6892
771
Maximilien Picquet AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:01:24 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/notebooks/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 64, in run
solve(gym_environment, cis) # let's try to solve the challenge, exciting ah?
File "solution.py", line 39, in solve
observation, reward, done, info = env.step(action)
File "/usr/local/lib/python2.7/dist-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/notebooks/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 105, in step
assert len(action) == 2
AssertionError
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6869
777
Yannick Berdou AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:03:40 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1244940250770317 deviation-center-line_median 0.1702985030340966 in-drivable-lane_median 0.033333333333333215
other stats deviation-center-line_max 0.2291035934160819 deviation-center-line_mean 0.15104091265992872 deviation-center-line_min 0.07084844041760953 deviation-heading_max 1.2760765640740437 deviation-heading_mean 0.5466497354728592 deviation-heading_median 0.35171107009370006 deviation-heading_min 0.06092138286581164 driven_any_max 2.1066666666666567 driven_any_mean 1.1093333333333295 driven_any_median 1.1599999999999824 driven_any_min 0.3466666666666718 driven_lanedir_max 2.066698089009994 driven_lanedir_mean 0.9871748479623343 driven_lanedir_min 0.14089729353622693 in-drivable-lane_max 0.2666666666666657 in-drivable-lane_mean 0.09999999999999978 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.3466666666666718, "driven_lanedir": 0.14089729353622693, "in-drivable-lane": 0.2, "deviation-heading": 0.7098822712947942, "deviation-center-line": 0.07084844041760953}, "ep001": {"driven_any": 0.34666666666667245, "driven_lanedir": 0.3458718774424743, "in-drivable-lane": 0, "deviation-heading": 0.06092138286581164, "deviation-center-line": 0.09897289157693905}, "ep002": {"driven_any": 1.1599999999999824, "driven_lanedir": 1.1244940250770317, "in-drivable-lane": 0, "deviation-heading": 0.35171107009370006, "deviation-center-line": 0.1702985030340966}, "ep003": {"driven_any": 2.1066666666666567, "driven_lanedir": 2.066698089009994, "in-drivable-lane": 0.033333333333333215, "deviation-heading": 0.33465738903594666, "deviation-center-line": 0.2291035934160819}, "ep004": {"driven_any": 1.5866666666666656, "driven_lanedir": 1.2579129547459444, "in-drivable-lane": 0.2666666666666657, "deviation-heading": 1.2760765640740437, "deviation-center-line": 0.1859811348549165}}
No reset possible 6825
783
Cliff Li AMOD18-AIDO not that random execution aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:10:04 InvalidEvaluator:
Tr [...] InvalidEvaluator:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 488, in wrap_evaluator
evaluator.score(cie)
File "eval.py", line 97, in score
raise dc.InvalidEvaluator(msg)
InvalidEvaluator: Gym exited with code 2
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6794
803
Maxim Kuzmin 🇷🇺ROS-based Lane Following aido1_LF1_r3-v3
step1-simulation error no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:10:51 Timeout:
Waited 602 [...] Timeout:
Waited 602.598948002 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6771
813
Benjamin Ramtoula 🇨🇦My modified ROS-based Lane Following aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:04:54 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 0.4581700716704997 deviation-center-line_median 0.0558069921312707 in-drivable-lane_median 1.866666666666664
other stats deviation-center-line_max 1.345711244430005 deviation-center-line_mean 0.4528271042007091 deviation-center-line_min 0.03414661293687591 deviation-heading_max 7.004635498246675 deviation-heading_mean 2.969049697810688 deviation-heading_median 1.8526107172410204 deviation-heading_min 0.25605774154674665 driven_any_max 22.43435645963331 driven_any_mean 10.49236217375566 driven_any_median 5.111829606041431 driven_any_min 1.2348786521426087 driven_lanedir_max 0.869426827593196 driven_lanedir_mean 0.4450534095475008 driven_lanedir_min 0.05303343826643836 in-drivable-lane_max 9.633333333333326 in-drivable-lane_mean 3.97333333333333 in-drivable-lane_min 0.43333333333333335 per-episodes details {"ep000": {"driven_any": 22.43435645963331, "driven_lanedir": 0.869426827593196, "in-drivable-lane": 9.633333333333326, "deviation-heading": 5.242330972673235, "deviation-center-line": 0.7839712298863096}, "ep001": {"driven_any": 1.2348786521426087, "driven_lanedir": 0.25505850807078234, "in-drivable-lane": 0.43333333333333335, "deviation-heading": 0.48961355934576184, "deviation-center-line": 0.044499441619084826}, "ep002": {"driven_any": 22.34503732737323, "driven_lanedir": 0.5895782021365881, "in-drivable-lane": 7.333333333333325, "deviation-heading": 7.004635498246675, "deviation-center-line": 1.345711244430005}, "ep003": {"driven_any": 5.111829606041431, "driven_lanedir": 0.4581700716704997, "in-drivable-lane": 1.866666666666664, "deviation-heading": 1.8526107172410204, "deviation-center-line": 0.0558069921312707}, "ep004": {"driven_any": 1.335708823587712, "driven_lanedir": 0.05303343826643836, "in-drivable-lane": 0.5999999999999999, "deviation-heading": 0.25605774154674665, "deviation-center-line": 0.03414661293687591}}
No reset possible 6755
821
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step4-viz success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:10:03 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_median 1.1677331265841218 deviation-center-line_median 0.21369449920758316 in-drivable-lane_median 0
other stats deviation-center-line_max 1.464411830780586 deviation-center-line_mean 0.4563088753418883 deviation-center-line_min 0.14607759722589408 deviation-heading_max 3.4927926401690663 deviation-heading_mean 1.2063914448630155 deviation-heading_median 1.058876716298759 deviation-heading_min 0.09639253705307378 driven_any_max 4.602452585926608 driven_any_mean 1.6442723429191433 driven_any_median 1.1704445826380951 driven_any_min 0.40884157584142633 driven_lanedir_max 4.490264233223098 driven_lanedir_mean 1.5726566807857558 driven_lanedir_min 0.4075578251135321 in-drivable-lane_max 0.633333333333334 in-drivable-lane_mean 0.1266666666666668 in-drivable-lane_min 0 per-episodes details {"ep000": {"driven_any": 0.6526231909238541, "driven_lanedir": 0.4609044611207218, "in-drivable-lane": 0.633333333333334, "deviation-heading": 1.058876716298759, "deviation-center-line": 0.21369449920758316}, "ep001": {"driven_any": 0.40884157584142633, "driven_lanedir": 0.4075578251135321, "in-drivable-lane": 0, "deviation-heading": 0.09639253705307378, "deviation-center-line": 0.14607759722589408}, "ep002": {"driven_any": 1.3869997792657325, "driven_lanedir": 1.3368237578873048, "in-drivable-lane": 0, "deviation-heading": 1.212248936259134, "deviation-center-line": 0.2647632241426425}, "ep003": {"driven_any": 1.1704445826380951, "driven_lanedir": 1.1677331265841218, "in-drivable-lane": 0, "deviation-heading": 0.17164639453504657, "deviation-center-line": 0.19259722535273616}, "ep004": {"driven_any": 4.602452585926608, "driven_lanedir": 4.490264233223098, "in-drivable-lane": 0, "deviation-heading": 3.4927926401690663, "deviation-center-line": 1.464411830780586}}
No reset possible 6754
822
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:01:04 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 117, in run
solve(params, cis)
File "solution.py", line 64, in solve
model = Model(config_name=config_name, config=config)
File "/workspace/model.py", line 15, in __init__
self.init_model(config_name)
File "/workspace/model.py", line 45, in init_model
map_location=self.device))
File "/opt/conda/lib/python2.7/site-packages/torch/serialization.py", line 356, in load
f = open(f, 'rb')
IOError: [Errno 2] No such file or directory: '/workspace/models/weights/mml2_d_28_178000.pth.pth'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6752
824
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:56 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 117, in run
solve(params, cis)
File "solution.py", line 64, in solve
model = Model(config_name=config_name, config=config)
File "/workspace/model.py", line 15, in __init__
self.init_model(config_name)
File "/workspace/model.py", line 31, in init_model
use_lstm=self.config.get("use_lstm", False)).to(self.device)
TypeError: __init__() got an unexpected keyword argument 'use_lstm'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6749
839
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step2-scoring success yes s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:00:34 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
survival_time_median 2.1333333333333355
other stats episodes details {"ep000": {"nsteps": 68, "reward": -15.091665333082132, "good_angle": 0.6981166724766602, "survival_time": 2.2666666666666684, "traveled_tiles": 1, "valid_direction": 1.6000000000000016}, "ep001": {"nsteps": 41, "reward": -24.910202139034503, "good_angle": 0.12060718551653264, "survival_time": 1.3666666666666676, "traveled_tiles": 1, "valid_direction": 0.4000000000000003}, "ep002": {"nsteps": 64, "reward": -15.972955069504678, "good_angle": 0.2113207229158426, "survival_time": 2.1333333333333355, "traveled_tiles": 2, "valid_direction": 0.9333333333333356}, "ep003": {"nsteps": 71, "reward": -14.35928999485684, "good_angle": 0.39706338058496066, "survival_time": 2.366666666666668, "traveled_tiles": 1, "valid_direction": 1.466666666666668}, "ep004": {"nsteps": 58, "reward": -17.632628266153663, "good_angle": 0.18148187305231292, "survival_time": 1.933333333333336, "traveled_tiles": 1, "valid_direction": 0.7000000000000022}}good_angle_max 0.6981166724766602 good_angle_mean 0.3217179669092618 good_angle_median 0.2113207229158426 good_angle_min 0.12060718551653264 reward_max -14.35928999485684 reward_mean -17.593348160526364 reward_median -15.972955069504678 reward_min -24.910202139034503 survival_time_max 2.366666666666668 survival_time_mean 2.013333333333335 survival_time_min 1.3666666666666676 traveled_tiles_max 2 traveled_tiles_mean 1.2 traveled_tiles_median 1 traveled_tiles_min 1 valid_direction_max 1.6000000000000016 valid_direction_mean 1.0200000000000016 valid_direction_median 0.9333333333333356 valid_direction_min 0.4000000000000003
No reset possible 6730
839
Mikita Sazanovich 🇷🇺RL solution aido1_LF1_r3-v3
step1-simulation success no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:06:49 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6632
729
Anton Mashikhin 🇷🇺AI DL RL MML XXXL 2k18 yoo aido1_LF1_r3-v3
step1-simulation failed no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:06:01 InvalidSubmission:
T [...] InvalidSubmission:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 486, in wrap_evaluator
raise InvalidSubmission(out[SPECIAL_INVALID_SUBMISSION])
InvalidSubmission: Invalid solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 590, in wrap_solution
raise InvalidSubmission(msg)
InvalidSubmission: Uncaught exception in solution:
Traceback (most recent call last):
File "/workspace/src/duckietown-challenges/src/duckietown_challenges/cie_concrete.py", line 585, in wrap_solution
solution.run(cis)
File "solution.py", line 116, in run
solve(params, cis)
File "solution.py", line 81, in solve
observation, reward, done, info = env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 332, in step
return self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/core.py", line 304, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/wrappers.py", line 89, in step
ob, reward, done, info = self.env.step(action)
File "/opt/conda/lib/python2.7/site-packages/gym/wrappers/time_limit.py", line 31, in step
observation, reward, done, info = self.env.step(action)
File "/workspace/src/gym-duckietown-agent/gym_duckietown_agent/envs/simplesimagent_env.py", line 109, in step
obs, rew, done, misc = self.sim.step(action, with_observation=True)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 55, in step
return self._failsafe_observe(msg)
File "/workspace/src/duckietown-slimremote/duckietown_slimremote/pc/robot.py", line 86, in _failsafe_observe
raise Exception(msg)
Exception: Giving up to connect to the gym duckietown server at host: evaluator
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6203
831
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation timeout no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:31:29 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6196
834
Samuel Lavoie Young Duke aido1_LF1_r3-v3
step1-simulation timeout no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:45:02 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6111
833
Orlando Marquez 🇨🇦Baseline solution using imitation learning from logs aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 2:12:24 Error while running [...]
Pulling evaluator ... done
stderr | ERROR: for solution Service Unavailable
stderr | Service Unavailable
stderr |
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible 6107
829
Anton Mashikhin 🇷🇺SAIC MOSCOW MML aido1_LF1_r3-v3
step1-simulation aborted no s-MacBook-puro.local-13934
6 years, 5 months 6 years, 5 months 0:10:42 Timeout:
Waited 602 [...] Timeout:
Waited 602.329112053 for container to finish. Giving up.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible