Duckietown Challenges Home Challenges Submissions

Submission 3190

Submission3190
Competingyes
Challengeaido2-LF-sim-validation
UserAnastasiya Nikolskaya 🇷🇺
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 22545
Next
User labelNN Solution
Admin priority50
Blessingn/a
User priority50

22545

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
22545step1-simulationsuccessyes0:28:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.565454441076636
survival_time_median14.950000000000076
deviation-center-line_median1.1048541736364808
in-drivable-lane_median1.7499999999999938


other stats
agent_compute-ego_max0.11431202173233032
agent_compute-ego_mean0.1111501953268564
agent_compute-ego_median0.1122921794973394
agent_compute-ego_min0.106088125705719
deviation-center-line_max1.2795665520291992
deviation-center-line_mean1.0784654919112424
deviation-center-line_min0.6489066981260487
deviation-heading_max3.24259962279599
deviation-heading_mean2.555693089139582
deviation-heading_median2.848507363300843
deviation-heading_min1.3470834296994465
driven_any_max3.064133709422124
driven_any_mean2.8251956235131335
driven_any_median3.0640542624367324
driven_any_min1.879636541915353
driven_lanedir_consec_max2.9551951891344515
driven_lanedir_consec_mean2.4565892741769777
driven_lanedir_consec_min1.8039808852920165
driven_lanedir_max2.9551951891344515
driven_lanedir_mean2.4565892741769777
driven_lanedir_median2.565454441076636
driven_lanedir_min1.8039808852920165
in-drivable-lane_max2.750000000000039
in-drivable-lane_mean1.2700000000000062
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 3.054059899981893, "sim_physics": 0.1342835791905721, "survival_time": 14.950000000000076, "driven_lanedir": 2.3140011021734, "sim_render-ego": 0.07029918114344279, "in-drivable-lane": 2.750000000000039, "agent_compute-ego": 0.11431202173233032, "deviation-heading": 3.1280961506916416, "set_robot_commands": 0.11399154424667358, "deviation-center-line": 1.1048541736364808, "driven_lanedir_consec": 2.3140011021734, "sim_compute_sim_state": 0.04298319657643636, "sim_compute_performance-ego": 0.07865835905075073, "sim_compute_robot_state-ego": 0.0889142362276713}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 3.0640937038095637, "sim_physics": 0.13397605975468954, "survival_time": 14.950000000000076, "driven_lanedir": 2.565454441076636, "sim_render-ego": 0.07155969699223837, "in-drivable-lane": 1.7499999999999938, "agent_compute-ego": 0.11231665372848512, "deviation-heading": 3.24259962279599, "set_robot_commands": 0.1130498433113098, "deviation-center-line": 1.0824156929037418, "driven_lanedir_consec": 2.565454441076636, "sim_compute_sim_state": 0.04327025254567464, "sim_compute_performance-ego": 0.08004132986068725, "sim_compute_robot_state-ego": 0.09201850573221844}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 3.064133709422124, "sim_physics": 0.12865578571955363, "survival_time": 14.950000000000076, "driven_lanedir": 2.644314753208384, "sim_render-ego": 0.07078912734985351, "in-drivable-lane": 1.849999999999998, "agent_compute-ego": 0.11074199597040812, "deviation-heading": 1.3470834296994465, "set_robot_commands": 0.1100933559735616, "deviation-center-line": 1.2765843428607413, "driven_lanedir_consec": 2.644314753208384, "sim_compute_sim_state": 0.0431714129447937, "sim_compute_performance-ego": 0.07867499828338623, "sim_compute_robot_state-ego": 0.0871767020225525}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 1.879636541915353, "sim_physics": 0.13316760909172795, "survival_time": 9.299999999999995, "driven_lanedir": 1.8039808852920165, "sim_render-ego": 0.07188243378875075, "in-drivable-lane": 0, "agent_compute-ego": 0.1122921794973394, "deviation-heading": 2.212178879209989, "set_robot_commands": 0.1102669636408488, "deviation-center-line": 0.6489066981260487, "driven_lanedir_consec": 1.8039808852920165, "sim_compute_sim_state": 0.042934594615813226, "sim_compute_performance-ego": 0.07942879071799658, "sim_compute_robot_state-ego": 0.08814696599078435}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 3.0640542624367324, "sim_physics": 0.1158755866686503, "survival_time": 14.950000000000076, "driven_lanedir": 2.9551951891344515, "sim_render-ego": 0.06413028955459595, "in-drivable-lane": 0, "agent_compute-ego": 0.106088125705719, "deviation-heading": 2.848507363300843, "set_robot_commands": 0.1024988357226054, "deviation-center-line": 1.2795665520291992, "driven_lanedir_consec": 2.9551951891344515, "sim_compute_sim_state": 0.039449278513590494, "sim_compute_performance-ego": 0.07311878283818563, "sim_compute_robot_state-ego": 0.08317259152730307}}
set_robot_commands_max0.11399154424667358
set_robot_commands_mean0.10998010857899984
set_robot_commands_median0.1102669636408488
set_robot_commands_min0.1024988357226054
sim_compute_performance-ego_max0.08004132986068725
sim_compute_performance-ego_mean0.07798445215020129
sim_compute_performance-ego_median0.07867499828338623
sim_compute_performance-ego_min0.07311878283818563
sim_compute_robot_state-ego_max0.09201850573221844
sim_compute_robot_state-ego_mean0.08788580030010593
sim_compute_robot_state-ego_median0.08814696599078435
sim_compute_robot_state-ego_min0.08317259152730307
sim_compute_sim_state_max0.04327025254567464
sim_compute_sim_state_mean0.04236174703926169
sim_compute_sim_state_median0.04298319657643636
sim_compute_sim_state_min0.039449278513590494
sim_physics_max0.1342835791905721
sim_physics_mean0.1291917240850387
sim_physics_median0.13316760909172795
sim_physics_min0.1158755866686503
sim_render-ego_max0.07188243378875075
sim_render-ego_mean0.06973214576577627
sim_render-ego_median0.07078912734985351
sim_render-ego_min0.06413028955459595
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean13.82000000000006
survival_time_min9.299999999999995
No reset possible
21499step1-simulationsuccessno0:10:33
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
21498step1-simulationhost-errorno0:05:45
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/project/src/duckietown_challenges_runner/runner.py", line 483, in get_cr
    submission_id=submission_id, timeout_sec=timeout_sec)
  File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
    next(self.gen)
  File "/project/src/duckietown_challenges_runner/runner.py", line 343, in setup_logging
    convert(f_stdout)
  File "/project/src/duckietown_challenges_runner/runner.py", line 332, in convert
    data = open(log).read().strip()
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido2-LF-sim-validation/submission3190/step1-simulation-ip-172-31-40-253-32059-job21498/logs/challenges-runner/stdout.log'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible