22474
step1-simulation success yes 2019-05-16 21:00:39+00:00 2019-05-16 21:14:05+00:00 0:13:26 Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median 1.1183690550459433 survival_time_median 6.149999999999986 deviation-center-line_median 0.366938549913891 in-drivable-lane_median 2.9999999999999893
other stats agent_compute-ego_max 0.11543785413106282 agent_compute-ego_mean 0.11296957712781494 agent_compute-ego_median 0.11239315334119294 agent_compute-ego_min 0.11145819028218588 deviation-center-line_max 0.6688136302855523 deviation-center-line_mean 0.3714835811300531 deviation-center-line_min 0.12096518584440498 deviation-heading_max 4.070604095614383 deviation-heading_mean 2.1882662597704563 deviation-heading_median 1.539952975595033 deviation-heading_min 0.39014049555056457 driven_any_max 2.0329618356493615 driven_any_mean 1.4596210361442328 driven_any_median 1.6218736669495035 driven_any_min 0.2096019554885327 driven_lanedir_consec_max 1.3610976499958565 driven_lanedir_consec_mean 0.8751469051809091 driven_lanedir_consec_min 0.14154279637636158 driven_lanedir_max 1.6452155391364036 driven_lanedir_mean 0.9747277801461818 driven_lanedir_median 1.1183690550459433 driven_lanedir_min 0.14154279637636158 in-drivable-lane_max 9.950000000000088 in-drivable-lane_mean 3.81000000000002 in-drivable-lane_min 0 per-episodes details {"ETHZ_autolab_technical_track-0-0": {"driven_any": 1.8944439149078731, "sim_physics": 0.0486532711982727, "survival_time": 14.950000000000076, "driven_lanedir": 0.6074138601763441, "sim_render-ego": 0.05329350312550862, "in-drivable-lane": 9.950000000000088, "agent_compute-ego": 0.11543785413106282, "deviation-heading": 3.8680595368139272, "set_robot_commands": 0.05198625405629475, "deviation-center-line": 0.39390118795369367, "driven_lanedir_consec": 0.5966486823239703, "sim_compute_sim_state": 0.03861078421274821, "sim_compute_performance-ego": 0.05658262411753336, "sim_compute_robot_state-ego": 0.05573338588078817}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.0329618356493615, "sim_physics": 0.04681465943654378, "survival_time": 14.950000000000076, "driven_lanedir": 1.6452155391364036, "sim_render-ego": 0.052317147254943845, "in-drivable-lane": 5.800000000000021, "agent_compute-ego": 0.1120314073562622, "deviation-heading": 4.070604095614383, "set_robot_commands": 0.05163191556930542, "deviation-center-line": 0.6688136302855523, "driven_lanedir_consec": 1.1580763421624134, "sim_compute_sim_state": 0.038115348815917965, "sim_compute_performance-ego": 0.056542154947916666, "sim_compute_robot_state-ego": 0.05525364875793457}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 1.6218736669495035, "sim_physics": 0.048023044495355514, "survival_time": 5.249999999999989, "driven_lanedir": 1.3610976499958565, "sim_render-ego": 0.05017966088794527, "in-drivable-lane": 0.29999999999999893, "agent_compute-ego": 0.11145819028218588, "deviation-heading": 1.539952975595033, "set_robot_commands": 0.05134680157616025, "deviation-center-line": 0.3067993516527235, "driven_lanedir_consec": 1.3610976499958565, "sim_compute_sim_state": 0.03759802182515462, "sim_compute_performance-ego": 0.05439813023521787, "sim_compute_robot_state-ego": 0.053447423662458145}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 0.2096019554885327, "sim_physics": 0.051689599689684416, "survival_time": 1.900000000000001, "driven_lanedir": 0.14154279637636158, "sim_render-ego": 0.053767392509862, "in-drivable-lane": 0, "agent_compute-ego": 0.11239315334119294, "deviation-heading": 1.0725741952783727, "set_robot_commands": 0.05099952220916748, "deviation-center-line": 0.12096518584440498, "driven_lanedir_consec": 0.14154279637636158, "sim_compute_sim_state": 0.03925826675013492, "sim_compute_performance-ego": 0.05580055086236251, "sim_compute_robot_state-ego": 0.055026656702945105}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 1.5392238077258924, "sim_physics": 0.04849233278414098, "survival_time": 6.149999999999986, "driven_lanedir": 1.1183690550459433, "sim_render-ego": 0.05078703795022112, "in-drivable-lane": 2.9999999999999893, "agent_compute-ego": 0.11352728052837092, "deviation-heading": 0.39014049555056457, "set_robot_commands": 0.050583626196636415, "deviation-center-line": 0.366938549913891, "driven_lanedir_consec": 1.1183690550459433, "sim_compute_sim_state": 0.03735513415763048, "sim_compute_performance-ego": 0.05565070912120789, "sim_compute_robot_state-ego": 0.05556264156248511}}set_robot_commands_max 0.05198625405629475 set_robot_commands_mean 0.05130962392151286 set_robot_commands_median 0.05134680157616025 set_robot_commands_min 0.050583626196636415 sim_compute_performance-ego_max 0.05658262411753336 sim_compute_performance-ego_mean 0.05579483385684766 sim_compute_performance-ego_median 0.05580055086236251 sim_compute_performance-ego_min 0.05439813023521787 sim_compute_robot_state-ego_max 0.05573338588078817 sim_compute_robot_state-ego_mean 0.05500475131332222 sim_compute_robot_state-ego_median 0.05525364875793457 sim_compute_robot_state-ego_min 0.053447423662458145 sim_compute_sim_state_max 0.03925826675013492 sim_compute_sim_state_mean 0.03818751115231723 sim_compute_sim_state_median 0.038115348815917965 sim_compute_sim_state_min 0.03735513415763048 sim_physics_max 0.051689599689684416 sim_physics_mean 0.04873458152079948 sim_physics_median 0.04849233278414098 sim_physics_min 0.04681465943654378 sim_render-ego_max 0.053767392509862 sim_render-ego_mean 0.05206894834569616 sim_render-ego_median 0.052317147254943845 sim_render-ego_min 0.05017966088794527 simulation-passed 1 survival_time_max 14.950000000000076 survival_time_mean 8.640000000000025 survival_time_min 1.900000000000001
No reset possible 21792
step1-simulation host-error no 2019-05-08 02:56:41+00:00 2019-05-08 03:01:49+00:00 0:05:08 Uncaught exception:
[...] Uncaught exception:
Traceback (most recent call last):
File "/project/src/duckietown_challenges_runner/runner.py", line 488, in get_cr
submission_id=submission_id, timeout_sec=timeout_sec)
File "/usr/lib/python3.6/contextlib.py", line 88, in __exit__
next(self.gen)
File "/project/src/duckietown_challenges_runner/runner.py", line 347, in setup_logging
convert(f_stdout)
File "/project/src/duckietown_challenges_runner/runner.py", line 343, in convert
with open(fn, 'w') as f:
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/duckietown/DT18/evaluator/executions/aido2-LF-sim-validation/submission3273/step1-simulation-ip-172-31-38-104-5376-job21792/logs/challenges-runner/stdout.html'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible