Duckietown Challenges Home Challenges Submissions

Job 38432

Job ID38432
submission6806
userAnthony Courchesne 🇨🇦
user labelbaseline-duckietown
challengeaido5-LF-sim-validation
stepLFv-sim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatormont03-416387c3ca35-1
date started
date completed
duration0:04:53
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.4366244480654584
survival_time_median4.674999999999992
deviation-center-line_median0.30235620843499067
in-drivable-lane_median0.9249999999999968


other stats
agent_compute-ego_max0.018044950167338057
agent_compute-ego_mean0.017416838793193595
agent_compute-ego_median0.017259986377229877
agent_compute-ego_min0.017102432250976563
complete-iteration_max0.2124339412240421
complete-iteration_mean0.20773257108295667
complete-iteration_median0.20991051940356983
complete-iteration_min0.1986753043006448
deviation-center-line_max0.4359954898526317
deviation-center-line_mean0.3095824874105573
deviation-center-line_min0.1976220429196164
deviation-heading_max2.286581164642681
deviation-heading_mean1.1645862618115042
deviation-heading_median0.8548827100318721
deviation-heading_min0.6619984625395918
driven_any_max1.9399972610486385
driven_any_mean0.8729055952054486
driven_any_median0.5725844520211549
driven_any_min0.40645621573084656
driven_lanedir_consec_max0.6802395853288719
driven_lanedir_consec_mean0.4567689421819273
driven_lanedir_consec_min0.2735872872679206
driven_lanedir_max0.6802395853288719
driven_lanedir_mean0.4567689421819273
driven_lanedir_median0.4366244480654584
driven_lanedir_min0.2735872872679206
get_duckie_state_max2.7362505594889323e-06
get_duckie_state_mean2.6724619023940142e-06
get_duckie_state_median2.716919955085306e-06
get_duckie_state_min2.5197571399165134e-06
get_robot_state_max0.01656500030966366
get_robot_state_mean0.015823247374272816
get_robot_state_median0.015965848553414437
get_robot_state_min0.01479629208059872
get_state_dump_max0.014166281503789563
get_state_dump_mean0.014003736890998542
get_state_dump_median0.013998706434287276
get_state_dump_min0.013851253191630046
get_ui_image_max0.03767780346028945
get_ui_image_mean0.0365659709888346
get_ui_image_median0.03630313714345296
get_ui_image_min0.03597980620814305
in-drivable-lane_max8.700000000000092
in-drivable-lane_mean2.85000000000002
in-drivable-lane_min0.849999999999997
per-episodes
details{"ETHZ_autolab_technical_track-sc0-0-ego": {"driven_any": 0.512996691585636, "get_ui_image": 0.03609213829040527, "step_physics": 0.07792760624605066, "survival_time": 4.249999999999993, "driven_lanedir": 0.3781075339199065, "get_state_dump": 0.01396649585050695, "sim_render-ego": 0.006348248089061064, "get_robot_state": 0.015832850512336284, "get_duckie_state": 2.7235816506778493e-06, "in-drivable-lane": 0.8999999999999968, "agent_compute-ego": 0.017102432250976563, "deviation-heading": 0.8218240116725859, "complete-iteration": 0.1986753043006448, "set_robot_commands": 0.004600278068991269, "deviation-center-line": 0.25178734328466934, "driven_lanedir_consec": 0.3781075339199065, "sim_compute_sim_state": 0.022389622295604032, "sim_compute_performance-ego": 0.004259950974408318}, "ETHZ_autolab_technical_track-sc1-0-ego": {"driven_any": 0.40645621573084656, "get_ui_image": 0.03767780346028945, "step_physics": 0.08730576318853042, "survival_time": 3.399999999999996, "driven_lanedir": 0.2735872872679206, "get_state_dump": 0.014166281503789563, "sim_render-ego": 0.006535000660840203, "get_robot_state": 0.01656500030966366, "get_duckie_state": 2.710258259492762e-06, "in-drivable-lane": 0.849999999999997, "agent_compute-ego": 0.017187605885898367, "deviation-heading": 0.8879414083911582, "complete-iteration": 0.2124339412240421, "set_robot_commands": 0.004981093546923469, "deviation-center-line": 0.1976220429196164, "driven_lanedir_consec": 0.2735872872679206, "sim_compute_sim_state": 0.02350016201243681, "sim_compute_performance-ego": 0.0043486567104564}, "ETHZ_autolab_technical_track-sc2-0-ego": {"driven_any": 1.9399972610486385, "get_ui_image": 0.03651413599650065, "step_physics": 0.08404998620351156, "survival_time": 14.950000000000076, "driven_lanedir": 0.6802395853288719, "get_state_dump": 0.013851253191630046, "sim_render-ego": 0.00647461732228597, "get_robot_state": 0.016098846594492594, "get_duckie_state": 2.7362505594889323e-06, "in-drivable-lane": 8.700000000000092, "agent_compute-ego": 0.018044950167338057, "deviation-heading": 2.286581164642681, "complete-iteration": 0.20850547472635905, "set_robot_commands": 0.004898836612701416, "deviation-center-line": 0.4359954898526317, "driven_lanedir_consec": 0.6802395853288719, "sim_compute_sim_state": 0.02412683884302775, "sim_compute_performance-ego": 0.004283359050750732}, "ETHZ_autolab_technical_track-sc3-0-ego": {"driven_any": 0.6321722124566739, "get_ui_image": 0.03597980620814305, "step_physics": 0.08806843851126876, "survival_time": 5.09999999999999, "driven_lanedir": 0.4951413622110103, "get_state_dump": 0.014030917018067603, "sim_render-ego": 0.0061178254146201936, "get_robot_state": 0.01479629208059872, "get_duckie_state": 2.5197571399165134e-06, "in-drivable-lane": 0.9499999999999966, "agent_compute-ego": 0.017332366868561388, "deviation-heading": 0.6619984625395918, "complete-iteration": 0.21131556408078064, "set_robot_commands": 0.004687080196305818, "deviation-center-line": 0.35292507358531194, "driven_lanedir_consec": 0.4951413622110103, "sim_compute_sim_state": 0.026002051783543007, "sim_compute_performance-ego": 0.004131165205263624}}
set_robot_commands_max0.004981093546923469
set_robot_commands_mean0.0047918221062304926
set_robot_commands_median0.004792958404503616
set_robot_commands_min0.004600278068991269
sim_compute_performance-ego_max0.0043486567104564
sim_compute_performance-ego_mean0.004255782985219768
sim_compute_performance-ego_median0.004271655012579525
sim_compute_performance-ego_min0.004131165205263624
sim_compute_sim_state_max0.026002051783543007
sim_compute_sim_state_mean0.024004668733652906
sim_compute_sim_state_median0.02381350042773228
sim_compute_sim_state_min0.022389622295604032
sim_render-ego_max0.006535000660840203
sim_render-ego_mean0.006368922871701857
sim_render-ego_median0.006411432705673517
sim_render-ego_min0.0061178254146201936
simulation-passed1
step_physics_max0.08806843851126876
step_physics_mean0.08433794853734035
step_physics_median0.08567787469602099
step_physics_min0.07792760624605066
survival_time_max14.950000000000076
survival_time_mean6.925000000000014
survival_time_min3.399999999999996

Highlights

38432

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-sc0-0

ETHZ_autolab_technical_track-sc1-0

ETHZ_autolab_technical_track-sc2-0

ETHZ_autolab_technical_track-sc3-0

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.