Duckietown Challenges Home Challenges Submissions

Job 38431

Job ID38431
submission6807
userAnthony Courchesne 🇨🇦
user labelbaseline-duckietown
challengeaido5-LF-sim-validation
stepLFv-sim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatorreg01-c51804ca78e4-1
date started
date completed
duration0:07:40
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.4366244480654584
survival_time_median4.674999999999992
deviation-center-line_median0.30235620843499067
in-drivable-lane_median0.9249999999999968


other stats
agent_compute-ego_max0.04671088036368875
agent_compute-ego_mean0.04070141178720137
agent_compute-ego_median0.04081323913499421
agent_compute-ego_min0.03446828851512834
complete-iteration_max0.44447536389033
complete-iteration_mean0.3958612404383865
complete-iteration_median0.41655952965511994
complete-iteration_min0.3058505385529761
deviation-center-line_max0.4359954898526317
deviation-center-line_mean0.3095824874105573
deviation-center-line_min0.1976220429196164
deviation-heading_max2.286581164642681
deviation-heading_mean1.1645862618115042
deviation-heading_median0.8548827100318721
deviation-heading_min0.6619984625395918
driven_any_max1.939997261048641
driven_any_mean0.87290559520545
driven_any_median0.5725844520211546
driven_any_min0.4064562157308501
driven_lanedir_consec_max0.6802395853288715
driven_lanedir_consec_mean0.4567689421819272
driven_lanedir_consec_min0.2735872872679206
driven_lanedir_max0.6802395853288715
driven_lanedir_mean0.4567689421819272
driven_lanedir_median0.4366244480654584
driven_lanedir_min0.2735872872679206
get_duckie_state_max4.0648030299766395e-06
get_duckie_state_mean3.876931527081658e-06
get_duckie_state_median3.898260640163048e-06
get_duckie_state_min3.6464017980238977e-06
get_robot_state_max0.046557182470957435
get_robot_state_mean0.03407904986072989
get_robot_state_median0.032436396093929516
get_robot_state_min0.024886224784103093
get_state_dump_max0.039798824226155
get_state_dump_mean0.03294313193536272
get_state_dump_median0.03402689227870866
get_state_dump_min0.023919918957878563
get_ui_image_max0.08659632766948026
get_ui_image_mean0.07189940902532316
get_ui_image_median0.07506264819818384
get_ui_image_min0.05087601203544467
in-drivable-lane_max8.700000000000092
in-drivable-lane_mean2.85000000000002
in-drivable-lane_min0.849999999999997
per-episodes
details{"ETHZ_autolab_technical_track-sc0-0-ego": {"driven_any": 0.5129966915856362, "get_ui_image": 0.07116039781009449, "step_physics": 0.12234441813300638, "survival_time": 4.249999999999993, "driven_lanedir": 0.3781075339199065, "get_state_dump": 0.030229686288272634, "sim_render-ego": 0.01944440392886891, "get_robot_state": 0.034357429953182445, "get_duckie_state": 3.7950627944048714e-06, "in-drivable-lane": 0.8999999999999968, "agent_compute-ego": 0.03859894696403952, "deviation-heading": 0.8218240116725859, "complete-iteration": 0.4064548716825597, "set_robot_commands": 0.012402823392082664, "deviation-center-line": 0.25178734328466934, "driven_lanedir_consec": 0.3781075339199065, "sim_compute_sim_state": 0.05803668358746697, "sim_compute_performance-ego": 0.019641466701731964}, "ETHZ_autolab_technical_track-sc1-0-ego": {"driven_any": 0.4064562157308501, "get_ui_image": 0.08659632766948026, "step_physics": 0.12151635394376868, "survival_time": 3.399999999999996, "driven_lanedir": 0.2735872872679206, "get_state_dump": 0.039798824226155, "sim_render-ego": 0.013934124918544994, "get_robot_state": 0.030515362234676584, "get_duckie_state": 3.6464017980238977e-06, "in-drivable-lane": 0.849999999999997, "agent_compute-ego": 0.04671088036368875, "deviation-heading": 0.8879414083911582, "complete-iteration": 0.42666418762768016, "set_robot_commands": 0.014196329257067512, "deviation-center-line": 0.1976220429196164, "driven_lanedir_consec": 0.2735872872679206, "sim_compute_sim_state": 0.05841010458329145, "sim_compute_performance-ego": 0.014751735855551326}, "ETHZ_autolab_technical_track-sc2-0-ego": {"driven_any": 1.939997261048641, "get_ui_image": 0.07896489858627319, "step_physics": 0.1233012557029724, "survival_time": 14.950000000000076, "driven_lanedir": 0.6802395853288715, "get_state_dump": 0.03782409826914469, "sim_render-ego": 0.020025694370269777, "get_robot_state": 0.046557182470957435, "get_duckie_state": 4.001458485921224e-06, "in-drivable-lane": 8.700000000000092, "agent_compute-ego": 0.04302753130594889, "deviation-heading": 2.286581164642681, "complete-iteration": 0.44447536389033, "set_robot_commands": 0.02018722693125407, "deviation-center-line": 0.4359954898526317, "driven_lanedir_consec": 0.6802395853288715, "sim_compute_sim_state": 0.054609607060750326, "sim_compute_performance-ego": 0.019738221168518068}, "ETHZ_autolab_technical_track-sc3-0-ego": {"driven_any": 0.632172212456673, "get_ui_image": 0.05087601203544467, "step_physics": 0.111441595881593, "survival_time": 5.09999999999999, "driven_lanedir": 0.4951413622110103, "get_state_dump": 0.023919918957878563, "sim_render-ego": 0.009893658114414589, "get_robot_state": 0.024886224784103093, "get_duckie_state": 4.0648030299766395e-06, "in-drivable-lane": 0.9499999999999966, "agent_compute-ego": 0.03446828851512834, "deviation-heading": 0.6619984625395918, "complete-iteration": 0.3058505385529761, "set_robot_commands": 0.007674778209013098, "deviation-center-line": 0.35292507358531194, "driven_lanedir_consec": 0.4951413622110103, "sim_compute_sim_state": 0.03572136981814515, "sim_compute_performance-ego": 0.006704108387816186}}
set_robot_commands_max0.02018722693125407
set_robot_commands_mean0.013615289447354337
set_robot_commands_median0.013299576324575088
set_robot_commands_min0.007674778209013098
sim_compute_performance-ego_max0.019738221168518068
sim_compute_performance-ego_mean0.015208883028404386
sim_compute_performance-ego_median0.017196601278641643
sim_compute_performance-ego_min0.006704108387816186
sim_compute_sim_state_max0.05841010458329145
sim_compute_sim_state_mean0.051694441262413474
sim_compute_sim_state_median0.05632314532410865
sim_compute_sim_state_min0.03572136981814515
sim_render-ego_max0.020025694370269777
sim_render-ego_mean0.015824470333024566
sim_render-ego_median0.016689264423706954
sim_render-ego_min0.009893658114414589
simulation-passed1
step_physics_max0.1233012557029724
step_physics_mean0.11965090591533512
step_physics_median0.12193038603838752
step_physics_min0.111441595881593
survival_time_max14.950000000000076
survival_time_mean6.925000000000014
survival_time_min3.399999999999996

Highlights

38431

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-sc0-0

ETHZ_autolab_technical_track-sc1-0

ETHZ_autolab_technical_track-sc2-0

ETHZ_autolab_technical_track-sc3-0

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.