Duckietown Challenges Home Challenges Submissions

Job 41729

Job ID41729
submission11298
userMoustafa Elarabi
user labelchallenge-aido_LF-template-pytorch
challengeaido5-LF-sim-validation
stepLFv-sim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatormont05-b47700d9a51a-1
date started
date completed
duration0:09:58
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.815241329743563
survival_time_median14.950000000000076
deviation-center-line_median0.6038070155851062
in-drivable-lane_median0.5500000000000052


other stats
agent_compute-ego_max0.026828479766845704
agent_compute-ego_mean0.025638203422228497
agent_compute-ego_median0.02563000559806824
agent_compute-ego_min0.024464322725931804
complete-iteration_max0.23744104703267416
complete-iteration_mean0.2255510618289312
complete-iteration_median0.22375662525494897
complete-iteration_min0.21724994977315268
deviation-center-line_max0.7601086277278756
deviation-center-line_mean0.5491059812672215
deviation-center-line_min0.2287012661707984
deviation-heading_max3.328193341081821
deviation-heading_mean2.536581234135362
deviation-heading_median2.4512300382994625
deviation-heading_min1.915671518860704
driven_any_max3.062275081844615
driven_any_mean3.038913825467301
driven_any_median3.04218331608615
driven_any_min3.0090135878522903
driven_lanedir_consec_max2.94935344632492
driven_lanedir_consec_mean2.3101315035294814
driven_lanedir_consec_min0.6606899083058804
driven_lanedir_max2.94935344632492
driven_lanedir_mean2.313324466335845
driven_lanedir_median2.815241329743563
driven_lanedir_min0.6734617595313339
get_duckie_state_max2.857844034830729e-06
get_duckie_state_mean2.7014811833699547e-06
get_duckie_state_median2.7171770731608073e-06
get_duckie_state_min2.513726552327474e-06
get_robot_state_max0.017198851903279622
get_robot_state_mean0.01692614952723185
get_robot_state_median0.016893627643585204
get_robot_state_min0.016718490918477377
get_state_dump_max0.014193073113759358
get_state_dump_mean0.014128341873486838
get_state_dump_median0.014138259092966716
get_state_dump_min0.014043776194254558
get_ui_image_max0.03932247241338094
get_ui_image_mean0.03824600319067637
get_ui_image_median0.03825753649075826
get_ui_image_min0.03714646736780802
in-drivable-lane_max10.500000000000083
in-drivable-lane_mean2.900000000000024
in-drivable-lane_min0.0
per-episodes
details{"ETHZ_autolab_technical_track-sc0-0-ego": {"driven_any": 3.0090135878522903, "get_ui_image": 0.03758231321970622, "step_physics": 0.08613525946935018, "survival_time": 14.950000000000076, "driven_lanedir": 2.840468797629866, "get_state_dump": 0.014193073113759358, "sim_render-ego": 0.006735505263010661, "get_robot_state": 0.017198851903279622, "get_duckie_state": 2.659956614176432e-06, "in-drivable-lane": 0.29999999999999893, "agent_compute-ego": 0.024656657377878824, "deviation-heading": 2.7999801961878164, "complete-iteration": 0.22237940390904745, "set_robot_commands": 0.004922160307566325, "deviation-center-line": 0.7293394937266842, "driven_lanedir_consec": 2.840468797629866, "sim_compute_sim_state": 0.026198118527730307, "sim_compute_performance-ego": 0.004597251415252686}, "ETHZ_autolab_technical_track-sc1-0-ego": {"driven_any": 3.0587324614298295, "get_ui_image": 0.03714646736780802, "step_physics": 0.08308473666508992, "survival_time": 14.950000000000076, "driven_lanedir": 2.94935344632492, "get_state_dump": 0.014134002526601156, "sim_render-ego": 0.0066297761599222816, "get_robot_state": 0.016732318401336668, "get_duckie_state": 2.7743975321451823e-06, "in-drivable-lane": 0.0, "agent_compute-ego": 0.024464322725931804, "deviation-heading": 3.328193341081821, "complete-iteration": 0.21724994977315268, "set_robot_commands": 0.004975700378417968, "deviation-center-line": 0.7601086277278756, "driven_lanedir_consec": 2.94935344632492, "sim_compute_sim_state": 0.025411612192789715, "sim_compute_performance-ego": 0.0045065553983052575}, "ETHZ_autolab_technical_track-sc2-0-ego": {"driven_any": 3.062275081844615, "get_ui_image": 0.0389327597618103, "step_physics": 0.0863597575823466, "survival_time": 14.950000000000076, "driven_lanedir": 0.6734617595313339, "get_state_dump": 0.014043776194254558, "sim_render-ego": 0.007010291417439779, "get_robot_state": 0.01705493688583374, "get_duckie_state": 2.857844034830729e-06, "in-drivable-lane": 10.500000000000083, "agent_compute-ego": 0.026828479766845704, "deviation-heading": 1.915671518860704, "complete-iteration": 0.2251338466008504, "set_robot_commands": 0.0050428120295206704, "deviation-center-line": 0.2287012661707984, "driven_lanedir_consec": 0.6606899083058804, "sim_compute_sim_state": 0.024913909435272216, "sim_compute_performance-ego": 0.004777177174886068}, "ETHZ_autolab_technical_track-sc3-0-ego": {"driven_any": 3.0256341707424697, "get_ui_image": 0.03932247241338094, "step_physics": 0.09430939594904582, "survival_time": 14.950000000000076, "driven_lanedir": 2.7900138618572603, "get_state_dump": 0.014142515659332276, "sim_render-ego": 0.0068085575103759765, "get_robot_state": 0.016718490918477377, "get_duckie_state": 2.513726552327474e-06, "in-drivable-lane": 0.8000000000000114, "agent_compute-ego": 0.02660335381825765, "deviation-heading": 2.1024798804111087, "complete-iteration": 0.23744104703267416, "set_robot_commands": 0.005064357916514078, "deviation-center-line": 0.47827453744352816, "driven_lanedir_consec": 2.7900138618572603, "sim_compute_sim_state": 0.029761544068654375, "sim_compute_performance-ego": 0.004540184338887533}}
set_robot_commands_max0.005064357916514078
set_robot_commands_mean0.00500125765800476
set_robot_commands_median0.005009256203969319
set_robot_commands_min0.004922160307566325
sim_compute_performance-ego_max0.004777177174886068
sim_compute_performance-ego_mean0.0046052920818328855
sim_compute_performance-ego_median0.004568717877070109
sim_compute_performance-ego_min0.0045065553983052575
sim_compute_sim_state_max0.029761544068654375
sim_compute_sim_state_mean0.026571296056111655
sim_compute_sim_state_median0.025804865360260013
sim_compute_sim_state_min0.024913909435272216
sim_render-ego_max0.007010291417439779
sim_render-ego_mean0.0067960325876871746
sim_render-ego_median0.006772031386693319
sim_render-ego_min0.0066297761599222816
simulation-passed1
step_physics_max0.09430939594904582
step_physics_mean0.08747228741645813
step_physics_median0.0862475085258484
step_physics_min0.08308473666508992
survival_time_max14.950000000000076
survival_time_mean14.950000000000076
survival_time_min14.950000000000076

Highlights

41729

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-sc0-0

ETHZ_autolab_technical_track-sc1-0

ETHZ_autolab_technical_track-sc2-0

ETHZ_autolab_technical_track-sc3-0

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.