Duckietown Challenges Home Challenges Submissions

Job 41877

Job ID41877
submission11301
userMoustafa Elarabi
user labeltemplate-pytorch
challengeaido5-LF-sim-validation
stepLFv-sim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatormont04-ef63c33f6e8a-1
date started
date completed
duration0:03:17
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.6264936771784531
survival_time_median2.9999999999999973
deviation-center-line_median0.12363280414828232
in-drivable-lane_median0.574999999999998


other stats
agent_compute-ego_max0.020562246867588587
agent_compute-ego_mean0.01969043795925035
agent_compute-ego_median0.02003865438595153
agent_compute-ego_min0.01812219619750977
complete-iteration_max0.205487745148795
complete-iteration_mean0.2009447704221969
complete-iteration_median0.2003199346083447
complete-iteration_min0.1976514673233032
deviation-center-line_max0.23012091970493093
deviation-center-line_mean0.13983937329452956
deviation-center-line_min0.08197096517662264
deviation-heading_max0.46427910504776165
deviation-heading_mean0.356706398693832
deviation-heading_median0.35401982375854124
deviation-heading_min0.254506842210484
driven_any_max2.9923912406053272
driven_any_mean1.359204210149228
driven_any_median0.9659127070421872
driven_any_min0.5126001859072107
driven_lanedir_consec_max0.9222487571109196
driven_lanedir_consec_mean0.6315619556030624
driven_lanedir_consec_min0.35101171094442396
driven_lanedir_max0.9222487571109196
driven_lanedir_mean0.6315619556030624
driven_lanedir_median0.6264936771784531
driven_lanedir_min0.35101171094442396
get_duckie_state_max2.70775386265346e-06
get_duckie_state_mean2.6365240713707487e-06
get_duckie_state_median2.6266915457589287e-06
get_duckie_state_min2.584959331311678e-06
get_robot_state_max0.01576311685885602
get_robot_state_mean0.01515528430815229
get_robot_state_median0.01525116341454642
get_robot_state_min0.014355693544660295
get_state_dump_max0.013743668271784196
get_state_dump_mean0.013622431241181262
get_state_dump_median0.013636838368007114
get_state_dump_min0.01347237995692662
get_ui_image_max0.0360541752406529
get_ui_image_mean0.03493837436339013
get_ui_image_median0.034819898838387396
get_ui_image_min0.03405952453613281
in-drivable-lane_max6.199999999999987
in-drivable-lane_mean1.937499999999996
in-drivable-lane_min0.4000000000000003
per-episodes
details{"ETHZ_autolab_technical_track-sc0-0-ego": {"driven_any": 0.784912245304705, "get_ui_image": 0.03405952453613281, "step_physics": 0.07772002696990966, "survival_time": 2.499999999999999, "driven_lanedir": 0.5752560599295724, "get_state_dump": 0.013692717552185058, "sim_render-ego": 0.005968790054321289, "get_robot_state": 0.014976229667663574, "get_duckie_state": 2.651214599609375e-06, "in-drivable-lane": 0.549999999999998, "agent_compute-ego": 0.020053582191467287, "deviation-heading": 0.3229018151212695, "complete-iteration": 0.1976514673233032, "set_robot_commands": 0.004687232971191406, "deviation-center-line": 0.1481842895150645, "driven_lanedir_consec": 0.5752560599295724, "sim_compute_sim_state": 0.02235061168670654, "sim_compute_performance-ego": 0.003996515274047851}, "ETHZ_autolab_technical_track-sc1-0-ego": {"driven_any": 0.5126001859072107, "get_ui_image": 0.0360541752406529, "step_physics": 0.07775983129228864, "survival_time": 1.7500000000000009, "driven_lanedir": 0.35101171094442396, "get_state_dump": 0.01347237995692662, "sim_render-ego": 0.006063761029924665, "get_robot_state": 0.015526097161429268, "get_duckie_state": 2.6021684919084822e-06, "in-drivable-lane": 0.4000000000000003, "agent_compute-ego": 0.020562246867588587, "deviation-heading": 0.38513783239581306, "complete-iteration": 0.2006328446524484, "set_robot_commands": 0.004527146475655692, "deviation-center-line": 0.09908131878150014, "driven_lanedir_consec": 0.35101171094442396, "sim_compute_sim_state": 0.022290447780064174, "sim_compute_performance-ego": 0.0042283058166503905}, "ETHZ_autolab_technical_track-sc2-0-ego": {"driven_any": 2.9923912406053272, "get_ui_image": 0.035150578147486636, "step_physics": 0.07708674704122265, "survival_time": 8.549999999999986, "driven_lanedir": 0.6777312944273337, "get_state_dump": 0.013743668271784196, "sim_render-ego": 0.006116807112219738, "get_robot_state": 0.01576311685885602, "get_duckie_state": 2.584959331311678e-06, "in-drivable-lane": 6.199999999999987, "agent_compute-ego": 0.02002372658043577, "deviation-heading": 0.46427910504776165, "complete-iteration": 0.2000070245642411, "set_robot_commands": 0.004813824480737162, "deviation-center-line": 0.08197096517662264, "driven_lanedir_consec": 0.6777312944273337, "sim_compute_sim_state": 0.023010853438349497, "sim_compute_performance-ego": 0.0041441833763791805}, "ETHZ_autolab_technical_track-sc3-0-ego": {"driven_any": 1.1469131687796692, "get_ui_image": 0.034489219529288156, "step_physics": 0.08534866741725376, "survival_time": 3.4999999999999956, "driven_lanedir": 0.9222487571109196, "get_state_dump": 0.01358095918382917, "sim_render-ego": 0.00570589474269322, "get_robot_state": 0.014355693544660295, "get_duckie_state": 2.70775386265346e-06, "in-drivable-lane": 0.5999999999999979, "agent_compute-ego": 0.01812219619750977, "deviation-heading": 0.254506842210484, "complete-iteration": 0.205487745148795, "set_robot_commands": 0.004548995835440499, "deviation-center-line": 0.23012091970493093, "driven_lanedir_consec": 0.9222487571109196, "sim_compute_sim_state": 0.025330400466918944, "sim_compute_performance-ego": 0.003854550634111677}}
set_robot_commands_max0.004813824480737162
set_robot_commands_mean0.00464429994075619
set_robot_commands_median0.004618114403315953
set_robot_commands_min0.004527146475655692
sim_compute_performance-ego_max0.0042283058166503905
sim_compute_performance-ego_mean0.004055888775297275
sim_compute_performance-ego_median0.004070349325213516
sim_compute_performance-ego_min0.003854550634111677
sim_compute_sim_state_max0.025330400466918944
sim_compute_sim_state_mean0.02324557834300979
sim_compute_sim_state_median0.02268073256252802
sim_compute_sim_state_min0.022290447780064174
sim_render-ego_max0.006116807112219738
sim_render-ego_mean0.005963813234789728
sim_render-ego_median0.0060162755421229765
sim_render-ego_min0.00570589474269322
simulation-passed1
step_physics_max0.08534866741725376
step_physics_mean0.07947881818016868
step_physics_median0.07773992913109914
step_physics_min0.07708674704122265
survival_time_max8.549999999999986
survival_time_mean4.074999999999996
survival_time_min1.7500000000000009

Highlights

41877

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-sc0-0

ETHZ_autolab_technical_track-sc1-0

ETHZ_autolab_technical_track-sc2-0

ETHZ_autolab_technical_track-sc3-0

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.