Duckietown Challenges Home Challenges Submissions

Job 81635

Job ID81635
submission16490
userArwa Alabdulkarim
user labelbase-image-ml
challengemooc-visservoing
stepsim
statussuccess
up to date Note that this job is not up to date; the challenge has been changed.
evaluatornogpu-production-b-spot-0-03
date started
date completed
duration0:06:41
message
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
in-drivable-lane_median7.400000000000105
deviation-center-line_median0.3408594071792538
driven_lanedir_consec_median0.8886187184762426
survival_time_median16.525000000000098


other stats
agent_compute-ego0_max0.005772723901740328
agent_compute-ego0_mean0.005609098004073272
agent_compute-ego0_median0.005609098004073272
agent_compute-ego0_min0.005445472106406216
complete-iteration_max0.12803565796619187
complete-iteration_mean0.12334048288264274
complete-iteration_median0.12334048288264274
complete-iteration_min0.1186453077990936
deviation-center-line_max0.4218926349516454
deviation-center-line_mean0.3408594071792538
deviation-center-line_min0.25982617940686215
deviation-heading_max1.1202942372354092
deviation-heading_mean1.0554673697747907
deviation-heading_median1.0554673697747907
deviation-heading_min0.9906405023141722
distance-from-start_max1.7523312672730105
distance-from-start_mean1.3623030036361023
distance-from-start_median1.3623030036361023
distance-from-start_min0.9722747399991944
driven_any_max2.1551292065033154
driven_any_mean1.6152902590396063
driven_any_median1.6152902590396063
driven_any_min1.0754513115758977
driven_lanedir_consec_max0.9116813643369236
driven_lanedir_consec_mean0.8886187184762426
driven_lanedir_consec_min0.8655560726155618
driven_lanedir_max0.9116813643369236
driven_lanedir_mean0.8886187184762426
driven_lanedir_median0.8886187184762426
driven_lanedir_min0.8655560726155618
get_duckie_state_max1.3222340413056086e-06
get_duckie_state_mean1.2939700232449702e-06
get_duckie_state_median1.2939700232449702e-06
get_duckie_state_min1.2657060051843318e-06
get_robot_state_max0.0035178494765768925
get_robot_state_mean0.00342905127970741
get_robot_state_median0.00342905127970741
get_robot_state_min0.0033402530828379267
get_state_dump_max0.004597662838265365
get_state_dump_mean0.004521960515801119
get_state_dump_median0.004521960515801119
get_state_dump_min0.004446258193336873
get_ui_image_max0.039684381902492544
get_ui_image_mean0.03886533399909543
get_ui_image_median0.03886533399909543
get_ui_image_min0.03804628609569833
in-drivable-lane_max12.400000000000176
in-drivable-lane_mean7.400000000000105
in-drivable-lane_min2.400000000000034
per-episodes
details{"LF-small-loop-000-ego0": {"driven_any": 2.1551292065033154, "get_ui_image": 0.039684381902492544, "step_physics": 0.06443721358127857, "survival_time": 21.650000000000173, "driven_lanedir": 0.9116813643369236, "get_state_dump": 0.004446258193336873, "get_robot_state": 0.0033402530828379267, "sim_render-ego0": 0.0033237835229267174, "get_duckie_state": 1.2657060051843318e-06, "in-drivable-lane": 12.400000000000176, "deviation-heading": 0.9906405023141722, "agent_compute-ego0": 0.005445472106406216, "complete-iteration": 0.12803565796619187, "set_robot_commands": 0.0019360533507738248, "distance-from-start": 1.7523312672730105, "deviation-center-line": 0.4218926349516454, "driven_lanedir_consec": 0.9116813643369236, "sim_compute_sim_state": 0.0035998381777293123, "sim_compute_performance-ego0": 0.0017398002510246593}, "LF-small-loop-001-ego0": {"driven_any": 1.0754513115758977, "get_ui_image": 0.03804628609569833, "step_physics": 0.05535456707383868, "survival_time": 11.400000000000029, "driven_lanedir": 0.8655560726155618, "get_state_dump": 0.004597662838265365, "get_robot_state": 0.0035178494765768925, "sim_render-ego0": 0.003420982818936677, "get_duckie_state": 1.3222340413056086e-06, "in-drivable-lane": 2.400000000000034, "deviation-heading": 1.1202942372354092, "agent_compute-ego0": 0.005772723901740328, "complete-iteration": 0.1186453077990936, "set_robot_commands": 0.0020557032922469895, "distance-from-start": 0.9722747399991944, "deviation-center-line": 0.25982617940686215, "driven_lanedir_consec": 0.8655560726155618, "sim_compute_sim_state": 0.003957667205010959, "sim_compute_performance-ego0": 0.0018363196776943955}}
set_robot_commands_max0.0020557032922469895
set_robot_commands_mean0.0019958783215104072
set_robot_commands_median0.0019958783215104072
set_robot_commands_min0.0019360533507738248
sim_compute_performance-ego0_max0.0018363196776943955
sim_compute_performance-ego0_mean0.0017880599643595274
sim_compute_performance-ego0_median0.0017880599643595274
sim_compute_performance-ego0_min0.0017398002510246593
sim_compute_sim_state_max0.003957667205010959
sim_compute_sim_state_mean0.003778752691370136
sim_compute_sim_state_median0.003778752691370136
sim_compute_sim_state_min0.0035998381777293123
sim_render-ego0_max0.003420982818936677
sim_render-ego0_mean0.0033723831709316972
sim_render-ego0_median0.0033723831709316972
sim_render-ego0_min0.0033237835229267174
simulation-passed1
step_physics_max0.06443721358127857
step_physics_mean0.059895890327558626
step_physics_median0.059895890327558626
step_physics_min0.05535456707383868
survival_time_max21.650000000000173
survival_time_mean16.525000000000098
survival_time_min11.400000000000029

Highlights

81635

Click the images to see detailed statistics about the episode.

LF-small-loop-000

LF-small-loop-001

Artifacts

The artifacts are hidden.

Container logs

The logs are hidden.