Duckietown Challenges Home Challenges Submissions

Submission 13062

Submission13062
Competingyes
Challengeaido5-LF-sim-validation
UserAyman Shams 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 60326
Next
User labelreal-exercise-2
Admin priority50
Blessingn/a
User priority50

60326

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
60326LFv-simsuccessyes0:10:18
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.48138864793340486
survival_time_median12.025000000000036
deviation-center-line_median0.33340971044598794
in-drivable-lane_median7.375000000000044


other stats
agent_compute-ego0_max0.013317091364256094
agent_compute-ego0_mean0.012560846577754818
agent_compute-ego0_median0.01247870956619798
agent_compute-ego0_min0.011968875814367223
complete-iteration_max0.3278542337283282
complete-iteration_mean0.27148186953656794
complete-iteration_median0.2732529610024085
complete-iteration_min0.21156732241312665
deviation-center-line_max0.38366955954388027
deviation-center-line_mean0.30706086826524315
deviation-center-line_min0.17775449262511658
deviation-heading_max3.1613493203262264
deviation-heading_mean1.978432599876594
deviation-heading_median1.9455523454572887
deviation-heading_min0.8612763882655732
driven_any_max3.0219426563149216
driven_any_mean1.7506289085365447
driven_any_median1.458525489042465
driven_any_min1.0635219997463274
driven_lanedir_consec_max0.6434158544002174
driven_lanedir_consec_mean0.46863115805056543
driven_lanedir_consec_min0.2683314819352347
driven_lanedir_max0.6434158544002174
driven_lanedir_mean0.46863115805056543
driven_lanedir_median0.48138864793340486
driven_lanedir_min0.2683314819352347
get_duckie_state_max1.3916580765335648e-06
get_duckie_state_mean1.2756746799634115e-06
get_duckie_state_median1.2633915500885546e-06
get_duckie_state_min1.1842575431429725e-06
get_robot_state_max0.003590092998169456
get_robot_state_mean0.0035339875394699233
get_robot_state_median0.0035262288746264183
get_robot_state_min0.0034933994104574015
get_state_dump_max0.004428378828279265
get_state_dump_mean0.004378098887306751
get_state_dump_median0.0043863560050244
get_state_dump_min0.004311304710898938
get_ui_image_max0.03616424233700748
get_ui_image_mean0.03058396528051388
get_ui_image_median0.03020570085944184
get_ui_image_min0.02576021706616437
in-drivable-lane_max20.200000000000177
in-drivable-lane_mean9.637500000000069
in-drivable-lane_min3.600000000000005
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.0219426563149216, "get_ui_image": 0.02828991762265002, "step_physics": 0.18886903738875768, "survival_time": 23.850000000000204, "driven_lanedir": 0.2683314819352347, "get_state_dump": 0.004311304710898938, "get_robot_state": 0.003590092998169456, "sim_render-ego0": 0.003664470117960016, "get_duckie_state": 1.2010710009969926e-06, "in-drivable-lane": 20.200000000000177, "deviation-heading": 2.846313382719597, "agent_compute-ego0": 0.01215540514830266, "complete-iteration": 0.25509735829660585, "set_robot_commands": 0.002195656050199245, "deviation-center-line": 0.38366955954388027, "driven_lanedir_consec": 0.2683314819352347, "sim_compute_sim_state": 0.010048082184093268, "sim_compute_performance-ego0": 0.0018981545540079413}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.2693221368468264, "get_ui_image": 0.03616424233700748, "step_physics": 0.2525425606490301, "survival_time": 10.600000000000016, "driven_lanedir": 0.3545609713149198, "get_state_dump": 0.004379777281497006, "get_robot_state": 0.003522362507564921, "sim_render-ego0": 0.003664578630330977, "get_duckie_state": 1.1842575431429725e-06, "in-drivable-lane": 6.350000000000023, "deviation-heading": 3.1613493203262264, "agent_compute-ego0": 0.013317091364256094, "complete-iteration": 0.3278542337283282, "set_robot_commands": 0.002091910357766308, "deviation-center-line": 0.3573171815375451, "driven_lanedir_consec": 0.3545609713149198, "sim_compute_sim_state": 0.010202667522878154, "sim_compute_performance-ego0": 0.001893992715038604}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.0635219997463274, "get_ui_image": 0.03212148409623366, "step_physics": 0.2227287619978517, "survival_time": 9.049999999999994, "driven_lanedir": 0.6434158544002174, "get_state_dump": 0.004428378828279265, "get_robot_state": 0.0034933994104574015, "sim_render-ego0": 0.0036170181337293688, "get_duckie_state": 1.3257120991801167e-06, "in-drivable-lane": 3.600000000000005, "deviation-heading": 1.0447913081949805, "agent_compute-ego0": 0.0128020139840933, "complete-iteration": 0.29140856370821105, "set_robot_commands": 0.002061632963327261, "deviation-center-line": 0.17775449262511658, "driven_lanedir_consec": 0.6434158544002174, "sim_compute_sim_state": 0.008230236860421987, "sim_compute_performance-ego0": 0.0018501858134846111}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.647728841238104, "get_ui_image": 0.02576021706616437, "step_physics": 0.15306203012113218, "survival_time": 13.450000000000056, "driven_lanedir": 0.6082163245518899, "get_state_dump": 0.004392934728551794, "get_robot_state": 0.003530095241687916, "sim_render-ego0": 0.003656315803527832, "get_duckie_state": 1.3916580765335648e-06, "in-drivable-lane": 8.400000000000066, "deviation-heading": 0.8612763882655732, "agent_compute-ego0": 0.011968875814367223, "complete-iteration": 0.21156732241312665, "set_robot_commands": 0.002185533664844654, "deviation-center-line": 0.3095022393544308, "driven_lanedir_consec": 0.6082163245518899, "sim_compute_sim_state": 0.005033351756908276, "sim_compute_performance-ego0": 0.0018963010222823532}}
set_robot_commands_max0.002195656050199245
set_robot_commands_mean0.0021336832590343672
set_robot_commands_median0.002138722011305481
set_robot_commands_min0.002061632963327261
sim_compute_performance-ego0_max0.0018981545540079413
sim_compute_performance-ego0_mean0.0018846585262033773
sim_compute_performance-ego0_median0.0018951468686604783
sim_compute_performance-ego0_min0.0018501858134846111
sim_compute_sim_state_max0.010202667522878154
sim_compute_sim_state_mean0.008378584581075422
sim_compute_sim_state_median0.009139159522257629
sim_compute_sim_state_min0.005033351756908276
sim_render-ego0_max0.003664578630330977
sim_render-ego0_mean0.003650595671387049
sim_render-ego0_median0.003660392960743924
sim_render-ego0_min0.0036170181337293688
simulation-passed1
step_physics_max0.2525425606490301
step_physics_mean0.20430059753919289
step_physics_median0.20579889969330467
step_physics_min0.15306203012113218
survival_time_max23.850000000000204
survival_time_mean14.237500000000066
survival_time_min9.049999999999994
No reset possible
60323LFv-simsuccessyes0:10:31
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible