Duckietown Challenges Home Challenges Submissions

Submission 10961

Submission10961
Competingyes
Challengeaido5-LF-sim-validation
UserHimanshu Arora 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57330
Next
User labelsim-exercise-1
Admin priority50
Blessingn/a
User priority50

57330

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57330LFv-simsuccessyes0:23:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.6734182579057113
survival_time_median39.22499999999991
deviation-center-line_median2.0533139284109625
in-drivable-lane_median9.9499999999998


other stats
agent_compute-ego0_max0.0128386719898352
agent_compute-ego0_mean0.012490039729884706
agent_compute-ego0_median0.012580158027518657
agent_compute-ego0_min0.011961170874666312
complete-iteration_max0.20464594519656637
complete-iteration_mean0.18060403435897177
complete-iteration_median0.1810445393111061
complete-iteration_min0.1556811136171085
deviation-center-line_max3.4706306967106206
deviation-center-line_mean2.0166528014582026
deviation-center-line_min0.489352652300264
deviation-heading_max9.203229560304344
deviation-heading_mean6.801334169378091
deviation-heading_median7.501577868215174
deviation-heading_min2.9989513807776738
driven_any_max11.60299741755845
driven_any_mean7.312835318237953
driven_any_median6.871746501871767
driven_any_min3.9048508516498255
driven_lanedir_consec_max10.26685199940852
driven_lanedir_consec_mean4.772755962647
driven_lanedir_consec_min1.4773353353680598
driven_lanedir_max10.26685199940852
driven_lanedir_mean5.4155019365427375
driven_lanedir_median4.958910205697185
driven_lanedir_min1.4773353353680598
get_duckie_state_max1.2999049144813871e-06
get_duckie_state_mean1.2586058058787845e-06
get_duckie_state_median1.2687312477798134e-06
get_duckie_state_min1.1970558134741232e-06
get_robot_state_max0.00365117477200136
get_robot_state_mean0.003612015198336653
get_robot_state_median0.003622197796608967
get_robot_state_min0.0035524904281273176
get_state_dump_max0.004672646214487632
get_state_dump_mean0.00452475284551768
get_state_dump_median0.004513539125535526
get_state_dump_min0.004399286916512038
get_ui_image_max0.03544419433759607
get_ui_image_mean0.0303949129358903
get_ui_image_median0.029464929941659123
get_ui_image_min0.02720559752264688
in-drivable-lane_max13.500000000000192
in-drivable-lane_mean9.937499999999924
in-drivable-lane_min6.3499999999999055
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 11.60299741755845, "get_ui_image": 0.027539098391822732, "step_physics": 0.09152509012786078, "survival_time": 59.99999999999873, "driven_lanedir": 10.26685199940852, "get_state_dump": 0.004399286916512038, "get_robot_state": 0.0035524904281273176, "sim_render-ego0": 0.003764761179114857, "get_duckie_state": 1.1970558134741232e-06, "in-drivable-lane": 6.3499999999999055, "deviation-heading": 9.203229560304344, "agent_compute-ego0": 0.011961170874666312, "complete-iteration": 0.1556811136171085, "set_robot_commands": 0.002105313077953634, "deviation-center-line": 3.4706306967106206, "driven_lanedir_consec": 10.26685199940852, "sim_compute_sim_state": 0.008791569964673299, "sim_compute_performance-ego0": 0.001958421227536928}, "LF-norm-zigzag-000-ego0": {"driven_any": 3.9048508516498255, "get_ui_image": 0.03544419433759607, "step_physics": 0.1298724096754323, "survival_time": 22.95000000000019, "driven_lanedir": 1.4773353353680598, "get_state_dump": 0.004549930924954621, "get_robot_state": 0.00362267079560653, "sim_render-ego0": 0.003862675894861636, "get_duckie_state": 1.299899557362432e-06, "in-drivable-lane": 13.500000000000192, "deviation-heading": 2.9989513807776738, "agent_compute-ego0": 0.0128295608188795, "complete-iteration": 0.20464594519656637, "set_robot_commands": 0.002163201311360235, "deviation-center-line": 0.489352652300264, "driven_lanedir_consec": 1.4773353353680598, "sim_compute_sim_state": 0.01021573491718458, "sim_compute_performance-ego0": 0.0019997244295866592}, "LF-norm-techtrack-000-ego0": {"driven_any": 6.812876519151299, "get_ui_image": 0.031390761491495514, "step_physics": 0.1233534726175191, "survival_time": 39.79999999999987, "driven_lanedir": 5.179571849669614, "get_state_dump": 0.004477147326116431, "get_robot_state": 0.003621724797611404, "sim_render-ego0": 0.0037779140951044138, "get_duckie_state": 1.2375629381971946e-06, "in-drivable-lane": 8.249999999999904, "deviation-heading": 8.940815312667949, "agent_compute-ego0": 0.012330755236157807, "complete-iteration": 0.19528807540759535, "set_robot_commands": 0.002131372951952696, "deviation-center-line": 2.321334538229407, "driven_lanedir_consec": 5.179571849669614, "sim_compute_sim_state": 0.012163222359593868, "sim_compute_performance-ego0": 0.0019575577308721796}, "LF-norm-small_loop-000-ego0": {"driven_any": 6.9306164845922344, "get_ui_image": 0.02720559752264688, "step_physics": 0.10441372191259105, "survival_time": 38.64999999999994, "driven_lanedir": 4.738248561724756, "get_state_dump": 0.004672646214487632, "get_robot_state": 0.00365117477200136, "sim_render-ego0": 0.003910608993944272, "get_duckie_state": 1.2999049144813871e-06, "in-drivable-lane": 11.649999999999691, "deviation-heading": 6.0623404237623975, "agent_compute-ego0": 0.0128386719898352, "complete-iteration": 0.1668010032146168, "set_robot_commands": 0.0022391561389893525, "deviation-center-line": 1.7852933185925184, "driven_lanedir_consec": 2.1672646661418087, "sim_compute_sim_state": 0.005814802739047265, "sim_compute_performance-ego0": 0.001969497333201327}}
set_robot_commands_max0.0022391561389893525
set_robot_commands_mean0.0021597608700639793
set_robot_commands_median0.0021472871316564656
set_robot_commands_min0.002105313077953634
sim_compute_performance-ego0_max0.0019997244295866592
sim_compute_performance-ego0_mean0.0019713001802992734
sim_compute_performance-ego0_median0.0019639592803691276
sim_compute_performance-ego0_min0.0019575577308721796
sim_compute_sim_state_max0.012163222359593868
sim_compute_sim_state_mean0.009246332495124751
sim_compute_sim_state_median0.00950365244092894
sim_compute_sim_state_min0.005814802739047265
sim_render-ego0_max0.003910608993944272
sim_render-ego0_mean0.0038289900407562946
sim_render-ego0_median0.003820294994983025
sim_render-ego0_min0.003764761179114857
simulation-passed1
step_physics_max0.1298724096754323
step_physics_mean0.1122911735833508
step_physics_median0.11388359726505506
step_physics_min0.09152509012786078
survival_time_max59.99999999999873
survival_time_mean40.34999999999968
survival_time_min22.95000000000019
No reset possible
57327LFv-simsuccessyes0:27:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible