Duckietown Challenges Home Challenges Submissions

Submission 10724

Submission10724
Competingyes
Challengeaido5-LF-sim-validation
UserFernanda Custodio Pereira do Carmo 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57957
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

57957

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

See previous jobs for previous versions of challenges
Job IDstepstatusup to datedate starteddate completeddurationmessage
57957LFv-simsuccessyes0:19:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.741676607658992
survival_time_median32.875000000000135
deviation-center-line_median1.4052947395690265
in-drivable-lane_median4.275000000000061


other stats
agent_compute-ego0_max0.01301194711624523
agent_compute-ego0_mean0.012183799799380349
agent_compute-ego0_median0.012178345196660351
agent_compute-ego0_min0.011366561687955452
complete-iteration_max0.19176697899511763
complete-iteration_mean0.167036238435772
complete-iteration_median0.16848496047203493
complete-iteration_min0.1394080538039005
deviation-center-line_max3.6138188582382935
deviation-center-line_mean1.706216796063991
deviation-center-line_min0.4004588468796167
deviation-heading_max6.927643293965403
deviation-heading_mean4.066494441758313
deviation-heading_median3.414294141702441
deviation-heading_min2.509746189662967
driven_any_max10.171668284973768
driven_any_mean5.834262337766609
driven_any_median5.431876293139489
driven_any_min2.301628479813692
driven_lanedir_consec_max10.067562291372044
driven_lanedir_consec_mean4.8871218480699055
driven_lanedir_consec_min1.997571885589594
driven_lanedir_max10.067562291372044
driven_lanedir_mean4.8871218480699055
driven_lanedir_median3.741676607658992
driven_lanedir_min1.997571885589594
get_duckie_state_max1.3774359605337622e-06
get_duckie_state_mean1.2237241946102503e-06
get_duckie_state_median1.207151706177857e-06
get_duckie_state_min1.103157405551526e-06
get_robot_state_max0.003765075872306689
get_robot_state_mean0.0035063536708169305
get_robot_state_median0.0034674824047961203
get_robot_state_min0.0033253740013687934
get_state_dump_max0.004653805557493607
get_state_dump_mean0.004386599121718067
get_state_dump_median0.0043442741909643905
get_state_dump_min0.004204042547449879
get_ui_image_max0.03276343437636397
get_ui_image_mean0.02914623787826755
get_ui_image_median0.029946003077298097
get_ui_image_min0.02392951098211004
in-drivable-lane_max14.449999999999816
in-drivable-lane_mean5.749999999999984
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 6.341306792405848, "get_ui_image": 0.02713316448836447, "step_physics": 0.08866710700937974, "survival_time": 37.50000000000001, "driven_lanedir": 4.065747278231012, "get_state_dump": 0.004464862826026073, "get_robot_state": 0.003605380356708633, "sim_render-ego0": 0.0038263962843446695, "get_duckie_state": 1.2705075280485394e-06, "in-drivable-lane": 14.449999999999816, "deviation-heading": 2.781162827275049, "agent_compute-ego0": 0.012538140369318771, "complete-iteration": 0.1542469813884019, "set_robot_commands": 0.0021730196301375185, "deviation-center-line": 1.2587603791527238, "driven_lanedir_consec": 4.065747278231012, "sim_compute_sim_state": 0.009817363736473926, "sim_compute_performance-ego0": 0.0019403637328573296}, "LF-norm-zigzag-000-ego0": {"driven_any": 2.301628479813692, "get_ui_image": 0.03276343437636397, "step_physics": 0.11328990436443562, "survival_time": 15.500000000000083, "driven_lanedir": 1.997571885589594, "get_state_dump": 0.004223685555902708, "get_robot_state": 0.003329584452883607, "sim_render-ego0": 0.003575938307587357, "get_duckie_state": 1.1437958843071744e-06, "in-drivable-lane": 2.050000000000029, "deviation-heading": 2.509746189662967, "agent_compute-ego0": 0.011818550024001928, "complete-iteration": 0.182722939555668, "set_robot_commands": 0.001959053266470072, "deviation-center-line": 0.4004588468796167, "driven_lanedir_consec": 1.997571885589594, "sim_compute_sim_state": 0.009891703964429652, "sim_compute_performance-ego0": 0.0017948464948648042}, "LF-norm-techtrack-000-ego0": {"driven_any": 4.522445793873128, "get_ui_image": 0.032758841666231726, "step_physics": 0.11725010678119456, "survival_time": 28.250000000000263, "driven_lanedir": 3.417605937086972, "get_state_dump": 0.004653805557493607, "get_robot_state": 0.003765075872306689, "sim_render-ego0": 0.00395984615959464, "get_duckie_state": 1.3774359605337622e-06, "in-drivable-lane": 6.500000000000092, "deviation-heading": 4.047425456129833, "agent_compute-ego0": 0.01301194711624523, "complete-iteration": 0.19176697899511763, "set_robot_commands": 0.0022497349830061302, "deviation-center-line": 1.551829099985329, "driven_lanedir_consec": 3.417605937086972, "sim_compute_sim_state": 0.011959466833107883, "sim_compute_performance-ego0": 0.002068839309072326}, "LF-norm-small_loop-000-ego0": {"driven_any": 10.171668284973768, "get_ui_image": 0.02392951098211004, "step_physics": 0.08358661856480581, "survival_time": 59.99999999999873, "driven_lanedir": 10.067562291372044, "get_state_dump": 0.004204042547449879, "get_robot_state": 0.0033253740013687934, "sim_render-ego0": 0.0035026764293991457, "get_duckie_state": 1.103157405551526e-06, "in-drivable-lane": 0.0, "deviation-heading": 6.927643293965403, "agent_compute-ego0": 0.011366561687955452, "complete-iteration": 0.1394080538039005, "set_robot_commands": 0.0019691032136508963, "deviation-center-line": 3.6138188582382935, "driven_lanedir_consec": 10.067562291372044, "sim_compute_sim_state": 0.00569718962009503, "sim_compute_performance-ego0": 0.0017530888343830092}}
set_robot_commands_max0.0022497349830061302
set_robot_commands_mean0.0020877277733161543
set_robot_commands_median0.0020710614218942074
set_robot_commands_min0.001959053266470072
sim_compute_performance-ego0_max0.002068839309072326
sim_compute_performance-ego0_mean0.0018892845927943672
sim_compute_performance-ego0_median0.0018676051138610669
sim_compute_performance-ego0_min0.0017530888343830092
sim_compute_sim_state_max0.011959466833107883
sim_compute_sim_state_mean0.009341431038526622
sim_compute_sim_state_median0.00985453385045179
sim_compute_sim_state_min0.00569718962009503
sim_render-ego0_max0.00395984615959464
sim_render-ego0_mean0.003716214295231453
sim_render-ego0_median0.003701167295966013
sim_render-ego0_min0.0035026764293991457
simulation-passed1
step_physics_max0.11725010678119456
step_physics_mean0.10069843417995392
step_physics_median0.10097850568690768
step_physics_min0.08358661856480581
survival_time_max59.99999999999873
survival_time_mean35.31249999999977
survival_time_min15.500000000000083
No reset possible
57954LFv-simsuccessyes0:33:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible