Duckietown Challenges Home Challenges Submissions

Submission 6542

Submission6542
Competingyes
Challengeaido3-off-LF-sim-validation
UserGianmarco Bernasconi
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
Jobsstep1-simulation: 32240
Next
User labelchallenge-aido_LF-baseline-duckietown
Admin priority50
Blessingn/a
User priority50

32240

Click the images to see detailed statistics about the episode.

ETHZ_autolab_technical_track-0-0

ETHZ_autolab_technical_track-1-0

ETHZ_autolab_technical_track-2-0

ETHZ_autolab_technical_track-3-0

ETHZ_autolab_technical_track-4-0

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
32240step1-simulationsuccessyes0:10:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.0901701423277403
survival_time_median14.950000000000076
deviation-center-line_median0.8689840908631571
in-drivable-lane_median0


other stats
agent_compute-ego_max0.026592395305633544
agent_compute-ego_mean0.025986172299326205
agent_compute-ego_median0.0258101224899292
agent_compute-ego_min0.02540169527501236
deviation-center-line_max0.9558077877149972
deviation-center-line_mean0.8321238029699576
deviation-center-line_min0.6759390407895018
deviation-heading_max3.218427934442498
deviation-heading_mean2.0606425680302936
deviation-heading_median1.926802722277247
deviation-heading_min1.15591004140265
driven_any_max2.1142688268115943
driven_any_mean2.0318422234000426
driven_any_median2.114228281297217
driven_any_min1.709192553660213
driven_lanedir_consec_max2.1053110614346062
driven_lanedir_consec_mean1.922514816494063
driven_lanedir_consec_min1.3084165015130806
driven_lanedir_max2.1053110614346062
driven_lanedir_mean1.923061413662145
driven_lanedir_median2.0901701423277403
driven_lanedir_min1.3111494873534897
in-drivable-lane_max2.0000000000000284
in-drivable-lane_mean0.4000000000000057
in-drivable-lane_min0
per-episodes
details{"ETHZ_autolab_technical_track-0-0": {"driven_any": 2.10728926191125, "sim_physics": 0.08329973220825196, "survival_time": 14.950000000000076, "driven_lanedir": 2.0129687403059093, "sim_render-ego": 0.013655415376027423, "in-drivable-lane": 0, "agent_compute-ego": 0.026592395305633544, "deviation-heading": 3.218427934442498, "set_robot_commands": 0.008822384675343832, "deviation-center-line": 0.8689840908631571, "driven_lanedir_consec": 2.0129687403059093, "sim_compute_sim_state": 0.006187586784362793, "sim_compute_performance-ego": 0.008753949801127116, "sim_compute_robot_state-ego": 0.009692291418711344}, "ETHZ_autolab_technical_track-1-0": {"driven_any": 2.114232193319939, "sim_physics": 0.08443276723225912, "survival_time": 14.950000000000076, "driven_lanedir": 2.095707636888978, "sim_render-ego": 0.014014909267425536, "in-drivable-lane": 0, "agent_compute-ego": 0.025726430416107175, "deviation-heading": 1.3825690764697056, "set_robot_commands": 0.008819704055786132, "deviation-center-line": 0.9558077877149972, "driven_lanedir_consec": 2.095707636888978, "sim_compute_sim_state": 0.00630681037902832, "sim_compute_performance-ego": 0.008500998814900716, "sim_compute_robot_state-ego": 0.010185492833455405}, "ETHZ_autolab_technical_track-2-0": {"driven_any": 1.709192553660213, "sim_physics": 0.08184265894163784, "survival_time": 12.150000000000038, "driven_lanedir": 1.3111494873534897, "sim_render-ego": 0.013748878314171308, "in-drivable-lane": 2.0000000000000284, "agent_compute-ego": 0.02540169527501236, "deviation-heading": 2.6195030655593667, "set_robot_commands": 0.008843367972982272, "deviation-center-line": 0.7331209880553442, "driven_lanedir_consec": 1.3084165015130806, "sim_compute_sim_state": 0.006567665578897108, "sim_compute_performance-ego": 0.008475273234363445, "sim_compute_robot_state-ego": 0.010287785235746407}, "ETHZ_autolab_technical_track-3-0": {"driven_any": 2.114228281297217, "sim_physics": 0.08283886830012004, "survival_time": 14.950000000000076, "driven_lanedir": 2.0901701423277403, "sim_render-ego": 0.01406118631362915, "in-drivable-lane": 0, "agent_compute-ego": 0.02640021800994873, "deviation-heading": 1.926802722277247, "set_robot_commands": 0.009178114732106528, "deviation-center-line": 0.9267671074267873, "driven_lanedir_consec": 2.0901701423277403, "sim_compute_sim_state": 0.006483519872029622, "sim_compute_performance-ego": 0.008657769362131754, "sim_compute_robot_state-ego": 0.010316004753112793}, "ETHZ_autolab_technical_track-4-0": {"driven_any": 2.1142688268115943, "sim_physics": 0.08461891571680705, "survival_time": 14.950000000000076, "driven_lanedir": 2.1053110614346062, "sim_render-ego": 0.013824103673299152, "in-drivable-lane": 0, "agent_compute-ego": 0.0258101224899292, "deviation-heading": 1.15591004140265, "set_robot_commands": 0.008748786449432373, "deviation-center-line": 0.6759390407895018, "driven_lanedir_consec": 2.1053110614346062, "sim_compute_sim_state": 0.006365305582682292, "sim_compute_performance-ego": 0.00886820395787557, "sim_compute_robot_state-ego": 0.009967883427937826}}
set_robot_commands_max0.009178114732106528
set_robot_commands_mean0.008882471577130227
set_robot_commands_median0.008822384675343832
set_robot_commands_min0.008748786449432373
sim_compute_performance-ego_max0.00886820395787557
sim_compute_performance-ego_mean0.00865123903407972
sim_compute_performance-ego_median0.008657769362131754
sim_compute_performance-ego_min0.008475273234363445
sim_compute_robot_state-ego_max0.010316004753112793
sim_compute_robot_state-ego_mean0.010089891533792757
sim_compute_robot_state-ego_median0.010185492833455405
sim_compute_robot_state-ego_min0.009692291418711344
sim_compute_sim_state_max0.006567665578897108
sim_compute_sim_state_mean0.006382177639400027
sim_compute_sim_state_median0.006365305582682292
sim_compute_sim_state_min0.006187586784362793
sim_physics_max0.08461891571680705
sim_physics_mean0.08340658847981519
sim_physics_median0.08329973220825196
sim_physics_min0.08184265894163784
sim_render-ego_max0.01406118631362915
sim_render-ego_mean0.013860898588910514
sim_render-ego_median0.013824103673299152
sim_render-ego_min0.013655415376027423
simulation-passed1
survival_time_max14.950000000000076
survival_time_mean14.390000000000068
survival_time_min12.150000000000038
No reset possible
32239step1-simulationsuccessyes0:07:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible