Duckietown Challenges Home Challenges Submissions

Submission 11535

Submission11535
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54458
Next
User labelexercise_state_estimation
Admin priority50
Blessingn/a
User priority50

54458

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
54458LFv-simsuccessyes0:27:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.846053662187193
survival_time_median48.39999999999939
deviation-center-line_median1.510310347804407
in-drivable-lane_median8.049999999999926


other stats
agent_compute-ego0_max0.01316228913427094
agent_compute-ego0_mean0.012680470833522004
agent_compute-ego0_median0.012736817018241758
agent_compute-ego0_min0.012085960163333551
complete-iteration_max0.26003541874945113
complete-iteration_mean0.2318124033543545
complete-iteration_median0.2321282702857943
complete-iteration_min0.20295765409637825
deviation-center-line_max4.089691315907791
deviation-center-line_mean1.96738083928153
deviation-center-line_min0.7592113456095149
deviation-heading_max19.008274459362138
deviation-heading_mean9.285952859319323
deviation-heading_median7.328559013983469
deviation-heading_min3.4784189499482245
driven_any_max7.918292117162034
driven_any_mean5.516960884255573
driven_any_median6.330632888969679
driven_any_min1.488285641920899
driven_lanedir_consec_max2.949087122696561
driven_lanedir_consec_mean1.9259287285940068
driven_lanedir_consec_min1.06252046730508
driven_lanedir_max5.3133824726235845
driven_lanedir_mean2.983702453289953
driven_lanedir_median2.7794534366155745
driven_lanedir_min1.06252046730508
get_duckie_state_max1.342519814479626e-06
get_duckie_state_mean1.3021606484944944e-06
get_duckie_state_median1.3166621364622093e-06
get_duckie_state_min1.232798506573933e-06
get_robot_state_max0.00391977196629101
get_robot_state_mean0.0037501077453330695
get_robot_state_median0.0037520486638014
get_robot_state_min0.003576561687438469
get_state_dump_max0.004951953093872578
get_state_dump_mean0.004720553044212211
get_state_dump_median0.004688132769993585
get_state_dump_min0.004553993542989095
get_ui_image_max0.03531806255743756
get_ui_image_mean0.03138264370828812
get_ui_image_median0.03187535962562024
get_ui_image_min0.026461793024474437
in-drivable-lane_max46.84999999999863
in-drivable-lane_mean16.437499999999627
in-drivable-lane_min2.8000000000000247
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 4.747675066374642, "get_ui_image": 0.02942390843032982, "step_physics": 0.13415224548919555, "survival_time": 36.80000000000005, "driven_lanedir": 4.267028545156069, "get_state_dump": 0.004755582058931141, "get_robot_state": 0.0038052216637247775, "sim_render-ego0": 0.003852853943243764, "get_duckie_state": 1.342519814479626e-06, "in-drivable-lane": 2.949999999999932, "deviation-heading": 7.256224409980328, "agent_compute-ego0": 0.013120939159005113, "complete-iteration": 0.20295765409637825, "set_robot_commands": 0.00231972931198058, "deviation-center-line": 1.8986719110653436, "driven_lanedir_consec": 2.4002289962993064, "sim_compute_sim_state": 0.009358022073877715, "sim_compute_performance-ego0": 0.002085178483939721}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.488285641920899, "get_ui_image": 0.03531806255743756, "step_physics": 0.1695615673452858, "survival_time": 12.25000000000004, "driven_lanedir": 1.06252046730508, "get_state_dump": 0.004553993542989095, "get_robot_state": 0.003576561687438469, "sim_render-ego0": 0.003664905462807756, "get_duckie_state": 1.232798506573933e-06, "in-drivable-lane": 2.8000000000000247, "deviation-heading": 3.4784189499482245, "agent_compute-ego0": 0.012085960163333551, "complete-iteration": 0.2429804617796487, "set_robot_commands": 0.0021304812857775184, "deviation-center-line": 0.7592113456095149, "driven_lanedir_consec": 1.06252046730508, "sim_compute_sim_state": 0.01004921905393523, "sim_compute_performance-ego0": 0.001958552414808816}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.913590711564716, "get_ui_image": 0.03432681082091066, "step_physics": 0.18348301082328397, "survival_time": 59.99999999999873, "driven_lanedir": 5.3133824726235845, "get_state_dump": 0.004951953093872578, "get_robot_state": 0.00391977196629101, "sim_render-ego0": 0.003977158980802334, "get_duckie_state": 1.3227168963811082e-06, "in-drivable-lane": 13.149999999999922, "deviation-heading": 19.008274459362138, "agent_compute-ego0": 0.01316228913427094, "complete-iteration": 0.26003541874945113, "set_robot_commands": 0.0023402566218951857, "deviation-center-line": 4.089691315907791, "driven_lanedir_consec": 2.949087122696561, "sim_compute_sim_state": 0.011629642197531924, "sim_compute_performance-ego0": 0.0021571503590783114}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.918292117162034, "get_ui_image": 0.026461793024474437, "step_physics": 0.15983144806187716, "survival_time": 59.99999999999873, "driven_lanedir": 1.2918783280750796, "get_state_dump": 0.004620683481056029, "get_robot_state": 0.0036988756638780223, "sim_render-ego0": 0.00375233184884331, "get_duckie_state": 1.3106073765433103e-06, "in-drivable-lane": 46.84999999999863, "deviation-heading": 7.400893617986608, "agent_compute-ego0": 0.012352694877478403, "complete-iteration": 0.22127607879193997, "set_robot_commands": 0.0022063044882336823, "deviation-center-line": 1.1219487845434706, "driven_lanedir_consec": 1.2918783280750796, "sim_compute_sim_state": 0.006318124902933265, "sim_compute_performance-ego0": 0.001952069486606925}}
set_robot_commands_max0.0023402566218951857
set_robot_commands_mean0.0022491929269717417
set_robot_commands_median0.0022630169001071313
set_robot_commands_min0.0021304812857775184
sim_compute_performance-ego0_max0.0021571503590783114
sim_compute_performance-ego0_mean0.0020382376861084435
sim_compute_performance-ego0_median0.002021865449374268
sim_compute_performance-ego0_min0.001952069486606925
sim_compute_sim_state_max0.011629642197531924
sim_compute_sim_state_mean0.009338752057069534
sim_compute_sim_state_median0.009703620563906471
sim_compute_sim_state_min0.006318124902933265
sim_render-ego0_max0.003977158980802334
sim_render-ego0_mean0.003811812558924291
sim_render-ego0_median0.003802592896043537
sim_render-ego0_min0.003664905462807756
simulation-passed1
step_physics_max0.18348301082328397
step_physics_mean0.16175706792991065
step_physics_median0.16469650770358146
step_physics_min0.13415224548919555
survival_time_max59.99999999999873
survival_time_mean42.262499999999385
survival_time_min12.25000000000004
No reset possible
49626LFv-simerrorno0:09:48
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140517179615360
- M:video_aido:cmdline(in:/;out:/) 140517034120288
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
49614LFv-simerrorno0:05:36
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140008809975968
- M:video_aido:cmdline(in:/;out:/) 140008808183264
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
42779LFv-simsuccessno0:11:08
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
42778LFv-simsuccessno0:08:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
42777LFv-simsuccessno0:08:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
42776LFv-simsuccessno0:06:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible