Duckietown Challenges Home Challenges Submissions

Submission 10852

Submission10852
Competingyes
Challengeaido5-LF-sim-validation
UserAyman Shams 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57742
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

57742

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
57742LFv-simsuccessyes0:13:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.5638766224282804
survival_time_median19.325000000000134
deviation-center-line_median0.504024735645444
in-drivable-lane_median10.650000000000103


other stats
agent_compute-ego0_max0.01417162490356879
agent_compute-ego0_mean0.013499026604915844
agent_compute-ego0_median0.013547709574400397
agent_compute-ego0_min0.012729062367293794
complete-iteration_max0.24654733959366293
complete-iteration_mean0.20914599987900465
complete-iteration_median0.21473526398747567
complete-iteration_min0.16056613194740424
deviation-center-line_max1.2496081008919622
deviation-center-line_mean0.6480307920783603
deviation-center-line_min0.33446559613059124
deviation-heading_max2.9766692190668325
deviation-heading_mean1.813095218132075
deviation-heading_median1.5400784602145092
deviation-heading_min1.1955547330324487
driven_any_max3.774956603958053
driven_any_mean2.4210196442928105
driven_any_median2.5555435398890887
driven_any_min0.7980348934350114
driven_lanedir_consec_max1.4547606282114538
driven_lanedir_consec_mean0.734337002408433
driven_lanedir_consec_min0.3548341365657174
driven_lanedir_max1.4547606282114538
driven_lanedir_mean0.734337002408433
driven_lanedir_median0.5638766224282804
driven_lanedir_min0.3548341365657174
get_duckie_state_max2.4807389633693663e-06
get_duckie_state_mean2.380184106426576e-06
get_duckie_state_median2.443579256453576e-06
get_duckie_state_min2.1528389494297867e-06
get_robot_state_max0.0045174480966734
get_robot_state_mean0.004263628614397402
get_robot_state_median0.004309453777469864
get_robot_state_min0.00391815880597648
get_state_dump_max0.005675504703332882
get_state_dump_mean0.0053867919517851335
get_state_dump_median0.005555915827537427
get_state_dump_min0.0047598314487327965
get_ui_image_max0.03938406705856323
get_ui_image_mean0.03382447154766502
get_ui_image_median0.034571255312573684
get_ui_image_min0.026771308506949473
in-drivable-lane_max24.750000000000263
in-drivable-lane_mean12.200000000000117
in-drivable-lane_min2.7499999999999902
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 1.965056330097135, "get_ui_image": 0.03225037052293028, "step_physics": 0.13428861788003751, "survival_time": 15.10000000000008, "driven_lanedir": 0.7437935981762278, "get_state_dump": 0.005675504703332882, "get_robot_state": 0.004351372766022635, "sim_render-ego0": 0.004569938867399008, "get_duckie_state": 2.480182710653878e-06, "in-drivable-lane": 9.150000000000093, "deviation-heading": 1.7785672553630585, "agent_compute-ego0": 0.013599914292691173, "complete-iteration": 0.2107352281954422, "set_robot_commands": 0.002637982761899237, "deviation-center-line": 0.555409644741715, "driven_lanedir_consec": 0.7437935981762278, "sim_compute_sim_state": 0.010807496092893895, "sim_compute_performance-ego0": 0.002443259305293017}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.7980348934350114, "get_ui_image": 0.03938406705856323, "step_physics": 0.16265451206880455, "survival_time": 6.749999999999984, "driven_lanedir": 0.383959646680333, "get_state_dump": 0.005463501986335306, "get_robot_state": 0.004267534788917093, "sim_render-ego0": 0.0044459823299856744, "get_duckie_state": 2.4069758022532744e-06, "in-drivable-lane": 2.7499999999999902, "deviation-heading": 1.30158966506596, "agent_compute-ego0": 0.01349550485610962, "complete-iteration": 0.24654733959366293, "set_robot_commands": 0.002633741673301248, "deviation-center-line": 0.4526398265491731, "driven_lanedir_consec": 0.383959646680333, "sim_compute_sim_state": 0.011712074279785156, "sim_compute_performance-ego0": 0.0023753274889553293}, "LF-norm-techtrack-000-ego0": {"driven_any": 3.774956603958053, "get_ui_image": 0.036892140102217086, "step_physics": 0.13340079974958782, "survival_time": 28.100000000000264, "driven_lanedir": 0.3548341365657174, "get_state_dump": 0.0056483296687395495, "get_robot_state": 0.0045174480966734, "sim_render-ego0": 0.004868332911977548, "get_duckie_state": 2.4807389633693663e-06, "in-drivable-lane": 24.750000000000263, "deviation-heading": 1.1955547330324487, "agent_compute-ego0": 0.01417162490356879, "complete-iteration": 0.21873529977950917, "set_robot_commands": 0.002689398838402323, "deviation-center-line": 0.33446559613059124, "driven_lanedir_consec": 0.3548341365657174, "sim_compute_sim_state": 0.013881563928580412, "sim_compute_performance-ego0": 0.002547144254613304}, "LF-norm-small_loop-000-ego0": {"driven_any": 3.1460307496810422, "get_ui_image": 0.026771308506949473, "step_physics": 0.0970494979518955, "survival_time": 23.5500000000002, "driven_lanedir": 1.4547606282114538, "get_state_dump": 0.0047598314487327965, "get_robot_state": 0.00391815880597648, "sim_render-ego0": 0.004040541790299497, "get_duckie_state": 2.1528389494297867e-06, "in-drivable-lane": 12.15000000000012, "deviation-heading": 2.9766692190668325, "agent_compute-ego0": 0.012729062367293794, "complete-iteration": 0.16056613194740424, "set_robot_commands": 0.002403515880390749, "deviation-center-line": 1.2496081008919622, "driven_lanedir_consec": 1.4547606282114538, "sim_compute_sim_state": 0.006707902682029595, "sim_compute_performance-ego0": 0.002088145179263616}}
set_robot_commands_max0.002689398838402323
set_robot_commands_mean0.002591159788498389
set_robot_commands_median0.0026358622176002426
set_robot_commands_min0.002403515880390749
sim_compute_performance-ego0_max0.002547144254613304
sim_compute_performance-ego0_mean0.0023634690570313167
sim_compute_performance-ego0_median0.0024092933971241732
sim_compute_performance-ego0_min0.002088145179263616
sim_compute_sim_state_max0.013881563928580412
sim_compute_sim_state_mean0.010777259245822264
sim_compute_sim_state_median0.011259785186339523
sim_compute_sim_state_min0.006707902682029595
sim_render-ego0_max0.004868332911977548
sim_render-ego0_mean0.004481198974915432
sim_render-ego0_median0.004507960598692341
sim_render-ego0_min0.004040541790299497
simulation-passed1
step_physics_max0.16265451206880455
step_physics_mean0.13184835691258137
step_physics_median0.13384470881481267
step_physics_min0.0970494979518955
survival_time_max28.100000000000264
survival_time_mean18.37500000000013
survival_time_min6.749999999999984
No reset possible
57737LFv-simsuccessyes0:11:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51548LFv-simerrorno0:02:11
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140328284502912
- M:video_aido:cmdline(in:/;out:/) 140328284158128
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51546LFv-simerrorno0:02:45
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140234513317024
- M:video_aido:cmdline(in:/;out:/) 140234513316352
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51481LFv-simhost-errorno0:04:33
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40850LFv-simsuccessno0:09:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40828LFv-simtimeoutno0:14:29
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40827LFv-simtimeoutno0:15:32
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38184LFv-simsuccessno0:11:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38181LFv-simerrorno0:00:41
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission10852/LFv-sim-mont03-cfb9f976bc49-1-job38181-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38177LFv-simsuccessno0:11:08
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible