Duckietown Challenges Home Challenges Submissions

Submission 10855

Submission10855
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57734
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

57734

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
57734LFv-simsuccessyes0:12:27
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.6550379629243321
survival_time_median9.825000000000005
deviation-center-line_median0.17222886498678372
in-drivable-lane_median6.17500000000001


other stats
agent_compute-ego0_max0.013916911500872988
agent_compute-ego0_mean0.013145854383817017
agent_compute-ego0_median0.012968962920885063
agent_compute-ego0_min0.012728580192624949
complete-iteration_max0.2239793288080316
complete-iteration_mean0.1946024520762961
complete-iteration_median0.19213592638744992
complete-iteration_min0.17015862672225288
deviation-center-line_max1.6152854346837118
deviation-center-line_mean0.5245322332267665
deviation-center-line_min0.13838576824978657
deviation-heading_max7.717503213582665
deviation-heading_mean2.5835246125486218
deviation-heading_median0.9812516091959124
deviation-heading_min0.6540920182199959
driven_any_max10.39175551790986
driven_any_mean3.810662904089197
driven_any_median1.7768768536352513
driven_any_min1.2971423911764275
driven_lanedir_consec_max5.723968196080483
driven_lanedir_consec_mean1.9144691921898649
driven_lanedir_consec_min0.6238326468303113
driven_lanedir_max5.729943241461132
driven_lanedir_mean1.915962953535027
driven_lanedir_median0.6550379629243321
driven_lanedir_min0.6238326468303113
get_duckie_state_max1.4637455795750474e-06
get_duckie_state_mean1.4157426742323734e-06
get_duckie_state_median1.415023814622543e-06
get_duckie_state_min1.3691774881093598e-06
get_robot_state_max0.003872420571067117
get_robot_state_mean0.003762871573166666
get_robot_state_median0.003749602269253414
get_robot_state_min0.0036798611830927185
get_state_dump_max0.005104365493312026
get_state_dump_mean0.0048023001733291226
get_state_dump_median0.004757605803367475
get_state_dump_min0.00458962359326951
get_ui_image_max0.036602404556776345
get_ui_image_mean0.0315289420511569
get_ui_image_median0.03110069120260764
get_ui_image_min0.027311981242635976
in-drivable-lane_max22.099999999999
in-drivable-lane_mean9.612499999999752
in-drivable-lane_min3.999999999999986
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 10.39175551790986, "get_ui_image": 0.028105356707102555, "step_physics": 0.10521080553877515, "survival_time": 51.6499999999992, "driven_lanedir": 5.729943241461132, "get_state_dump": 0.00458962359326951, "get_robot_state": 0.0036798611830927185, "sim_render-ego0": 0.003956145198017762, "get_duckie_state": 1.3691774881093598e-06, "in-drivable-lane": 22.099999999999, "deviation-heading": 7.717503213582665, "agent_compute-ego0": 0.012728580192624949, "complete-iteration": 0.17185378351340672, "set_robot_commands": 0.0022483236323934, "deviation-center-line": 1.6152854346837118, "driven_lanedir_consec": 5.723968196080483, "sim_compute_sim_state": 0.009208118892499973, "sim_compute_performance-ego0": 0.002043698465801069}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.2971423911764275, "get_ui_image": 0.036602404556776345, "step_physics": 0.14720886004598518, "survival_time": 7.549999999999981, "driven_lanedir": 0.6238326468303113, "get_state_dump": 0.0047281055073989066, "get_robot_state": 0.003712061204408344, "sim_render-ego0": 0.00393692600099664, "get_duckie_state": 1.3850237193860506e-06, "in-drivable-lane": 3.999999999999986, "deviation-heading": 1.2205305486905136, "agent_compute-ego0": 0.012953234346289384, "complete-iteration": 0.2239793288080316, "set_robot_commands": 0.002259954025870875, "deviation-center-line": 0.19791563280452307, "driven_lanedir_consec": 0.6238326468303113, "sim_compute_sim_state": 0.010491400957107544, "sim_compute_performance-ego0": 0.001999157039742721}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.4237774567388572, "get_ui_image": 0.03409602569811272, "step_physics": 0.13815297357963793, "survival_time": 8.199999999999982, "driven_lanedir": 0.6525316772003017, "get_state_dump": 0.005104365493312026, "get_robot_state": 0.003872420571067117, "sim_render-ego0": 0.004125654336177942, "get_duckie_state": 1.4637455795750474e-06, "in-drivable-lane": 4.5499999999999865, "deviation-heading": 0.6540920182199959, "agent_compute-ego0": 0.013916911500872988, "complete-iteration": 0.2124180692614931, "set_robot_commands": 0.0023075436100815283, "deviation-center-line": 0.13838576824978657, "driven_lanedir_consec": 0.6525316772003017, "sim_compute_sim_state": 0.008690558057842832, "sim_compute_performance-ego0": 0.0020601865017052853}, "LF-norm-small_loop-000-ego0": {"driven_any": 2.1299762505316453, "get_ui_image": 0.027311981242635976, "step_physics": 0.10767504235972528, "survival_time": 11.450000000000028, "driven_lanedir": 0.6575442486483627, "get_state_dump": 0.004787106099336044, "get_robot_state": 0.003787143334098484, "sim_render-ego0": 0.004018297402755074, "get_duckie_state": 1.4450239098590351e-06, "in-drivable-lane": 7.800000000000033, "deviation-heading": 0.7419726697013113, "agent_compute-ego0": 0.012984691495480745, "complete-iteration": 0.17015862672225288, "set_robot_commands": 0.0022508662679921027, "deviation-center-line": 0.14654209716904437, "driven_lanedir_consec": 0.6575442486483627, "sim_compute_sim_state": 0.005266211343848187, "sim_compute_performance-ego0": 0.0019919592401255732}}
set_robot_commands_max0.0023075436100815283
set_robot_commands_mean0.0022666718840844767
set_robot_commands_median0.002255410146931489
set_robot_commands_min0.0022483236323934
sim_compute_performance-ego0_max0.0020601865017052853
sim_compute_performance-ego0_mean0.0020237503118436624
sim_compute_performance-ego0_median0.0020214277527718953
sim_compute_performance-ego0_min0.0019919592401255732
sim_compute_sim_state_max0.010491400957107544
sim_compute_sim_state_mean0.008414072312824634
sim_compute_sim_state_median0.008949338475171402
sim_compute_sim_state_min0.005266211343848187
sim_render-ego0_max0.004125654336177942
sim_render-ego0_mean0.004009255734486854
sim_render-ego0_median0.003987221300386418
sim_render-ego0_min0.00393692600099664
simulation-passed1
step_physics_max0.14720886004598518
step_physics_mean0.1245619203810309
step_physics_median0.1229140079696816
step_physics_min0.10521080553877515
survival_time_max51.6499999999992
survival_time_mean19.7124999999998
survival_time_min7.549999999999981
No reset possible
51487LFv-simerrorno0:02:10
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139778329113264
- M:video_aido:cmdline(in:/;out:/) 139778329112784
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51470LFv-simhost-errorno0:04:12
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51458LFv-simerrorno0:03:16
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140704723397936
- M:video_aido:cmdline(in:/;out:/) 140704723397024
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40843LFv-simsuccessno0:09:45
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40840LFv-simsuccessno0:08:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40836LFv-simsuccessno0:08:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40835LFv-simsuccessno0:10:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40834LFv-simsuccessno0:09:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40824LFv-simsuccessno0:09:17
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40822LFv-simsuccessno0:10:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38194LFv-simerrorno0:00:40
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission10855/LFv-sim-mont01-6ef51bb8a9d6-1-job38194-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38192LFv-simerrorno0:00:36
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission10855/LFv-sim-mont03-cfb9f976bc49-1-job38192-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible