Duckietown Challenges Home Challenges Submissions

Submission 9330

Submission9330
Competingyes
Challengeaido5-LF-sim-validation
UserJerome Labonte 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58169
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58169

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58169LFv-simsuccessyes0:33:38
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median7.240000475245312
survival_time_median59.99999999999873
deviation-center-line_median3.295658608132273
in-drivable-lane_median9.074999999999878


other stats
agent_compute-ego0_max0.01323130679372078
agent_compute-ego0_mean0.01266429221477072
agent_compute-ego0_median0.01270118720525508
agent_compute-ego0_min0.012023487654851934
complete-iteration_max0.2269928816653211
complete-iteration_mean0.19018394713096745
complete-iteration_median0.1874611467107482
complete-iteration_min0.15882061343705228
deviation-center-line_max4.506522157613055
deviation-center-line_mean3.154738010895569
deviation-center-line_min1.521112669704674
deviation-heading_max22.405076254836832
deviation-heading_mean13.664860306421932
deviation-heading_median12.936283768703262
deviation-heading_min6.381797433444374
driven_any_max13.40178001023378
driven_any_mean11.184165691969168
driven_any_median11.699461682736269
driven_any_min7.93595939217036
driven_lanedir_consec_max11.11703319727334
driven_lanedir_consec_mean7.412704438929911
driven_lanedir_consec_min4.053783607955676
driven_lanedir_max12.79831681711014
driven_lanedir_mean9.505154801106055
driven_lanedir_median9.432862930324546
driven_lanedir_min6.3565765266649965
get_duckie_state_max1.512629595920767e-06
get_duckie_state_mean1.386572700554848e-06
get_duckie_state_median1.3760186353392843e-06
get_duckie_state_min1.281623935620056e-06
get_robot_state_max0.0040451400359356625
get_robot_state_mean0.0037881247599742
get_robot_state_median0.0037053306335017248
get_robot_state_min0.003696697736957687
get_state_dump_max0.005244412746512252
get_state_dump_mean0.004875681446641665
get_state_dump_median0.004831029016906078
get_state_dump_min0.004596255006242255
get_ui_image_max0.03445473122259262
get_ui_image_mean0.030326278142509485
get_ui_image_median0.0305705047831374
get_ui_image_min0.025709371781170515
in-drivable-lane_max13.749999999999666
in-drivable-lane_mean8.47499999999986
in-drivable-lane_min2.0000000000000115
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 13.26862661233472, "get_ui_image": 0.027555458750157035, "step_physics": 0.10883221777154443, "survival_time": 59.99999999999873, "driven_lanedir": 11.458058384810982, "get_state_dump": 0.004925216465171032, "get_robot_state": 0.0036968309416759025, "sim_render-ego0": 0.003826554470713391, "get_duckie_state": 1.458105298502062e-06, "in-drivable-lane": 9.499999999999854, "deviation-heading": 13.140203245877949, "agent_compute-ego0": 0.012511244026647815, "complete-iteration": 0.17470277953802993, "set_robot_commands": 0.002222487372621509, "deviation-center-line": 3.4525823472137533, "driven_lanedir_consec": 11.11703319727334, "sim_compute_sim_state": 0.009009065080145614, "sim_compute_performance-ego0": 0.002028402936746437}, "LF-norm-zigzag-000-ego0": {"driven_any": 10.130296753137811, "get_ui_image": 0.03445473122259262, "step_physics": 0.15051567822471448, "survival_time": 59.99999999999873, "driven_lanedir": 7.407667475838112, "get_state_dump": 0.004736841568641122, "get_robot_state": 0.003696697736957687, "sim_render-ego0": 0.003824951448210272, "get_duckie_state": 1.2939319721765066e-06, "in-drivable-lane": 13.749999999999666, "deviation-heading": 22.405076254836832, "agent_compute-ego0": 0.012891130383862344, "complete-iteration": 0.2269928816653211, "set_robot_commands": 0.0022431742043221227, "deviation-center-line": 3.138734869050793, "driven_lanedir_consec": 4.053783607955676, "sim_compute_sim_state": 0.012428998947143556, "sim_compute_performance-ego0": 0.002108482000333483}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.93595939217036, "get_ui_image": 0.03358555081611777, "step_physics": 0.12340116569861316, "survival_time": 34.50000000000018, "driven_lanedir": 6.3565765266649965, "get_state_dump": 0.005244412746512252, "get_robot_state": 0.0040451400359356625, "sim_render-ego0": 0.00410810496803648, "get_duckie_state": 1.512629595920767e-06, "in-drivable-lane": 8.649999999999903, "deviation-heading": 6.381797433444374, "agent_compute-ego0": 0.01323130679372078, "complete-iteration": 0.20021951388346648, "set_robot_commands": 0.0024486150479351215, "deviation-center-line": 1.521112669704674, "driven_lanedir_consec": 5.388160847152429, "sim_compute_sim_state": 0.011847787586548912, "sim_compute_performance-ego0": 0.0022040862595465696}, "LF-norm-small_loop-000-ego0": {"driven_any": 13.40178001023378, "get_ui_image": 0.025709371781170515, "step_physics": 0.09857410654040996, "survival_time": 59.99999999999873, "driven_lanedir": 12.79831681711014, "get_state_dump": 0.004596255006242255, "get_robot_state": 0.003713830325327547, "sim_render-ego0": 0.0037640709364841025, "get_duckie_state": 1.281623935620056e-06, "in-drivable-lane": 2.0000000000000115, "deviation-heading": 12.732364291528576, "agent_compute-ego0": 0.012023487654851934, "complete-iteration": 0.15882061343705228, "set_robot_commands": 0.0022246643069582517, "deviation-center-line": 4.506522157613055, "driven_lanedir_consec": 9.091840103338194, "sim_compute_sim_state": 0.006126326982623631, "sim_compute_performance-ego0": 0.002004812798035532}}
set_robot_commands_max0.0024486150479351215
set_robot_commands_mean0.0022847352329592513
set_robot_commands_median0.002233919255640187
set_robot_commands_min0.002222487372621509
sim_compute_performance-ego0_max0.0022040862595465696
sim_compute_performance-ego0_mean0.0020864459986655055
sim_compute_performance-ego0_median0.0020684424685399596
sim_compute_performance-ego0_min0.002004812798035532
sim_compute_sim_state_max0.012428998947143556
sim_compute_sim_state_mean0.009853044649115428
sim_compute_sim_state_median0.010428426333347264
sim_compute_sim_state_min0.006126326982623631
sim_render-ego0_max0.00410810496803648
sim_render-ego0_mean0.003880920455861062
sim_render-ego0_median0.003825752959461832
sim_render-ego0_min0.0037640709364841025
simulation-passed1
step_physics_max0.15051567822471448
step_physics_mean0.12033079205882052
step_physics_median0.1161166917350788
step_physics_min0.09857410654040996
survival_time_max59.99999999999873
survival_time_mean53.62499999999909
survival_time_min34.50000000000018
No reset possible
52298LFv-simerrorno0:08:32
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140155659870272
- M:video_aido:cmdline(in:/;out:/) 140155659871232
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52297LFv-simerrorno0:05:50
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140255288802512
- M:video_aido:cmdline(in:/;out:/) 140255288236880
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41709LFv-simsuccessno0:08:41
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38155LFv-simerrorno0:00:40
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9330/LFv-sim-mont03-cfb9f976bc49-1-job38155-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38151LFv-simerrorno0:00:44
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9330/LFv-sim-mont01-6ef51bb8a9d6-1-job38151-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36308LFv-simerrorno0:00:47
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9330/LFv-sim-Sandy2-sandy-1-job36308-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35746LFv-simsuccessno0:01:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35743LFv-simerrorno0:00:48
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1063, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9330/LFv-sim-noname-sandy-1-job35743-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35351LFv-simerrorno0:21:34
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9330/LFv-sim-reg02-1b92df2e7e91-1-job35351:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9330/LFv-sim-reg02-1b92df2e7e91-1-job35351/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9330/LFv-sim-reg02-1b92df2e7e91-1-job35351/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9330/LFv-sim-reg02-1b92df2e7e91-1-job35351/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9330/LFv-sim-reg02-1b92df2e7e91-1-job35351/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9330/LFv-sim-reg02-1b92df2e7e91-1-job35351/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34986LFv-simsuccessno0:21:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34692LFv-simsuccessno0:24:08
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34691LFv-simsuccessno0:24:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible