Duckietown Challenges Home Challenges Submissions

Submission 9274

Submission9274
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58409
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58409

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58409LFv-simsuccessyes0:38:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median6.216403422199971
survival_time_median59.99999999999873
deviation-center-line_median2.548860047808028
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.03738900783357771
agent_compute-ego0_mean0.035591982236809774
agent_compute-ego0_median0.0366055055224429
agent_compute-ego0_min0.03176791006877559
complete-iteration_max0.2460164694265958
complete-iteration_mean0.22096578410821197
complete-iteration_median0.22132916255954105
complete-iteration_min0.1951883418871699
deviation-center-line_max2.6405515185540955
deviation-center-line_mean2.449436537616176
deviation-center-line_min2.059474536294552
deviation-heading_max5.73696203848988
deviation-heading_mean4.949941058204
deviation-heading_median4.853778225618798
deviation-heading_min4.355245743088523
driven_any_max6.253640466358732
driven_any_mean6.2523044998718325
driven_any_median6.253608160164667
driven_any_min6.248361212799264
driven_lanedir_consec_max6.227667911734675
driven_lanedir_consec_mean6.21736487910894
driven_lanedir_consec_min6.208984760301146
driven_lanedir_max6.227667911734675
driven_lanedir_mean6.21736487910894
driven_lanedir_median6.216403422199971
driven_lanedir_min6.208984760301146
get_duckie_state_max1.416019754147748e-06
get_duckie_state_mean1.3624698692912562e-06
get_duckie_state_median1.37403346815276e-06
get_duckie_state_min1.285792786711757e-06
get_robot_state_max0.004179650599712337
get_robot_state_mean0.003981504859177894
get_robot_state_median0.004089467630696038
get_robot_state_min0.00356743357560716
get_state_dump_max0.005079289459368272
get_state_dump_mean0.004904176918890554
get_state_dump_median0.004985986601601632
get_state_dump_min0.00456544501299068
get_ui_image_max0.04166440860516424
get_ui_image_mean0.034652942771419296
get_ui_image_median0.03325780851457835
get_ui_image_min0.03043174545135625
in-drivable-lane_max0.0
in-drivable-lane_mean0.0
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 6.253640466358732, "get_ui_image": 0.033620874252446385, "step_physics": 0.1302887257886469, "survival_time": 59.99999999999873, "driven_lanedir": 6.219727152976221, "get_state_dump": 0.005022521419985705, "get_robot_state": 0.004089652449761104, "sim_render-ego0": 0.004163883806366805, "get_duckie_state": 1.416019754147748e-06, "in-drivable-lane": 0.0, "deviation-heading": 4.355245743088523, "agent_compute-ego0": 0.036243711681191275, "complete-iteration": 0.2297440796073132, "set_robot_commands": 0.002643206038145499, "deviation-center-line": 2.6405515185540955, "driven_lanedir_consec": 6.219727152976221, "sim_compute_sim_state": 0.011293651261595664, "sim_compute_performance-ego0": 0.002271844981413499}, "LF-norm-zigzag-000-ego0": {"driven_any": 6.253606758378004, "get_ui_image": 0.04166440860516424, "step_physics": 0.136551231071415, "survival_time": 59.99999999999873, "driven_lanedir": 6.208984760301146, "get_state_dump": 0.004949451783217558, "get_robot_state": 0.004089282811630973, "sim_render-ego0": 0.004154643845697128, "get_duckie_state": 1.3578543555825874e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.73696203848988, "agent_compute-ego0": 0.03696729936369452, "complete-iteration": 0.2460164694265958, "set_robot_commands": 0.0025963594673277437, "deviation-center-line": 2.4620900013463656, "driven_lanedir_consec": 6.208984760301146, "sim_compute_sim_state": 0.012608314334701042, "sim_compute_performance-ego0": 0.0023314895280493386}, "LF-norm-techtrack-000-ego0": {"driven_any": 6.248361212799264, "get_ui_image": 0.03043174545135625, "step_physics": 0.10456240167228704, "survival_time": 59.99999999999873, "driven_lanedir": 6.21307969142372, "get_state_dump": 0.00456544501299068, "get_robot_state": 0.00356743357560716, "sim_render-ego0": 0.003722713154420368, "get_duckie_state": 1.285792786711757e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.033196043525689, "agent_compute-ego0": 0.03176791006877559, "complete-iteration": 0.1951883418871699, "set_robot_commands": 0.0021881096369023127, "deviation-center-line": 2.059474536294552, "driven_lanedir_consec": 6.21307969142372, "sim_compute_sim_state": 0.01236551488865226, "sim_compute_performance-ego0": 0.0019276618560486096}, "LF-norm-small_loop-000-ego0": {"driven_any": 6.25360956195133, "get_ui_image": 0.032894742776710326, "step_physics": 0.11730242291656164, "survival_time": 59.99999999999873, "driven_lanedir": 6.227667911734675, "get_state_dump": 0.005079289459368272, "get_robot_state": 0.004179650599712337, "sim_render-ego0": 0.004176390359641908, "get_duckie_state": 1.3902125807229326e-06, "in-drivable-lane": 0.0, "deviation-heading": 4.674360407711906, "agent_compute-ego0": 0.03738900783357771, "complete-iteration": 0.21291424551176885, "set_robot_commands": 0.0026785624612876515, "deviation-center-line": 2.63563009426969, "driven_lanedir_consec": 6.227667911734675, "sim_compute_sim_state": 0.006803453018226591, "sim_compute_performance-ego0": 0.0023107695440567106}}
set_robot_commands_max0.0026785624612876515
set_robot_commands_mean0.0025265594009158017
set_robot_commands_median0.0026197827527366215
set_robot_commands_min0.0021881096369023127
sim_compute_performance-ego0_max0.0023314895280493386
sim_compute_performance-ego0_mean0.0022104414773920396
sim_compute_performance-ego0_median0.0022913072627351047
sim_compute_performance-ego0_min0.0019276618560486096
sim_compute_sim_state_max0.012608314334701042
sim_compute_sim_state_mean0.010767733375793887
sim_compute_sim_state_median0.01182958307512396
sim_compute_sim_state_min0.006803453018226591
sim_render-ego0_max0.004176390359641908
sim_render-ego0_mean0.004054407791531552
sim_render-ego0_median0.004159263826031967
sim_render-ego0_min0.003722713154420368
simulation-passed1
step_physics_max0.136551231071415
step_physics_mean0.12217619536222764
step_physics_median0.12379557435260428
step_physics_min0.10456240167228704
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
52509LFv-simerrorno0:13:57
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139625409356080
- M:video_aido:cmdline(in:/;out:/) 139625409357568
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52506LFv-simhost-errorno0:11:24
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52501LFv-simerrorno0:11:08
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140263488782288
- M:video_aido:cmdline(in:/;out:/) 140263489227936
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52376LFv-simtimeoutno0:14:14
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41764LFv-simsuccessno0:09:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41762LFv-simsuccessno0:09:24
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38260LFv-simsuccessno0:11:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38258LFv-simsuccessno0:09:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36372LFv-simerrorno0:00:49
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9274/LFv-sim-Sandy2-sandy-1-job36372-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35804LFv-simsuccessno0:01:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35397LFv-simerrorno0:22:36
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9274/LFv-sim-reg02-1b92df2e7e91-1-job35397:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9274/LFv-sim-reg02-1b92df2e7e91-1-job35397/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9274/LFv-sim-reg02-1b92df2e7e91-1-job35397/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9274/LFv-sim-reg02-1b92df2e7e91-1-job35397/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9274/LFv-sim-reg02-1b92df2e7e91-1-job35397/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9274/LFv-sim-reg02-1b92df2e7e91-1-job35397/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35038LFv-simsuccessno0:23:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34583LFv-simsuccessno0:24:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible