Duckietown Challenges Home Challenges Submissions

Submission 9239

Submission9239
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58485
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58485

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58485LFv-simsuccessyes0:20:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median4.116890139031213
survival_time_median36.39999999999998
deviation-center-line_median1.885844520898889
in-drivable-lane_median11.349999999999886


other stats
agent_compute-ego0_max0.01261872144845816
agent_compute-ego0_mean0.012410724988442152
agent_compute-ego0_median0.01245846281687336
agent_compute-ego0_min0.012107252871563732
complete-iteration_max0.20313743566855405
complete-iteration_mean0.17988388846270095
complete-iteration_median0.17801681435513397
complete-iteration_min0.16036448947198187
deviation-center-line_max3.160312638762253
deviation-center-line_mean1.7979356409979377
deviation-center-line_min0.25974088343171975
deviation-heading_max11.728057928981128
deviation-heading_mean6.290167721813913
deviation-heading_median5.968541387465835
deviation-heading_min1.495530183342855
driven_any_max10.422607366469997
driven_any_mean6.070407145742276
driven_any_median6.17305486441774
driven_any_min1.5129114876636331
driven_lanedir_consec_max6.185608756717224
driven_lanedir_consec_mean3.7719929280841393
driven_lanedir_consec_min0.6685826775569077
driven_lanedir_max6.185608756717224
driven_lanedir_mean3.7719929280841393
driven_lanedir_median4.116890139031213
driven_lanedir_min0.6685826775569077
get_duckie_state_max2.30715825007512e-06
get_duckie_state_mean2.216070439849369e-06
get_duckie_state_median2.2259341473878615e-06
get_duckie_state_min2.1052552145466327e-06
get_robot_state_max0.0037278383206098505
get_robot_state_mean0.003695647489614904
get_robot_state_median0.003700975167691723
get_robot_state_min0.00365280130246632
get_state_dump_max0.0047594192700508315
get_state_dump_mean0.004688052825915386
get_state_dump_median0.004682982161234814
get_state_dump_min0.004626827711141089
get_ui_image_max0.03646492713536972
get_ui_image_mean0.03111126649791833
get_ui_image_median0.030546549400718
get_ui_image_min0.026887040054867608
in-drivable-lane_max22.149999999999455
in-drivable-lane_mean12.52499999999981
in-drivable-lane_min5.250000000000011
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 4.9370691587765, "get_ui_image": 0.02842205509010328, "step_physics": 0.09667305930103516, "survival_time": 29.30000000000028, "driven_lanedir": 3.4800334831893256, "get_state_dump": 0.004626827711141089, "get_robot_state": 0.00365280130246632, "sim_render-ego0": 0.003731161406742856, "get_duckie_state": 2.231061763211044e-06, "in-drivable-lane": 8.650000000000123, "deviation-heading": 3.1223185627392227, "agent_compute-ego0": 0.012107252871563732, "complete-iteration": 0.16390772085173572, "set_robot_commands": 0.00219529166538427, "deviation-center-line": 1.236329741273805, "driven_lanedir_consec": 3.4800334831893256, "sim_compute_sim_state": 0.010427839711333865, "sim_compute_performance-ego0": 0.0019819911131460782}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.5129114876636331, "get_ui_image": 0.03646492713536972, "step_physics": 0.1273919362288255, "survival_time": 9.700000000000005, "driven_lanedir": 0.6685826775569077, "get_state_dump": 0.0047594192700508315, "get_robot_state": 0.0037278383206098505, "sim_render-ego0": 0.0038381124154115336, "get_duckie_state": 2.30715825007512e-06, "in-drivable-lane": 5.250000000000011, "deviation-heading": 1.495530183342855, "agent_compute-ego0": 0.01261872144845816, "complete-iteration": 0.20313743566855405, "set_robot_commands": 0.002287686176789113, "deviation-center-line": 0.25974088343171975, "driven_lanedir_consec": 0.6685826775569077, "sim_compute_sim_state": 0.009884384350898938, "sim_compute_performance-ego0": 0.00206911135942508}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.409040570058979, "get_ui_image": 0.03267104371133272, "step_physics": 0.11739616306449462, "survival_time": 43.499999999999666, "driven_lanedir": 4.7537467948731, "get_state_dump": 0.004682003835859857, "get_robot_state": 0.003704930008757949, "sim_render-ego0": 0.0038180148697337384, "get_duckie_state": 2.1052552145466327e-06, "in-drivable-lane": 14.049999999999647, "deviation-heading": 8.814764212192447, "agent_compute-ego0": 0.012571868885535189, "complete-iteration": 0.19212590785853215, "set_robot_commands": 0.002183494556921905, "deviation-center-line": 2.535359300523973, "driven_lanedir_consec": 4.7537467948731, "sim_compute_sim_state": 0.013002551929548332, "sim_compute_performance-ego0": 0.0020058248127916515}, "LF-norm-small_loop-000-ego0": {"driven_any": 10.422607366469997, "get_ui_image": 0.026887040054867608, "step_physics": 0.09844392959918706, "survival_time": 59.99999999999873, "driven_lanedir": 6.185608756717224, "get_state_dump": 0.004683960486609771, "get_robot_state": 0.0036970203266254967, "sim_render-ego0": 0.0037751501545520943, "get_duckie_state": 2.2208065315646792e-06, "in-drivable-lane": 22.149999999999455, "deviation-heading": 11.728057928981128, "agent_compute-ego0": 0.012345056748211533, "complete-iteration": 0.16036448947198187, "set_robot_commands": 0.002207558915378847, "deviation-center-line": 3.160312638762253, "driven_lanedir_consec": 6.185608756717224, "sim_compute_sim_state": 0.006257557650589129, "sim_compute_performance-ego0": 0.0019761362639593143}}
set_robot_commands_max0.002287686176789113
set_robot_commands_mean0.002218507828618534
set_robot_commands_median0.0022014252903815586
set_robot_commands_min0.002183494556921905
sim_compute_performance-ego0_max0.00206911135942508
sim_compute_performance-ego0_mean0.002008265887330531
sim_compute_performance-ego0_median0.001993907962968865
sim_compute_performance-ego0_min0.0019761362639593143
sim_compute_sim_state_max0.013002551929548332
sim_compute_sim_state_mean0.009893083410592566
sim_compute_sim_state_median0.0101561120311164
sim_compute_sim_state_min0.006257557650589129
sim_render-ego0_max0.0038381124154115336
sim_render-ego0_mean0.003790609711610056
sim_render-ego0_median0.003796582512142917
sim_render-ego0_min0.003731161406742856
simulation-passed1
step_physics_max0.1273919362288255
step_physics_mean0.1099762720483856
step_physics_median0.10792004633184084
step_physics_min0.09667305930103516
survival_time_max59.99999999999873
survival_time_mean35.62499999999967
survival_time_min9.700000000000005
No reset possible
58481LFv-simsuccessyes0:11:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52443LFv-simerrorno0:07:28
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140302566527040
- M:video_aido:cmdline(in:/;out:/) 140303894610368
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52431LFv-simerrorno0:05:09
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139649767817856
- M:video_aido:cmdline(in:/;out:/) 139649768272320
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52423LFv-simerrorno0:03:57
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140512266460992
- M:video_aido:cmdline(in:/;out:/) 140512266458784
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41779LFv-simsuccessno0:08:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41778LFv-simsuccessno0:08:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38320LFv-simsuccessno0:10:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38318LFv-simsuccessno0:11:19
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36410LFv-simerrorno0:00:47
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9239/LFv-sim-Sandy2-sandy-1-job36410-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35835LFv-simsuccessno0:01:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35422LFv-simerrorno0:22:36
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9239/LFv-sim-reg05-b2dee9d94ee0-1-job35422:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9239/LFv-sim-reg05-b2dee9d94ee0-1-job35422/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9239/LFv-sim-reg05-b2dee9d94ee0-1-job35422/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9239/LFv-sim-reg05-b2dee9d94ee0-1-job35422/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9239/LFv-sim-reg05-b2dee9d94ee0-1-job35422/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9239/LFv-sim-reg05-b2dee9d94ee0-1-job35422/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35067LFv-simsuccessno0:22:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34518LFv-simsuccessno0:25:53
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34517LFv-simsuccessno0:25:11
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible