Duckietown Challenges Home Challenges Submissions

Submission 9238

Submission9238
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58498
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58498

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58498LFv-simsuccessyes0:07:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.7170315445576467
survival_time_median11.850000000000032
deviation-center-line_median0.20667517845645303
in-drivable-lane_median4.75000000000003


other stats
agent_compute-ego0_max0.037014259015276135
agent_compute-ego0_mean0.020861678852141484
agent_compute-ego0_median0.0169650330116313
agent_compute-ego0_min0.012502390370027197
complete-iteration_max0.24549580827544007
complete-iteration_mean0.19282262361801075
complete-iteration_median0.1806176727377493
complete-iteration_min0.16455934072110434
deviation-center-line_max0.5501822899732269
deviation-center-line_mean0.26306079568846596
deviation-center-line_min0.08871053586773069
deviation-heading_max1.6324824282171502
deviation-heading_mean1.1272176994862315
deviation-heading_median1.0913600991085717
deviation-heading_min0.6936681715106332
driven_any_max3.807100032993421
driven_any_mean2.2347852637536874
driven_any_median2.266234432369532
driven_any_min0.5995721572822645
driven_lanedir_consec_max2.0494051565195983
driven_lanedir_consec_mean0.9167485980134332
driven_lanedir_consec_min0.18352614641884157
driven_lanedir_max2.0494051565195983
driven_lanedir_mean0.9167485980134332
driven_lanedir_median0.7170315445576467
driven_lanedir_min0.18352614641884157
get_duckie_state_max1.5723554393913172e-06
get_duckie_state_mean1.3740491735656262e-06
get_duckie_state_median1.355451509557029e-06
get_duckie_state_min1.2129382357571297e-06
get_robot_state_max0.0038108372990089128
get_robot_state_mean0.003716208976368351
get_robot_state_median0.0037310240662874338
get_robot_state_min0.003591950473889627
get_state_dump_max0.004747551957560327
get_state_dump_mean0.004609557420379938
get_state_dump_median0.004600479416689224
get_state_dump_min0.004489718890580975
get_ui_image_max0.0353971493395069
get_ui_image_mean0.03044638797272786
get_ui_image_median0.030216417478485073
get_ui_image_min0.025955567594434393
in-drivable-lane_max14.750000000000137
in-drivable-lane_mean6.687500000000047
in-drivable-lane_min2.499999999999994
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.8427528573522265, "get_ui_image": 0.02847836611620802, "step_physics": 0.09713831289636396, "survival_time": 14.600000000000072, "driven_lanedir": 2.0494051565195983, "get_state_dump": 0.004578771037860128, "get_robot_state": 0.0036975849203689103, "sim_render-ego0": 0.0038150911038239256, "get_duckie_state": 1.381688557387212e-06, "in-drivable-lane": 4.300000000000061, "deviation-heading": 1.399891344524762, "agent_compute-ego0": 0.012502390370027197, "complete-iteration": 0.16455934072110434, "set_robot_commands": 0.0022328192870364662, "deviation-center-line": 0.5501822899732269, "driven_lanedir_consec": 2.0494051565195983, "sim_compute_sim_state": 0.010012912262014968, "sim_compute_performance-ego0": 0.002014386369099796}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.5995721572822645, "get_ui_image": 0.0353971493395069, "step_physics": 0.16297507889663118, "survival_time": 3.899999999999994, "driven_lanedir": 0.18352614641884157, "get_state_dump": 0.0046221877955183194, "get_robot_state": 0.0038108372990089128, "sim_render-ego0": 0.0039598971982545495, "get_duckie_state": 1.5723554393913172e-06, "in-drivable-lane": 2.499999999999994, "deviation-heading": 0.7828288536923811, "agent_compute-ego0": 0.02026564561868016, "complete-iteration": 0.24549580827544007, "set_robot_commands": 0.002177232428442074, "deviation-center-line": 0.08871053586773069, "driven_lanedir_consec": 0.18352614641884157, "sim_compute_sim_state": 0.01011781451068347, "sim_compute_performance-ego0": 0.0020772384691841997}, "LF-norm-techtrack-000-ego0": {"driven_any": 3.807100032993421, "get_ui_image": 0.031954468840762126, "step_physics": 0.11006092469309278, "survival_time": 19.25000000000014, "driven_lanedir": 0.7175869876403856, "get_state_dump": 0.004747551957560327, "get_robot_state": 0.003764463212205956, "sim_render-ego0": 0.003923265427505414, "get_duckie_state": 1.329214461726846e-06, "in-drivable-lane": 14.750000000000137, "deviation-heading": 1.6324824282171502, "agent_compute-ego0": 0.013664420404582444, "complete-iteration": 0.18346945362387543, "set_robot_commands": 0.0023138177209567528, "deviation-center-line": 0.22442338157528335, "driven_lanedir_consec": 0.7175869876403856, "sim_compute_sim_state": 0.010919974257908955, "sim_compute_performance-ego0": 0.002029677129162408}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.6897160073868376, "get_ui_image": 0.025955567594434393, "step_physics": 0.09394577552712027, "survival_time": 9.099999999999994, "driven_lanedir": 0.7164761014749079, "get_state_dump": 0.004489718890580975, "get_robot_state": 0.003591950473889627, "sim_render-ego0": 0.003702068589424175, "get_duckie_state": 1.2129382357571297e-06, "in-drivable-lane": 5.2, "deviation-heading": 0.6936681715106332, "agent_compute-ego0": 0.037014259015276135, "complete-iteration": 0.1777658918516232, "set_robot_commands": 0.0021893939033883515, "deviation-center-line": 0.18892697533762276, "driven_lanedir_consec": 0.7164761014749079, "sim_compute_sim_state": 0.004936093189677254, "sim_compute_performance-ego0": 0.0018536888185094616}}
set_robot_commands_max0.0023138177209567528
set_robot_commands_mean0.002228315834955911
set_robot_commands_median0.002211106595212409
set_robot_commands_min0.002177232428442074
sim_compute_performance-ego0_max0.0020772384691841997
sim_compute_performance-ego0_mean0.001993747696488966
sim_compute_performance-ego0_median0.002022031749131102
sim_compute_performance-ego0_min0.0018536888185094616
sim_compute_sim_state_max0.010919974257908955
sim_compute_sim_state_mean0.008996698555071162
sim_compute_sim_state_median0.01006536338634922
sim_compute_sim_state_min0.004936093189677254
sim_render-ego0_max0.0039598971982545495
sim_render-ego0_mean0.003850080579752016
sim_render-ego0_median0.00386917826566467
sim_render-ego0_min0.003702068589424175
simulation-passed1
step_physics_max0.16297507889663118
step_physics_mean0.11603002300330204
step_physics_median0.10359961879472836
step_physics_min0.09394577552712027
survival_time_max19.25000000000014
survival_time_mean11.712500000000048
survival_time_min3.899999999999994
No reset possible
58496LFv-simsuccessyes0:07:25
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58494LFv-simsuccessyes0:06:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58493LFv-simsuccessyes0:08:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52436LFv-simerrorno0:03:57
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139654066405680
- M:video_aido:cmdline(in:/;out:/) 139654068709456
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52432LFv-simerrorno0:05:39
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139712507798048
- M:video_aido:cmdline(in:/;out:/) 139712507764688
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52425LFv-simerrorno0:03:35
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140258529776544
- M:video_aido:cmdline(in:/;out:/) 140258529777504
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41780LFv-simsuccessno0:07:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38319LFv-simsuccessno0:07:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36414LFv-simsuccessno0:10:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36413LFv-simerrorno0:00:53
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9238/LFv-sim-Sandy2-sandy-1-job36413-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35837LFv-simsuccessno0:00:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35423LFv-simerrorno0:22:12
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9238/LFv-sim-reg03-0c28c9d61367-1-job35423:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9238/LFv-sim-reg03-0c28c9d61367-1-job35423/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9238/LFv-sim-reg03-0c28c9d61367-1-job35423/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9238/LFv-sim-reg03-0c28c9d61367-1-job35423/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9238/LFv-sim-reg03-0c28c9d61367-1-job35423/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9238/LFv-sim-reg03-0c28c9d61367-1-job35423/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35068LFv-simsuccessno0:22:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34485LFv-simsuccessno0:23:08
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34484LFv-simsuccessno0:24:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible