Duckietown Challenges Home Challenges Submissions

Submission 6762

Submission6762
Competingyes
Challengeaido5-LF-sim-validation
UserCharlie Gauthier 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58664
Next
User labeltemplate-ros
Admin priority50
Blessingn/a
User priority50

58664

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58664LFv-simsuccessyes0:19:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.8330802569545445
survival_time_median27.4500000000001
deviation-center-line_median0.4962559748025668
in-drivable-lane_median7.400000000000058


other stats
agent_compute-ego0_max0.032955262217560226
agent_compute-ego0_mean0.028595880137022713
agent_compute-ego0_median0.031966440803688684
agent_compute-ego0_min0.01749537672315325
complete-iteration_max0.2588001010874211
complete-iteration_mean0.22812666208356688
complete-iteration_median0.22344141271671267
complete-iteration_min0.20682372181342107
deviation-center-line_max2.2829169308793316
deviation-center-line_mean0.8822162663426927
deviation-center-line_min0.2534361848863056
deviation-heading_max7.7313213382583665
deviation-heading_mean4.543267492751355
deviation-heading_median3.95020214249486
deviation-heading_min2.541344347757335
driven_any_max7.916282182083668
driven_any_mean4.158965442998144
driven_any_median3.5062338257079224
driven_any_min1.7071119384930646
driven_lanedir_consec_max2.438368898728998
driven_lanedir_consec_mean1.150861920906783
driven_lanedir_consec_min0.49891827098904473
driven_lanedir_max4.031347896612052
driven_lanedir_mean1.63742466556544
driven_lanedir_median0.9873608480967404
driven_lanedir_min0.5436290694562256
get_duckie_state_max1.4041151319231306e-06
get_duckie_state_mean1.274642588981249e-06
get_duckie_state_median1.2965422344591055e-06
get_duckie_state_min1.1013707550836543e-06
get_robot_state_max0.003835421800613403
get_robot_state_mean0.003598550408183813
get_robot_state_median0.0036284557999292665
get_robot_state_min0.0033018682322633157
get_state_dump_max0.005394996915544782
get_state_dump_mean0.004826664464788192
get_state_dump_median0.004882144184140828
get_state_dump_min0.004147372575326327
get_ui_image_max0.03330909016961357
get_ui_image_mean0.028927918873385017
get_ui_image_median0.029678813660235913
get_ui_image_min0.02304495800345466
in-drivable-lane_max53.59999999999872
in-drivable-lane_mean18.099999999999707
in-drivable-lane_min3.999999999999986
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 4.694571944560155, "get_ui_image": 0.027062670193581915, "step_physics": 0.12288474349818602, "survival_time": 36.40000000000007, "driven_lanedir": 4.031347896612052, "get_state_dump": 0.004698389678662042, "get_robot_state": 0.003649206645531255, "sim_render-ego0": 0.0037178904921920217, "get_duckie_state": 1.3778566169477131e-06, "in-drivable-lane": 3.999999999999986, "deviation-heading": 7.7313213382583665, "agent_compute-ego0": 0.03252051200395749, "complete-iteration": 0.20780711278666847, "set_robot_commands": 0.002230080393933792, "deviation-center-line": 2.2829169308793316, "driven_lanedir_consec": 2.438368898728998, "sim_compute_sim_state": 0.008968567161403075, "sim_compute_performance-ego0": 0.0019843938092308933}, "LF-norm-zigzag-000-ego0": {"driven_any": 2.317895706855689, "get_ui_image": 0.03330909016961357, "step_physics": 0.1643296367717239, "survival_time": 18.500000000000128, "driven_lanedir": 1.3104965987573989, "get_state_dump": 0.005065898689619614, "get_robot_state": 0.0036077049543272776, "sim_render-ego0": 0.003684727650768352, "get_duckie_state": 1.215227851970498e-06, "in-drivable-lane": 6.450000000000044, "deviation-heading": 5.137911297996097, "agent_compute-ego0": 0.032955262217560226, "complete-iteration": 0.2588001010874211, "set_robot_commands": 0.002117201324421762, "deviation-center-line": 0.728101257170824, "driven_lanedir_consec": 1.0019354164730068, "sim_compute_sim_state": 0.011649014814844672, "sim_compute_performance-ego0": 0.002001905055701572}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.7071119384930646, "get_ui_image": 0.03229495712688991, "step_physics": 0.1626109778881073, "survival_time": 13.950000000000063, "driven_lanedir": 0.5436290694562256, "get_state_dump": 0.005394996915544782, "get_robot_state": 0.003835421800613403, "sim_render-ego0": 0.003970428875514439, "get_duckie_state": 1.4041151319231306e-06, "in-drivable-lane": 8.350000000000072, "deviation-heading": 2.541344347757335, "agent_compute-ego0": 0.01749537672315325, "complete-iteration": 0.2390757126467569, "set_robot_commands": 0.0022583459104810444, "deviation-center-line": 0.2534361848863056, "driven_lanedir_consec": 0.49891827098904473, "sim_compute_sim_state": 0.009010707480566845, "sim_compute_performance-ego0": 0.002115171296255929}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.916282182083668, "get_ui_image": 0.02304495800345466, "step_physics": 0.13220339551952656, "survival_time": 59.99999999999873, "driven_lanedir": 0.6642250974360822, "get_state_dump": 0.004147372575326327, "get_robot_state": 0.0033018682322633157, "sim_render-ego0": 0.0033984883043986377, "get_duckie_state": 1.1013707550836543e-06, "in-drivable-lane": 53.59999999999872, "deviation-heading": 2.762492986993624, "agent_compute-ego0": 0.03141236960341988, "complete-iteration": 0.20682372181342107, "set_robot_commands": 0.0019487983281169703, "deviation-center-line": 0.2644106924343097, "driven_lanedir_consec": 0.6642250974360822, "sim_compute_sim_state": 0.00558008520331212, "sim_compute_performance-ego0": 0.0017131157064318755}}
set_robot_commands_max0.0022583459104810444
set_robot_commands_mean0.0021386064892383923
set_robot_commands_median0.002173640859177777
set_robot_commands_min0.0019487983281169703
sim_compute_performance-ego0_max0.002115171296255929
sim_compute_performance-ego0_mean0.0019536464669050674
sim_compute_performance-ego0_median0.001993149432466233
sim_compute_performance-ego0_min0.0017131157064318755
sim_compute_sim_state_max0.011649014814844672
sim_compute_sim_state_mean0.008802093665031678
sim_compute_sim_state_median0.008989637320984959
sim_compute_sim_state_min0.00558008520331212
sim_render-ego0_max0.003970428875514439
sim_render-ego0_mean0.0036928838307183623
sim_render-ego0_median0.003701309071480186
sim_render-ego0_min0.0033984883043986377
simulation-passed1
step_physics_max0.1643296367717239
step_physics_mean0.14550718841938595
step_physics_median0.14740718670381692
step_physics_min0.12288474349818602
survival_time_max59.99999999999873
survival_time_mean32.21249999999975
survival_time_min13.950000000000063
No reset possible
58659LFv-simsuccessyes0:23:12
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58655LFv-simsuccessyes0:29:38
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58653LFv-simsuccessyes0:26:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58649LFv-simsuccessyes0:28:48
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52606LFv-simerrorno0:09:40
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139984941394288
- M:video_aido:cmdline(in:/;out:/) 139984943641024
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52603LFv-simerrorno0:10:29
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140018214684368
- M:video_aido:cmdline(in:/;out:/) 140018215179024
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52592LFv-simerrorno0:05:45
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140329200586368
- M:video_aido:cmdline(in:/;out:/) 140329198550080
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41857LFv-simsuccessno0:08:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38451LFv-simsuccessno0:08:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38448LFv-simsuccessno0:08:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36502LFv-simsuccessno0:10:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35920LFv-simsuccessno0:00:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35919LFv-simsuccessno0:00:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35916LFv-simsuccessno0:01:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35567LFv-simtimeoutno1:05:23
I can see how the jo [...]
I can see how the job 35567 is timeout because passed 3923 seconds and the timeout is 3600.0.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35533LFv-simabortedno0:25:13
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg01-e2eca1b23f29-1-job35533:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg01-e2eca1b23f29-1-job35533/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg01-e2eca1b23f29-1-job35533/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg01-e2eca1b23f29-1-job35533/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg01-e2eca1b23f29-1-job35533/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg01-e2eca1b23f29-1-job35533/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35492LFv-simabortedno0:20:56
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg05-b2dee9d94ee0-1-job35492:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg05-b2dee9d94ee0-1-job35492/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg05-b2dee9d94ee0-1-job35492/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg05-b2dee9d94ee0-1-job35492/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg05-b2dee9d94ee0-1-job35492/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg05-b2dee9d94ee0-1-job35492/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35179LFv-simabortedno0:22:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34450LFv-simabortedno0:23:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34449LFv-simabortedno0:24:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34300LFv-simabortedno0:23:02
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg11-cc3acf431491-1-job34300:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg11-cc3acf431491-1-job34300/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34299LFv-simabortedno0:22:38
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg09-5e69f1a0aa29-1-job34299:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg09-5e69f1a0aa29-1-job34299/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34131LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg04-bf35e9d68df4-1-job34131'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34105LFv-simabortedno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg01-53440c9394b5-1-job34105'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34097LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg01-53440c9394b5-1-job34097'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34093LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg11-951de1eeccca-1-job34093'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34085LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg11-951de1eeccca-1-job34085'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34081LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg03-c2bc3037870e-1-job34081'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34074LFv-simabortedno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg07-c4e193407567-1-job34074'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34067LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6762/LFv-sim-reg05-5ca0d35e6d82-1-job34067'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33275LFv-simabortedno0:08:36
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible