Duckietown Challenges Home Challenges Submissions

Submission 6834

Submission6834
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58564
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58564

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58564LFv-simsuccessyes0:45:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.0
survival_time_median59.99999999999873
deviation-center-line_median1.2422730096440104
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.014011656811195646
agent_compute-ego0_mean0.012750620647433595
agent_compute-ego0_median0.012561020307199446
agent_compute-ego0_min0.011868785164139847
complete-iteration_max0.37218273132667257
complete-iteration_mean0.3143631987131009
complete-iteration_median0.31480509543994584
complete-iteration_min0.25565987264583945
deviation-center-line_max4.053503393024394
deviation-center-line_mean1.731069028434027
deviation-center-line_min0.386226701423694
deviation-heading_max27.859809596736422
deviation-heading_mean14.925250437835436
deviation-heading_median14.309269950178932
deviation-heading_min3.22265225424745
driven_any_max2.6645352591003757e-13
driven_any_mean1.9984014443252818e-13
driven_any_median2.6645352591003757e-13
driven_any_min0.0
driven_lanedir_consec_max0.000286102294921875
driven_lanedir_consec_mean7.152557373046875e-05
driven_lanedir_consec_min0.0
driven_lanedir_max0.000286102294921875
driven_lanedir_mean7.152557373046875e-05
driven_lanedir_median0.0
driven_lanedir_min0.0
get_duckie_state_max2.105865351464925e-06
get_duckie_state_mean2.064176840547916e-06
get_duckie_state_median2.0865099713963137e-06
get_duckie_state_min1.977822067934111e-06
get_robot_state_max0.003931004836299239
get_robot_state_mean0.003846876428685121
get_robot_state_median0.003835785696647447
get_robot_state_min0.0037849294851463504
get_state_dump_max0.0048542814786785545
get_state_dump_mean0.004798086805208636
get_state_dump_median0.004780678923779185
get_state_dump_min0.0047767078945976215
get_ui_image_max0.03720857817167843
get_ui_image_mean0.03207851667189777
get_ui_image_median0.03182396007318679
get_ui_image_min0.027457568369539057
in-drivable-lane_max0.0
in-drivable-lane_mean0.0
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.02969254284079724, "step_physics": 0.2301959576555136, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.0047767078945976215, "get_robot_state": 0.003802752316146965, "sim_render-ego0": 0.003818766659840656, "get_duckie_state": 2.072316026012665e-06, "in-drivable-lane": 0.0, "deviation-heading": 22.66279310353771, "agent_compute-ego0": 0.011873465592815515, "complete-iteration": 0.29742493299917816, "set_robot_commands": 0.002275632481888669, "deviation-center-line": 4.053503393024394, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.008877491176773568, "sim_compute_performance-ego0": 0.002023745138976695}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.0, "get_ui_image": 0.03720857817167843, "step_physics": 0.291443513890885, "survival_time": 59.99999999999873, "driven_lanedir": 0.000286102294921875, "get_state_dump": 0.0048542814786785545, "get_robot_state": 0.0038688190771479294, "sim_render-ego0": 0.004027390658706551, "get_duckie_state": 2.105865351464925e-06, "in-drivable-lane": 0.0, "deviation-heading": 27.859809596736422, "agent_compute-ego0": 0.014011656811195646, "complete-iteration": 0.37218273132667257, "set_robot_commands": 0.0023654583193281112, "deviation-center-line": 1.0457540566888746, "driven_lanedir_consec": 0.000286102294921875, "sim_compute_sim_state": 0.01216140793920258, "sim_compute_performance-ego0": 0.002152283920237266}, "LF-norm-techtrack-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.033955377305576344, "step_physics": 0.2584479291869838, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.004780564875924319, "get_robot_state": 0.003931004836299239, "sim_render-ego0": 0.003959498933511015, "get_duckie_state": 1.977822067934111e-06, "in-drivable-lane": 0.0, "deviation-heading": 3.22265225424745, "agent_compute-ego0": 0.013248575021583373, "complete-iteration": 0.3321852578807135, "set_robot_commands": 0.0023862765690964723, "deviation-center-line": 1.4387919625991463, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.009259310888311051, "sim_compute_performance-ego0": 0.002129233747001095}, "LF-norm-small_loop-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.027457568369539057, "step_physics": 0.19309287384884444, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.004780792971634051, "get_robot_state": 0.0037849294851463504, "sim_render-ego0": 0.0038942623694274542, "get_duckie_state": 2.100703916779962e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.955746796820156, "agent_compute-ego0": 0.011868785164139847, "complete-iteration": 0.25565987264583945, "set_robot_commands": 0.0023024626119646997, "deviation-center-line": 0.386226701423694, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.0063431249073800395, "sim_compute_performance-ego0": 0.00204898078276851}}
set_robot_commands_max0.0023862765690964723
set_robot_commands_mean0.002332457495569488
set_robot_commands_median0.0023339604656464055
set_robot_commands_min0.002275632481888669
sim_compute_performance-ego0_max0.002152283920237266
sim_compute_performance-ego0_mean0.002088560897245892
sim_compute_performance-ego0_median0.0020891072648848027
sim_compute_performance-ego0_min0.002023745138976695
sim_compute_sim_state_max0.01216140793920258
sim_compute_sim_state_mean0.00916033372791681
sim_compute_sim_state_median0.00906840103254231
sim_compute_sim_state_min0.0063431249073800395
sim_render-ego0_max0.004027390658706551
sim_render-ego0_mean0.003924979655371419
sim_render-ego0_median0.003926880651469235
sim_render-ego0_min0.003818766659840656
simulation-passed1
step_physics_max0.291443513890885
step_physics_mean0.24329506864555672
step_physics_median0.2443219434212487
step_physics_min0.19309287384884444
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58562LFv-simsuccessyes0:46:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52494LFv-simerrorno0:11:46
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140637363468608
- M:video_aido:cmdline(in:/;out:/) 140637363493616
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52484LFv-simerrorno0:12:50
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140463473547296
- M:video_aido:cmdline(in:/;out:/) 140463473474960
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41813LFv-simsuccessno0:09:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38370LFv-simsuccessno0:09:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36455LFv-simsuccessno0:10:18
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35876LFv-simsuccessno0:01:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35456LFv-simerrorno0:22:19
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg03-0c28c9d61367-1-job35456:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg03-0c28c9d61367-1-job35456/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg03-0c28c9d61367-1-job35456/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg03-0c28c9d61367-1-job35456/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg03-0c28c9d61367-1-job35456/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg03-0c28c9d61367-1-job35456/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35455LFv-simerrorno0:22:14
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg01-94a6fab21ac9-1-job35455:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg01-94a6fab21ac9-1-job35455/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg01-94a6fab21ac9-1-job35455/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg01-94a6fab21ac9-1-job35455/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg01-94a6fab21ac9-1-job35455/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg01-94a6fab21ac9-1-job35455/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35141LFv-simsuccessno0:23:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35140LFv-simsuccessno0:23:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34415LFv-simsuccessno0:26:19
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34253LFv-simabortedno0:27:46
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg08-4aa5552eb623-1-job34253:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg08-4aa5552eb623-1-job34253/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33851LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg05-5ca0d35e6d82-1-job33851'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33846LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg01-53440c9394b5-1-job33846'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33843LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg01-53440c9394b5-1-job33843'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33833LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg03-c2bc3037870e-1-job33833'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33830LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg07-c4e193407567-1-job33830'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33824LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg07-c4e193407567-1-job33824'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33820LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg11-951de1eeccca-1-job33820'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33813LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg11-951de1eeccca-1-job33813'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33721LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6834/LFv-sim-reg04-bf35e9d68df4-1-job33721'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33414LFv-simsuccessno0:23:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33413LFv-simsuccessno0:23:24
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible