Duckietown Challenges Home Challenges Submissions

Submission 6829

Submission6829
Competingyes
Challengeaido5-LF-sim-validation
UserHimanshu Arora 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58582
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58582

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58582LFv-simsuccessyes0:34:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.8286461697708942
survival_time_median59.27499999999877
deviation-center-line_median2.912378769463648
in-drivable-lane_median12.649999999999492


other stats
agent_compute-ego0_max0.013019842499352152
agent_compute-ego0_mean0.01279498185717372
agent_compute-ego0_median0.01275313635055072
agent_compute-ego0_min0.01265381222824129
complete-iteration_max0.22816343580089735
complete-iteration_mean0.20329643577801992
complete-iteration_median0.20261255795285368
complete-iteration_min0.17979719140547498
deviation-center-line_max4.1659454824995645
deviation-center-line_mean2.9576781201616886
deviation-center-line_min1.8400094592198943
deviation-heading_max12.87439175576276
deviation-heading_mean9.350797814146452
deviation-heading_median9.151664055497218
deviation-heading_min6.2254713898286145
driven_any_max7.921170431642517
driven_any_mean7.394794003971707
driven_any_median7.774686056288765
driven_any_min6.108633471666776
driven_lanedir_consec_max5.159369007947497
driven_lanedir_consec_mean3.8681769334683582
driven_lanedir_consec_min2.6560463863841486
driven_lanedir_max7.393812852353614
driven_lanedir_mean5.267659729258569
driven_lanedir_median4.909789453613135
driven_lanedir_min3.857247157454392
get_duckie_state_max1.445400228508307e-06
get_duckie_state_mean1.4219523601606257e-06
get_duckie_state_median1.4196161758446658e-06
get_duckie_state_min1.4031768604448643e-06
get_robot_state_max0.004120947557911531
get_robot_state_mean0.003975323843335649
get_robot_state_median0.003976069322533651
get_robot_state_min0.003828209170363764
get_state_dump_max0.004988222098370377
get_state_dump_mean0.004936197258655076
get_state_dump_median0.004937675110450462
get_state_dump_min0.004881216715349007
get_ui_image_max0.03716002629314512
get_ui_image_mean0.032596223238936954
get_ui_image_median0.03258343370253079
get_ui_image_min0.02805799925754112
in-drivable-lane_max27.099999999998687
in-drivable-lane_mean13.799999999999429
in-drivable-lane_min2.80000000000004
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921170431642517, "get_ui_image": 0.02990666853200387, "step_physics": 0.10918780310167855, "survival_time": 59.99999999999873, "driven_lanedir": 7.393812852353614, "get_state_dump": 0.004906441150954324, "get_robot_state": 0.003980962561131715, "sim_render-ego0": 0.004113594955647617, "get_duckie_state": 1.415027170554486e-06, "in-drivable-lane": 2.80000000000004, "deviation-heading": 6.2254713898286145, "agent_compute-ego0": 0.012728390149728742, "complete-iteration": 0.18038412712694307, "set_robot_commands": 0.0024024455573139936, "deviation-center-line": 2.642568562206722, "driven_lanedir_consec": 4.141236104827591, "sim_compute_sim_state": 0.01086744559396812, "sim_compute_performance-ego0": 0.0022013707522250134}, "LF-norm-zigzag-000-ego0": {"driven_any": 6.108633471666776, "get_ui_image": 0.03716002629314512, "step_physics": 0.1453177921331612, "survival_time": 47.049999999999464, "driven_lanedir": 5.159369007947497, "get_state_dump": 0.004881216715349007, "get_robot_state": 0.003828209170363764, "sim_render-ego0": 0.003976549312567255, "get_duckie_state": 1.4031768604448643e-06, "in-drivable-lane": 6.4999999999996305, "deviation-heading": 7.947299260544233, "agent_compute-ego0": 0.01265381222824129, "complete-iteration": 0.22484098877876427, "set_robot_commands": 0.002341732857333627, "deviation-center-line": 4.1659454824995645, "driven_lanedir_consec": 5.159369007947497, "sim_compute_sim_state": 0.012479684914752936, "sim_compute_performance-ego0": 0.0021120018766690716}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.628625527599891, "get_ui_image": 0.03526019887305771, "step_physics": 0.1488354708554395, "survival_time": 58.54999999999881, "driven_lanedir": 4.660209899278772, "get_state_dump": 0.004968909069946601, "get_robot_state": 0.004120947557911531, "sim_render-ego0": 0.004188982293060615, "get_duckie_state": 1.4242051811348456e-06, "in-drivable-lane": 18.79999999999935, "deviation-heading": 12.87439175576276, "agent_compute-ego0": 0.013019842499352152, "complete-iteration": 0.22816343580089735, "set_robot_commands": 0.002537318260596474, "deviation-center-line": 3.182188976720574, "driven_lanedir_consec": 3.516056234714197, "sim_compute_sim_state": 0.012845727770808614, "sim_compute_performance-ego0": 0.0022942558083517968}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.920746584977638, "get_ui_image": 0.02805799925754112, "step_physics": 0.1150620593119422, "survival_time": 59.99999999999873, "driven_lanedir": 3.857247157454392, "get_state_dump": 0.004988222098370377, "get_robot_state": 0.0039711760839355875, "sim_render-ego0": 0.004083751341782442, "get_duckie_state": 1.445400228508307e-06, "in-drivable-lane": 27.099999999998687, "deviation-heading": 10.356028850450205, "agent_compute-ego0": 0.012777882551372697, "complete-iteration": 0.17979719140547498, "set_robot_commands": 0.002414160028881674, "deviation-center-line": 1.8400094592198943, "driven_lanedir_consec": 2.6560463863841486, "sim_compute_sim_state": 0.00621630508238628, "sim_compute_performance-ego0": 0.002136694203804772}}
set_robot_commands_max0.002537318260596474
set_robot_commands_mean0.002423914176031442
set_robot_commands_median0.0024083027930978333
set_robot_commands_min0.002341732857333627
sim_compute_performance-ego0_max0.0022942558083517968
sim_compute_performance-ego0_mean0.002186080660262663
sim_compute_performance-ego0_median0.002169032478014893
sim_compute_performance-ego0_min0.0021120018766690716
sim_compute_sim_state_max0.012845727770808614
sim_compute_sim_state_mean0.010602290840478989
sim_compute_sim_state_median0.01167356525436053
sim_compute_sim_state_min0.00621630508238628
sim_render-ego0_max0.004188982293060615
sim_render-ego0_mean0.004090719475764483
sim_render-ego0_median0.00409867314871503
sim_render-ego0_min0.003976549312567255
simulation-passed1
step_physics_max0.1488354708554395
step_physics_mean0.12960078135055536
step_physics_median0.1301899257225517
step_physics_min0.10918780310167855
survival_time_max59.99999999999873
survival_time_mean56.39999999999893
survival_time_min47.049999999999464
No reset possible
58581LFv-simsuccessyes0:37:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58580LFv-simsuccessyes0:29:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52533LFv-simerrorno0:09:43
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140108403471504
- M:video_aido:cmdline(in:/;out:/) 140108403457568
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52530LFv-simtimeoutno----No reset possible
52513LFv-simerrorno0:11:17
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140167284080800
- M:video_aido:cmdline(in:/;out:/) 140167284083824
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41819LFv-simsuccessno0:08:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38386LFv-simsuccessno0:08:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36462LFv-simsuccessno0:09:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36461LFv-simsuccessno0:09:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35883LFv-simsuccessno0:00:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35879LFv-simsuccessno0:01:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35462LFv-simerrorno0:21:50
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg02-1b92df2e7e91-1-job35462:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg02-1b92df2e7e91-1-job35462/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg02-1b92df2e7e91-1-job35462/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg02-1b92df2e7e91-1-job35462/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg02-1b92df2e7e91-1-job35462/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg02-1b92df2e7e91-1-job35462/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35147LFv-simsuccessno0:25:25
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34420LFv-simsuccessno0:28:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34252LFv-simabortedno0:28:01
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg05-58d82dc1badb-1-job34252:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg05-58d82dc1badb-1-job34252/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33887LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg11-951de1eeccca-1-job33887'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33884LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg11-951de1eeccca-1-job33884'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33880LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg04-bf35e9d68df4-1-job33880'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33869LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg04-bf35e9d68df4-1-job33869'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33865LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg07-c4e193407567-1-job33865'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33861LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg01-53440c9394b5-1-job33861'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33852LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg01-53440c9394b5-1-job33852'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33845LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg05-5ca0d35e6d82-1-job33845'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33844LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg05-5ca0d35e6d82-1-job33844'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33821LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6829/LFv-sim-reg03-c2bc3037870e-1-job33821'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33404LFv-simsuccessno0:12:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible