Duckietown Challenges Home Challenges Submissions

Submission 6816

Submission6816
Competingyes
Challengeaido5-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58616
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58616

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58616LFv-simsuccessyes0:43:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.0
survival_time_median59.99999999999873
deviation-center-line_median1.2422730096440104
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.013263540005902267
agent_compute-ego0_mean0.012683751730001736
agent_compute-ego0_median0.012578969097058045
agent_compute-ego0_min0.012313528719988592
complete-iteration_max0.34842031305775256
complete-iteration_mean0.29974807440291634
complete-iteration_median0.31015686161015854
complete-iteration_min0.2302582613335958
deviation-center-line_max4.053503393024394
deviation-center-line_mean1.731069028434027
deviation-center-line_min0.386226701423694
deviation-heading_max27.859809596736422
deviation-heading_mean14.925250437835436
deviation-heading_median14.309269950178932
deviation-heading_min3.22265225424745
driven_any_max2.6645352591003757e-13
driven_any_mean1.9984014443252818e-13
driven_any_median2.6645352591003757e-13
driven_any_min0.0
driven_lanedir_consec_max0.000286102294921875
driven_lanedir_consec_mean7.152557373046875e-05
driven_lanedir_consec_min0.0
driven_lanedir_max0.000286102294921875
driven_lanedir_mean7.152557373046875e-05
driven_lanedir_median0.0
driven_lanedir_min0.0
get_duckie_state_max1.5520037064246591e-06
get_duckie_state_mean1.4896694567677975e-06
get_duckie_state_median1.5020767516835745e-06
get_duckie_state_min1.402520617279383e-06
get_robot_state_max0.004060012712565985
get_robot_state_mean0.003911097182719336
get_robot_state_median0.003952635317221967
get_robot_state_min0.003679105383867427
get_state_dump_max0.0051633978167938055
get_state_dump_mean0.005017103650587783
get_state_dump_median0.005111874291342959
get_state_dump_min0.004681268202871407
get_ui_image_max0.03565240025421066
get_ui_image_mean0.030799966321003423
get_ui_image_median0.03140358523861951
get_ui_image_min0.024740294552564025
in-drivable-lane_max0.0
in-drivable-lane_mean0.0
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.029326375179941907, "step_physics": 0.22260036952886653, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.005146661269277657, "get_robot_state": 0.004060012712565985, "sim_render-ego0": 0.004126406033569133, "get_duckie_state": 1.5520037064246591e-06, "in-drivable-lane": 0.0, "deviation-heading": 22.66279310353771, "agent_compute-ego0": 0.012506320017958362, "complete-iteration": 0.29175652870032115, "set_robot_commands": 0.002459444074606915, "deviation-center-line": 4.053503393024394, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.00918123009401396, "sim_compute_performance-ego0": 0.0022538555948859347}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.0, "get_ui_image": 0.03565240025421066, "step_physics": 0.269596189186039, "survival_time": 59.99999999999873, "driven_lanedir": 0.000286102294921875, "get_state_dump": 0.0051633978167938055, "get_robot_state": 0.003976237664711069, "sim_render-ego0": 0.00408349247598132, "get_duckie_state": 1.514087013162046e-06, "in-drivable-lane": 0.0, "deviation-heading": 27.859809596736422, "agent_compute-ego0": 0.013263540005902267, "complete-iteration": 0.34842031305775256, "set_robot_commands": 0.0024129947357431837, "deviation-center-line": 1.0457540566888746, "driven_lanedir_consec": 0.000286102294921875, "sim_compute_sim_state": 0.011935592789534825, "sim_compute_performance-ego0": 0.0022446831299005997}, "LF-norm-techtrack-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.03348079529729711, "step_physics": 0.2556220425058662, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.00507708731340826, "get_robot_state": 0.003929032969732864, "sim_render-ego0": 0.00403413486718933, "get_duckie_state": 1.4900664902051026e-06, "in-drivable-lane": 0.0, "deviation-heading": 3.22265225424745, "agent_compute-ego0": 0.012313528719988592, "complete-iteration": 0.3285571945199959, "set_robot_commands": 0.0023490757271213197, "deviation-center-line": 1.4387919625991463, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.009469426144767463, "sim_compute_performance-ego0": 0.002185133871289713}, "LF-norm-small_loop-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.024740294552564025, "step_physics": 0.170500155988085, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.004681268202871407, "get_robot_state": 0.003679105383867427, "sim_render-ego0": 0.003731854253764157, "get_duckie_state": 1.402520617279383e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.955746796820156, "agent_compute-ego0": 0.012651618176157727, "complete-iteration": 0.2302582613335958, "set_robot_commands": 0.0021827741030550915, "deviation-center-line": 0.386226701423694, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.006060148258193347, "sim_compute_performance-ego0": 0.0019489962492854668}}
set_robot_commands_max0.002459444074606915
set_robot_commands_mean0.0023510721601316276
set_robot_commands_median0.002381035231432252
set_robot_commands_min0.0021827741030550915
sim_compute_performance-ego0_max0.0022538555948859347
sim_compute_performance-ego0_mean0.002158167211340429
sim_compute_performance-ego0_median0.0022149085005951563
sim_compute_performance-ego0_min0.0019489962492854668
sim_compute_sim_state_max0.011935592789534825
sim_compute_sim_state_mean0.0091615993216274
sim_compute_sim_state_median0.009325328119390712
sim_compute_sim_state_min0.006060148258193347
sim_render-ego0_max0.004126406033569133
sim_render-ego0_mean0.003993971907625985
sim_render-ego0_median0.004058813671585325
sim_render-ego0_min0.003731854253764157
simulation-passed1
step_physics_max0.269596189186039
step_physics_mean0.22957968930221417
step_physics_median0.23911120601736635
step_physics_min0.170500155988085
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58615LFv-simsuccessyes0:42:30
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58614LFv-simsuccessyes0:42:30
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58613LFv-simsuccessyes0:42:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58611LFv-simsuccessyes0:41:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52550LFv-simerrorno0:09:48
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140064356730864
- M:video_aido:cmdline(in:/;out:/) 140064356731776
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52547LFv-simtimeoutno----No reset possible
52544LFv-simerrorno0:10:55
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139815434586864
- M:video_aido:cmdline(in:/;out:/) 139815434633952
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41836LFv-simsuccessno0:09:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41835LFv-simsuccessno0:09:19
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38421LFv-simsuccessno0:09:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38420LFv-simsuccessno0:09:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36481LFv-simsuccessno0:10:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35891LFv-simsuccessno0:01:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35551LFv-simerrorno0:23:21
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg01-8ce556a6457c-1-job35551:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg01-8ce556a6457c-1-job35551/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg01-8ce556a6457c-1-job35551/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg01-8ce556a6457c-1-job35551/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg01-8ce556a6457c-1-job35551/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg01-8ce556a6457c-1-job35551/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35546LFv-simabortedno0:23:55
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg02-c6a6bfc3ddd2-1-job35546:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg02-c6a6bfc3ddd2-1-job35546/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg02-c6a6bfc3ddd2-1-job35546/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg02-c6a6bfc3ddd2-1-job35546/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg02-c6a6bfc3ddd2-1-job35546/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg02-c6a6bfc3ddd2-1-job35546/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35519LFv-simtimeoutno1:05:18
I can see how the jo [...]
I can see how the job 35519 is timeout because passed 3918 seconds and the timeout is 3600.0.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35477LFv-simabortedno0:22:09
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg05-b2dee9d94ee0-1-job35477:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg05-b2dee9d94ee0-1-job35477/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg05-b2dee9d94ee0-1-job35477/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg05-b2dee9d94ee0-1-job35477/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg05-b2dee9d94ee0-1-job35477/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg05-b2dee9d94ee0-1-job35477/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35164LFv-simsuccessno0:27:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34439LFv-simsuccessno0:25:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34269LFv-simabortedno0:24:39
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg10-e57b0b7c7a6d-1-job34269:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg10-e57b0b7c7a6d-1-job34269/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34009LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg04-bf35e9d68df4-1-job34009'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34003LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg04-bf35e9d68df4-1-job34003'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33973LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg03-c2bc3037870e-1-job33973'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33968LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg05-5ca0d35e6d82-1-job33968'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33959LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg07-c4e193407567-1-job33959'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33953LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg11-951de1eeccca-1-job33953'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33952LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6816/LFv-sim-reg01-53440c9394b5-1-job33952'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33384LFv-simsuccessno0:24:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible