Duckietown Challenges Home Challenges Submissions

Submission 6825

Submission6825
Competingyes
Challengeaido5-LF-sim-validation
UserLiam Paull 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58588
Next
User labelbaseline-duckietown
Admin priority50
Blessingn/a
User priority50

58588

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58588LFv-simsuccessyes0:35:15
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median6.441165075028019
survival_time_median59.99999999999873
deviation-center-line_median3.4339299550653126
in-drivable-lane_median9.624999999999725


other stats
agent_compute-ego0_max0.012642591422443882
agent_compute-ego0_mean0.012567497858099894
agent_compute-ego0_median0.01257639632038431
agent_compute-ego0_min0.012474607369187075
complete-iteration_max0.2042974574083492
complete-iteration_mean0.18824245421515215
complete-iteration_median0.19211826783036512
complete-iteration_min0.16443582379152932
deviation-center-line_max3.8971751912760695
deviation-center-line_mean3.091791309887432
deviation-center-line_min1.6021301381430322
deviation-heading_max12.172244673138344
deviation-heading_mean8.526010708179639
deviation-heading_median7.991055814788366
deviation-heading_min5.949686530003481
driven_any_max7.921186404396646
driven_any_mean7.918853145681213
driven_any_median7.919890473543528
driven_any_min7.91444523124115
driven_lanedir_consec_max7.416658845546216
driven_lanedir_consec_mean5.615866307239143
driven_lanedir_consec_min2.1644762333543177
driven_lanedir_max7.416658845546216
driven_lanedir_mean5.675081847121664
driven_lanedir_median6.441165075028019
driven_lanedir_min2.4013383928843997
get_duckie_state_max1.3554721549587585e-06
get_duckie_state_mean1.3310049693848468e-06
get_duckie_state_median1.3433626351209603e-06
get_duckie_state_min1.2818224523387086e-06
get_robot_state_max0.003723298778740393
get_robot_state_mean0.0036674106150840742
get_robot_state_median0.003659859187994075
get_robot_state_min0.0036266253056077536
get_state_dump_max0.0047271811495613394
get_state_dump_mean0.004667775617054758
get_state_dump_median0.004676346576382576
get_state_dump_min0.004591228165892539
get_ui_image_max0.03496470776922399
get_ui_image_mean0.03009296595107308
get_ui_image_median0.029746147019976285
get_ui_image_min0.025914861995115765
in-drivable-lane_max39.64999999999855
in-drivable-lane_mean15.137499999999488
in-drivable-lane_min1.6499999999999533
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921186404396646, "get_ui_image": 0.02778262341647819, "step_physics": 0.09842225236757708, "survival_time": 59.99999999999873, "driven_lanedir": 5.953216823600106, "get_state_dump": 0.004704233609468713, "get_robot_state": 0.003628786160090286, "sim_render-ego0": 0.003762711295478052, "get_duckie_state": 1.2818224523387086e-06, "in-drivable-lane": 13.399999999999691, "deviation-heading": 7.015338110583086, "agent_compute-ego0": 0.012474607369187075, "complete-iteration": 0.16443582379152932, "set_robot_commands": 0.002199564051568558, "deviation-center-line": 3.787756183656077, "driven_lanedir_consec": 5.953216823600106, "sim_compute_sim_state": 0.009412496115742476, "sim_compute_performance-ego0": 0.001957265860234371}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.92101511907974, "get_ui_image": 0.03496470776922399, "step_physics": 0.1278166582344176, "survival_time": 59.99999999999873, "driven_lanedir": 7.416658845546216, "get_state_dump": 0.004648459543296439, "get_robot_state": 0.003690932215897864, "sim_render-ego0": 0.0038041137438034832, "get_duckie_state": 1.3554721549587585e-06, "in-drivable-lane": 1.6499999999999533, "deviation-heading": 12.172244673138344, "agent_compute-ego0": 0.012642591422443882, "complete-iteration": 0.2042974574083492, "set_robot_commands": 0.002212009858727753, "deviation-center-line": 3.8971751912760695, "driven_lanedir_consec": 7.416658845546216, "sim_compute_sim_state": 0.012393064046283249, "sim_compute_performance-ego0": 0.002032572184871575}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.91444523124115, "get_ui_image": 0.03170967062347438, "step_physics": 0.11471567662133464, "survival_time": 59.99999999999873, "driven_lanedir": 6.929113326455932, "get_state_dump": 0.0047271811495613394, "get_robot_state": 0.003723298778740393, "sim_render-ego0": 0.003862695233410939, "get_duckie_state": 1.3364145499681256e-06, "in-drivable-lane": 5.849999999999756, "deviation-heading": 8.966773518993646, "agent_compute-ego0": 0.01254645454000176, "complete-iteration": 0.1887142555004949, "set_robot_commands": 0.002244634890337967, "deviation-center-line": 3.0801037264745483, "driven_lanedir_consec": 6.929113326455932, "sim_compute_sim_state": 0.013045624233503129, "sim_compute_performance-ego0": 0.0020423884395755}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.9187658280073165, "get_ui_image": 0.025914861995115765, "step_physics": 0.13479938455465731, "survival_time": 59.99999999999873, "driven_lanedir": 2.4013383928843997, "get_state_dump": 0.004591228165892539, "get_robot_state": 0.0036266253056077536, "sim_render-ego0": 0.0037610217196856017, "get_duckie_state": 1.3503107202737953e-06, "in-drivable-lane": 39.64999999999855, "deviation-heading": 5.949686530003481, "agent_compute-ego0": 0.012606338100766858, "complete-iteration": 0.19552228016023532, "set_robot_commands": 0.0021691405703681992, "deviation-center-line": 1.6021301381430322, "driven_lanedir_consec": 2.1644762333543177, "sim_compute_sim_state": 0.006034914202535282, "sim_compute_performance-ego0": 0.0019290445249940235}}
set_robot_commands_max0.002244634890337967
set_robot_commands_mean0.002206337342750619
set_robot_commands_median0.0022057869551481554
set_robot_commands_min0.0021691405703681992
sim_compute_performance-ego0_max0.0020423884395755
sim_compute_performance-ego0_mean0.0019903177524188674
sim_compute_performance-ego0_median0.001994919022552973
sim_compute_performance-ego0_min0.0019290445249940235
sim_compute_sim_state_max0.013045624233503129
sim_compute_sim_state_mean0.010221524649516034
sim_compute_sim_state_median0.010902780081012862
sim_compute_sim_state_min0.006034914202535282
sim_render-ego0_max0.003862695233410939
sim_render-ego0_mean0.003797635498094519
sim_render-ego0_median0.003783412519640768
sim_render-ego0_min0.0037610217196856017
simulation-passed1
step_physics_max0.13479938455465731
step_physics_mean0.11893849294449664
step_physics_median0.12126616742787612
step_physics_min0.09842225236757708
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58587LFv-simsuccessyes0:36:35
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58586LFv-simsuccessyes0:37:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52541LFv-simerrorno0:07:50
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140304377024672
- M:video_aido:cmdline(in:/;out:/) 140304376987264
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52535LFv-simerrorno0:09:07
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140445143035424
- M:video_aido:cmdline(in:/;out:/) 140445143035568
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52531LFv-simerrorno0:09:44
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140344668054768
- M:video_aido:cmdline(in:/;out:/) 140344668053664
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41826LFv-simsuccessno0:08:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41825LFv-simsuccessno0:08:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41824LFv-simsuccessno0:08:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38405LFv-simsuccessno0:17:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38404LFv-simerrorno0:01:27
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission6825/LFv-sim-reg02-2e50246280f2-1-job38404-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36467LFv-simsuccessno0:10:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35882LFv-simsuccessno0:01:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35467LFv-simerrorno0:21:54
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg02-1b92df2e7e91-1-job35467:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg02-1b92df2e7e91-1-job35467/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg02-1b92df2e7e91-1-job35467/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg02-1b92df2e7e91-1-job35467/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg02-1b92df2e7e91-1-job35467/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg02-1b92df2e7e91-1-job35467/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35466LFv-simerrorno0:21:16
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg01-94a6fab21ac9-1-job35466:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg01-94a6fab21ac9-1-job35466/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg01-94a6fab21ac9-1-job35466/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg01-94a6fab21ac9-1-job35466/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg01-94a6fab21ac9-1-job35466/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg01-94a6fab21ac9-1-job35466/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35152LFv-simsuccessno0:24:38
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34428LFv-simsuccessno0:26:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34427LFv-simsuccessno0:28:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34257LFv-simabortedno0:24:37
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg12-dacebf82dd85-1-job34257:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg12-dacebf82dd85-1-job34257/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33932LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg04-bf35e9d68df4-1-job33932'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33924LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg04-bf35e9d68df4-1-job33924'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33909LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg11-951de1eeccca-1-job33909'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33901LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg11-951de1eeccca-1-job33901'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33894LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg07-c4e193407567-1-job33894'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33890LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg05-5ca0d35e6d82-1-job33890'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33883LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg01-53440c9394b5-1-job33883'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33877LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg05-5ca0d35e6d82-1-job33877'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33870LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6825/LFv-sim-reg03-c2bc3037870e-1-job33870'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33399LFv-simsuccessno0:14:19
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33398LFv-simsuccessno0:14:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible