Duckietown Challenges Home Challenges Submissions

Submission 6809

Submission6809
Competingyes
Challengeaido5-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58617
Next
User labelbaseline-duckietown
Admin priority50
Blessingn/a
User priority50

58617

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58617LFv-simsuccessyes0:08:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.7353243854971382
survival_time_median12.050000000000036
deviation-center-line_median0.4762827256198623
in-drivable-lane_median5.675000000000036


other stats
agent_compute-ego0_max0.01398425813644163
agent_compute-ego0_mean0.012889606055826529
agent_compute-ego0_median0.012744287663475674
agent_compute-ego0_min0.012085590759913127
complete-iteration_max0.19650157960523076
complete-iteration_mean0.17343227380640816
complete-iteration_median0.17524710935474008
complete-iteration_min0.14673329691092174
deviation-center-line_max0.7680223932146105
deviation-center-line_mean0.4674260038436652
deviation-center-line_min0.14911617092032564
deviation-heading_max4.034660462811229
deviation-heading_mean2.3442682960365904
deviation-heading_median2.1044918068071503
deviation-heading_min1.1334291077208307
driven_any_max3.141435636556551
driven_any_mean1.677601000949518
driven_any_median1.4618430614608804
driven_any_min0.645282244319761
driven_lanedir_consec_max1.0745206495074684
driven_lanedir_consec_mean0.6858464419950135
driven_lanedir_consec_min0.1982163474783092
driven_lanedir_max1.0745206495074684
driven_lanedir_mean0.6858464419950135
driven_lanedir_median0.7353243854971382
driven_lanedir_min0.1982163474783092
get_duckie_state_max2.264976501464844e-06
get_duckie_state_mean2.1287226371027497e-06
get_duckie_state_median2.0928032581125633e-06
get_duckie_state_min2.0643075307210286e-06
get_robot_state_max0.003956848575222877
get_robot_state_mean0.00384999142852089
get_robot_state_median0.003850314953485161
get_robot_state_min0.0037424872318903607
get_state_dump_max0.004925901772545986
get_state_dump_mean0.004841084604776859
get_state_dump_median0.0048427950438363464
get_state_dump_min0.004752846558888754
get_ui_image_max0.036372928058399874
get_ui_image_mean0.032138169807696466
get_ui_image_median0.03271630693213263
get_ui_image_min0.026747137308120728
in-drivable-lane_max14.95000000000018
in-drivable-lane_mean7.52500000000006
in-drivable-lane_min3.7999999999999865
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 1.4751204731217136, "get_ui_image": 0.030804751349277185, "step_physics": 0.10203508861729356, "survival_time": 12.150000000000038, "driven_lanedir": 0.7635629765155627, "get_state_dump": 0.004925901772545986, "get_robot_state": 0.003888992012524214, "sim_render-ego0": 0.003969024439327052, "get_duckie_state": 2.264976501464844e-06, "in-drivable-lane": 5.600000000000024, "deviation-heading": 2.38877406005591, "agent_compute-ego0": 0.012760783805221807, "complete-iteration": 0.17278276506017465, "set_robot_commands": 0.002438308762722328, "deviation-center-line": 0.7155083847574778, "driven_lanedir_consec": 0.7635629765155627, "sim_compute_sim_state": 0.009744877697991544, "sim_compute_performance-ego0": 0.0021198900019536253}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.645282244319761, "get_ui_image": 0.036372928058399874, "step_physics": 0.11978851246232744, "survival_time": 5.899999999999987, "driven_lanedir": 0.1982163474783092, "get_state_dump": 0.004771308738644384, "get_robot_state": 0.003811637894446109, "sim_render-ego0": 0.003899868796853458, "get_duckie_state": 2.0936757576565784e-06, "in-drivable-lane": 3.7999999999999865, "deviation-heading": 1.1334291077208307, "agent_compute-ego0": 0.012727791521729542, "complete-iteration": 0.19650157960523076, "set_robot_commands": 0.0024220202149463303, "deviation-center-line": 0.14911617092032564, "driven_lanedir_consec": 0.1982163474783092, "sim_compute_sim_state": 0.01058656828744071, "sim_compute_performance-ego0": 0.002028701686057724}, "LF-norm-techtrack-000-ego0": {"driven_any": 3.141435636556551, "get_ui_image": 0.03462786251498807, "step_physics": 0.10000250704826848, "survival_time": 24.750000000000217, "driven_lanedir": 1.0745206495074684, "get_state_dump": 0.00491428134902831, "get_robot_state": 0.003956848575222877, "sim_render-ego0": 0.004027239737972137, "get_duckie_state": 2.091930758568548e-06, "in-drivable-lane": 14.95000000000018, "deviation-heading": 4.034660462811229, "agent_compute-ego0": 0.01398425813644163, "complete-iteration": 0.17771145364930552, "set_robot_commands": 0.002515344850478634, "deviation-center-line": 0.7680223932146105, "driven_lanedir_consec": 1.0745206495074684, "sim_compute_sim_state": 0.011480806335326164, "sim_compute_performance-ego0": 0.0021063627735260996}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.448565649800047, "get_ui_image": 0.026747137308120728, "step_physics": 0.08565658827622731, "survival_time": 11.950000000000037, "driven_lanedir": 0.7070857944787137, "get_state_dump": 0.004752846558888754, "get_robot_state": 0.0037424872318903607, "sim_render-ego0": 0.003833461801211039, "get_duckie_state": 2.0643075307210286e-06, "in-drivable-lane": 5.750000000000049, "deviation-heading": 1.82020955355839, "agent_compute-ego0": 0.012085590759913127, "complete-iteration": 0.14673329691092174, "set_robot_commands": 0.0024623324473698935, "deviation-center-line": 0.23705706648224684, "driven_lanedir_consec": 0.7070857944787137, "sim_compute_sim_state": 0.005387364824612936, "sim_compute_performance-ego0": 0.0019771993160247804}}
set_robot_commands_max0.002515344850478634
set_robot_commands_mean0.0024595015688792963
set_robot_commands_median0.002450320605046111
set_robot_commands_min0.0024220202149463303
sim_compute_performance-ego0_max0.0021198900019536253
sim_compute_performance-ego0_mean0.0020580384443905572
sim_compute_performance-ego0_median0.0020675322297919116
sim_compute_performance-ego0_min0.0019771993160247804
sim_compute_sim_state_max0.011480806335326164
sim_compute_sim_state_mean0.00929990428634284
sim_compute_sim_state_median0.010165722992716128
sim_compute_sim_state_min0.005387364824612936
sim_render-ego0_max0.004027239737972137
sim_render-ego0_mean0.0039323986938409215
sim_render-ego0_median0.0039344466180902555
sim_render-ego0_min0.003833461801211039
simulation-passed1
step_physics_max0.11978851246232744
step_physics_mean0.1018706741010292
step_physics_median0.10101879783278102
step_physics_min0.08565658827622731
survival_time_max24.750000000000217
survival_time_mean13.687500000000068
survival_time_min5.899999999999987
No reset possible
52563LFv-simerrorno0:04:40
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140485708848048
- M:video_aido:cmdline(in:/;out:/) 140485708847328
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52560LFv-simerrorno0:03:30
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139738209893968
- M:video_aido:cmdline(in:/;out:/) 139738209890704
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52558LFv-simerrorno0:04:27
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139627994044496
- M:video_aido:cmdline(in:/;out:/) 139627994066080
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52551LFv-simerrorno0:04:27
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139910601846848
- M:video_aido:cmdline(in:/;out:/) 139910601849488
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41841LFv-simsuccessno0:05:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38427LFv-simsuccessno0:05:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38426LFv-simsuccessno0:07:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36484LFv-simsuccessno0:05:30
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36483LFv-simsuccessno0:05:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35901LFv-simsuccessno0:01:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35897LFv-simsuccessno0:01:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35555LFv-simerrorno0:11:30
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-92589757aa5b-1-job35555:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-92589757aa5b-1-job35555/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-92589757aa5b-1-job35555/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-92589757aa5b-1-job35555/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-92589757aa5b-1-job35555/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-92589757aa5b-1-job35555/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35526LFv-simabortedno0:12:56
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0cae09ccdd4d-1-job35526:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0cae09ccdd4d-1-job35526/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0cae09ccdd4d-1-job35526/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0cae09ccdd4d-1-job35526/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0cae09ccdd4d-1-job35526/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0cae09ccdd4d-1-job35526/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35481LFv-simabortedno0:10:24
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0c28c9d61367-1-job35481:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0c28c9d61367-1-job35481/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0c28c9d61367-1-job35481/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0c28c9d61367-1-job35481/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0c28c9d61367-1-job35481/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-0c28c9d61367-1-job35481/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35168LFv-simsuccessno0:12:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34276LFv-simerrorno0:11:19
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg06-4ab2f0cbdf5a-1-job34276:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg06-4ab2f0cbdf5a-1-job34276/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34274LFv-simerrorno0:11:38
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-4882a976d5dc-1-job34274:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-4882a976d5dc-1-job34274/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34041LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg04-bf35e9d68df4-1-job34041'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34033LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg04-bf35e9d68df4-1-job34033'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34021LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg03-c2bc3037870e-1-job34021'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34017LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg07-c4e193407567-1-job34017'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34007LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg07-c4e193407567-1-job34007'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34005LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg05-5ca0d35e6d82-1-job34005'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33998LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg11-951de1eeccca-1-job33998'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33992LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg11-951de1eeccca-1-job33992'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33988LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg11-951de1eeccca-1-job33988'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33980LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6809/LFv-sim-reg01-53440c9394b5-1-job33980'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33371LFv-simsuccessno0:08:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33370LFv-simsuccessno0:08:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible