Duckietown Challenges Home Challenges Submissions

Submission 6807

Submission6807
Competingyes
Challengeaido5-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58619
Next
User labelbaseline-duckietown
Admin priority50
Blessingn/a
User priority50

58619

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58619LFv-simsuccessyes0:08:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.7353243854972329
survival_time_median12.075000000000037
deviation-center-line_median0.477971685366957
in-drivable-lane_median5.675000000000036


other stats
agent_compute-ego0_max0.012273419804933692
agent_compute-ego0_mean0.012241260237478731
agent_compute-ego0_median0.012263971666679313
agent_compute-ego0_min0.01216367781162262
complete-iteration_max0.18912007428016983
complete-iteration_mean0.16203962459640583
complete-iteration_median0.1569183769876798
complete-iteration_min0.14520167013009389
deviation-center-line_max0.7668233999124398
deviation-center-line_mean0.4679707353916699
deviation-center-line_min0.14911617092032564
deviation-heading_max4.031974919266127
deviation-heading_mean2.34831832538021
deviation-heading_median2.11393463726694
deviation-heading_min1.1334291077208307
driven_any_max3.1414356365565625
driven_any_mean1.677601000949521
driven_any_median1.4618430614608806
driven_any_min0.645282244319761
driven_lanedir_consec_max1.0745206495081625
driven_lanedir_consec_mean0.6858464419952344
driven_lanedir_consec_min0.1982163474783092
driven_lanedir_max1.0745206495081625
driven_lanedir_mean0.6858464419952344
driven_lanedir_median0.7353243854972329
driven_lanedir_min0.1982163474783092
get_duckie_state_max1.2942722865513391e-06
get_duckie_state_mean1.203403309973656e-06
get_duckie_state_median1.1821136330113266e-06
get_duckie_state_min1.1551136873206314e-06
get_robot_state_max0.0037144875039859695
get_robot_state_mean0.0036033924840639946
get_robot_state_median0.0035708634841321693
get_robot_state_min0.0035573554640056706
get_state_dump_max0.004541575178808096
get_state_dump_mean0.00448501687491254
get_state_dump_median0.004484026013600706
get_state_dump_min0.00443044029364065
get_ui_image_max0.03622312706057765
get_ui_image_mean0.03088719970068675
get_ui_image_median0.030733137046202168
get_ui_image_min0.025859397649765015
in-drivable-lane_max14.95000000000018
in-drivable-lane_mean7.52500000000006
in-drivable-lane_min3.7999999999999865
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 1.4751204731217142, "get_ui_image": 0.028850719880084603, "step_physics": 0.08550942284720285, "survival_time": 12.200000000000038, "driven_lanedir": 0.7635629765157521, "get_state_dump": 0.004541575178808096, "get_robot_state": 0.0037144875039859695, "sim_render-ego0": 0.0038665907723563057, "get_duckie_state": 1.1551136873206314e-06, "in-drivable-lane": 5.600000000000024, "deviation-heading": 2.40765972097549, "agent_compute-ego0": 0.012263830340638451, "complete-iteration": 0.15215857077618034, "set_robot_commands": 0.002367322298945213, "deviation-center-line": 0.7188863042516672, "driven_lanedir_consec": 0.7635629765157521, "sim_compute_sim_state": 0.008971676534535934, "sim_compute_performance-ego0": 0.0019960841354058714}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.645282244319761, "get_ui_image": 0.03622312706057765, "step_physics": 0.11424429276410272, "survival_time": 5.899999999999987, "driven_lanedir": 0.1982163474783092, "get_state_dump": 0.00443044029364065, "get_robot_state": 0.0035573554640056706, "sim_render-ego0": 0.0036893171422621783, "get_duckie_state": 1.2942722865513391e-06, "in-drivable-lane": 3.7999999999999865, "deviation-heading": 1.1334291077208307, "agent_compute-ego0": 0.012273419804933692, "complete-iteration": 0.18912007428016983, "set_robot_commands": 0.0022609153715502316, "deviation-center-line": 0.14911617092032564, "driven_lanedir_consec": 0.1982163474783092, "sim_compute_sim_state": 0.010500889866291977, "sim_compute_performance-ego0": 0.0018628525132892513}, "LF-norm-techtrack-000-ego0": {"driven_any": 3.1414356365565625, "get_ui_image": 0.03261555421231973, "step_physics": 0.0897966818376021, "survival_time": 24.700000000000216, "driven_lanedir": 1.0745206495081625, "get_state_dump": 0.004448596395627417, "get_robot_state": 0.0035643336748836015, "sim_render-ego0": 0.003783522711859809, "get_duckie_state": 1.1631936737985322e-06, "in-drivable-lane": 14.95000000000018, "deviation-heading": 4.031974919266127, "agent_compute-ego0": 0.01226411299272017, "complete-iteration": 0.1616781831991793, "set_robot_commands": 0.002266951281614978, "deviation-center-line": 0.7668233999124398, "driven_lanedir_consec": 1.0745206495081625, "sim_compute_sim_state": 0.01094321674770779, "sim_compute_performance-ego0": 0.0019214909486096317}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.448565649800047, "get_ui_image": 0.025859397649765015, "step_physics": 0.08576569159825644, "survival_time": 11.950000000000037, "driven_lanedir": 0.7070857944787137, "get_state_dump": 0.004519455631573995, "get_robot_state": 0.003577393293380737, "sim_render-ego0": 0.0036810408035914104, "get_duckie_state": 1.201033592224121e-06, "in-drivable-lane": 5.750000000000049, "deviation-heading": 1.82020955355839, "agent_compute-ego0": 0.01216367781162262, "complete-iteration": 0.14520167013009389, "set_robot_commands": 0.0022684733072916665, "deviation-center-line": 0.23705706648224684, "driven_lanedir_consec": 0.7070857944787137, "sim_compute_sim_state": 0.005395150184631348, "sim_compute_performance-ego0": 0.0018937577803929647}}
set_robot_commands_max0.002367322298945213
set_robot_commands_mean0.0022909155648505224
set_robot_commands_median0.0022677122944533223
set_robot_commands_min0.0022609153715502316
sim_compute_performance-ego0_max0.0019960841354058714
sim_compute_performance-ego0_mean0.0019185463444244297
sim_compute_performance-ego0_median0.0019076243645012984
sim_compute_performance-ego0_min0.0018628525132892513
sim_compute_sim_state_max0.01094321674770779
sim_compute_sim_state_mean0.008952733333291761
sim_compute_sim_state_median0.009736283200413956
sim_compute_sim_state_min0.005395150184631348
sim_render-ego0_max0.0038665907723563057
sim_render-ego0_mean0.003755117857517426
sim_render-ego0_median0.0037364199270609934
sim_render-ego0_min0.0036810408035914104
simulation-passed1
step_physics_max0.11424429276410272
step_physics_mean0.09382902226179102
step_physics_median0.08778118671792927
step_physics_min0.08550942284720285
survival_time_max24.700000000000216
survival_time_mean13.68750000000007
survival_time_min5.899999999999987
No reset possible
58618LFv-simsuccessyes0:08:48
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52585LFv-simerrorno0:03:41
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140255424508832
- M:video_aido:cmdline(in:/;out:/) 140255424439488
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52573LFv-simerrorno0:04:11
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140626218314432
- M:video_aido:cmdline(in:/;out:/) 140626218312608
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52570LFv-simerrorno0:06:15
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140199695892736
- M:video_aido:cmdline(in:/;out:/) 140199695863472
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52569LFv-simhost-errorno0:04:20
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41843LFv-simsuccessno0:04:55
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41842LFv-simsuccessno0:04:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38431LFv-simsuccessno0:07:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38430LFv-simsuccessno0:07:18
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36488LFv-simsuccessno0:05:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36486LFv-simsuccessno0:05:08
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35900LFv-simsuccessno0:01:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35899LFv-simsuccessno0:01:19
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35524LFv-simerrorno0:12:49
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-117cf9595558-1-job35524:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-117cf9595558-1-job35524/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-117cf9595558-1-job35524/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-117cf9595558-1-job35524/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-117cf9595558-1-job35524/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-117cf9595558-1-job35524/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35482LFv-simabortedno0:10:23
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-b2dee9d94ee0-1-job35482:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-b2dee9d94ee0-1-job35482/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-b2dee9d94ee0-1-job35482/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-b2dee9d94ee0-1-job35482/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-b2dee9d94ee0-1-job35482/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-b2dee9d94ee0-1-job35482/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35169LFv-simsuccessno0:11:17
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34278LFv-simerrorno0:11:26
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg08-4aa5552eb623-1-job34278:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg08-4aa5552eb623-1-job34278/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34275LFv-simerrorno0:11:27
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg09-5e69f1a0aa29-1-job34275:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg09-5e69f1a0aa29-1-job34275/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34050LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg04-bf35e9d68df4-1-job34050'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34043LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg04-bf35e9d68df4-1-job34043'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34015LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg03-c2bc3037870e-1-job34015'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34012LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg11-951de1eeccca-1-job34012'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34006LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg11-951de1eeccca-1-job34006'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34000LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg05-5ca0d35e6d82-1-job34000'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33986LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg01-53440c9394b5-1-job33986'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33982LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6807/LFv-sim-reg07-c4e193407567-1-job33982'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33367LFv-simsuccessno0:08:09
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33366LFv-simsuccessno0:08:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible