Duckietown Challenges Home Challenges Submissions

Submission 6783

Submission6783
Competingyes
Challengeaido5-LF-sim-validation
UserJiaxu Xing
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58648
Next
User labeltemplate-random
Admin priority50
Blessingn/a
User priority50

58648

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58648LFv-simsuccessyes0:04:36
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.5040495128145347
survival_time_median4.6499999999999915
deviation-center-line_median0.10620228737371736
in-drivable-lane_median2.674999999999992


other stats
agent_compute-ego0_max0.011768211921056112
agent_compute-ego0_mean0.011274862837663508
agent_compute-ego0_median0.011338799798232747
agent_compute-ego0_min0.010653639833132424
complete-iteration_max0.23448538002760513
complete-iteration_mean0.2061892490857234
complete-iteration_median0.2079591586368497
complete-iteration_min0.1743532990415891
deviation-center-line_max0.16455763165774434
deviation-center-line_mean0.1092777488823554
deviation-center-line_min0.06014878912424249
deviation-heading_max1.819139548495243
deviation-heading_mean0.860079214882594
deviation-heading_median0.6517793617099584
deviation-heading_min0.31761858761521594
driven_any_max3.078535752374254
driven_any_mean1.8526223552312917
driven_any_median1.6219947547238958
driven_any_min1.0879641591031215
driven_lanedir_consec_max0.6643282900394009
driven_lanedir_consec_mean0.4884432809313369
driven_lanedir_consec_min0.28134580805687714
driven_lanedir_max0.6643282900394009
driven_lanedir_mean0.4924956463812869
driven_lanedir_median0.5121542437144349
driven_lanedir_min0.28134580805687714
get_duckie_state_max1.5099843343098958e-06
get_duckie_state_mean1.426538104989992e-06
get_duckie_state_median1.407663574176761e-06
get_duckie_state_min1.3808409372965495e-06
get_robot_state_max0.004158525363258694
get_robot_state_mean0.004028459550720694
get_robot_state_median0.004006152079242862
get_robot_state_min0.003943008681138356
get_state_dump_max0.005267646001732867
get_state_dump_mean0.005208524661946791
get_state_dump_median0.005230955433744921
get_state_dump_min0.005104541778564453
get_ui_image_max0.03778324438178021
get_ui_image_mean0.03303060574151905
get_ui_image_median0.03306240170313839
get_ui_image_min0.028214375178019207
in-drivable-lane_max6.549999999999982
in-drivable-lane_mean3.3499999999999894
in-drivable-lane_min1.499999999999995
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.078535752374254, "get_ui_image": 0.030631509008286876, "step_physics": 0.1212272432786, "survival_time": 7.84999999999998, "driven_lanedir": 0.28134580805687714, "get_state_dump": 0.005198502842384049, "get_robot_state": 0.00396861909311029, "sim_render-ego0": 0.0039972202687323845, "get_duckie_state": 1.3822241674495648e-06, "in-drivable-lane": 6.549999999999982, "deviation-heading": 0.891006057417327, "agent_compute-ego0": 0.010946350761606724, "complete-iteration": 0.1909579702570469, "set_robot_commands": 0.0022888726825955547, "deviation-center-line": 0.13094886067496936, "driven_lanedir_consec": 0.28134580805687714, "sim_compute_sim_state": 0.010451772544957414, "sim_compute_performance-ego0": 0.002158039732824398}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.5806581102909054, "get_ui_image": 0.03778324438178021, "step_physics": 0.1565898553184841, "survival_time": 4.549999999999992, "driven_lanedir": 0.3891269085945981, "get_state_dump": 0.005267646001732867, "get_robot_state": 0.004158525363258694, "sim_render-ego0": 0.004185775051946225, "get_duckie_state": 1.4331029809039573e-06, "in-drivable-lane": 2.449999999999993, "deviation-heading": 1.819139548495243, "agent_compute-ego0": 0.01173124883485877, "complete-iteration": 0.23448538002760513, "set_robot_commands": 0.002411412156146506, "deviation-center-line": 0.16455763165774434, "driven_lanedir_consec": 0.37291744679479777, "sim_compute_sim_state": 0.009998578092326288, "sim_compute_performance-ego0": 0.0022640850232995076}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.0879641591031215, "get_ui_image": 0.03549329439798991, "step_physics": 0.15117631355921426, "survival_time": 3.5499999999999954, "driven_lanedir": 0.6643282900394009, "get_state_dump": 0.005263408025105794, "get_robot_state": 0.004043685065375434, "sim_render-ego0": 0.004144906997680664, "get_duckie_state": 1.5099843343098958e-06, "in-drivable-lane": 1.499999999999995, "deviation-heading": 0.41255266600258983, "agent_compute-ego0": 0.011768211921056112, "complete-iteration": 0.22496034701665243, "set_robot_commands": 0.002373264895545112, "deviation-center-line": 0.08145571407246535, "driven_lanedir_consec": 0.6643282900394009, "sim_compute_sim_state": 0.008390267690022787, "sim_compute_performance-ego0": 0.002210858795377943}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.6633313991568863, "get_ui_image": 0.028214375178019207, "step_physics": 0.11268057425816852, "survival_time": 4.749999999999991, "driven_lanedir": 0.6351815788342716, "get_state_dump": 0.005104541778564453, "get_robot_state": 0.003943008681138356, "sim_render-ego0": 0.0039466073115666704, "get_duckie_state": 1.3808409372965495e-06, "in-drivable-lane": 2.89999999999999, "deviation-heading": 0.31761858761521594, "agent_compute-ego0": 0.010653639833132424, "complete-iteration": 0.1743532990415891, "set_robot_commands": 0.00236646831035614, "deviation-center-line": 0.06014878912424249, "driven_lanedir_consec": 0.6351815788342716, "sim_compute_sim_state": 0.0052808721860249834, "sim_compute_performance-ego0": 0.002073789636294047}}
set_robot_commands_max0.002411412156146506
set_robot_commands_mean0.0023600045111608283
set_robot_commands_median0.002369866602950626
set_robot_commands_min0.0022888726825955547
sim_compute_performance-ego0_max0.0022640850232995076
sim_compute_performance-ego0_mean0.0021766932969489736
sim_compute_performance-ego0_median0.0021844492641011704
sim_compute_performance-ego0_min0.002073789636294047
sim_compute_sim_state_max0.010451772544957414
sim_compute_sim_state_mean0.008530372628332868
sim_compute_sim_state_median0.009194422891174538
sim_compute_sim_state_min0.0052808721860249834
sim_render-ego0_max0.004185775051946225
sim_render-ego0_mean0.004068627407481486
sim_render-ego0_median0.004071063633206524
sim_render-ego0_min0.0039466073115666704
simulation-passed1
step_physics_max0.1565898553184841
step_physics_mean0.13541849660361674
step_physics_median0.13620177841890713
step_physics_min0.11268057425816852
survival_time_max7.84999999999998
survival_time_mean5.17499999999999
survival_time_min3.5499999999999954
No reset possible
58640LFv-simsuccessyes0:04:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58628LFv-simsuccessyes0:04:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52660LFv-simerrorno0:01:54
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139914362211440
- M:video_aido:cmdline(in:/;out:/) 139914362212112
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52657LFv-simerrorno0:03:10
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139792651505376
- M:video_aido:cmdline(in:/;out:/) 139792651503888
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52653LFv-simerrorno0:01:11
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139979399449712
- M:video_aido:cmdline(in:/;out:/) 139979399449472
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52614LFv-simhost-errorno0:01:44
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52599LFv-simerrorno0:03:20
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140269295272240
- M:video_aido:cmdline(in:/;out:/) 140264392549808
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41858LFv-simsuccessno0:03:32
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41856LFv-simsuccessno0:03:31
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38452LFv-simsuccessno0:03:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38450LFv-simsuccessno0:04:22
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36499LFv-simsuccessno0:03:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35915LFv-simsuccessno0:00:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35914LFv-simsuccessno0:00:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35582LFv-simsuccessno0:07:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35566LFv-simtimeoutno0:07:44
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35565LFv-simtimeoutno0:00:06
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35564LFv-simtimeoutno0:00:03
Timeout because eval [...]
Timeout because evaluator contacted us
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35534LFv-simabortedno0:07:54
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg03-0cae09ccdd4d-1-job35534:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg03-0cae09ccdd4d-1-job35534/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg03-0cae09ccdd4d-1-job35534/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg03-0cae09ccdd4d-1-job35534/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg03-0cae09ccdd4d-1-job35534/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg03-0cae09ccdd4d-1-job35534/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35493LFv-simabortedno0:07:08
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-94a6fab21ac9-1-job35493:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-94a6fab21ac9-1-job35493/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-94a6fab21ac9-1-job35493/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-94a6fab21ac9-1-job35493/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-94a6fab21ac9-1-job35493/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-94a6fab21ac9-1-job35493/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35177LFv-simabortedno0:08:31
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34448LFv-simabortedno0:08:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34447LFv-simabortedno0:09:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34298LFv-simabortedno0:07:50
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-d4ceb20fdede-1-job34298:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-d4ceb20fdede-1-job34298/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34126LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg04-bf35e9d68df4-1-job34126'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34094LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-53440c9394b5-1-job34094'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34088LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg01-53440c9394b5-1-job34088'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34083LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg11-951de1eeccca-1-job34083'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34075LFv-simabortedno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg03-c2bc3037870e-1-job34075'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34068LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg07-c4e193407567-1-job34068'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34062LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg05-5ca0d35e6d82-1-job34062'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34056LFv-simabortedno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6783/LFv-sim-reg05-5ca0d35e6d82-1-job34056'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33324LFv-simabortedno0:05:51
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible