Duckietown Challenges Home Challenges Submissions

Submission 9315

Submission9315
Competingyes
Challengeaido5-LF-sim-validation
UserJerome Labonte 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58238
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58238

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58238LFv-simsuccessyes0:36:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median5.141291064492529
survival_time_median59.99999999999873
deviation-center-line_median3.4646899476492825
in-drivable-lane_median4.6249999999999005


other stats
agent_compute-ego0_max0.04299204970875151
agent_compute-ego0_mean0.035449628131177205
agent_compute-ego0_median0.042466204926731387
agent_compute-ego0_min0.013874052962494532
complete-iteration_max0.23635767778687236
complete-iteration_mean0.20561771691589925
complete-iteration_median0.19895473030782757
complete-iteration_min0.18820372926106957
deviation-center-line_max3.9936821142594496
deviation-center-line_mean3.5125970022135875
deviation-center-line_min3.1273259992963354
deviation-heading_max19.283312540424134
deviation-heading_mean14.484392580221837
deviation-heading_median13.679018711576042
deviation-heading_min11.296220357311128
driven_any_max11.70773560957308
driven_any_mean10.482914376375016
driven_any_median10.421838640462472
driven_any_min9.380244615002049
driven_lanedir_consec_max7.336132288497249
driven_lanedir_consec_mean5.315312503835923
driven_lanedir_consec_min3.642535597861385
driven_lanedir_max11.151810458263258
driven_lanedir_mean9.331249925617591
driven_lanedir_median9.573707015436147
driven_lanedir_min7.025775213334811
get_duckie_state_max2.2035355770419183e-06
get_duckie_state_mean2.0067062504980382e-06
get_duckie_state_median1.9800057518392876e-06
get_duckie_state_min1.8632779212716615e-06
get_robot_state_max0.003864644469071387
get_robot_state_mean0.0036475016711455
get_robot_state_median0.003619597416734021
get_robot_state_min0.003486167382042573
get_state_dump_max0.004743354505146671
get_state_dump_mean0.004609667589821287
get_state_dump_median0.004629859717859019
get_state_dump_min0.004435596418420441
get_ui_image_max0.03432568840738339
get_ui_image_mean0.030174765608689865
get_ui_image_median0.030144552505582106
get_ui_image_min0.026084269016211872
in-drivable-lane_max13.1999999999995
in-drivable-lane_mean6.1999999999997994
in-drivable-lane_min2.3499999999998984
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 11.70773560957308, "get_ui_image": 0.02715444425857633, "step_physics": 0.09727749240884773, "survival_time": 59.99999999999873, "driven_lanedir": 11.151810458263258, "get_state_dump": 0.004547727197334232, "get_robot_state": 0.003536217615665941, "sim_render-ego0": 0.003626990774886793, "get_duckie_state": 1.916877435307816e-06, "in-drivable-lane": 2.3499999999998984, "deviation-heading": 11.296220357311128, "agent_compute-ego0": 0.04227301997010853, "complete-iteration": 0.19216135379972307, "set_robot_commands": 0.002143150563045505, "deviation-center-line": 3.1273259992963354, "driven_lanedir_consec": 7.336132288497249, "sim_compute_sim_state": 0.009604085593497524, "sim_compute_performance-ego0": 0.0019057521613610972}, "LF-norm-zigzag-000-ego0": {"driven_any": 9.380244615002049, "get_ui_image": 0.03432568840738339, "step_physics": 0.13225884520938058, "survival_time": 59.99999999999873, "driven_lanedir": 7.025775213334811, "get_state_dump": 0.004435596418420441, "get_robot_state": 0.003486167382042573, "sim_render-ego0": 0.003595767866860421, "get_duckie_state": 1.8632779212716615e-06, "in-drivable-lane": 13.1999999999995, "deviation-heading": 19.283312540424134, "agent_compute-ego0": 0.04265938988335424, "complete-iteration": 0.23635767778687236, "set_robot_commands": 0.002115791385914265, "deviation-center-line": 3.159891835193267, "driven_lanedir_consec": 3.642535597861385, "sim_compute_sim_state": 0.011489468351391134, "sim_compute_performance-ego0": 0.001901129302533838}, "LF-norm-techtrack-000-ego0": {"driven_any": 9.727862762775157, "get_ui_image": 0.03313466075258787, "step_physics": 0.12840398920267249, "survival_time": 59.99999999999873, "driven_lanedir": 8.603382234210972, "get_state_dump": 0.004743354505146671, "get_robot_state": 0.003864644469071387, "sim_render-ego0": 0.003915857415115903, "get_duckie_state": 2.2035355770419183e-06, "in-drivable-lane": 6.499999999999762, "deviation-heading": 15.439891929270482, "agent_compute-ego0": 0.013874052962494532, "complete-iteration": 0.20574810681593209, "set_robot_commands": 0.0023057381378224647, "deviation-center-line": 3.9936821142594496, "driven_lanedir_consec": 5.1091742398889215, "sim_compute_sim_state": 0.013235024865918314, "sim_compute_performance-ego0": 0.0021746307487392506}, "LF-norm-small_loop-000-ego0": {"driven_any": 11.11581451814979, "get_ui_image": 0.026084269016211872, "step_physics": 0.09645477738805258, "survival_time": 59.99999999999873, "driven_lanedir": 10.544031796661324, "get_state_dump": 0.004711992238383806, "get_robot_state": 0.0037029772178021, "sim_render-ego0": 0.0038002589461606905, "get_duckie_state": 2.043134068370759e-06, "in-drivable-lane": 2.750000000000039, "deviation-heading": 11.9181454938816, "agent_compute-ego0": 0.04299204970875151, "complete-iteration": 0.18820372926106957, "set_robot_commands": 0.002277150936269641, "deviation-center-line": 3.7694880601052976, "driven_lanedir_consec": 5.173407889096136, "sim_compute_sim_state": 0.006094370952355276, "sim_compute_performance-ego0": 0.0019935785384102727}}
set_robot_commands_max0.0023057381378224647
set_robot_commands_mean0.002210457755762969
set_robot_commands_median0.002210150749657573
set_robot_commands_min0.002115791385914265
sim_compute_performance-ego0_max0.0021746307487392506
sim_compute_performance-ego0_mean0.0019937726877611145
sim_compute_performance-ego0_median0.0019496653498856848
sim_compute_performance-ego0_min0.001901129302533838
sim_compute_sim_state_max0.013235024865918314
sim_compute_sim_state_mean0.010105737440790564
sim_compute_sim_state_median0.01054677697244433
sim_compute_sim_state_min0.006094370952355276
sim_render-ego0_max0.003915857415115903
sim_render-ego0_mean0.003734718750755951
sim_render-ego0_median0.0037136248605237417
sim_render-ego0_min0.003595767866860421
simulation-passed1
step_physics_max0.13225884520938058
step_physics_mean0.11359877605223836
step_physics_median0.1128407408057601
step_physics_min0.09645477738805258
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
58237LFv-simsuccessyes0:31:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58231LFv-simsuccessyes0:25:26
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58224LFv-simsuccessyes0:33:16
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58222LFv-simsuccessyes0:35:38
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
58219LFv-simsuccessyes0:34:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52419LFv-simerrorno0:10:05
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140407002982576
- M:video_aido:cmdline(in:/;out:/) 140407002983536
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52413LFv-simerrorno0:08:17
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139895548950224
- M:video_aido:cmdline(in:/;out:/) 139895548949552
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52408LFv-simerrorno0:08:53
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140713858572640
- M:video_aido:cmdline(in:/;out:/) 140713858573456
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52323LFv-simhost-errorno0:09:20
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41730LFv-simsuccessno0:09:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38182LFv-simerrorno0:00:38
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9315/LFv-sim-mont04-e828c68b6a88-1-job38182-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38178LFv-simerrorno0:00:40
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9315/LFv-sim-mont01-6ef51bb8a9d6-1-job38178-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36328LFv-simsuccessno0:11:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36325LFv-simsuccessno0:10:42
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36324LFv-simerrorno0:00:47
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9315/LFv-sim-Sandy1-sandy-1-job36324-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35758LFv-simsuccessno0:01:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35363LFv-simerrorno0:23:10
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9315/LFv-sim-reg01-94a6fab21ac9-1-job35363:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9315/LFv-sim-reg01-94a6fab21ac9-1-job35363/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9315/LFv-sim-reg01-94a6fab21ac9-1-job35363/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9315/LFv-sim-reg01-94a6fab21ac9-1-job35363/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9315/LFv-sim-reg01-94a6fab21ac9-1-job35363/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9315/LFv-sim-reg01-94a6fab21ac9-1-job35363/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34997LFv-simsuccessno0:23:28
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34665LFv-simsuccessno0:25:13
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible