Duckietown Challenges Home Challenges Submissions

Submission 9331

Submission9331
Competingyes
Challengeaido5-LF-sim-validation
UserYishu Malhotra 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58171
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58171

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58171LFv-simsuccessyes0:07:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.483347440463484
survival_time_median6.274999999999986
deviation-center-line_median0.1613876658352726
in-drivable-lane_median3.874999999999987


other stats
agent_compute-ego0_max0.013182440189400103
agent_compute-ego0_mean0.01259390594056668
agent_compute-ego0_median0.012699179656622428
agent_compute-ego0_min0.011794824259621755
complete-iteration_max0.21458939350012576
complete-iteration_mean0.18683969132430672
complete-iteration_median0.18969278080876345
complete-iteration_min0.15338381017957414
deviation-center-line_max0.19590540594081715
deviation-center-line_mean0.1506933110229952
deviation-center-line_min0.08409250648061849
deviation-heading_max1.599448758093649
deviation-heading_mean0.981835311929708
deviation-heading_median0.9465614366685352
deviation-heading_min0.43476961628811217
driven_any_max3.0216631221771526
driven_any_mean1.7533193278529846
driven_any_median1.4633123445520355
driven_any_min1.064989500130715
driven_lanedir_consec_max0.6416825625224958
driven_lanedir_consec_mean0.4702482444539515
driven_lanedir_consec_min0.27261553436634234
driven_lanedir_max0.6416825625224958
driven_lanedir_mean0.4702482444539515
driven_lanedir_median0.483347440463484
driven_lanedir_min0.27261553436634234
get_duckie_state_max1.343358464601661e-06
get_duckie_state_mean1.239415592684798e-06
get_duckie_state_median1.2279846967556776e-06
get_duckie_state_min1.1583345126261753e-06
get_robot_state_max0.0039315079197739106
get_robot_state_mean0.003727002607684366
get_robot_state_median0.0037215363760371096
get_robot_state_min0.0035334297588893344
get_state_dump_max0.004927435306587604
get_state_dump_mean0.0046840071813466656
get_state_dump_median0.004704689473354782
get_state_dump_min0.004399214472089495
get_ui_image_max0.03526779191683879
get_ui_image_mean0.03104400979983085
get_ui_image_median0.03141345927492891
get_ui_image_min0.02608132873262678
in-drivable-lane_max9.950000000000024
in-drivable-lane_mean4.9499999999999975
in-drivable-lane_min2.0999999999999925
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.0216631221771526, "get_ui_image": 0.029252101393306956, "step_physics": 0.10049438877265994, "survival_time": 11.850000000000032, "driven_lanedir": 0.27261553436634234, "get_state_dump": 0.004748844298995843, "get_robot_state": 0.0038183486762167025, "sim_render-ego0": 0.003925297440600996, "get_duckie_state": 1.343358464601661e-06, "in-drivable-lane": 9.950000000000024, "deviation-heading": 1.4107195214541324, "agent_compute-ego0": 0.01297949843046044, "complete-iteration": 0.16988725321633474, "set_robot_commands": 0.0022831113398575982, "deviation-center-line": 0.19590540594081715, "driven_lanedir_consec": 0.27261553436634234, "sim_compute_sim_state": 0.010257267150558344, "sim_compute_performance-ego0": 0.002038185336008794}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.2746331024189033, "get_ui_image": 0.03526779191683879, "step_physics": 0.1358169770873753, "survival_time": 5.599999999999988, "driven_lanedir": 0.3527375269616988, "get_state_dump": 0.00466053464771372, "get_robot_state": 0.003624724075857517, "sim_render-ego0": 0.003774632394841287, "get_duckie_state": 1.1583345126261753e-06, "in-drivable-lane": 3.3999999999999906, "deviation-heading": 1.599448758093649, "agent_compute-ego0": 0.012418860882784414, "complete-iteration": 0.20949830840119216, "set_robot_commands": 0.002137903618601571, "deviation-center-line": 0.17202345944862188, "driven_lanedir_consec": 0.3527375269616988, "sim_compute_sim_state": 0.009773302922206642, "sim_compute_performance-ego0": 0.001940822179338573}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.064989500130715, "get_ui_image": 0.03357481715655086, "step_physics": 0.14213212571962916, "survival_time": 4.899999999999991, "driven_lanedir": 0.6416825625224958, "get_state_dump": 0.004927435306587604, "get_robot_state": 0.0039315079197739106, "sim_render-ego0": 0.003974054798935399, "get_duckie_state": 1.2860153660629734e-06, "in-drivable-lane": 2.0999999999999925, "deviation-heading": 0.48240335188293815, "agent_compute-ego0": 0.013182440189400103, "complete-iteration": 0.21458939350012576, "set_robot_commands": 0.0022902681369974154, "deviation-center-line": 0.08409250648061849, "driven_lanedir_consec": 0.6416825625224958, "sim_compute_sim_state": 0.00837383848248106, "sim_compute_performance-ego0": 0.002113585520272303}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.651991586685168, "get_ui_image": 0.02608132873262678, "step_physics": 0.09488168954849244, "survival_time": 6.949999999999983, "driven_lanedir": 0.6139573539652692, "get_state_dump": 0.004399214472089495, "get_robot_state": 0.0035334297588893344, "sim_render-ego0": 0.003622145312173026, "get_duckie_state": 1.1699540274483818e-06, "in-drivable-lane": 4.3499999999999845, "deviation-heading": 0.43476961628811217, "agent_compute-ego0": 0.011794824259621755, "complete-iteration": 0.15338381017957414, "set_robot_commands": 0.00224682092666626, "deviation-center-line": 0.15075187222192327, "driven_lanedir_consec": 0.6139573539652692, "sim_compute_sim_state": 0.00489722490310669, "sim_compute_performance-ego0": 0.0018431408064705985}}
set_robot_commands_max0.0022902681369974154
set_robot_commands_mean0.0022395260055307112
set_robot_commands_median0.002264966133261929
set_robot_commands_min0.002137903618601571
sim_compute_performance-ego0_max0.002113585520272303
sim_compute_performance-ego0_mean0.001983933460522567
sim_compute_performance-ego0_median0.001989503757673683
sim_compute_performance-ego0_min0.0018431408064705985
sim_compute_sim_state_max0.010257267150558344
sim_compute_sim_state_mean0.008325408364588184
sim_compute_sim_state_median0.009073570702343853
sim_compute_sim_state_min0.00489722490310669
sim_render-ego0_max0.003974054798935399
sim_render-ego0_mean0.003824032486637677
sim_render-ego0_median0.0038499649177211417
sim_render-ego0_min0.003622145312173026
simulation-passed1
step_physics_max0.14213212571962916
step_physics_mean0.11833129528203924
step_physics_median0.11815568293001764
step_physics_min0.09488168954849244
survival_time_max11.850000000000032
survival_time_mean7.324999999999999
survival_time_min4.899999999999991
No reset possible
52296LFv-simerrorno0:01:47
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140532352630848
- M:video_aido:cmdline(in:/;out:/) 140532352312368
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52286LFv-simerrorno0:04:46
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139999176921040
- M:video_aido:cmdline(in:/;out:/) 139999176920992
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52282LFv-simhost-errorno0:05:31
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52272LFv-simerrorno0:03:28
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140411876365888
- M:video_aido:cmdline(in:/;out:/) 140411876366272
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41707LFv-simsuccessno0:05:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38157LFv-simerrorno0:00:39
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9331/LFv-sim-mont05-227ea22a5fff-1-job38157-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38147LFv-simerrorno0:00:46
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.49-py3.8.egg/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission9331/LFv-sim-mont04-e828c68b6a88-1-job38147-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36306LFv-simerrorno0:00:47
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1088, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9331/LFv-sim-Sandy1-sandy-1-job36306-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35736LFv-simerrorno0:00:50
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.

No results found! Something very wrong. 
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1063, in run_one
    cr = read_challenge_results(wd)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges/challenge_results.py", line 90, in read_challenge_results
    raise NoResultsFound(msg)
duckietown_challenges.challenge_results.NoResultsFound: File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9331/LFv-sim-noname-sandy-1-job35736-a-wd/challenge-results/challenge_results.yaml' does not exist.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35350LFv-simerrorno0:10:58
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9331/LFv-sim-reg03-0c28c9d61367-1-job35350:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9331/LFv-sim-reg03-0c28c9d61367-1-job35350/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9331/LFv-sim-reg03-0c28c9d61367-1-job35350/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9331/LFv-sim-reg03-0c28c9d61367-1-job35350/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9331/LFv-sim-reg03-0c28c9d61367-1-job35350/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission9331/LFv-sim-reg03-0c28c9d61367-1-job35350/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34985LFv-simsuccessno0:11:47
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34694LFv-simsuccessno0:12:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34693LFv-simsuccessno0:13:35
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible