Duckietown Challenges Home Challenges Submissions

Submission 11049

Submission11049
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 56973
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

56973

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
56973LFv-simsuccessyes0:29:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median3.86446349643067
survival_time_median55.07499999999901
deviation-center-line_median1.4287181973351706
in-drivable-lane_median10.075000000000014


other stats
agent_compute-ego0_max0.04344900301224338
agent_compute-ego0_mean0.020724035409159857
agent_compute-ego0_median0.013487739430478372
agent_compute-ego0_min0.012471659763439282
complete-iteration_max0.1932903362452041
complete-iteration_mean0.18297176150864003
complete-iteration_median0.1823180014126104
complete-iteration_min0.17396070696413518
deviation-center-line_max2.3793357482449293
deviation-center-line_mean1.4308990811849005
deviation-center-line_min0.48682418182433074
deviation-heading_max11.501199858487457
deviation-heading_mean7.672409016608505
deviation-heading_median8.025283940708723
deviation-heading_min3.1378683265291194
driven_any_max7.513931591884645
driven_any_mean6.259979287338326
driven_any_median6.8039206501857015
driven_any_min3.918144257097255
driven_lanedir_consec_max7.270695903958918
driven_lanedir_consec_mean4.181811026539042
driven_lanedir_consec_min1.7276212093359082
driven_lanedir_max7.270695903958918
driven_lanedir_mean4.438004455923018
driven_lanedir_median4.376850355198623
driven_lanedir_min1.7276212093359082
get_duckie_state_max1.4290213584899903e-06
get_duckie_state_mean1.2563472205510978e-06
get_duckie_state_median1.2160334844089638e-06
get_duckie_state_min1.164300554896473e-06
get_robot_state_max0.003718084469437599
get_robot_state_mean0.0036101241316090073
get_robot_state_median0.003643690794548564
get_robot_state_min0.0034350304679013014
get_state_dump_max0.00469866693019867
get_state_dump_mean0.004562099818133029
get_state_dump_median0.004533321689846378
get_state_dump_min0.004483088962640691
get_ui_image_max0.033158961978184985
get_ui_image_mean0.029482982912010028
get_ui_image_median0.029778857831213926
get_ui_image_min0.02521525400742728
in-drivable-lane_max35.04999999999913
in-drivable-lane_mean13.79999999999979
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.918144257097255, "get_ui_image": 0.0281915083527565, "step_physics": 0.10606791190803053, "survival_time": 31.95000000000032, "driven_lanedir": 1.9832137902910192, "get_state_dump": 0.00469866693019867, "get_robot_state": 0.003718084469437599, "sim_render-ego0": 0.003927405551075935, "get_duckie_state": 1.4290213584899903e-06, "in-drivable-lane": 16.15000000000023, "deviation-heading": 3.1378683265291194, "agent_compute-ego0": 0.012859588861465454, "complete-iteration": 0.17396070696413518, "set_robot_commands": 0.0021839506924152376, "deviation-center-line": 0.48682418182433074, "driven_lanedir_consec": 1.9832137902910192, "sim_compute_sim_state": 0.010232184827327727, "sim_compute_performance-ego0": 0.0019974425435066224}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.4362995088845585, "get_ui_image": 0.033158961978184985, "step_physics": 0.11892826392390549, "survival_time": 59.99999999999873, "driven_lanedir": 7.270695903958918, "get_state_dump": 0.004483088962640691, "get_robot_state": 0.0034350304679013014, "sim_render-ego0": 0.0037292320464274767, "get_duckie_state": 1.164300554896473e-06, "in-drivable-lane": 0.0, "deviation-heading": 10.166113930529468, "agent_compute-ego0": 0.01411588999949129, "complete-iteration": 0.1932903362452041, "set_robot_commands": 0.0020902180651840223, "deviation-center-line": 2.3793357482449293, "driven_lanedir_consec": 7.270695903958918, "sim_compute_sim_state": 0.011404965541245638, "sim_compute_performance-ego0": 0.0018673514843384096}, "LF-norm-techtrack-000-ego0": {"driven_any": 7.513931591884645, "get_ui_image": 0.03136620730967135, "step_physics": 0.1117663566727896, "survival_time": 59.149999999998776, "driven_lanedir": 6.770486920106226, "get_state_dump": 0.004546687611051508, "get_robot_state": 0.003677749754609288, "sim_render-ego0": 0.003840198992071925, "get_duckie_state": 1.2271307610176703e-06, "in-drivable-lane": 3.9999999999997966, "deviation-heading": 11.501199858487457, "agent_compute-ego0": 0.012471659763439282, "complete-iteration": 0.1832434321577485, "set_robot_commands": 0.0021694741539053015, "deviation-center-line": 1.9264746579376124, "driven_lanedir_consec": 5.745713202570321, "sim_compute_sim_state": 0.011354403922686706, "sim_compute_performance-ego0": 0.0019696556233070994}, "LF-norm-small_loop-000-ego0": {"driven_any": 6.1715417914868445, "get_ui_image": 0.02521525400742728, "step_physics": 0.08985205612033167, "survival_time": 50.99999999999924, "driven_lanedir": 1.7276212093359082, "get_state_dump": 0.004519955768641249, "get_robot_state": 0.00360963183448784, "sim_render-ego0": 0.003891979445439711, "get_duckie_state": 1.204936207800257e-06, "in-drivable-lane": 35.04999999999913, "deviation-heading": 5.884453950887979, "agent_compute-ego0": 0.04344900301224338, "complete-iteration": 0.18139257066747233, "set_robot_commands": 0.0022441285364074408, "deviation-center-line": 0.9309617367327288, "driven_lanedir_consec": 1.7276212093359082, "sim_compute_sim_state": 0.006571699192895245, "sim_compute_performance-ego0": 0.001953893964134164}}
set_robot_commands_max0.0022441285364074408
set_robot_commands_mean0.0021719428619780008
set_robot_commands_median0.0021767124231602698
set_robot_commands_min0.0020902180651840223
sim_compute_performance-ego0_max0.0019974425435066224
sim_compute_performance-ego0_mean0.0019470859038215737
sim_compute_performance-ego0_median0.0019617747937206317
sim_compute_performance-ego0_min0.0018673514843384096
sim_compute_sim_state_max0.011404965541245638
sim_compute_sim_state_mean0.00989081337103883
sim_compute_sim_state_median0.010793294375007218
sim_compute_sim_state_min0.006571699192895245
sim_render-ego0_max0.003927405551075935
sim_render-ego0_mean0.003847204008753762
sim_render-ego0_median0.0038660892187558182
sim_render-ego0_min0.0037292320464274767
simulation-passed1
step_physics_max0.11892826392390549
step_physics_mean0.10665364715626433
step_physics_median0.10891713429041006
step_physics_min0.08985205612033167
survival_time_max59.99999999999873
survival_time_mean50.52499999999927
survival_time_min31.95000000000032
No reset possible
56962LFv-simsuccessyes0:18:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
56960LFv-simsuccessyes0:26:08
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
56954LFv-simsuccessyes0:27:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
50824LFv-simerrorno0:08:27
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140665505701024
- M:video_aido:cmdline(in:/;out:/) 140665505625088
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
50812LFv-simerrorno0:14:47
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140643015041232
- M:video_aido:cmdline(in:/;out:/) 140643015044640
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
50810LFv-simerrorno0:08:56
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139932281181808
- M:video_aido:cmdline(in:/;out:/) 139932281182864
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
50795LFv-simerrorno0:03:31
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139989960653264
- M:video_aido:cmdline(in:/;out:/) 139989960653792
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40390LFv-simsuccessno0:09:17
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40388LFv-simsuccessno0:09:14
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40387LFv-simsuccessno0:09:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40386LFv-simsuccessno0:09:40
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40370LFv-simhost-errorno0:12:27
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 687, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 888, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1388, in write_logs
    services2id: Dict[str, str] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 943, in get_services_id
    raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError: 

│ container_ids: [b1ab41c3f6ca2a9458bd228adf8528178c777eea61301b35f4338202e951a9a8,
│                 a366ed85daab005d26af95222be1b119513d86bd86dfec53223b0d20f7c7ff5e,
│                 7d40b38e05ab15ce69eac7434225fbd4b046a0a6bf49a10a73772883283c62cc]
│      services: dict[4]
│                │ solution:
│                │ dict[5]
│                │ │ image: docker.io/melisande/aido-submissions@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_DATA_IN: /fifos/ego-in
│                │ │ │ AIDO_REQUIRE_GPU: 1
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/ego-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg05-221dc42400b8-1-job40370-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg05-221dc42400b8-1-job40370-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ evaluator:
│                │ dict[6]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-experiment_manager@sha256:48f0b2f34825d15848c3060740ab95a99254d233959586b75aeedf03f1ed5832
│                │ │ environment:
│                │ │ dict[8]
│                │ │ │ experiment_manager_parameters:
│                │ │ │ |episodes_per_scenario: 1
│                │ │ │ |episode_length_s: 15.0
│                │ │ │ |min_episode_length_s: 0.0
│                │ │ │ |seed: 20200922
│                │ │ │ |physics_dt: 0.05
│                │ │ │ |max_failures: 2
│                │ │ │ |fifo_dir: /fifos
│                │ │ │ |sim_in: /fifos/simulator-in
│                │ │ │ |sim_out: /fifos/simulator-out
│                │ │ │ |sm_in: /fifos/scenario_maker-in
│                │ │ │ |sm_out: /fifos/scenario_maker-out
│                │ │ │ |timeout_initialization: 120
│                │ │ │ |timeout_regular: 120
│                │ │ │ |
│                │ │ │ |port: 10123 # visualization port
│                │ │ │ |
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ ports: [10123]
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg05-221dc42400b8-1-job40370-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg05-221dc42400b8-1-job40370-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ simulator:
│                │ dict[5]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:953ba9f1437e3f267db1f4ff2e58340de33f6c207283f26def16fc8612b9506e
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |env_constructor: Simulator
│                │ │ │ |env_parameters:
│                │ │ │ |  max_steps: 500001 # we don't want the gym to reset itself
│                │ │ │ |  domain_rand: 0
│                │ │ │ |  camera_width: 640
│                │ │ │ |  camera_height: 480
│                │ │ │ |  distortion: true
│                │ │ │ |  num_tris_distractors: 0
│                │ │ │ |  color_ground: [0, 0.3, 0] # green
│                │ │ │ |  enable_leds: true
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/simulator-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg05-221dc42400b8-1-job40370-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg05-221dc42400b8-1-job40370-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ scenario_maker:
│                │ dict[5]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-scenario_maker@sha256:1562e17ba46090cdedbe68b8584827f1b3730f2786f6abd52e8790b4f0f1e63c
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |maps:
│                │ │ │ |- ETHZ_autolab_technical_track
│                │ │ │ |scenarios_per_map: 4
│                │ │ │ |robots_npcs: []
│                │ │ │ |robots_pcs: [ego]
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/scenario_maker-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/scenario_maker-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg05-221dc42400b8-1-job40370-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg05-221dc42400b8-1-job40370-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│           res: dict[3]
│                │ evaluator: b1ab41c3f6ca2a9458bd228adf8528178c777eea61301b35f4338202e951a9a8
│                │ simulator: a366ed85daab005d26af95222be1b119513d86bd86dfec53223b0d20f7c7ff5e
│                │ solution: 7d40b38e05ab15ce69eac7434225fbd4b046a0a6bf49a10a73772883283c62cc
│         names: dict[3]
│                │ b1ab41c3f6ca2a9458bd228adf8528178c777eea61301b35f4338202e951a9a8: reg05-221dc42400b8-1-job40370-668327_evaluator_1
│                │ a366ed85daab005d26af95222be1b119513d86bd86dfec53223b0d20f7c7ff5e: reg05-221dc42400b8-1-job40370-668327_simulator_1
│                │ 7d40b38e05ab15ce69eac7434225fbd4b046a0a6bf49a10a73772883283c62cc: reg05-221dc42400b8-1-job40370-668327_solution_1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40368LFv-simsuccessno0:09:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40367LFv-simhost-errorno0:12:21
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 687, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 888, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1388, in write_logs
    services2id: Dict[str, str] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 943, in get_services_id
    raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError: 

│ container_ids: [357f743284d6656cc0653dfa1034655b6c454c6d1bab4891bcd98b875b2405f1,
│                 0958d722982f687c4f28a244ff8b98be5db46f11a6ed7eaea9c4dbf728590d6d,
│                 0ade1bebb496bd830ade89fb3a48d4d31b8d6d4e50fd05328add589c7dda6205]
│      services: dict[4]
│                │ solution:
│                │ dict[5]
│                │ │ image: docker.io/melisande/aido-submissions@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_DATA_IN: /fifos/ego-in
│                │ │ │ AIDO_REQUIRE_GPU: 1
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/ego-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg01-bf67cb02b1e0-1-job40367-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg01-bf67cb02b1e0-1-job40367-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ evaluator:
│                │ dict[6]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-experiment_manager@sha256:48f0b2f34825d15848c3060740ab95a99254d233959586b75aeedf03f1ed5832
│                │ │ environment:
│                │ │ dict[8]
│                │ │ │ experiment_manager_parameters:
│                │ │ │ |episodes_per_scenario: 1
│                │ │ │ |episode_length_s: 15.0
│                │ │ │ |min_episode_length_s: 0.0
│                │ │ │ |seed: 20200922
│                │ │ │ |physics_dt: 0.05
│                │ │ │ |max_failures: 2
│                │ │ │ |fifo_dir: /fifos
│                │ │ │ |sim_in: /fifos/simulator-in
│                │ │ │ |sim_out: /fifos/simulator-out
│                │ │ │ |sm_in: /fifos/scenario_maker-in
│                │ │ │ |sm_out: /fifos/scenario_maker-out
│                │ │ │ |timeout_initialization: 120
│                │ │ │ |timeout_regular: 120
│                │ │ │ |
│                │ │ │ |port: 10123 # visualization port
│                │ │ │ |
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ ports: [10123]
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg01-bf67cb02b1e0-1-job40367-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg01-bf67cb02b1e0-1-job40367-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ simulator:
│                │ dict[5]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:953ba9f1437e3f267db1f4ff2e58340de33f6c207283f26def16fc8612b9506e
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |env_constructor: Simulator
│                │ │ │ |env_parameters:
│                │ │ │ |  max_steps: 500001 # we don't want the gym to reset itself
│                │ │ │ |  domain_rand: 0
│                │ │ │ |  camera_width: 640
│                │ │ │ |  camera_height: 480
│                │ │ │ |  distortion: true
│                │ │ │ |  num_tris_distractors: 0
│                │ │ │ |  color_ground: [0, 0.3, 0] # green
│                │ │ │ |  enable_leds: true
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/simulator-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg01-bf67cb02b1e0-1-job40367-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg01-bf67cb02b1e0-1-job40367-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ scenario_maker:
│                │ dict[5]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-scenario_maker@sha256:1562e17ba46090cdedbe68b8584827f1b3730f2786f6abd52e8790b4f0f1e63c
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |maps:
│                │ │ │ |- ETHZ_autolab_technical_track
│                │ │ │ |scenarios_per_map: 4
│                │ │ │ |robots_npcs: []
│                │ │ │ |robots_pcs: [ego]
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/scenario_maker-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/scenario_maker-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg01-bf67cb02b1e0-1-job40367-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg01-bf67cb02b1e0-1-job40367-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│           res: dict[3]
│                │ evaluator: 357f743284d6656cc0653dfa1034655b6c454c6d1bab4891bcd98b875b2405f1
│                │ simulator: 0958d722982f687c4f28a244ff8b98be5db46f11a6ed7eaea9c4dbf728590d6d
│                │ solution: 0ade1bebb496bd830ade89fb3a48d4d31b8d6d4e50fd05328add589c7dda6205
│         names: dict[3]
│                │ 357f743284d6656cc0653dfa1034655b6c454c6d1bab4891bcd98b875b2405f1: reg01-bf67cb02b1e0-1-job40367-277764_evaluator_1
│                │ 0958d722982f687c4f28a244ff8b98be5db46f11a6ed7eaea9c4dbf728590d6d: reg01-bf67cb02b1e0-1-job40367-277764_simulator_1
│                │ 0ade1bebb496bd830ade89fb3a48d4d31b8d6d4e50fd05328add589c7dda6205: reg01-bf67cb02b1e0-1-job40367-277764_solution_1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40366LFv-simhost-errorno0:12:19
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 687, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 888, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1388, in write_logs
    services2id: Dict[str, str] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 943, in get_services_id
    raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError: 

│ container_ids: [2b57b2ced8c55de5535033f62bf3f3ac037b07df9613c47ac7b020553e795d1c,
│                 6727b4681f04f6fe87207512207b3ef1f4ef795a1016910fde6bb00ac02d35b1,
│                 78b9b993aa2a79892be9c23b2af90681f23d517607c0a919c10b55261526d705]
│      services: dict[4]
│                │ solution:
│                │ dict[5]
│                │ │ image: docker.io/melisande/aido-submissions@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_DATA_IN: /fifos/ego-in
│                │ │ │ AIDO_REQUIRE_GPU: 1
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/ego-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg04-cb6499190cf5-1-job40366-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg04-cb6499190cf5-1-job40366-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ evaluator:
│                │ dict[6]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-experiment_manager@sha256:48f0b2f34825d15848c3060740ab95a99254d233959586b75aeedf03f1ed5832
│                │ │ environment:
│                │ │ dict[8]
│                │ │ │ experiment_manager_parameters:
│                │ │ │ |episodes_per_scenario: 1
│                │ │ │ |episode_length_s: 15.0
│                │ │ │ |min_episode_length_s: 0.0
│                │ │ │ |seed: 20200922
│                │ │ │ |physics_dt: 0.05
│                │ │ │ |max_failures: 2
│                │ │ │ |fifo_dir: /fifos
│                │ │ │ |sim_in: /fifos/simulator-in
│                │ │ │ |sim_out: /fifos/simulator-out
│                │ │ │ |sm_in: /fifos/scenario_maker-in
│                │ │ │ |sm_out: /fifos/scenario_maker-out
│                │ │ │ |timeout_initialization: 120
│                │ │ │ |timeout_regular: 120
│                │ │ │ |
│                │ │ │ |port: 10123 # visualization port
│                │ │ │ |
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ ports: [10123]
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg04-cb6499190cf5-1-job40366-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg04-cb6499190cf5-1-job40366-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ simulator:
│                │ dict[5]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:953ba9f1437e3f267db1f4ff2e58340de33f6c207283f26def16fc8612b9506e
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |env_constructor: Simulator
│                │ │ │ |env_parameters:
│                │ │ │ |  max_steps: 500001 # we don't want the gym to reset itself
│                │ │ │ |  domain_rand: 0
│                │ │ │ |  camera_width: 640
│                │ │ │ |  camera_height: 480
│                │ │ │ |  distortion: true
│                │ │ │ |  num_tris_distractors: 0
│                │ │ │ |  color_ground: [0, 0.3, 0] # green
│                │ │ │ |  enable_leds: true
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/simulator-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg04-cb6499190cf5-1-job40366-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg04-cb6499190cf5-1-job40366-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ scenario_maker:
│                │ dict[5]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-scenario_maker@sha256:1562e17ba46090cdedbe68b8584827f1b3730f2786f6abd52e8790b4f0f1e63c
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |maps:
│                │ │ │ |- ETHZ_autolab_technical_track
│                │ │ │ |scenarios_per_map: 4
│                │ │ │ |robots_npcs: []
│                │ │ │ |robots_pcs: [ego]
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/scenario_maker-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/scenario_maker-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_04_22_11_18@sha256:bb5bf7f61d3ffe823b861d092df1eff8e64ad4cc7ae8ffdc98739f059b6891ff
│                │ │ │ username: andrea
│                │ │ │ uid: 0
│                │ │ │ USER: andrea
│                │ │ │ HOME: /fake-home/andrea
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg04-cb6499190cf5-1-job40366-a-wd:/challenges:rw,
│                │ │  /media/sdb/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11049/LFv-sim-reg04-cb6499190cf5-1-job40366-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-andrea-home:/fake-home/andrea:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│           res: dict[3]
│                │ evaluator: 2b57b2ced8c55de5535033f62bf3f3ac037b07df9613c47ac7b020553e795d1c
│                │ simulator: 6727b4681f04f6fe87207512207b3ef1f4ef795a1016910fde6bb00ac02d35b1
│                │ solution: 78b9b993aa2a79892be9c23b2af90681f23d517607c0a919c10b55261526d705
│         names: dict[3]
│                │ 2b57b2ced8c55de5535033f62bf3f3ac037b07df9613c47ac7b020553e795d1c: reg04-cb6499190cf5-1-job40366-342118_evaluator_1
│                │ 6727b4681f04f6fe87207512207b3ef1f4ef795a1016910fde6bb00ac02d35b1: reg04-cb6499190cf5-1-job40366-342118_simulator_1
│                │ 78b9b993aa2a79892be9c23b2af90681f23d517607c0a919c10b55261526d705: reg04-cb6499190cf5-1-job40366-342118_solution_1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40365LFv-simsuccessno0:10:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
39712LFv-simsuccessno0:11:44
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38809LFv-simsuccessno0:08:34
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38808LFv-simsuccessno0:11:29
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible