Duckietown Challenges Home Challenges Submissions

Submission 11625

Submission11625
Competingyes
Challengeaido5-LF-sim-validation
UserPhilippe Reddy 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 54243
Next
User labelsim-exercise-2
Admin priority50
Blessingn/a
User priority50

54243

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
54243LFv-simsuccessyes0:23:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.5678164030227726
survival_time_median39.17499999999991
deviation-center-line_median2.6830079340266533
in-drivable-lane_median6.649999999999871


other stats
agent_compute-ego0_max0.012871761425059526
agent_compute-ego0_mean0.012627699501808663
agent_compute-ego0_median0.012629617123382366
agent_compute-ego0_min0.012379802335410393
complete-iteration_max0.2810884429649873
complete-iteration_mean0.24430952069296197
complete-iteration_median0.2453081469678174
complete-iteration_min0.2055333458712258
deviation-center-line_max3.2407290615259243
deviation-center-line_mean2.220331535896406
deviation-center-line_min0.2745812140063952
deviation-heading_max15.075714100064092
deviation-heading_mean9.777929491770642
deviation-heading_median11.14824059801056
deviation-heading_min1.7395226709973557
driven_any_max7.9197676902302145
driven_any_mean4.765496093490761
driven_any_median5.059332899178727
driven_any_min1.0235508853753712
driven_lanedir_consec_max3.5364109523922407
driven_lanedir_consec_mean2.3167499197361803
driven_lanedir_consec_min0.5949559205069339
driven_lanedir_max4.14308340955313
driven_lanedir_mean2.7717641199343364
driven_lanedir_median3.174508574838641
driven_lanedir_min0.5949559205069339
get_duckie_state_max2.277934032937755e-06
get_duckie_state_mean2.246861441665906e-06
get_duckie_state_median2.256828292512883e-06
get_duckie_state_min2.195855148700105e-06
get_robot_state_max0.003928605805743824
get_robot_state_mean0.003873895064597555
get_robot_state_median0.003887155675845751
get_robot_state_min0.003792663100954893
get_state_dump_max0.004890295798363893
get_state_dump_mean0.00482997135145829
get_state_dump_median0.004874560316453578
get_state_dump_min0.0046804689745621126
get_ui_image_max0.03836642747575587
get_ui_image_mean0.0324053045412817
get_ui_image_median0.032071798733402074
get_ui_image_min0.027111193222566807
in-drivable-lane_max35.04999999999925
in-drivable-lane_mean12.64999999999974
in-drivable-lane_min2.249999999999961
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 4.740802131604438, "get_ui_image": 0.02956474345663319, "step_physics": 0.15758408476477084, "survival_time": 36.75000000000005, "driven_lanedir": 4.14308340955313, "get_state_dump": 0.004890295798363893, "get_robot_state": 0.003870839657990829, "sim_render-ego0": 0.003910747235235961, "get_duckie_state": 2.277934032937755e-06, "in-drivable-lane": 2.249999999999961, "deviation-heading": 11.60192114655155, "agent_compute-ego0": 0.012715282323567764, "complete-iteration": 0.2267210052713104, "set_robot_commands": 0.002300956650920536, "deviation-center-line": 2.585075098549351, "driven_lanedir_consec": 2.323026608760504, "sim_compute_sim_state": 0.00969582893278288, "sim_compute_performance-ego0": 0.002092382830122243}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.0235508853753712, "get_ui_image": 0.03836642747575587, "step_physics": 0.20151173120195215, "survival_time": 8.74999999999999, "driven_lanedir": 0.5949559205069339, "get_state_dump": 0.004875431006604975, "get_robot_state": 0.003928605805743824, "sim_render-ego0": 0.003995082595131614, "get_duckie_state": 2.254139293323864e-06, "in-drivable-lane": 3.5, "deviation-heading": 1.7395226709973557, "agent_compute-ego0": 0.012543951923196966, "complete-iteration": 0.2810884429649873, "set_robot_commands": 0.002374076030471108, "deviation-center-line": 0.2745812140063952, "driven_lanedir_consec": 0.5949559205069339, "sim_compute_sim_state": 0.011241552504626188, "sim_compute_performance-ego0": 0.002154766158624129}, "LF-norm-techtrack-000-ego0": {"driven_any": 5.377863666753017, "get_ui_image": 0.03457885401017096, "step_physics": 0.18639799068812707, "survival_time": 41.599999999999774, "driven_lanedir": 3.5364109523922407, "get_state_dump": 0.004873689626302181, "get_robot_state": 0.0039034716937006737, "sim_render-ego0": 0.00391846372872269, "get_duckie_state": 2.195855148700105e-06, "in-drivable-lane": 9.79999999999974, "deviation-heading": 15.075714100064092, "agent_compute-ego0": 0.012871761425059526, "complete-iteration": 0.2638952886643244, "set_robot_commands": 0.002360876868752872, "deviation-center-line": 3.2407290615259243, "driven_lanedir_consec": 3.5364109523922407, "sim_compute_sim_state": 0.012730576983448408, "sim_compute_performance-ego0": 0.0021641002554280988}, "LF-norm-small_loop-000-ego0": {"driven_any": 7.9197676902302145, "get_ui_image": 0.027111193222566807, "step_physics": 0.14306487687719951, "survival_time": 59.99999999999873, "driven_lanedir": 2.812606197285042, "get_state_dump": 0.0046804689745621126, "get_robot_state": 0.003792663100954893, "sim_render-ego0": 0.003795382184450275, "get_duckie_state": 2.259517291701902e-06, "in-drivable-lane": 35.04999999999925, "deviation-heading": 10.69456004946957, "agent_compute-ego0": 0.012379802335410393, "complete-iteration": 0.2055333458712258, "set_robot_commands": 0.002359379141852818, "deviation-center-line": 2.7809407695039554, "driven_lanedir_consec": 2.812606197285042, "sim_compute_sim_state": 0.006240264660710598, "sim_compute_performance-ego0": 0.00201763677954376}}
set_robot_commands_max0.002374076030471108
set_robot_commands_mean0.0023488221729993336
set_robot_commands_median0.002360128005302846
set_robot_commands_min0.002300956650920536
sim_compute_performance-ego0_max0.0021641002554280988
sim_compute_performance-ego0_mean0.002107221505929557
sim_compute_performance-ego0_median0.002123574494373186
sim_compute_performance-ego0_min0.00201763677954376
sim_compute_sim_state_max0.012730576983448408
sim_compute_sim_state_mean0.00997705577039202
sim_compute_sim_state_median0.010468690718704533
sim_compute_sim_state_min0.006240264660710598
sim_render-ego0_max0.003995082595131614
sim_render-ego0_mean0.0039049189358851353
sim_render-ego0_median0.003914605481979326
sim_render-ego0_min0.003795382184450275
simulation-passed1
step_physics_max0.20151173120195215
step_physics_mean0.17213967088301238
step_physics_median0.17199103772644897
step_physics_min0.14306487687719951
survival_time_max59.99999999999873
survival_time_mean36.77499999999963
survival_time_min8.74999999999999
No reset possible
54241LFv-simsuccessyes0:29:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
54240LFv-simhost-erroryes0:01:05
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 53, in get_services_id
    container = client.containers.get(container_id)
  File "/usr/local/lib/python3.8/dist-packages/docker/models/containers.py", line 880, in get
    resp = self.client.api.inspect_container(container_id)
  File "/usr/local/lib/python3.8/dist-packages/docker/utils/decorators.py", line 16, in wrapped
    raise errors.NullResource(
docker.errors.NullResource: Resource ID was not provided

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 777, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 991, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 120, in write_logs
    services2id: Dict[ServiceName, ContainerID] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 63, in get_services_id
    raise DockerComposeFail(msg, output=output.decode(), names=names) from e
duckietown_challenges_runner.docker_compose.DockerComposeFail: Cannot get process ids
│ output: ''
│  names: {}
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
54237LFv-simsuccessyes0:25:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
49331LFv-simerrorno0:05:31
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139982905337792
- M:video_aido:cmdline(in:/;out:/) 139982903771200
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
49323LFv-simerrorno0:09:32
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140497829196368
- M:video_aido:cmdline(in:/;out:/) 140497829199392
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
43174LFv-simsuccessno0:08:38
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
43173LFv-simhost-errorno0:07:53
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 687, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 888, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 1388, in write_logs
    services2id: Dict[str, str] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 943, in get_services_id
    raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError: 

│ container_ids: [61272762b7328b8217066162bc15620e4a95c61129c0c4c5f511721581003e42,
│                 a8592e7c554c8b04d907a7e2b06d411bbf11d12dd214fc57787f7f3a1ff6c117,
│                 09e778c7fae56e9c6999aeb754e7d2709f306301513c4408cf87cb9cc658705f]
│      services: dict[4]
│                │ solution:
│                │ dict[5]
│                │ │ image: docker.io/phred123/aido-submissions@sha256:40032f700cbacb663060f4e4c6c6e0c3ce2c32754bfcf57470eb5aaa60c2caec
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_DATA_IN: /fifos/ego-in
│                │ │ │ AIDO_REQUIRE_GPU: 1
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/ego-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/phred123/aido-submissions:2020_11_25_22_54_44@sha256:40032f700cbacb663060f4e4c6c6e0c3ce2c32754bfcf57470eb5aaa60c2caec
│                │ │ │ username: aido
│                │ │ │ uid: 0
│                │ │ │ USER: aido
│                │ │ │ HOME: /fake-home/aido
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /mnt/Data/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11625/LFv-sim-mont01-ee9dcacd535e-1-job43173-a-wd:/challenges:rw,
│                │ │  /mnt/Data/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11625/LFv-sim-mont01-ee9dcacd535e-1-job43173-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-aido-home:/fake-home/aido:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ evaluator:
│                │ dict[6]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-experiment_manager@sha256:48f0b2f34825d15848c3060740ab95a99254d233959586b75aeedf03f1ed5832
│                │ │ environment:
│                │ │ dict[8]
│                │ │ │ experiment_manager_parameters:
│                │ │ │ |episodes_per_scenario: 1
│                │ │ │ |episode_length_s: 15.0
│                │ │ │ |min_episode_length_s: 0.0
│                │ │ │ |seed: 20200922
│                │ │ │ |physics_dt: 0.05
│                │ │ │ |max_failures: 2
│                │ │ │ |fifo_dir: /fifos
│                │ │ │ |sim_in: /fifos/simulator-in
│                │ │ │ |sim_out: /fifos/simulator-out
│                │ │ │ |sm_in: /fifos/scenario_maker-in
│                │ │ │ |sm_out: /fifos/scenario_maker-out
│                │ │ │ |timeout_initialization: 120
│                │ │ │ |timeout_regular: 120
│                │ │ │ |
│                │ │ │ |port: 10123 # visualization port
│                │ │ │ |
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/phred123/aido-submissions:2020_11_25_22_54_44@sha256:40032f700cbacb663060f4e4c6c6e0c3ce2c32754bfcf57470eb5aaa60c2caec
│                │ │ │ username: aido
│                │ │ │ uid: 0
│                │ │ │ USER: aido
│                │ │ │ HOME: /fake-home/aido
│                │ │ ports: [10123]
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /mnt/Data/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11625/LFv-sim-mont01-ee9dcacd535e-1-job43173-a-wd:/challenges:rw,
│                │ │  /mnt/Data/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11625/LFv-sim-mont01-ee9dcacd535e-1-job43173-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-aido-home:/fake-home/aido:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ simulator:
│                │ dict[5]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:953ba9f1437e3f267db1f4ff2e58340de33f6c207283f26def16fc8612b9506e
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |env_constructor: Simulator
│                │ │ │ |env_parameters:
│                │ │ │ |  max_steps: 500001 # we don't want the gym to reset itself
│                │ │ │ |  domain_rand: 0
│                │ │ │ |  camera_width: 640
│                │ │ │ |  camera_height: 480
│                │ │ │ |  distortion: true
│                │ │ │ |  num_tris_distractors: 0
│                │ │ │ |  color_ground: [0, 0.3, 0] # green
│                │ │ │ |  enable_leds: true
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/simulator-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/phred123/aido-submissions:2020_11_25_22_54_44@sha256:40032f700cbacb663060f4e4c6c6e0c3ce2c32754bfcf57470eb5aaa60c2caec
│                │ │ │ username: aido
│                │ │ │ uid: 0
│                │ │ │ USER: aido
│                │ │ │ HOME: /fake-home/aido
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /mnt/Data/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11625/LFv-sim-mont01-ee9dcacd535e-1-job43173-a-wd:/challenges:rw,
│                │ │  /mnt/Data/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11625/LFv-sim-mont01-ee9dcacd535e-1-job43173-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-aido-home:/fake-home/aido:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ scenario_maker:
│                │ dict[5]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-scenario_maker@sha256:1562e17ba46090cdedbe68b8584827f1b3730f2786f6abd52e8790b4f0f1e63c
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |maps:
│                │ │ │ |- ETHZ_autolab_technical_track
│                │ │ │ |scenarios_per_map: 4
│                │ │ │ |robots_npcs: []
│                │ │ │ |robots_pcs: [ego]
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/scenario_maker-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/scenario_maker-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ SUBMISSION_CONTAINER: docker.io/phred123/aido-submissions:2020_11_25_22_54_44@sha256:40032f700cbacb663060f4e4c6c6e0c3ce2c32754bfcf57470eb5aaa60c2caec
│                │ │ │ username: aido
│                │ │ │ uid: 0
│                │ │ │ USER: aido
│                │ │ │ HOME: /fake-home/aido
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /mnt/Data/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11625/LFv-sim-mont01-ee9dcacd535e-1-job43173-a-wd:/challenges:rw,
│                │ │  /mnt/Data/duckietown-tmp-evals/production/aido5-LF-sim-validation/submission11625/LFv-sim-mont01-ee9dcacd535e-1-job43173-a-fifos:/fifos:rw,
│                │ │  /tmp/fake-aido-home:/fake-home/aido:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│           res: dict[3]
│                │ evaluator: 61272762b7328b8217066162bc15620e4a95c61129c0c4c5f511721581003e42
│                │ simulator: a8592e7c554c8b04d907a7e2b06d411bbf11d12dd214fc57787f7f3a1ff6c117
│                │ solution: 09e778c7fae56e9c6999aeb754e7d2709f306301513c4408cf87cb9cc658705f
│         names: dict[3]
│                │ 61272762b7328b8217066162bc15620e4a95c61129c0c4c5f511721581003e42: mont01-ee9dcacd535e-1-job43173-663263_evaluator_1
│                │ a8592e7c554c8b04d907a7e2b06d411bbf11d12dd214fc57787f7f3a1ff6c117: mont01-ee9dcacd535e-1-job43173-663263_simulator_1
│                │ 09e778c7fae56e9c6999aeb754e7d2709f306301513c4408cf87cb9cc658705f: mont01-ee9dcacd535e-1-job43173-663263_solution_1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
43171LFv-simsuccessno0:11:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible