Duckietown Challenges Home Challenges Submissions

Submission 10819

Submission10819
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 57810
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

57810

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
57810LFv-simsuccessyes0:18:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median1.410047398830135
survival_time_median26.724999999999987
deviation-center-line_median0.6828498071857874
in-drivable-lane_median10.524999999999864


other stats
agent_compute-ego0_max0.013858462451995164
agent_compute-ego0_mean0.01294704547787648
agent_compute-ego0_median0.012822806028686646
agent_compute-ego0_min0.012284107402137456
complete-iteration_max0.20682934837277783
complete-iteration_mean0.1780386762585653
complete-iteration_median0.17352578202941835
complete-iteration_min0.1582737926026465
deviation-center-line_max1.3305865841317828
deviation-center-line_mean0.7309218908980757
deviation-center-line_min0.22740136508894512
deviation-heading_max10.369406616383426
deviation-heading_mean4.196299425866151
deviation-heading_median2.786013595111145
deviation-heading_min0.8437638968588876
driven_any_max7.95029727349804
driven_any_mean4.029561563518762
driven_any_median3.5451947351679687
driven_any_min1.0775595102410684
driven_lanedir_consec_max3.642207656136032
driven_lanedir_consec_mean1.7718617215864851
driven_lanedir_consec_min0.6251444325496383
driven_lanedir_max3.642207656136032
driven_lanedir_mean1.7992330172117286
driven_lanedir_median1.4647899900806218
driven_lanedir_min0.6251444325496383
get_duckie_state_max1.4426293185281329e-06
get_duckie_state_mean1.3347795592080524e-06
get_duckie_state_median1.3655923135948493e-06
get_duckie_state_min1.1653042911143786e-06
get_robot_state_max0.0037575822190182295
get_robot_state_mean0.003633200150268941
get_robot_state_median0.003649387063478663
get_robot_state_min0.003476444255100207
get_state_dump_max0.0047325905316279
get_state_dump_mean0.0045881210018274155
get_state_dump_median0.004641107170677864
get_state_dump_min0.004337679134326034
get_ui_image_max0.036107929223383795
get_ui_image_mean0.029995793046506906
get_ui_image_median0.02879639655973274
get_ui_image_min0.026282449843178332
in-drivable-lane_max39.19999999999904
in-drivable-lane_mean15.999999999999693
in-drivable-lane_min3.750000000000001
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 5.31402948592366, "get_ui_image": 0.027588243557204135, "step_physics": 0.10122472994512342, "survival_time": 39.299999999999905, "driven_lanedir": 3.642207656136032, "get_state_dump": 0.0045510889462927855, "get_robot_state": 0.003644148912635813, "sim_render-ego0": 0.003876205623680027, "get_duckie_state": 1.4426293185281329e-06, "in-drivable-lane": 12.149999999999654, "deviation-heading": 4.584902583015695, "agent_compute-ego0": 0.012284107402137456, "complete-iteration": 0.16821207206464328, "set_robot_commands": 0.0021797095956766015, "deviation-center-line": 1.079422039390939, "driven_lanedir_consec": 3.642207656136032, "sim_compute_sim_state": 0.01074556444015406, "sim_compute_performance-ego0": 0.0020224278582096707}, "LF-norm-zigzag-000-ego0": {"driven_any": 7.95029727349804, "get_ui_image": 0.036107929223383795, "step_physics": 0.12721292263860012, "survival_time": 59.99999999999873, "driven_lanedir": 2.272077434231623, "get_state_dump": 0.0047325905316279, "get_robot_state": 0.0037575822190182295, "sim_render-ego0": 0.004081545821038214, "get_duckie_state": 1.4182034380529247e-06, "in-drivable-lane": 39.19999999999904, "deviation-heading": 10.369406616383426, "agent_compute-ego0": 0.013858462451995164, "complete-iteration": 0.20682934837277783, "set_robot_commands": 0.002233567781789813, "deviation-center-line": 1.3305865841317828, "driven_lanedir_consec": 2.1625922517306497, "sim_compute_sim_state": 0.012688053537665755, "sim_compute_performance-ego0": 0.0020600659563380615}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.0775595102410684, "get_ui_image": 0.030004549562261347, "step_physics": 0.11221298608887062, "survival_time": 8.84999999999999, "driven_lanedir": 0.6251444325496383, "get_state_dump": 0.004337679134326034, "get_robot_state": 0.003476444255100207, "sim_render-ego0": 0.0038382926683747367, "get_duckie_state": 1.1653042911143786e-06, "in-drivable-lane": 3.750000000000001, "deviation-heading": 0.9871246072065948, "agent_compute-ego0": 0.012822783395145716, "complete-iteration": 0.17883949199419344, "set_robot_commands": 0.0022097933158445895, "deviation-center-line": 0.22740136508894512, "driven_lanedir_consec": 0.6251444325496383, "sim_compute_sim_state": 0.007976538679572973, "sim_compute_performance-ego0": 0.001876129193252392}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.7763599844122773, "get_ui_image": 0.026282449843178332, "step_physics": 0.0972611711058818, "survival_time": 14.150000000000066, "driven_lanedir": 0.6575025459296202, "get_state_dump": 0.004731125395062943, "get_robot_state": 0.0036546252143215127, "sim_render-ego0": 0.004001227902694487, "get_duckie_state": 1.3129811891367738e-06, "in-drivable-lane": 8.900000000000077, "deviation-heading": 0.8437638968588876, "agent_compute-ego0": 0.012822828662227576, "complete-iteration": 0.1582737926026465, "set_robot_commands": 0.0022517299987900425, "deviation-center-line": 0.28627757498063555, "driven_lanedir_consec": 0.6575025459296202, "sim_compute_sim_state": 0.005229643532927607, "sim_compute_performance-ego0": 0.001948537960858412}}
set_robot_commands_max0.0022517299987900425
set_robot_commands_mean0.0022187001730252618
set_robot_commands_median0.002221680548817201
set_robot_commands_min0.0021797095956766015
sim_compute_performance-ego0_max0.0020600659563380615
sim_compute_performance-ego0_mean0.001976790242164634
sim_compute_performance-ego0_median0.001985482909534041
sim_compute_performance-ego0_min0.001876129193252392
sim_compute_sim_state_max0.012688053537665755
sim_compute_sim_state_mean0.0091599500475801
sim_compute_sim_state_median0.009361051559863515
sim_compute_sim_state_min0.005229643532927607
sim_render-ego0_max0.004081545821038214
sim_render-ego0_mean0.0039493180039468665
sim_render-ego0_median0.003938716763187257
sim_render-ego0_min0.0038382926683747367
simulation-passed1
step_physics_max0.12721292263860012
step_physics_mean0.109477952444619
step_physics_median0.106718858016997
step_physics_min0.0972611711058818
survival_time_max59.99999999999873
survival_time_mean30.57499999999967
survival_time_min8.84999999999999
No reset possible
57807LFv-simsuccessyes0:19:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
57805LFv-simsuccessyes0:19:10
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51615LFv-simerrorno0:04:37
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140149398678688
- M:video_aido:cmdline(in:/;out:/) 140149397619184
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51607LFv-simhost-errorno0:07:33
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 59, in get_services_id
    raise ZValueError(container_ids=container_ids, services=services, res=res, names=names)
zuper_commons.types.exceptions.ZValueError: 

│ container_ids: [432862db1ea0c7cfa20ab28ebf63e61c658127c20f4d32f88976457d41e073a7]
│      services: dict[3]
│                │ evaluator:
│                │ dict[7]
│                │ │ image: docker.io/andreacensi/aido5-lf-sim-validation-lfv-sim-evaluator@sha256:2bc9fe8514d570141f87b0626353cbc6aebad89d220fecc8876a008efd430515
│                │ │ environment:
│                │ │ dict[10]
│                │ │ │ experiment_manager_parameters:
│                │ │ │ |episodes_per_scenario: 1
│                │ │ │ |episode_length_s: 60.0
│                │ │ │ |min_episode_length_s: 0.0
│                │ │ │ |seed: 888
│                │ │ │ |physics_dt: 0.05
│                │ │ │ |max_failures: 2
│                │ │ │ |fifo_dir: /fifos
│                │ │ │ |sim_in: /fifos/simulator-in
│                │ │ │ |sim_out: /fifos/simulator-out
│                │ │ │ |sm_in: /fifos/scenario_maker-in
│                │ │ │ |sm_out: /fifos/scenario_maker-out
│                │ │ │ |timeout_initialization: 120
│                │ │ │ |timeout_regular: 120
│                │ │ │ |port: 10123
│                │ │ │ |scenarios:
│                │ │ │ |- /scenarios
│                │ │ │ |
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 10819
│                │ │ │ submitter_name: melisande
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_03_21_58_30@sha256:4a8db05667fc3fb319059ac7aae3716e138eb276b5b035628ce187002b3d221a
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ ports: [10123]
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_a4c6b6c14e99}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission10819/LFv-sim-gpu-prod-01_a4c6b6c14e99-job51607-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission10819/LFv-sim-gpu-prod-01_a4c6b6c14e99-job51607-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_02_09_52_39-35352/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ simulator:
│                │ dict[6]
│                │ │ image: docker.io/duckietown/challenge-aido_lf-simulator-gym@sha256:4848042f4d088b99b480cb1fc276e32f956d6b9dee27d70fcbaba500d8a8768c
│                │ │ environment:
│                │ │ dict[12]
│                │ │ │ AIDONODE_CONFIG:
│                │ │ │ |env_constructor: Simulator
│                │ │ │ |env_parameters:
│                │ │ │ |  max_steps: 500001 # we don't want the gym to reset itself
│                │ │ │ |  domain_rand: 0
│                │ │ │ |  camera_width: 640
│                │ │ │ |  camera_height: 480
│                │ │ │ |  distortion: true
│                │ │ │ |  num_tris_distractors: 0
│                │ │ │ |  color_ground: [0, 0.3, 0] # green
│                │ │ │ |  enable_leds: true
│                │ │ │ |
│                │ │ │ AIDONODE_DATA_IN: /fifos/simulator-in
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/simulator-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 10819
│                │ │ │ submitter_name: melisande
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_03_21_58_30@sha256:4a8db05667fc3fb319059ac7aae3716e138eb276b5b035628ce187002b3d221a
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_a4c6b6c14e99}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission10819/LFv-sim-gpu-prod-01_a4c6b6c14e99-job51607-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission10819/LFv-sim-gpu-prod-01_a4c6b6c14e99-job51607-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_02_09_52_39-35352/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│                │ solution-ego0:
│                │ dict[6]
│                │ │ image: docker.io/melisande/aido-submissions@sha256:4a8db05667fc3fb319059ac7aae3716e138eb276b5b035628ce187002b3d221a
│                │ │ environment:
│                │ │ dict[13]
│                │ │ │ AIDONODE_NAME: ego0
│                │ │ │ AIDONODE_DATA_IN: /fifos/ego0-in
│                │ │ │ AIDO_REQUIRE_GPU: 1
│                │ │ │ AIDONODE_DATA_OUT: fifo:/fifos/ego0-out
│                │ │ │ challenge_name: aido5-LF-sim-validation
│                │ │ │ challenge_step_name: LFv-sim
│                │ │ │ submission_id: 10819
│                │ │ │ submitter_name: melisande
│                │ │ │ SUBMISSION_CONTAINER: docker.io/melisande/aido-submissions:2020_11_03_21_58_30@sha256:4a8db05667fc3fb319059ac7aae3716e138eb276b5b035628ce187002b3d221a
│                │ │ │ username: ubuntu
│                │ │ │ uid: 0
│                │ │ │ USER: ubuntu
│                │ │ │ HOME: /fake-home/ubuntu
│                │ │ labels: {org.duckietown.created_by_runner: true, org.duckietown.runner_name: gpu-prod-01_a4c6b6c14e99}
│                │ │ user: 0:0
│                │ │ volumes:
│                │ │ [
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission10819/LFv-sim-gpu-prod-01_a4c6b6c14e99-job51607-a-wd:/challenges:rw,
│                │ │  /tmp/duckietown/aido5-LF-sim-validation/submission10819/LFv-sim-gpu-prod-01_a4c6b6c14e99-job51607-a-fifos:/fifos:rw,
│                │ │  /tmp/duckietown/dt-challenges-runner/20_12_02_09_52_39-35352/fake-ubuntu-home:/fake-home/ubuntu:rw]
│                │ │ networks: {evaluation: {aliases: [evaluation]} }
│           res: {solution-ego0: 432862db1ea0c7cfa20ab28ebf63e61c658127c20f4d32f88976457d41e073a7}
│         names: dict[1]
│                │ 432862db1ea0c7cfa20ab28ebf63e61c658127c20f4d32f88976457d41e073a7: gpu-prod-01_a4c6b6c14e99-job51607-870269_solution-ego0_1

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 777, in get_cr
    cr = run_single(
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/runner.py", line 991, in run_single
    write_logs(wd, project, services=config["services"])
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 120, in write_logs
    services2id: Dict[ServiceName, ContainerID] = get_services_id(wd, project, services)
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner/docker_compose.py", line 63, in get_services_id
    raise DockerComposeFail(msg, output=output.decode(), names=names) from e
duckietown_challenges_runner.docker_compose.DockerComposeFail: Cannot get process ids
│ output: |432862db1ea0c7cfa20ab28ebf63e61c658127c20f4d32f88976457d41e073a7
│         |
│  names: dict[1]
│         │ 432862db1ea0c7cfa20ab28ebf63e61c658127c20f4d32f88976457d41e073a7: gpu-prod-01_a4c6b6c14e99-job51607-870269_solution-ego0_1
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
51600LFv-simhost-errorno0:10:50
The container "evalu [...]
The container "evaluator" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
40904LFv-simsuccessno0:11:00
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38029LFv-simsuccessno0:08:33
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible