Duckietown Challenges Home Challenges Submissions

Submission 6817

Submission6817
Competingyes
Challengeaido5-LF-sim-validation
UserAnthony Courchesne 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58608
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58608

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58608LFv-simsuccessyes0:20:07
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median2.1069994852979335
survival_time_median32.15000000000004
deviation-center-line_median1.4181389612590551
in-drivable-lane_median7.0500000000000735


other stats
agent_compute-ego0_max0.012814802508200367
agent_compute-ego0_mean0.012407074184303975
agent_compute-ego0_median0.012394351755211183
agent_compute-ego0_min0.012024790718593168
complete-iteration_max0.234248528557439
complete-iteration_mean0.1892088270579365
complete-iteration_median0.17807515445149236
complete-iteration_min0.16643647077132245
deviation-center-line_max3.281597209017729
deviation-center-line_mean1.6480767139539505
deviation-center-line_min0.47443172427996255
deviation-heading_max7.303409713730261
deviation-heading_mean4.038930044702247
deviation-heading_median3.3853092005261294
deviation-heading_min2.081692064026468
driven_any_max7.921176457117004
driven_any_mean4.419285506913345
driven_any_median4.127197321963932
driven_any_min1.5015709266085155
driven_lanedir_consec_max4.182696994793213
driven_lanedir_consec_mean2.4341788149141457
driven_lanedir_consec_min1.340019294267502
driven_lanedir_max7.244848545278506
driven_lanedir_mean3.199716702535469
driven_lanedir_median2.1069994852979335
driven_lanedir_min1.340019294267502
get_duckie_state_max1.3677801915152087e-06
get_duckie_state_mean1.3338164566664544e-06
get_duckie_state_median1.3353019643333535e-06
get_duckie_state_min1.2968817064839025e-06
get_robot_state_max0.0037997172803294903
get_robot_state_mean0.003721692238070424
get_robot_state_median0.003713792575013812
get_robot_state_min0.00365946652192458
get_state_dump_max0.004837973750367456
get_state_dump_mean0.004794823126657764
get_state_dump_median0.0048108433718800575
get_state_dump_min0.0047196320125034875
get_ui_image_max0.03649406663833126
get_ui_image_mean0.03121621929351901
get_ui_image_median0.03048579172589951
get_ui_image_min0.02739922708394576
in-drivable-lane_max19.44999999999972
in-drivable-lane_mean8.787499999999973
in-drivable-lane_min1.6000000000000227
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 7.921176457117004, "get_ui_image": 0.028237637234766418, "step_physics": 0.09988107212774958, "survival_time": 59.99999999999873, "driven_lanedir": 7.244848545278506, "get_state_dump": 0.0047918050910511385, "get_robot_state": 0.00365946652192458, "sim_render-ego0": 0.003730654021683184, "get_duckie_state": 1.3677801915152087e-06, "in-drivable-lane": 3.549999999999997, "deviation-heading": 7.303409713730261, "agent_compute-ego0": 0.012024790718593168, "complete-iteration": 0.16679481959759845, "set_robot_commands": 0.0022148035845093485, "deviation-center-line": 3.281597209017729, "driven_lanedir_consec": 4.182696994793213, "sim_compute_sim_state": 0.010182026125410018, "sim_compute_performance-ego0": 0.0019839193104308015}, "LF-norm-zigzag-000-ego0": {"driven_any": 1.5015709266085155, "get_ui_image": 0.03649406663833126, "step_physics": 0.15739567337497587, "survival_time": 12.35000000000004, "driven_lanedir": 1.340019294267502, "get_state_dump": 0.0048298816527089766, "get_robot_state": 0.003693536404640444, "sim_render-ego0": 0.0038541747677710743, "get_duckie_state": 1.2968817064839025e-06, "in-drivable-lane": 1.6000000000000227, "deviation-heading": 2.081692064026468, "agent_compute-ego0": 0.012814802508200367, "complete-iteration": 0.234248528557439, "set_robot_commands": 0.00224608471316676, "deviation-center-line": 1.0805742463991364, "driven_lanedir_consec": 1.340019294267502, "sim_compute_sim_state": 0.010759677617780624, "sim_compute_performance-ego0": 0.002069487687080137}, "LF-norm-techtrack-000-ego0": {"driven_any": 5.1462172315570704, "get_ui_image": 0.0327339462170326, "step_physics": 0.1162266525110804, "survival_time": 39.84999999999987, "driven_lanedir": 2.539687090505629, "get_state_dump": 0.0047196320125034875, "get_robot_state": 0.00373404874538718, "sim_render-ego0": 0.00387824627391079, "get_duckie_state": 1.3378927283418509e-06, "in-drivable-lane": 19.44999999999972, "deviation-heading": 3.903260095382132, "agent_compute-ego0": 0.012280714541748353, "complete-iteration": 0.18935548930538623, "set_robot_commands": 0.002272178355912517, "deviation-center-line": 1.755703676118974, "driven_lanedir_consec": 2.539687090505629, "sim_compute_sim_state": 0.011365832541520734, "sim_compute_performance-ego0": 0.002051691961168944}, "LF-norm-small_loop-000-ego0": {"driven_any": 3.108177412370794, "get_ui_image": 0.02739922708394576, "step_physics": 0.10309290399356764, "survival_time": 24.450000000000212, "driven_lanedir": 1.6743118800902386, "get_state_dump": 0.004837973750367456, "get_robot_state": 0.0037997172803294903, "sim_render-ego0": 0.003885751354451082, "get_duckie_state": 1.3327112003248566e-06, "in-drivable-lane": 10.55000000000015, "deviation-heading": 2.8673583056701273, "agent_compute-ego0": 0.012507988968674015, "complete-iteration": 0.16643647077132245, "set_robot_commands": 0.0024179259125067265, "deviation-center-line": 0.47443172427996255, "driven_lanedir_consec": 1.6743118800902386, "sim_compute_sim_state": 0.006374239921569824, "sim_compute_performance-ego0": 0.002027959239726164}}
set_robot_commands_max0.0024179259125067265
set_robot_commands_mean0.002287748141523838
set_robot_commands_median0.0022591315345396384
set_robot_commands_min0.0022148035845093485
sim_compute_performance-ego0_max0.002069487687080137
sim_compute_performance-ego0_mean0.002033264549601512
sim_compute_performance-ego0_median0.002039825600447554
sim_compute_performance-ego0_min0.0019839193104308015
sim_compute_sim_state_max0.011365832541520734
sim_compute_sim_state_mean0.0096704440515703
sim_compute_sim_state_median0.01047085187159532
sim_compute_sim_state_min0.006374239921569824
sim_render-ego0_max0.003885751354451082
sim_render-ego0_mean0.003837206604454033
sim_render-ego0_median0.003866210520840932
sim_render-ego0_min0.003730654021683184
simulation-passed1
step_physics_max0.15739567337497587
step_physics_mean0.11914907550184337
step_physics_median0.10965977825232402
step_physics_min0.09988107212774958
survival_time_max59.99999999999873
survival_time_mean34.16249999999971
survival_time_min12.35000000000004
No reset possible
58604LFv-simsuccessyes0:30:15
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52542LFv-simerrorno0:08:56
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139760136903600
- M:video_aido:cmdline(in:/;out:/) 139760136926208
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41834LFv-simsuccessno0:09:33
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38418LFv-simsuccessno0:16:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38417LFv-simsuccessno0:16:21
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38415LFv-simsuccessno0:09:05
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36478LFv-simsuccessno0:09:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35893LFv-simsuccessno0:01:04
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35558LFv-simerrorno0:24:25
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-fe0070bfd2f4-1-job35558:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-fe0070bfd2f4-1-job35558/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-fe0070bfd2f4-1-job35558/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-fe0070bfd2f4-1-job35558/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-fe0070bfd2f4-1-job35558/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-fe0070bfd2f4-1-job35558/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35545LFv-simabortedno0:23:39
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-5753c726a5d0-1-job35545:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-5753c726a5d0-1-job35545/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-5753c726a5d0-1-job35545/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-5753c726a5d0-1-job35545/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-5753c726a5d0-1-job35545/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-5753c726a5d0-1-job35545/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35520LFv-simtimeoutno1:05:14
I can see how the jo [...]
I can see how the job 35520 is timeout because passed 3914 seconds and the timeout is 3600.0.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35476LFv-simabortedno0:22:14
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg03-0c28c9d61367-1-job35476:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg03-0c28c9d61367-1-job35476/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg03-0c28c9d61367-1-job35476/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg03-0c28c9d61367-1-job35476/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg03-0c28c9d61367-1-job35476/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg03-0c28c9d61367-1-job35476/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35163LFv-simsuccessno0:23:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35162LFv-simsuccessno0:23:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34438LFv-simsuccessno0:26:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34437LFv-simsuccessno0:26:39
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34268LFv-simabortedno0:24:52
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg07-1f09cddcc73e-1-job34268:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg07-1f09cddcc73e-1-job34268/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33999LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-bf35e9d68df4-1-job33999'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33993LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg04-bf35e9d68df4-1-job33993'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33972LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg01-53440c9394b5-1-job33972'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33963LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg05-5ca0d35e6d82-1-job33963'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33955LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg03-c2bc3037870e-1-job33955'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33951LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg03-c2bc3037870e-1-job33951'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33942LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg07-c4e193407567-1-job33942'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33935LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6817/LFv-sim-reg11-951de1eeccca-1-job33935'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33385LFv-simsuccessno0:14:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible