Duckietown Challenges Home Challenges Submissions

Submission 6853

Submission6853
Competingyes
Challengeaido5-LF-sim-validation
UserMelisande Teng
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58508
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58508

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58508LFv-simsuccessyes0:07:57
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.6984616085042363
survival_time_median12.650000000000045
deviation-center-line_median0.27538841854550866
in-drivable-lane_median7.775000000000052


other stats
agent_compute-ego0_max0.01323173249640116
agent_compute-ego0_mean0.012683165895449756
agent_compute-ego0_median0.012640950805460852
agent_compute-ego0_min0.012219029474476156
complete-iteration_max0.20510634831491223
complete-iteration_mean0.1792979692592438
complete-iteration_median0.1815318519883901
complete-iteration_min0.14902182474528272
deviation-center-line_max0.6912304162880201
deviation-center-line_mean0.37610055486747535
deviation-center-line_min0.26239496609086405
deviation-heading_max3.6796462369235488
deviation-heading_mean2.09566927377692
deviation-heading_median2.0255397763353145
deviation-heading_min0.6519513055135033
driven_any_max3.1361934036709584
driven_any_mean2.3055126422506347
driven_any_median2.2976554691395195
driven_any_min1.4905462270525416
driven_lanedir_consec_max1.1048671533265175
driven_lanedir_consec_mean0.7728086032238082
driven_lanedir_consec_min0.5894440425602423
driven_lanedir_max1.1048671533265175
driven_lanedir_mean0.7939955610427359
driven_lanedir_median0.7230784183999477
driven_lanedir_min0.6249582540445311
get_duckie_state_max1.3510386149088542e-06
get_duckie_state_mean1.2957468311379697e-06
get_duckie_state_median1.3306526182382643e-06
get_duckie_state_min1.1706434731664954e-06
get_robot_state_max0.0037511035248085304
get_robot_state_mean0.0035908189413803053
get_robot_state_median0.0035563763046413347
get_robot_state_min0.003499419631430022
get_state_dump_max0.004825359509315019
get_state_dump_mean0.004592511658086602
get_state_dump_median0.004543392602339767
get_state_dump_min0.004457901918351857
get_ui_image_max0.036103185072902166
get_ui_image_mean0.030872198257116366
get_ui_image_median0.030632581500108256
get_ui_image_min0.02612044495534679
in-drivable-lane_max10.200000000000076
in-drivable-lane_mean7.512500000000042
in-drivable-lane_min4.2999999999999865
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 3.1361934036709584, "get_ui_image": 0.028698836579734897, "step_physics": 0.10790909514015104, "survival_time": 16.150000000000095, "driven_lanedir": 1.1048671533265175, "get_state_dump": 0.004825359509315019, "get_robot_state": 0.0037511035248085304, "sim_render-ego0": 0.003799304550076709, "get_duckie_state": 1.3510386149088542e-06, "in-drivable-lane": 9.300000000000075, "deviation-heading": 3.6796462369235488, "agent_compute-ego0": 0.01265058841234372, "complete-iteration": 0.17602002841454964, "set_robot_commands": 0.002242076544114101, "deviation-center-line": 0.6912304162880201, "driven_lanedir_consec": 1.1048671533265175, "sim_compute_sim_state": 0.010072070875285584, "sim_compute_performance-ego0": 0.0019861459732055664}, "LF-norm-zigzag-000-ego0": {"driven_any": 2.668006649631925, "get_ui_image": 0.036103185072902166, "step_physics": 0.13001617263345158, "survival_time": 14.40000000000007, "driven_lanedir": 0.6249582540445311, "get_state_dump": 0.004457901918351857, "get_robot_state": 0.003499419631430022, "sim_render-ego0": 0.0036915404573856342, "get_duckie_state": 1.1706434731664954e-06, "in-drivable-lane": 10.200000000000076, "deviation-heading": 2.303062935448712, "agent_compute-ego0": 0.012631313198577987, "complete-iteration": 0.20510634831491223, "set_robot_commands": 0.002046795452342314, "deviation-center-line": 0.26239496609086405, "driven_lanedir_consec": 0.5894440425602423, "sim_compute_sim_state": 0.010737179059883304, "sim_compute_performance-ego0": 0.0018521444195282088}, "LF-norm-techtrack-000-ego0": {"driven_any": 1.4905462270525416, "get_ui_image": 0.032566326420481614, "step_physics": 0.11551327821685048, "survival_time": 8.14999999999998, "driven_lanedir": 0.7189822446332333, "get_state_dump": 0.004577844608120802, "get_robot_state": 0.0035742986492994355, "sim_render-ego0": 0.004037428192976044, "get_duckie_state": 1.346192708829554e-06, "in-drivable-lane": 4.2999999999999865, "deviation-heading": 0.6519513055135033, "agent_compute-ego0": 0.01323173249640116, "complete-iteration": 0.18704367556223056, "set_robot_commands": 0.002438469630915944, "deviation-center-line": 0.2642574045086618, "driven_lanedir_consec": 0.7189822446332333, "sim_compute_sim_state": 0.008979541499440262, "sim_compute_performance-ego0": 0.002051981484017721}, "LF-norm-small_loop-000-ego0": {"driven_any": 1.927304288647114, "get_ui_image": 0.02612044495534679, "step_physics": 0.08976003459599464, "survival_time": 10.90000000000002, "driven_lanedir": 0.7271745921666619, "get_state_dump": 0.004508940596558732, "get_robot_state": 0.0035384539599832335, "sim_render-ego0": 0.003672272103017868, "get_duckie_state": 1.315112527646975e-06, "in-drivable-lane": 6.250000000000028, "deviation-heading": 1.7480166172219167, "agent_compute-ego0": 0.012219029474476156, "complete-iteration": 0.14902182474528272, "set_robot_commands": 0.002090620667967078, "deviation-center-line": 0.2865194325823555, "driven_lanedir_consec": 0.6779409723752394, "sim_compute_sim_state": 0.005145677148479305, "sim_compute_performance-ego0": 0.0018943227045067912}}
set_robot_commands_max0.002438469630915944
set_robot_commands_mean0.0022044905738348593
set_robot_commands_median0.0021663486060405895
set_robot_commands_min0.002046795452342314
sim_compute_performance-ego0_max0.002051981484017721
sim_compute_performance-ego0_mean0.001946148645314572
sim_compute_performance-ego0_median0.001940234338856179
sim_compute_performance-ego0_min0.0018521444195282088
sim_compute_sim_state_max0.010737179059883304
sim_compute_sim_state_mean0.008733617145772112
sim_compute_sim_state_median0.009525806187362923
sim_compute_sim_state_min0.005145677148479305
sim_render-ego0_max0.004037428192976044
sim_render-ego0_mean0.003800136325864064
sim_render-ego0_median0.0037454225037311713
sim_render-ego0_min0.003672272103017868
simulation-passed1
step_physics_max0.13001617263345158
step_physics_mean0.11079964514661192
step_physics_median0.11171118667850076
step_physics_min0.08976003459599464
survival_time_max16.150000000000095
survival_time_mean12.40000000000004
survival_time_min8.14999999999998
No reset possible
58507LFv-simsuccessyes0:07:25
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52438LFv-simerrorno0:02:31
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 139891658097136
- M:video_aido:cmdline(in:/;out:/) 139891657906256
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52428LFv-simerrorno0:05:21
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140099291421904
- M:video_aido:cmdline(in:/;out:/) 140103888945888
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52427LFv-simerrorno0:04:25
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140098287020928
- M:video_aido:cmdline(in:/;out:/) 140098286663232
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41786LFv-simsuccessno0:07:50
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41785LFv-simsuccessno0:08:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38334LFv-simsuccessno0:07:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36435LFv-simsuccessno0:09:20
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35847LFv-simsuccessno0:01:02
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35436LFv-simerrorno0:19:38
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6853/LFv-sim-reg04-c054faef3177-1-job35436:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6853/LFv-sim-reg04-c054faef3177-1-job35436/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6853/LFv-sim-reg04-c054faef3177-1-job35436/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6853/LFv-sim-reg04-c054faef3177-1-job35436/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6853/LFv-sim-reg04-c054faef3177-1-job35436/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6853/LFv-sim-reg04-c054faef3177-1-job35436/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35119LFv-simsuccessno0:20:49
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33461LFv-simsuccessno0:24:43
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33453LFv-simsuccessno0:10:58
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33452LFv-simerrorno0:00:58
The container "solut [...]
The container "solution" exited with code 1.


Look at the logs for the container to know more about the error.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible