Duckietown Challenges Home Challenges Submissions

Submission 6826

Submission6826
Competingyes
Challengeaido5-LF-sim-validation
UserLiam Paull 🇨🇦
Date submitted
Last status update
Completecomplete
DetailsEvaluation is complete.
Sisters
Result💚
JobsLFv-sim: 58585
Next
User labelexercise_ros_template
Admin priority50
Blessingn/a
User priority50

58585

Click the images to see detailed statistics about the episode.

LF-norm-loop-000

LF-norm-small_loop-000

LF-norm-techtrack-000

LF-norm-zigzag-000

Evaluation jobs for this submission

Show only up-to-date jobs
Job IDstepstatusup to datedate starteddate completeddurationmessage
58585LFv-simsuccessyes0:39:59
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
driven_lanedir_consec_median0.0
survival_time_median59.99999999999873
deviation-center-line_median1.2422730096440104
in-drivable-lane_median0.0


other stats
agent_compute-ego0_max0.012871899275259613
agent_compute-ego0_mean0.012033888839067367
agent_compute-ego0_median0.01177477509056301
agent_compute-ego0_min0.011714105899883843
complete-iteration_max0.325363203250399
complete-iteration_mean0.27015282867552337
complete-iteration_median0.2718610233510166
complete-iteration_min0.21152606474966135
deviation-center-line_max4.053503393024394
deviation-center-line_mean1.731069028434027
deviation-center-line_min0.386226701423694
deviation-heading_max27.859809596736422
deviation-heading_mean14.925250437835436
deviation-heading_median14.309269950178932
deviation-heading_min3.22265225424745
driven_any_max2.6645352591003757e-13
driven_any_mean1.9984014443252818e-13
driven_any_median2.6645352591003757e-13
driven_any_min0.0
driven_lanedir_consec_max0.000286102294921875
driven_lanedir_consec_mean7.152557373046875e-05
driven_lanedir_consec_min0.0
driven_lanedir_max0.000286102294921875
driven_lanedir_mean7.152557373046875e-05
driven_lanedir_median0.0
driven_lanedir_min0.0
get_duckie_state_max1.2135327011222743e-06
get_duckie_state_mean1.1444985122108935e-06
get_duckie_state_median1.1336297218646734e-06
get_duckie_state_min1.0972019039919534e-06
get_robot_state_max0.0035569052414334288
get_robot_state_mean0.003504283223719918
get_robot_state_median0.0035003739332378552
get_robot_state_min0.003459479786970534
get_state_dump_max0.004336709086841389
get_state_dump_mean0.004290294736549321
get_state_dump_median0.004297425506712495
get_state_dump_min0.004229618845930902
get_ui_image_max0.03507457883232936
get_ui_image_mean0.02991936273320728
get_ui_image_median0.02967556211771715
get_ui_image_min0.025251747865065448
in-drivable-lane_max0.0
in-drivable-lane_mean0.0
in-drivable-lane_min0.0
per-episodes
details{"LF-norm-loop-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.02760403301197722, "step_physics": 0.19024633984085323, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.004229618845930902, "get_robot_state": 0.0035230178420093037, "sim_render-ego0": 0.003566476923539974, "get_duckie_state": 1.1015692718023066e-06, "in-drivable-lane": 0.0, "deviation-heading": 22.66279310353771, "agent_compute-ego0": 0.011742573991405478, "complete-iteration": 0.25320368027508405, "set_robot_commands": 0.002047683674528835, "deviation-center-line": 4.053503393024394, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.008333489063081892, "sim_compute_performance-ego0": 0.001834138843240984}, "LF-norm-zigzag-000-ego0": {"driven_any": 0.0, "get_ui_image": 0.03507457883232936, "step_physics": 0.25067231875474405, "survival_time": 59.99999999999873, "driven_lanedir": 0.000286102294921875, "get_state_dump": 0.004336709086841389, "get_robot_state": 0.0034777300244664073, "sim_render-ego0": 0.0036082309449741385, "get_duckie_state": 1.2135327011222743e-06, "in-drivable-lane": 0.0, "deviation-heading": 27.859809596736422, "agent_compute-ego0": 0.012871899275259613, "complete-iteration": 0.325363203250399, "set_robot_commands": 0.002067329484556835, "deviation-center-line": 1.0457540566888746, "driven_lanedir_consec": 0.000286102294921875, "sim_compute_sim_state": 0.011337207417801755, "sim_compute_performance-ego0": 0.001837739936517339}, "LF-norm-techtrack-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.03174709122345708, "step_physics": 0.22288956511129843, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.0042989895206803986, "get_robot_state": 0.0035569052414334288, "sim_render-ego0": 0.0035669603117498925, "get_duckie_state": 1.16569017192704e-06, "in-drivable-lane": 0.0, "deviation-heading": 3.22265225424745, "agent_compute-ego0": 0.011806976189720542, "complete-iteration": 0.2905183664269491, "set_robot_commands": 0.002067502392618782, "deviation-center-line": 1.4387919625991463, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.008664986970124891, "sim_compute_performance-ego0": 0.0018409833026666823}, "LF-norm-small_loop-000-ego0": {"driven_any": 2.6645352591003757e-13, "get_ui_image": 0.025251747865065448, "step_physics": 0.15347971685919337, "survival_time": 59.99999999999873, "driven_lanedir": 0.0, "get_state_dump": 0.004295861492744592, "get_robot_state": 0.003459479786970534, "sim_render-ego0": 0.003527635340885159, "get_duckie_state": 1.0972019039919534e-06, "in-drivable-lane": 0.0, "deviation-heading": 5.955746796820156, "agent_compute-ego0": 0.011714105899883843, "complete-iteration": 0.21152606474966135, "set_robot_commands": 0.0020183343672931045, "deviation-center-line": 0.386226701423694, "driven_lanedir_consec": 0.0, "sim_compute_sim_state": 0.005886746088134359, "sim_compute_performance-ego0": 0.0018150441156239631}}
set_robot_commands_max0.002067502392618782
set_robot_commands_mean0.002050212479749389
set_robot_commands_median0.002057506579542835
set_robot_commands_min0.0020183343672931045
sim_compute_performance-ego0_max0.0018409833026666823
sim_compute_performance-ego0_mean0.0018319765495122425
sim_compute_performance-ego0_median0.0018359393898791617
sim_compute_performance-ego0_min0.0018150441156239631
sim_compute_sim_state_max0.011337207417801755
sim_compute_sim_state_mean0.008555607384785724
sim_compute_sim_state_median0.008499238016603392
sim_compute_sim_state_min0.005886746088134359
sim_render-ego0_max0.0036082309449741385
sim_render-ego0_mean0.00356732588028729
sim_render-ego0_median0.0035667186176449334
sim_render-ego0_min0.003527635340885159
simulation-passed1
step_physics_max0.25067231875474405
step_physics_mean0.20432198514152228
step_physics_median0.20656795247607584
step_physics_min0.15347971685919337
survival_time_max59.99999999999873
survival_time_mean59.99999999999873
survival_time_min59.99999999999873
No reset possible
52526LFv-simerrorno0:12:06
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140701683424368
- M:video_aido:cmdline(in:/;out:/) 140701683423360
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
52514LFv-simerrorno0:10:02
InvalidEvaluator: Tr [...]
InvalidEvaluator:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
    im = Image.open(filename)
  File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
    raise UnidentifiedImageError(
PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
    result = block.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
    image = imread(self.config.file)
  File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
    raise ValueError(msg) from e
ValueError: Could not open filename "banner1.png".

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 329, in main
    make_video2(
  File "/usr/local/lib/python3.8/site-packages/aido_analyze/utils_video.py", line 149, in make_video2
    pg("video_aido", params)
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 280, in pg
    raise e
  File "/usr/local/lib/python3.8/site-packages/procgraph/scripts/pgmain.py", line 277, in pg
    model.update()
  File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 321, in update
    raise BadMethodCall("update", block, traceback.format_exc())
procgraph.core.exceptions.BadMethodCall: User-thrown exception while calling update() in block 'static_image'.
- B:StaticImage:static_image(in:/;out:rgb) 140377954452288
- M:video_aido:cmdline(in:/;out:/) 140377954453296
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 51, in imread
>     im = Image.open(filename)
>   File "/usr/local/lib/python3.8/site-packages/PIL/Image.py", line 2943, in open
>     raise UnidentifiedImageError(
> PIL.UnidentifiedImageError: cannot identify image file 'banner1.png'
> 
> The above exception was the direct cause of the following exception:
> 
> Traceback (most recent call last):
>   File "/usr/local/lib/python3.8/site-packages/procgraph/core/model.py", line 316, in update
>     result = block.update()
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 31, in update
>     image = imread(self.config.file)
>   File "/usr/local/lib/python3.8/site-packages/procgraph_pil/imread_imp.py", line 54, in imread
>     raise ValueError(msg) from e
> ValueError: Could not open filename "banner1.png".

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/duckietown_challenges/cie_concrete.py", line 681, in scoring_context
    yield cie
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 68, in go
    wrap(cie)
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/experiment_manager.py", line 34, in wrap
    asyncio.run(main(cie, logdir, attempts), debug=True)
  File "/usr/local/lib/python3.8/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/usr/local/lib/python3.8/site-packages/duckietown_experiment_manager/code.py", line 383, in main
    raise dc.InvalidEvaluator(msg) from e
duckietown_challenges.exceptions.InvalidEvaluator: Anomalous error while running episodes:
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
41823LFv-simsuccessno0:09:17
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
38395LFv-simsuccessno0:14:56
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
36464LFv-simsuccessno0:09:23
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35878LFv-simsuccessno0:01:01
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35465LFv-simerrorno0:22:13
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg05-b2dee9d94ee0-1-job35465:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg05-b2dee9d94ee0-1-job35465/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.

List of all files:

 -/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg05-b2dee9d94ee0-1-job35465/docker-compose.original.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg05-b2dee9d94ee0-1-job35465/docker-compose.yaml
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg05-b2dee9d94ee0-1-job35465/logs/challenges-runner/stdout.log
-/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg05-b2dee9d94ee0-1-job35465/logs/challenges-runner/stderr.log
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35153LFv-simsuccessno0:24:06
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
35151LFv-simsuccessno0:23:54
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34426LFv-simsuccessno0:26:37
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34425LFv-simsuccessno0:26:38
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
34256LFv-simabortedno0:25:11
The result file is n [...]
The result file is not found in working dir /tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg02-796a7d9adcef-1-job34256:

File '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg02-796a7d9adcef-1-job34256/challenge-results/challenge_results.yaml' does not exist.

This usually means that the evaluator did not finish and some times that there was an import error.
Check the evaluator log to see what happened.
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33921LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg04-bf35e9d68df4-1-job33921'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33916LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg04-bf35e9d68df4-1-job33916'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33892LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg01-53440c9394b5-1-job33892'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33881LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg07-c4e193407567-1-job33881'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33879LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg01-53440c9394b5-1-job33879'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33872LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg11-951de1eeccca-1-job33872'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33863LFv-simhost-errorno0:00:01
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg03-c2bc3037870e-1-job33863'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33862LFv-simhost-errorno0:00:00
Uncaught exception: [...]
Uncaught exception:
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/duckietown_challenges_runner_daffy-6.0.29-py3.8.egg/duckietown_challenges_runner/runner.py", line 628, in get_cr
    os.makedirs(wd)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
OSError: [Errno 28] No space left on device: '/tmp/duckietown/DT18/evaluator/executions/aido5-LF-sim-validation/submission6826/LFv-sim-reg05-5ca0d35e6d82-1-job33862'
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible
33400LFv-simsuccessno0:24:03
Artefacts hidden. If you are the author, please login using the top-right link or use the dashboard.
No reset possible