Skip to main content

CPC2 Submission

Registration

Teams are required to register to help us organise the challenge. Registered teams will be assigned a unique team ID.

What evaluation data is provided?

The evaluation data consists of audio signals processed by hearing aid systems, clean reference signals, listener metadata, and a mapping of which listeners listened to which scenes/hearing aid systems.

The evaluation data is will be made available when the submission period opens. See the download page for more details.

There will be three evaluation sets (eval1, eval2 and eval3), corresponding to the three three training data partitions. i.e., predictions for the eval1 set should be made with systems trained on the train1 partition; eval2 with train2 and eval3 with train3.

Note, the evaluation data does not contain the listener responses. We will score your submission for you and return your score (we aim to do this within 24 hours of submission). We will then release the true listener responses the day after the submission deadline to allow teams to perform analysis of their results.

What do I need to submit?

All teams must submit

  • Their predicted intelligibility scores
  • A two page technical report

The predicted intelligibility scores

Scores for each evaluation set should be stored in a separate CSV file named as follows CPC2_<TEAM_ID>.<SET>.csv, where <TEAM_ID> is your individual team ID, e.g. 'E001' and <SET> is the evaluation set number, either 1, 2, or 3.

The CSV files should have two columns,

signal_ID, intelligibility_score

where the signal_ID is the unique signal identifier used for the wav file name (e.g., S08510_L0239_E001) and intelligibility_score is the predicted intelligibility given in terms of the percentage words recognised correctly for the signal (i.e., from 0 to 100).

The three CSV files should be sent as email attachments to the email address: claritychallengecontact@gmail.com

Please use "CPC2 Submission <TEAM_ID>" as the subject line.

We also encourage you to make your prediction model code available via an open-source license, but this is not a pre-requisite for entry (see challenge rules).

info

All registered teams will be emailed with a reminder of their unique team ID shortly before the submission deadline. If you plan to submit please register before the submission deadline.

The technical report

The two page technical report must be submitted in the format required for the Clarity-2023 Workshop. The author kit and link for submission can be found on the workshop website.

The report needs to be sufficiently complete for us to judge whether your system(s)/model(s) is compliant with the challenge rules. You can find a list of key challenge dates here.

Your report should include an abstract and introduction and sections on experimental setup/methodology including system/model information and model/network architecture, evaluation/results, discussion, conclusion and references. Please provide an estimation of the computational resources needed. You must describe any external data and pre-existing tools, software and models used. Please make it clear how your system(s)/model(s) meets the challenge rules.

Note, you will not have your final evaluation set scores when you submit your report. We will score your submission for you and return your score (we aim to do this within 24 hours of submission). We will then release the groundtruth listener responses the day after the submission deadline (i.e. 1st August) to allow teams to perform further analysis of their results. This extra information can then be included in a revised version of your report, which will be published on the workshop website in time for the workshop itself on 19th August.

How will intellectual property be handled?

See here under Intellectual Property.