The PALM challenge focuses on the investigation and development of algorithms associated with the diagnosis of Pathological Myopia (PM) and segmentation of lesions in fundus photos from PM patients. The goal of the challenge is to evaluate and compare automated algorithms for the detection of pathological myopia on a common dataset of retinal fundus images. We invite the medical image analysis community to participate by developing and testing existing and novel automated fundus classification and segmentation methods.
Thanks for your interest and patience. PALM challenge, as a part of the serial challenge iChallenge, opens to the public on Dec 28 2018. Training data of the 1st classification task will be released on a new platform in early Jan 2019, for better management, including 1) automatic performance evaluation; 2) better communication between organizers and participants; 3) higher download speed for participants around the world; 4) other benefits.
Please don't mind if your participation request is not approved on this platform, we will batch process the request before data releasing.
Related RULES are copied as below for your reference.
- Anonymous registration is NOT allowed. All information entered when registering a team, including the (TRUE) Name of the contact person, the Affiliation (including department, full name of university/institute/company, country) and the E-mail address must be COMPLETE and CORRECT. The abbreviation is not allowed.
- Incomplete registrations will be removed without notice.
- Redundant registrations will be removed without notice.
How to Participate?
Please contact the organizers if you have any questions.
Key Notes for your Participation @ Venice
- Any team meets all the following requirement before the deadline (Mar 10), is accepted to attend the onsite challenge @ Venice on April 8:
a) register your team via the CMT paper submission system (https://cmt3.research.microsoft.com/PALM2019/), a team may have only one member (if you have registered via email, you still need to submit our technical report through the CMT system with team information);
b) obtain a non-zero score on the leaderboards (opened on Feb 28);
c) submit a self-contained and reasonable technical report via the CMT system (opened on Feb 21);
send a confirmation email of attendance to the organizersconfirm your attendance of onsite participation in the CMT system together with the (pre-)camera ready submission or even from the beginning of your team registration;
e) VISA and INVITATION LETTER: please pay the registration fee to ISBI for the challenge day of Apr 8, and then send an invitation letter request to ISBI registration service agent if you need one for VISA application.
- The final score = 0.3 * off_site_socre + 0.7*onsite_score;
- Sorry for the delays during sharing the data, please do understand that we are trying our best to guarantee the data quality.
- The result submission and automatic evaluation, as well as the leaderboards, are on Baidu BROAD platform(under the challenge named iChallenge-PM) during the pre-onsite challenge stage, and the final on-site stage evaluation entrance together with the test images will be distributed through the CMT system.
A total of ~4000 USD awards will be provided by Baidu for the onsite PALM challenge.
Please do log in the CMT system to register your team before the deadline on Mar 15, to make your results effective. Thank you for the support and wish you good luck.
Important Dates （PST time）
|Oct 01 2018:||Individual Registration opens.|
|Dec 28 2018:||PALM website opens. Now we moved to another platform for better organization in Early Jan 2019.|
|Jan 01 2019:||Training images for the first classification task are released.|
Jan 10 2019
|Feb 15 2019:||The off-site validation set is released.|
|Feb 16 2019:||Annotations (for other 3 tasks) of the Training set are released.
|Feb 21 2019:||Dropbox links of the Training Data Annotations are released. |
|Feb 21 2019:||Team registration and technical report submission open on the CMT system.|
|Feb 25 2019:|| Submissions of results on the off-site validation set will be opened @ BROAD platform.
- A team must be registered through the CMT system and got approval before able to
get an effective score on the leaderboards with the results submitted
|Feb 28 2019:||Submission and evaluation open (Entance Link).|
||Off-site validation set results submission deadline. -- Extended to April 1 due to the evaluation delay and the server issues, and also because the validation results are not for final onsite participation selection, all teams can take part in the onsite challenge, the validation results only contribute 30% to the final scores. BUT please do register your team by Mar 15 and submit an (incomplete) technical report by Mar 18 in the CMT system, to be eligible to win an award.|
|Mar 18 2019:||Paper (technical report) SUBMISSION (draft manuscript submission which can be updated by Apr 01) deadline.
-- This is to issue the invitation letters to whom needs an invitation for travel approval.
|Mar 19 2019:||Team registration and onsite participation confirmation through the CMT system close.|
|Apr 04 2019:||Technical report final Camera-Ready SUBMISSION deadline.|
|Apr 08 2019 （AM）:||On-site PALM in conjunction with ISBI 2019.|
Number of users: 481