CLEF e-Health 2013

Here is the proceedings link where you can find the task reviews as well as the articles for free.

Task #1 : Annotation of disorder mentions in clinical reports by (1a) identifying a span of text as a disorder mention, and (1b) [optional] mapping the span to a UMLS CUI (i.e. SNOMED-CT codes).

Dataset : From different clinical encounters including radiology reports, discharge summaries, and ECG/ECHO reports, about 181K words annotated by organizers and 200 documents were provided as training  (5811 disorder entities were annotated and mapped to 1007 unique CUIs or CUI-less) and 100 documents were spared for testing (5340 disorder entities with 795 CUIs or CUI-less).

Competition Results : The best systems had an F1 score of 0.75 (0.80 Precision, 0.71 Recall) in Task 1a and an accuracy of 0.59 in Task 1b. Task 1a top three teams are namely, UTHealthCCB.A, NCBI and CLEAR.

About these ads