Collaborative Annotation for Reliable Natural Language Processing – Technical and Sociological Aspects
Autor Forten Limba Engleză Hardback – 6 iun 2016
Preț: 920.91 lei
Preț vechi: 1151.15 lei
-20% Nou
Puncte Express: 1381
Preț estimativ în valută:
176.26€ • 181.65$ • 148.81£
176.26€ • 181.65$ • 148.81£
Carte tipărită la comandă
Livrare economică 04-18 martie
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9781848219045
ISBN-10: 1848219040
Pagini: 192
Dimensiuni: 167 x 244 x 17 mm
Greutate: 0.45 kg
Editura: ISTE Ltd.
Locul publicării:Hoboken, United States
ISBN-10: 1848219040
Pagini: 192
Dimensiuni: 167 x 244 x 17 mm
Greutate: 0.45 kg
Editura: ISTE Ltd.
Locul publicării:Hoboken, United States
Public țintă
Scientists, researchers and engineers interested in this subject areaCuprins
Preface ix
List of Acronyms xi
Introduction xiii
Chapter 1. Annotating Collaboratively 1
1.1. The annotation process (re)visited 1
1.1.1. Building consensus 1
1.1.2. Existing methodologies 3
1.1.3. Preparatory work 7
1.1.4. Pre–campaign 13
1.1.5. Annotation 17
1.1.6. Finalization 21
1.2. Annotation complexity 24
1.2.1. Example overview 25
1.2.2. What to annotate? 28
1.2.3. How to annotate? 30
1.2.4. The weight of the context 36
1.2.5. Visualization 38
1.2.6. Elementary annotation tasks 40
1.3. Annotation tools 43
1.3.1. To be or not to be an annotation tool 43
1.3.2. Much more than prototypes 46
1.3.3. Addressing the new annotation challenges 49
1.3.4. The impossible dream tool 54
1.4. Evaluating the annotation quality 55
1.4.1. What is annotation quality? 55
1.4.2. Understanding the basics 56
1.4.3. Beyond kappas 63
1.4.4. Giving meaning to the metrics 67
1.5. Conclusion 75
Chapter 2. Crowdsourcing Annotation 77
2.1. What is crowdsourcing and why should we be interested in it? 77
2.1.1. A moving target 77
2.1.2. A massive success 80
2.2. Deconstructing the myths 81
2.2.1. Crowdsourcing is a recent phenomenon 81
2.2.2. Crowdsourcing involves a crowd (of non–experts) 83
2.2.3. Crowdsourcing involves (a crowd of) non–experts 87
2.3. Playing with a purpose 93
2.3.1. Using the players innate capabilities and world knowledge 94
2.3.2. Using the players school knowledge 96
2.3.3. Using the players learning capacities 97
2.4. Acknowledging crowdsourcing specifics 101
2.4.1. Motivating the participants 101
2.4.2. Producing quality data 107
2.5. Ethical issues 109
2.5.1. Game ethics 109
2.5.2. What s wrong with Amazon Mechanical Turk? 111
2.5.3. A charter to rule them all 113
Conclusion 115
Appendix 117
Glossary 141
Bibliography 143
Index 163
Introduction xiii
Chapter 1. Annotating Collaboratively 1
1.1. The annotation process (re)visited 1
1.1.1. Building consensus 1
1.1.2. Existing methodologies 3
1.1.3. Preparatory work 7
1.1.4. Pre–campaign 13
1.1.5. Annotation 17
1.1.6. Finalization 21
1.2. Annotation complexity 24
1.2.1. Example overview 25
1.2.2. What to annotate? 28
1.2.3. How to annotate? 30
1.2.4. The weight of the context 36
1.2.5. Visualization 38
1.2.6. Elementary annotation tasks 40
1.3. Annotation tools 43
1.3.1. To be or not to be an annotation tool 43
1.3.2. Much more than prototypes 46
1.3.3. Addressing the new annotation challenges 49
1.3.4. The impossible dream tool 54
1.4. Evaluating the annotation quality 55
1.4.1. What is annotation quality? 55
1.4.2. Understanding the basics 56
1.4.3. Beyond kappas 63
1.4.4. Giving meaning to the metrics 67
1.5. Conclusion 75
Chapter 2. Crowdsourcing Annotation 77
2.1. What is crowdsourcing and why should we be interested in it? 77
2.1.1. A moving target 77
2.1.2. A massive success 80
2.2. Deconstructing the myths 81
2.2.1. Crowdsourcing is a recent phenomenon 81
2.2.2. Crowdsourcing involves a crowd (of non–experts) 83
2.2.3. Crowdsourcing involves (a crowd of) non–experts 87
2.3. Playing with a purpose 93
2.3.1. Using the players innate capabilities and world knowledge 94
2.3.2. Using the players school knowledge 96
2.3.3. Using the players learning capacities 97
2.4. Acknowledging crowdsourcing specifics 101
2.4.1. Motivating the participants 101
2.4.2. Producing quality data 107
2.5. Ethical issues 109
2.5.1. Game ethics 109
2.5.2. What s wrong with Amazon Mechanical Turk? 111
2.5.3. A charter to rule them all 113
Conclusion 115
Appendix 117
Glossary 141
Bibliography 143
Index 163