A Supervised Learning Framework in the Context of Multiple Annotators

dc.contributor.authorGil González, Julián
dc.contributor.authorÁlvarez Meza, Andrés Marino
dc.contributor.corporatenameVicedecanatura de Investigación y Extensión -Facultad de Ingeniería y Arquitectura-Sede Manizales -Editorial Universidad Nacional de Colombiaspa
dc.date.accessioned2023-09-11T13:36:20Z
dc.date.available2023-09-11T13:36:20Z
dc.date.issued2023
dc.description.abstractThe increasing popularity of crowdsourcing platforms, i.e., Amazon Mechanical Turk, is changing how datasets for supervised learning are built. In these cases, instead of having datasets labeled by one source (which is supposed to be an expert who provided the absolute gold standard), we have datasets labeled by multiple annotators with different and unknown expertise. Hence, we face a multi-labeler scenario, which typical supervised learning models cannot tackle.For this reason, much attention has recently been given to the approaches that capture multiple annotators’ wisdom. However, such methods reside on two key assumptions: the labeler’s performance does not depend on the input space and independence among the annotators, which are hardly feasible in real-world settings. This book exploresseveral models based on both frequentist and Bayesian perspectives aiming to face multi-labeler scenarios. Our approaches model the annotators’ behavior by considering the relationship between the input space and the labelers’ performance and coding interdependencies among them.eng
dc.description.tableofcontents1 Preliminaries 1.1 Motivation 1.2 Problem Statement 1.3 Mathematical Preliminaries 1.3.1 Methods for Supervised Learning 1.3.2 Learning from Multiple Annotators 1.4 Literature Review on Supervised Learning from Multiple Annotators 1.5 Objectives 1.5.1 General Objective 1.5.2 Specific Objectives 1.6 Outline and Contributions 1.6.1 Kernel Alignment-Based Annotator Relevance Analysis (KAAR) 1.6.2 Localized Kernel Alignment-Based Annotator Relevance Analysis (LKAAR) 1.6.3 Regularized Chained Deep Neural Network for Multiple Annotators (RCDNN) 1.6.4 Chained Gaussian Processes for Multiple Annotators (CGPMA) andCorrelated Chained Gaussian Processes for Multiple Annotators (CCGPMA) 1.6.5 Book Structure 2 Kernel Alignment-Based Annotator Relevance Analysis 2.1 Centered Kernel Alignment Fundamentals 2.2 Kernel Alignment-Based Annotator Relevance Analysis 2.2.1 KAAR for Classification and Regression 2.3 Experimental Set-Up 2.3.1 Classification 2.3.2 Regression 2.4 Results and Discussion 2.4.1 Classification 2.4.2 Regression 2.5 Summary 3 Localized Kernel Alignment-Based Annotator Relevance Analysis 3.1 Localized Kernel Alignment Fundamentals 3.2 Localized Kernel Alignment-Based Annotator Relevance Analysis 3.2.1 LKAAR for Classification and Regression 3.3 Experimental Set-Up 3.3.1 Classification 3.3.2 Regression 3.4 Results and Discussion 3.4.1 Classification 3.4.2 Regression 3.5 Summary 4 Regularized Chained Deep Neural Network for Multiple Annotators 4.1 Chained Deep Neural Network 4.2 Regularized Chained Deep Neural Network for Classification with Multiple Annotators 4.3 Experimental Set-Up 4.3.1 Tested Datasets 4.3.2 Provided and Simulated Annotations 4.3.3 Method Comparison and Quality Assessment 4.3.4 RCDNN Detailed Architecture and Training 4.4 Results and Discussion 4.5 Summary 5 Correlated Chained Gaussian Processes for Multiple Annotators 5.1 Chained Gaussian Processes 5.1.1 Correlated Chained Gaussian Processes 5.2 Correlated Chained GP for Multiple Annotators-CCGPMA 5.2.1 Classification 5.2.2 Regression 5.3 Experimental Set-Up 5.3.1 Classification 5.3.2 Regression 5.4 Results and Discussion 5.4.1 Classification 5.4.2 Regression 5.5 Summary 6 Final Remarks 6.1 Conclusions 6.2 Future Work 6.3 Repositories Bibliography Appendices Appendix A CCGPMA Supplementary Material A.1 Derivation of CCGPMA Lower Bounds A.1.1 Gradients w.r.t. the Variational Parameters A.2 Likelihood Functions A.2.1 Multiclass Classification with Multiple Annotators A.2.2 Gaussian Distribution for Regression with Multiple Annotators Alphabetical Indexeng
dc.format.mimetypeapplication/pdfspa
dc.identifier.eisbn9789585053694spa
dc.identifier.instnameUniversidad Nacional de Colombiaspa
dc.identifier.reponameRepositorio Institucional Universidad Nacional de Colombiaspa
dc.identifier.repourlhttps://repositorio.unal.edu.co/spa
dc.identifier.urihttps://repositorio.unal.edu.co/handle/unal/84685
dc.language.isoengspa
dc.publisher.placeBogotá,Colombiaspa
dc.rights.accessrightsinfo:eu-repo/semantics/openAccessspa
dc.rights.licenseAtribución-NoComercial-SinDerivadas 4.0 Internacionalspa
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/spa
dc.subject.ddc620 - Ingeniería y operaciones afinesspa
dc.subject.proposalAprendizaje supervisadospa
dc.subject.proposalInteligencia artificial
dc.subject.proposalAprendizaje automáticospa
dc.subject.proposalRedes neuronalesspa
dc.subject.proposalComputadoresspa
dc.subject.proposalProcesos de Gaussspa
dc.titleA Supervised Learning Framework in the Context of Multiple Annotators
dc.typeLibrospa
dc.type.coarhttp://purl.org/coar/resource_type/c_2f33spa
dc.type.coarversionhttp://purl.org/coar/version/c_970fb48d4fbd8a85spa
dc.type.driverinfo:eu-repo/semantics/bookspa
dc.type.redcolhttp://purl.org/redcol/resource_type/LIBspa
dc.type.versioninfo:eu-repo/semantics/publishedVersionspa
dcterms.audience.professionaldevelopmentEstudiantesspa
oaire.accessrightshttp://purl.org/coar/access_right/c_abf2spa

Archivos

Bloque original

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
9789585053694.pdf
Tamaño:
6.53 MB
Formato:
Adobe Portable Document Format
Descripción:

Bloque de licencias

Mostrando 1 - 1 de 1
Cargando...
Miniatura
Nombre:
license.txt
Tamaño:
5.74 KB
Formato:
Item-specific license agreed upon to submission
Descripción:

Colecciones