Please use this identifier to cite or link to this item: http://repositorio.inesctec.pt/handle/123456789/7125
Full metadata record
DC FieldValueLanguage
dc.contributor.authorPinto,Fen
dc.contributor.authorVítor Manuel Cerqueiraen
dc.contributor.authorCarlos Manuel Soaresen
dc.contributor.authorJoão Mendes Moreiraen
dc.date.accessioned2018-01-19T17:17:22Z-
dc.date.available2018-01-19T17:17:22Z-
dc.date.issued2017en
dc.identifier.urihttp://repositorio.inesctec.pt/handle/123456789/7125-
dc.description.abstractMachine Learning (ML) has been successfully applied to a wide range of domains and applications. One of the techniques behind most of these successful applications is Ensemble Learning (EL), the field of ML that gave birth to methods such as Random Forests or Boosting. The complexity of applying these techniques together with the market scarcity on ML experts, has created the need for systems that enable a fast and easy drop-in replacement for ML libraries. Automated machine learning (autoML) is the field of ML that attempts to answers these needs. We propose autoBagging, an autoML system that automatically ranks 63 bagging workflows by exploiting past performance and metalearning. Results on 140 classification datasets from the OpenML platform show that autoBagging can yield better performance than the Average Rank method and achieve results that are not statistically different from an ideal model that systematically selects the best workflow for each dataset. For the purpose of reproducibility and generalizability, autoBagging is publicly available as an R package on CRAN.en
dc.languageengen
dc.relation6211en
dc.relation5450en
dc.relation5001en
dc.rightsinfo:eu-repo/semantics/openAccessen
dc.titleautoBagging: Learning to Rank Bagging Workflows with Metalearningen
dc.typeconferenceObjecten
dc.typePublicationen
Appears in Collections:CESE - Articles in International Conferences

Files in This Item:
File Description SizeFormat 
P-00M-YFM.pdf370.09 kBAdobe PDFThumbnail
View/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.