2000 character limit reached
Flexible model composition in machine learning and its implementation in MLJ (2012.15505v1)
Published 31 Dec 2020 in cs.LG
Abstract: A graph-based protocol called `learning networks' which combine assorted machine learning models into meta-models is described. Learning networks are shown to overcome several limitations of model composition as implemented in the dominant machine learning platforms. After illustrating the protocol in simple examples, a concise syntax for specifying a learning network, implemented in the MLJ framework, is presented. Using the syntax, it is shown that learning networks are are sufficiently flexible to include Wolpert's model stacking, with out-of-sample predictions for the base learners.
Collections
Sign up for free to add this paper to one or more collections.