- Anglický jazyk
A New Modeling for Knowledge Transfer in Machine Learning
Autor: Fan Liu
Multi-Task Learning (MTL), as opposed to Single Task Learning (STL), has become a hot topic in machine learning research. MTL has shown significant advantage to STL because of its ability to facilitate knowledge sharing between tasks. This thesis presents... Viac o knihe
Na objednávku, dodanie 2-4 týždne
45.36 €
bežná cena: 50.40 €
O knihe
Multi-Task Learning (MTL), as opposed to Single Task Learning (STL), has become a hot topic in machine learning research. MTL has shown significant advantage to STL because of its ability to facilitate knowledge sharing between tasks. This thesis presents my recent studies on Knowledge Transfer (KT) - the process of transferring knowledge from one task to another, which is at the core of MTL. The novelly proposed KT algorithm for correlated MTL adapts learner independence, thus empowering any ordinary classifier for MTL. The proposed MEB-based KT is on the basis that in the feature space, the two correlated tasks share some common input data that lie on the overlapping regions of the feature spaces in-between the two correlated tasks. The main idea is to find the correlating knowledge - overlapping regions of the two tasks - and transfer the related data regardless of the learner employed. KT is done by building a correlation space via MEBs and transferring the enclosed instances from the primary task to the secondary task. The extent of KT depends on the amount of overlapping instances between two tasks. This book is required reading for post-graduates and researchers in MTL.
- Vydavateľstvo: LAP LAMBERT Academic Publishing
- Rok vydania: 2011
- Formát: Paperback
- Rozmer: 220 x 150 mm
- Jazyk: Anglický jazyk
- ISBN: 9783844397321