Transferred correlation learning: An incremental scheme for neural network ensembles
Document Type
Conference Proceeding
Publication Date
1-1-2010
Abstract
Transfer learning is a new learning paradigm, in which, besides the training data for the targeted learning task, data that are related to the task (often under a different distribution) are also employed to help train a better learner. For example, out-dated data can be used as such related data. In this paper, we propose a new transfer learning framework for training neural network (NN) ensembles. The framework has two key features: 1) it uses the well-known negative correlation learning to train an ensemble of diverse neural networks from the related data, fully discovering the knowledge in the data; and 2) a penalized incremental learning scheme is used to adapt the neural networks obtained from negative correlation learning to the training data for the targeted learning task. The adaptation is guided by reference neural networks that measure the relatedness between the training and the related data. Experiments on benchmark data sets show that our framework can achieve classification accuracy competitive to existing ensemble transfer learning methods such as TrAdaBoost [1] and TrBagg [2]. We discuss some characteristics of our framework observed in the experiment and the scenarios under which the framework may have superior performance. © 2010 IEEE.
Publication Source (Journal or Book title)
Proceedings of the International Joint Conference on Neural Networks
Recommended Citation
Jiang, L., Zhang, J., & Allen, G. (2010). Transferred correlation learning: An incremental scheme for neural network ensembles. Proceedings of the International Joint Conference on Neural Networks https://doi.org/10.1109/IJCNN.2010.5596617