Neural Data Server: A Large-Scale Search Engine for Transfer Learning Data

1University of Toronto
3Vector Institute

Transfer learning has proven to be a successful technique to train deep learning models in the domains where little training data is available. The dominant approach is to pretrain a model on a large generic dataset such as ImageNet and finetune its weights on the target domain. However, in the new era of an ever increasing number of massive datasets, selecting the relevant data for pretraining is a critical issue. We introduce Neural Data Server (NDS), a large-scale search engine for finding the most useful transfer learning data to the target domain. Our NDS consists of a dataserver which indexes several large popular image datasets, and aims to recommend data to a client, an end-user with a target application with its own small labeled dataset. As in any search engine that serves information to possibly numerous users, we want the online computation performed by the dataserver to be minimal. The dataserver represents large datasets with a much more compact mixture-of-experts model, and employs it to perform data search in a series of dataserver-client transactions at a low computational cost. We show the effectiveness of NDS in various transfer learning scenarios, demonstrating state-of-the-art performance on several target datasets and tasks such as image classification, object detection and instance segmentation. We obtain significant improvements over ImageNet pre-training by downloading only 26 Gb of server's data in cases when training on the entire dataserver (538 Gb) would take weeks. Our Neural Data Server is implemented as a webservice, and we invite you to try it.


  • [January 2020] Paper released on arXiv


Xi Yan* , David Acuna* , Sanja Fidler
(* denotes equal contribution)

Neural Data Server: A Large-Scale Search Engine for Transfer Learning Data



Xi Yan

David Acuna

Sanja Fidler