Abstract
Current approaches for feature selection on multiple data sources need to join all data in order to evaluate features against the class label, thus are not scalable and involve unnecessary information leakage. In this paper, we present a way of performing feature selection through class propagation, eliminating the need of join before feature selection. We propagate a very compact data structure that provides enough information for selecting features to each data source, thus allowing features to be evaluated locally without looking at any other information. Our experiments confirmed that our algorithm is highly scalable while effectively preserving the data privacy. Copyright © by SIAM.