We proposed a novel boosting algorithm - InfoBoost. Though AdaBoost has been widely used for feature selection and classifier learning, many of the selected features are redundant. By incorporating mutual information into AdaBoost, InfoBoost fully examines the redundancy between candidate classifiers and selected classifiers. The classifiers thus selected are both accurate and non-redundant. Experimental results show that InfoBoost learned strong classifier has lower training error than AdaBoost. InfoBoost learning has also been applied to selecting discriminative Gabor features for face recognition. Even with the simple correlation distance measure and 1-NN classifier, the selected Gabor features achieve quite high recognition accuracy on the FERET database, where both expression and illumination variance are present. When only 140 features are used, InfoBoost selected features achieve 95.5% accuracy, about 2.5% higher than that achieved by AdaBoost.