Abstract
Conformal predictors are machine learning algorithms that output prediction sets that have a guarantee of marginal validity for finite samples with minimal distributional assumptions. This is a property that makes conformal predictors useful for machine learning tasks where we require reliable predictions. It would also be desirable to achieve conditional validity in the same setting, in the sense that validity of the prediction intervals remains true regardless of conditioning on any particular property of the object of the prediction. Unfortunately, it has been shown that such conditional validity is impossible to guarantee for non-trivial prediction problems for finite samples. In this article, instead of trying to achieve a strong conditional validity guarantee, an approximation to conditional validity is considered and measured empirically. A new algorithm is introduced to do this by iteratively adjusting a conformity measure to deviations from object conditional validity measured in the training data. Experimental results are provided for three data sets that demonstrate (1) in real world machine learning tasks, lack of conditional validity is a measurable problem and (2) that the proposed algorithm is effective at alleviating this problem.
Original language | English |
---|---|
Pages (from-to) | 4-23 |
Number of pages | 20 |
Journal | Proceedings of Machine Learning Research |
Volume | 152 |
Publication status | Published - 2021 |
Event | 10th Symposium on Conformal and Probabilistic Prediction and Applications, COPA 2021 - Virtual, Online Duration: 8 Sept 2021 → 10 Sept 2021 |
Keywords
- Conformal prediction
- approximation
- conditional validity
ASJC Scopus subject areas
- Artificial Intelligence
- Software
- Control and Systems Engineering
- Statistics and Probability