PyCP: An Open-Source Conformal Predictions Toolkit
Abstract
The Conformal Predictions framework is a new game-theoretic approach to reliable machine learning, which provides a methodology to obtain error calibration under classification and regression settings. The framework combines principles of transductive inference, algorithmic randomness and hypothesis testing to provide guaranteed error calibration in online settings (and calibration in offline settings supported by empirical studies). As the framework is being increasingly used in a variety of machine learning settings such as active learning, anomaly detection, feature selection, and change detection, there is a need to develop algorithmic implementations of the framework that can be used and further improved by researchers and practitioners. In this paper, we introduce PyCP, an open-source implementation of the Conformal Predictions framework that currently provides support for classification problems within transductive and Mondrian settings. PyCP is modular, extensible and intended for community sharing and development.
Domains
Computer Science [cs]Origin | Files produced by the author(s) |
---|
Loading...