In “How to Use Expert Advice“, Cesa-Bianchi, Freund, Haussler, Helmbold, Sharpire, and Warmuth (1997) describe boosting type algorithms to combine classifiers. They prove a bound of
$$ \sqrt{n \log(M)}$$
on the total regret where $n$ is the number of data points and $M$ is the number of classifiers. They relate their bounds to the L1 distance between binary strings and apply their results to pattern recognition theory and PAC learning.