Skip to yearly menu bar Skip to main content

Workshop: Theory and Practice of Differential Privacy

Private Boosted Decision Trees via Smooth Re-Weighting: Simplicity is Useful

Marco Carmosino · Vahid Reza Asadi · Mohammad Mahdi Jahanara · Akbar Rafiey · Bahar Salamatian


Protecting the privacy of people whose data is used by machine learning algorithms is important. Differential Privacy is the appropriate mathematical framework for formal guarantees of privacy, and boosted decision trees are a popular machine learning technique. So we propose and test a practical algorithm for boosting decision trees that guarantees differential privacy. Privacy is enforced because our booster never puts too much weight on any one example; this ensures that each individual's data never influences a single tree "too much." Experiments show that this boosting algorithm can produce better model sparsity and accuracy than other differentially private ensemble classifiers.

Chat is not available.