At Apple, we believe privacy is a fundamental human right. Privacy-preserving ML is a key research area we focus on. In this talk we will share how we applied Differential Privacy to learn popular photos people take at frequently visited locations in order to improve the selection of cover images for Photo Memories, without personally identifiable data leaving their device. This talk will cover how photos are labeled with the on-device model, and how these label-location pairs get encoded into a one-hot vectors, so that random noise can be added. We will explain the secure aggregation protocol that aggregates noised vectors from thousands of devices with a strong Differential Privacy guarantee.