Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Humans, Algorithmic Decision-Making and Society: Modeling Interactions and Impact

Bayesian Collaborative Bandits for Time Slot Inference in Maternal Health Programs

Arpan Dasgupta · Arun Sai Suggala · Karthikeyan Shanmugam · Aparna Taneja · Milind Tambe


Abstract:

Mobile health programs have gained a lot of popularity recently due to the widespread use of mobile phones, particularly in underserved communities. However, call records from one such maternal mHealth program in India indicate that different beneficiaries have different time preferences, due to their availability during the day as well as limited access to a phone. This makes selection of the best time slot to call a beneficiary an important problem for the program. Prior work has formalized this as a collaborative bandit problem, where the assumption of a low-rank call pickup matrix allows for more efficient exploration across arms. We propose a novel Bayesian solution to the collaborative bandit problem using Stochastic Gradient Langevin Dynamics (SGLD) and Thompson Sampling for selection of time slots. We show that this method is able to perform better in scarce data situations where there are limited time steps for exploration, and has the ability to utilize prior knowledge about arms to its advantage. We also propose a faster version of the algorithm using alternative sampling which can potentially scale to a very large number of users such that it may be potentially deployable in the real world. We evaluate the algorithm against existing methods on simulated data inspired from real-world data.

Chat is not available.