Multi-Armed Bandit Problem: Finding an Optimal Solution When We Don’t Have Enough Time and Resources by Gyung Hyun Je ’20, Wednesday, December 4, 1 – 1:45 pm, Mathematics Colloquium
“Two roads diverged in a yellow wood,” — Robert Frost, Road Not Taken.
And which one should you choose? Every day, multiple roads diverge in front of us, and we face the dilemma of having to choose one. We can’t explore every single option given our limited resources and time, but we don’t have enough data to be sure which choice is best. This is a Multi-Armed Bandit problem. Our goal is to maximize the expected gain as we make a sequence of decisions to allocate our resources among a number of choices we don’t know much about. In this colloquium, we will talk about the known mathematical solutions of Multi-Armed Bandit problem. We will also examine how we can approximate an optimal solution computationally and apply it in real-life cases.