learning.understanding.cognition.intelligence.data science

Owen Levin

Credentials: Ph.D. Student, Computer Science

Machine Learning Theory

Personal website: Owen Levin

Advisor: Jerry Zhu

I’m interested in wrangling problems related to machine learning using mathematical theory. One example of this is machine teaching. The goal here is to train a learner/classifier to do some task with as small a data set as possible. For example, if the task is related to driving we might need to train a model to classify street signs as e.g. stop sign, yield sign, speed limit sign, etc.

The typical way to do this is to first collect thousands or millions of labelled images of each kind of sign. Then train by cramming all those images in some random order down the gullet of the model you’re training. By contrast, in one variant of machine teaching, a teaching agent picks small data set, maybe just tens or hundreds of images in some particular order. The teacher’s goal is to pick these images such that the performance of the model trained on this smaller data set is as good or better than a model trained on the full data set of millions.

This example and many like it can be analyzed through the lens of optimal control theory. Control theory is a branch of mathematics where one has a set of possible states, in this case the model learned from a sequence of input images, and a set of possible actions an agent can take to change the state, in this case the the teacher’s choice of subsequent training image. Control theory then chooses actions to maximize the a reward function after T actions, in this case, the teacher’s sequence of T training images is chosen to maximize the performance of the learned model. Posed as such, this machine teaching problem can be solved with well-understood principles of classical control theory. There is a rich body of control theory literature and other areas of mathematics which can apply to machine learning. My goal is to frame problems in these settings to glean and exploit new insights.