Full recordShow full item record
AbstractGraduation date: 2008
The goal of many machine learning problems can be formalized as the creation of a function that can properly classify an input vector, given a set of examples of that function. While this formalism has produced a number of success stories, there are notable situations in which it fails. One such situation arises when the class labels are composed of multiple variables, each of which may be correlated with all or part of the input or output vectors. Such problems, known as structured prediction problems, are common in the fields of information retrieval, computational linguistics, and computer vision, among others. In this dissertation, I will discuss structured prediction problems and some of the previous approaches to solving them. I will then present a new algorithm, structured gradient boosting, that combines strong points of previous approaches while retaining their generality. More specifically, the algorithm will combine some of the notions of margin maximization present in support vector methods with the speed and flexibility of the structured perceptron algorithm. Finally, I will show a number of novel ways in which this algorithm can be applied effectively, highlighting applications in learning by demonstration and music information retrieval.