Computational Learning Theory and Beyond It
Computational Learning Theory and Beyond It
In this T-shaped course you will be introduced to computational learning theory and get a glimpse of other research towards a theory of artificial intelligence. "T-shaped" means that on the one hand we will concentrate on different learning models in depth, on the other hand we want to give a broad overview and invite experts from other AI projects to show what else can be done in AI and theory of computer science research.
The focus is on learning from informant, a formal model for binary classification, for example by a support vector machine or perceptron. Illustrating examples are linear separators and other uniformly decidable sets of formal languages. Due to the learning by enumeration technique by Gold the learning process can be assumed consistent when full-information is available.
After the proofs of the latter observations, the model is adjusted towards the setting of deep learning. We first investigate the learnability of the set of half-spaces by this incremental learners and then show that they have less learning power than the full-information variant by a fundamental proof technique due to Blum and Blum. Finally, we will apply this technique to also separate consistency.
Beyond these models, we present and visualize applied binary classifiers and you will get concentrated insights into other approaches towards a theory of AI and important topics in theoretical computer science. These include evolutionary algorithms, fair clustering, game theory, low-dimensional embeddings, stable matchings, 3-satisfiability and submodular optimization.
Further, more models in Gold-Style computational learning theory are being discussed by experts in the second week.