The traditional theory on algorithms and their complexity generally gives runtime bounds in terms of the size of an instance; with the need to process ever larger data sets, these analyses have lost a lot of their predictive power since even many polynomial time algorithms can not be used to process huge amounts of data. Consequently, frameworks have been developed to more closely model the data encountered in practical settings. This allows us to design algorithms with much better runtime guarantees for some practical data, as well as develop a much more detailed complexity theory of such algorithms. This seminar will give an overview of some of the modern algorithmic approaches that allow us to more accurately analyse real-world problems, such as parameterized and fine-grained analysis, randomization, and streaming and online algorithms.
The seminar will cover up to 10 topics, depending on the number of participants. Each topic will be covered by one student, who will work through a paper or problem, give a 30 min presentation on it, and write a 2-3 page summary.
We are looking for students with an interest in theoretical computer science and algorithm design. Some of the topics to be discussed are fairly technical and experience with graph theory, algorithm analysis, probability theory, and complexity theory will be beneficial. However, the range of topics is quite wide, so a lack of background in some of these topics should not be an obstacle. The seminar is open to master students in CS, DS, IIW, and TM.