Diploma / Doctoral Seminar: Theory Meets Application



Our research seminar "Theory meets Application" provides a forum for exchange between the different groups within the Institute of Computer Science. It takes place three times during the summer break 2015, giving attendees the opportunity to learn about the most recent research that is being done in Bonn, and to discover and discuss potential joint interests and areas of overlap.

Slides are available for download at the bottom of this page. If you did not receive the login information, please contact Thomas Schultz.

5.8. 16:00-17:30 A207 (Römerstr.): Heiko Röglin and Jürgen Gall

Heiko Röglin: Smoothed Analysis of Algorithms

The complexity of many classical optimization problems and algorithms seems very well understood. However, very often theoretical results contradict observations made in practice. Many NP-hard optimization problems like the knapsack problem can be solved efficiently in practice and for many problems, like linear programming, algorithms with exponential worst-case running time outperform polynomial-time algorithms. The reason for this discrepancy is the pessimistic worst-case perspective adopted by theoreticians, in which the
performance of an algorithm is solely measured by its behavior on the worst possible input.

In order to provide a more realistic measure that can explain the practical performance of algorithms, Spielman and Teng introduced in 2001 the framework of smoothed analysis, in which the performance of an algorithm is measured on inputs that are subject to random noise. In this talk, we will present this framework and survey some results on smoothed analysis of algorithm.

Jürgen Gall: Large-scale Problems in Computer Vision

Analyzing visual data, e.g., segmenting an object or estimating the human pose in an image, requires the minimization of an energy function either for parameter learning or for inference. The optimization, however, is often intractable, in particular for large-scale data with million of images and videos. We therefore have to give up some desirable properties like generality, optimality, good worst-case complexity, integrality, or determinism. In this talk, I will give a brief introduction to a few computer vision problems and approaches for solving them. Depending on time, the talk will include object discovery, video classification, human pose estimation, neural networks, branch-and-bound, and random fields.

19.8. 16:00-17:30 HS III.03a (LBH): Stefan Kratsch and Thomas Schultz

Stefan Kratsch: Parameterized complexity and treewidth

Parameterized complexity is a framework for studying the complexity of NP-hard problems in finer detail. This is achieved by relating the runtime not only to the input size but taking into account one or more problem parameters. For example, for many NP-hard problems there are competitive algorithms for finding small solutions, despite expecting time exponential in the input size in general.

The notion of treewidth and the algorithmic paradigm of dynamic programming on tree decompositions play an important role in parameterized complexity. Treewidth is probably to most widely studied parameter and a wealth of problems have efficient algorithms on graphs or structures of bounded treewidth.

The talk will begin with a brief introduction to parameterized complexity. The main part will be introducing the notion of treewidth and discussing the intuition behind dynamic programming for graphs of bounded treewidth.

Thomas Schultz: Mathematics and Visualization in Neuroscience

Modern medical imaging opens a window into the living human brain, and enables new insights into its structure, function, and disease. At the same time, it generates complex data that poses interesting challenges to data analysis and visualization. Within this context, this talk will illustrate how computer science can act as a bridge between mathematical insight and a specific application.

In particular, we explain how classic concepts from linear algebra, such as low-rank approximation or positive definiteness, can be transferred to tensors, which can be considered as multi-way extensions of matrices. In theory, this turns efficiently computable problems into NP-hard ones. We will discuss an example where, in practice, it still leads to an algorithm that produces state-of-the-art results at a competitive computational cost.

2.9. 16:00-17:30 A207 (Römerstr.): Matthias Mnich and Satya Samal Swarup

Matthias Mnich: A Fine-Grained Complexity Analysis of Cut Problems in Graphs

Cut problems in graphs are among the most extensively studied optimization problems: separating a graph into several pieces according to certain criteria while optimizing some objective function. Cut problems find many practical applications like image segmentation or clustering. Their theoretical appeal in part comes from the fact that many of these cut problems lie just on the boundary of tractability, meaning that they can be either solved provably efficiently or that fast algorithms are unlikely to exist. Despite that graph cut problems have been studied for more than half a century, the past few years have seen tremendous progress in understanding the conditions which make a cut problem hard or easy. To this end, I will survey a fine-grained complexity analysis of fundamental cut problems that highlights the crucial aspects that contribute to make a problem easy or hard.

Satya Samal Swarup: A geometric method for model reduction of biochemical networks with polynomial rate functions

We discuss a novel analysis method for reaction network systems with polynomial or rational rate functions. This method is based on computing tropical equilibrations defined by the equality of at least two dominant monomials of opposite signs in the differential equations of each dynamic variable. In algebraic geometry, the tropical equilibration problem is tantamount to

finding tropical prevarieties, that are finite intersections of tropical hypersurfaces. Tropical equilibrations with the same set of dominant monomials define a branch or equivalence class. Minimal branches are particularly interesting as they describe the simplest states of the reaction network. We provide a method to compute the number of minimal branches and to find representative tropical equilibrations for each branch.

Joint work with Dima Grigoriev (CNRS, Mathematiques, Universite de Lille), Holger Fröhlich (Algorithmic Bioinformatics, Bonn-Aachen International Center for IT), Andreas Weber (Institut für Informatik II, University of Bonn), Ovidiu Radulescu (DIMNP UMR CNRS 5235, University of Montpellier)