Computer Science Colloquium, 2004-2005

Michael Langberg
Computer Science Department, California Institute of Technology
December 8th, 2004

Worst Case analysis and Combinatorial Optimization in Information Theory

A central theme in the theory of information is the stochastic study of information flow. The messages sent, the errors imposed on the network, and the request pattern of users are typically modelled as random entities, and analyzed as such. In this talk, I suggest applying notions from the field of Theoretical Computer Science to the study of Information Theory. The focus on efficiency in computation, worst case analysis, approximation algorithms, and randomization in computation (all extensively studied in the theory of computation) when addressing the field of Information Theory, raises many research directions which have both theoretical and practical interest.

In this talk I will analyze three concrete scenarios within this theme. In the area of multi party communication, I address the design of "cost efficient" codes which allow communication at capacity. Namely, combinatorial optimization in the network coding framework is considered, within which I discuss both worst case bounds on optimal costs and approximate (per instance) bounds. In the field of the design of error correcting codes for two party communication over a noisy channel, I address the well known gap between the rates achievable under the random error model (binary symmetric channel) and the Hamming error model (adversarial channel). I show that enhancing the model of two party communication by assuming that the parties involved share an extremely short secret random string allows one to close this gap. Finally, I will present relations between average and worst case analysis in algorithmic aspects of the distribution of dynamic information.

Partially based on joint work with Alexander Sprintson and Jehoshua Bruck.


Shuly Wintner
Last modified: Sun Nov 28 09:25:51 IST 2004