February 16, Wednesday 14:15, Room 303, Jacobs Building
Title:
Lecturer: Ofer Neiman
Lecturer homepage : http://www.cs.princeton.edu/~oneiman/
Affiliation : Computer Science Department, Princeton University
Embedding of metric spaces has been recognized as a powerful tool for algorithm design, and plays a role in a range of application areas. The usefulness follows from the fact that the embedding allows us to reduce problems defined over arbitrary "difficult" metrics to problems over "simpler" metrics.
In the first part of the talk I will review several results and applications of metric embedding, and in the second part I will focus on a recent result on dimension reduction in L_1:
Given a set of n points in L_1, how many dimensions are needed to
represent all pairwise distances within a specific distortion ?
This dimension-distortion tradeoff question is well understood for the
L_2 norm, where O((\log n)/\epsilon^{2}) dimensions suffice to achieve
1+\epsilon distortion. In sharp contrast, there is a significant gap between
upper and lower bounds for dimension reduction in L_1.
In this work, we show the first near linear lower bounds for dimension
reduction in L_1. In particular, we show that 1+\epsilon distortion
requires at least n^{1-O(1/\log(1/\epsilon))} dimensions.