We present bounds between different widths of convex subsets of Banach spaces, including Gelfand, Kolmogorov and Bernstein widths.Using this, and some relations between widths and minimal errors, we obtain bounds on the maximal gain of adaptive and randomizedalgorithms over non-adaptive, deterministic ones for approximating linear operators on convex sets.
We conclude with an overview of the new state of the art and a list of open problems.