Professor Anders C. Hansen
Anders C. Hansen leads the Applied Functional and Harmonic Analysis group within the Faculty of Mathematics at the University of Cambridge and Department of Applied Mathematics and Theoretical Physics (DAMTP). He is a Professor of mathematics at the University of Cambridge, Professor of Mathematics at the University of Oslo, a Royal Society University Research Fellow and also a Fellow of Peterhouse. For further information, see Wikipedia .
Email: ach70@cam.ac.uk
Tel: +44 1223 760403
Office: F2.01
Research Interests
Functional Analysis, Artificial Intelligence, Foundations of Computational Mathematics, Solvability Complexity Index hierarchy, Generalised Hardness of Approximation, Optimisation, Inverse Problems, Medical Imaging, Operator/Spectral Theory, Numerical Analysis, Computational Harmonic Analysis, PDEs, Compressed Sensing, Mathematical Signal Processing, Sampling Theory, Geometric Integration, Operator Algebras
Selected Talks and Events
- Plenary speaker at International Conference on Mathematical Theory of Deep Learning (August 5-9, 2024).
- Plenary speaker at 10th International Conference on Mathematical Methods for Curves and Surfaces (June 26-28, 2024).
- Plenary speaker at Nordic Perspectives on Artificial Intelligence (Oct. 12-13, 2023).
- Organizing the workshop Computational mathematics in computer assisted proofs (Sept 12-16, 2022) together with Charles Fefferman and Svetlana Jitomirskaya.
- Plenary speaker at Thirty years of Acta Numerica (26 June - 02 July 2022).
- Speaking at King's College London Mathematics Colloquium (12 May, 2022).
- Organizing the workshop Interpretability, safety, and security in AI (Dec 13-15, 2021) together with Rich Baraniuk , Miguel Rodrigues and Adrian Weller.
- Speaking (online) at the University of Chicago Mathematics Colloquium (April 7, 2021).
- Speaking (online) at the Cambridge Science Festival (March 29, 2021).
- Speaking at the University of Minnesota, Applied and Computational Math Colloquium (Feb. 3 2020)
- Plenary speaker at the National Academy of Sciences, Arthur M. Sackler Colloquim: The Science of Deep Learning, Washington D.C. (March 2019).
- Plenary speaker at SPARS (2017).
- Plenary speaker at Structured Regularization for High-Dimensional Data Analysis, Institut Henri Poincaré (2017).
- Plenary speaker at Strobl16: Time-Frequency Analysis and Related Topics (2016).
- Plenary speaker at UCL-Duke Workshop on Sensing and Analysis of High-Dimensional Data (2014).
Prizes and Awards
1. PROSE Award Finalist 2022 - Computing & Information Science.
3. 2018 IMA Prize in Mathematics and Applications.
4. Leverhulme Prize in Mathematics and Statistics 2017.
5. Royal Society University Research Fellow 2012.
News
1. SIAM News reports (front page) on our work in the recent May edition: Proving existence is not enough: Mathematical paradoxes unravel the limits of neural networks in AI.
2. IEEE Spectrum Magazine reports on our paper "The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem."
3. Proc. Natl. Acad. Sci. published our paper "The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem." Here is the announcement from Cambridge University News. Further press coverage here.
4. PROSE Award Finalist 2022 - Computing & Information Sciencefor our book Compressive Imaging: Structure, Sampling, Learning (with B. Adcock) on Cambridge University Press.
5. SIAM News reports (front page) on our work from the paper "The mathematics of adversarial attacks in AI -- Why deep learning is unstable despite the existence of stable neural networks" in the recent October edition: Deep Learning: What Could Go Wrong?.
6. SIAM News reports on our work on deep learning in scientific computing in the recent March edition: Deep Learning in Scientific Computing: Understanding the Instability Mystery.
7. Proc. Natl. Acad. Sci. published our paper On instabilities of deep learning in image reconstruction and the potential costs of AI Here is some of the press coverage: Cambridge University News, Physics World, EurekAlert, The Register, Health Care Business, Radiology Business, Science Daily, Psychology Today, Government Computing, Diagnostic Imaging, News Medical, Press Release Point, Tech Xplore, Aunt Minnie, My Science, Digit, The Talking Machines, MC.AI, Rama on Healthcare, News8PLus, Genethique, Healthcare in Europe, AuntminnieEurope, Newsbreak, AI Development Hub, FirstWord MedTech, AI Daily.
8. Our paper How to compute spectra with error control is on the cover of the last June edition of Physical Review Letters.
9. The Sackler Colloquium at the US National Academy of Sciences: "The Science of deep learning". Watch the presentation "On instabilities in deep learning - Does AI come at a cost?"
10. SIAM News has our work on the Restricted Isometry Property in Levels in compressed sensing on the front page of the October edition: From Global to Local: Getting More from Compressed Sensing.
11. Siemens validated in practice, using a modified MRI machine, the asymptotic sparsity, asymptotic incoherence and high resolution concepts introduced by our work (see Breaking the coherence barrier: A new theory for compressed sensing and also On asymptotic structure in compressed sensing). From their conclusion:“[...] The image resolution has been greatly improved [...]. Current results practically demonstrated that it is possible to break the coherence barrier by increasing the spatial resolution in MR acquisitions. This likewise implies that the full potential of the compressed sensing is unleashed only if asymptotic sparsity and asymptotic incoherence is achieved.”
Their work Novel Sampling Strategies for Sparse MR Image Reconstruction was published in May 2014 in the Proceedings of the International Society for Magnetic Resonance in Medicine.
Students and Post-Docs
Phd Students: 1. Clarice Poon (graduated 2015), 2. Milana Gataric (graduated 2016), 3. Alexander Jones (graduated 2016), 4. Alexander Bastounis (graduated 2018), 5. Vegard Antun (graduated 2020), 6. Matt Colbrook (graduated 2020), 7. Laura Thesing (graduated 2022), 8. Simon Becker (graduated 2022), 9. Nina Gottschling (graduated 2023), 10. Paolo Campodonico (graduating 2024), 11. David Liu (graduating 2024), 12. Luca Gazdag (graduating 2024), 13. Johan Wind (graduating 2025), 14. Emil Haugen (graduating 2027), 15. George Coote (graduating 2027).
Post-docs: 1. Jonathan Ben-Artzi ( 2011-2014, PhD: Brown University), 2. Bogdan Roman ( 2013-2016, 2016-2019, PhD: University of Cambridge), 3. Priscilla Canizares (2015-2016, PhD: Autonomous University of Barcelona), 4. Milana Gataric (2015-2016, PhD: University of Cambridge), 5. Francesco Renna (2016-2018, PhD: University of Padova), 6. Alexander Bastounis (2019-2021, PhD: University of Cambridge), 7. Vegard Antun (2020-, PhD: University of Oslo), 8. Alexei Stepanenko (2022-2023, PhD: Cardiff University).
Teaching
NST Part IA Mathematical Methods I - Course A.
Part III course on Compressed Sensing.
Editor
Proceedings of the Royal Society Series A (2014-2020)
Networks & Heterogeneous Media (2021- )
SIAM Journal on Imaging Sciences (2022- )
BIT Numerical Mathematics (2023- )
Books
- B. Adcock, A. C. Hansen, Compressive Imaging: Structure, Sampling, Learning,
Cambridge University Press (2021)
Selected Papers
- A. C. Hansen, On the Solvability Complexity Index, the n-Pseudospectrum and Approximations of Spectra of Operators,
J. Amer. Math. Soc. 24, no. 1, 81-124 - J. Ben-Artzi, M. Colbrook, A. C. Hansen, O. Nevanlinna, M. Seidel, Computing spectra - On the Solvability Complexity Index hierarchy and towers of algorithms.
- A. Bastounis, A. C. Hansen, V. Vlacic, The extended Smale's 9th problem.
- V. Antun, F. Renna, C. Poon, B. Adcock, A. C. Hansen, On instabilities of deep learning in image reconstruction and the potential costs of AI,
Proc. Natl. Acad. Sci. 2020, no. 5, 201907377 - M, Colbrook, V. Antun, A. C. Hansen, The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem. Proc. Natl. Acad. Sci. 2022 no. 119 (12) e2107151119
- B. Adcock, A. C. Hansen, C. Poon, B. Roman, Breaking the coherence barrier: A new theory for compressed sensing,
Forum of Mathematics, Sigma 5(4):1-84 - M. Colbrook, A. C. Hansen, The foundations of spectral computations via the Solvability Complexity Index hierarchy J. Eur. Math. Soc. 2022
- B. Adcock, A. C. Hansen, Generalized Sampling and Infinite Dimensional Compressed Sensing,
Found. Comp. Math. 16, no. 5, 1263-1323 - M. Colbrook, B. Roman, A. C. Hansen, How to compute spectra with error control,
Phys. Rev. Lett. 122, 250201 (front cover)
SIAM News
- V. Antun, M, Colbrook, A. C. Hansen, Proving existence is not enough: Mathematical paradoxes unravel the limits of neural networks in AI. SIAM News, 50, no. 8 May 2022 (front cover)
- A. Bastounis, A. C. Hansen, D. Higham, I. Tyukin, V. Vlacic, Deep Learning: What Could Go Wrong? SIAM News, 54, no. 8 October 2021 (front cover)
- V. Antun, N. Gottschling, A. C. Hansen, B. Adcock, Deep Learning in Scientific Computing: Understanding the Instability Mystery.
SIAM News, 54, no. 2 March 2021 - A. Bastounis, B. Adcock, A. C. Hansen, From Global to Local: Getting More from Compressed Sensing,
SIAM News, 50, no. 8 October 2017 (front cover)
Papers in Chronological Order
- A. Bastounis, P. Campodonico, M. van der Schaar, B. Adcock, A. C. Hansen, On the consistent reasoning paradox of intelligence and optimal trust in AI: The power of `I don't know'.
- A. Bastounis, F. Cucker, A. C. Hansen, When can you trust feature selection? -- I: A condition-based analysis of LASSO and generalised hardness of approximation.
- A. Bastounis, F. Cucker, A. C. Hansen, When can you trust feature selection? -- II: On the effects of random data on condition in statistics and optimisation.
- N. Gottschling, P. Campodonico, V. Antun, A. C. Hansen, On the existence of optimal multi-valued decoders and their accuracy bounds for undersampled inverse problems .
- J. S. Wind, V. Antun, A. C. Hansen, Implicit regularization in AI meets generalized hardness of approximation in optimization -- Sharp results for diagonal linear networks.
- J. Ben-Artzi, M. Colbrook, A. C. Hansen, O. Nevanlinna, M. Seidel, Computing spectra - On the Solvability Complexity Index hierarchy and towers of algorithms.
- A. Bastounis, A. C. Hansen, V. Vlacic, The extended Smale's 9th problem -- On computational barriers and paradoxes in estimation, regularisation, computer-assisted proofs and learning.
- S. Becker, A. C. Hansen, Computing solutions of Schrodinger equations on unbounded domains - On the brink of numerical algorithms.
- A. Bastounis, A. C. Hansen, V. Vlacic, The mathematics of adversarial attacks in AI -- Why deep learning is unstable despite the existence of stable neural networks. Eur. Jour. Appl. Math.
- N. Gottschling, V. Antun, B. Adcock, A. C. Hansen, The troublesome kernel -- On hallucinations, no free lunches and the accuracy-stability trade-off in inverse problems. SIAM Review (To appear).
- Z. Liu, A. C. Hansen, Do stable neural networks exist for classification problems? -- A new view on stability in AI. Eur. Jour. Appl. Math.
- L. Gazdag, A. C. Hansen, Generalised hardness of approximation and the SCI hierarchy - On determining the boundaries of training algorithms in AI. Found. Comp. Math.
- A. Bastounis, A. N. Gorban, A. C. Hansen, D. J. Higham, D. Prokhorov, O. Sutton, I. Y. Tyukin, Q. Zhou, The Boundaries of Verifiable Accuracy, Robustness, and Generalisation in Deep Learning. Artificial Neural Networks and Machine Learning -- ICANN 2023 (Springer)
- L. Thesing, A. C. Hansen, Which neural networks can be computed by an algorithm? -- Generalised hardness of approximation meets Deep Learning. Proc. Appl. Math. Mech. 2022;22:1 e202200174.
- V. Antun, M, Colbrook, A. C. Hansen, Proving existence is not enough: Mathematical paradoxes unravel the limits of neural networks in AI. SIAM News, 50, no. 8 May 2022
- V. Antun, M, Colbrook, A. C. Hansen, The difficulty of computing stable and accurate neural networks: On the barriers of deep learning and Smale's 18th problem. Proc. Natl. Acad. Sci. 2022 no. 119 (12) e2107151119
- T. Loss, M, Colbrook, A. C. Hansen, Stratified Sampling Based Compressed Sensing for Structured Signals. IEEE Trans. Signal Process, vol. 70, pp. 3530-3539, (2022)
- M. Colbrook, A. C. Hansen, The foundations of spectral computations via the Solvability Complexity Index hierarchy. J. Eur. Math. Soc. (2022)
- A. Bastounis, A. C. Hansen, D. Higham, I. Tyukin, V. Vlacic, Deep Learning: What Could Go Wrong? SIAM News, 54, no. 8 October 2021
- L. Thesing, V. Antun, A. C. Hansen, What do AI algorithms actually learn - On false structures in deep learning.
- V. Antun, N. Gottschling, A. C. Hansen, B. Adcock, Deep Learning in Scientific Computing: Understanding the Instability Mystery.
SIAM News, 54, no. 2 March 2021 - L. Thesing, A. C. Hansen, Non-uniform recovery guarantees for binary measurements and infinite-dimensional compressed sensing. J. Fourier Anal. Appl. 27, 14 (2021)
- B. Adcock, V. Antun, A. C. Hansen, Uniform recovery in infinite-dimensional compressed sensing and applications to structured binary sampling Appl. Comput. Harmon. Anal. Volume 55:1-40 (2021)
- V. Antun, F. Renna, C. Poon, B. Adcock, A. C. Hansen, On instabilities of deep learning in image reconstruction and the potential costs of AI,
Proc. Natl. Acad. Sci. 2020, no. 5, 201907377 - J. Schoormans, G. J. Strijkers, A. C. Hansen, A. J. Nederveen, B. F. Coolen, Compressed Sensing MRI with Variable Density Averaging (CS-VDA) Outperforms Full Sampling at Low SNR. Phys. Med. Biol. 65 045004 (2020)
- M. Colbrook, B. Roman, A. C. Hansen, How to compute spectra with error control,
Phys. Rev. Lett. 122, 250201 - A. C. Hansen, B. Roman, Structure and Optimisation in Computational Harmonic Analysis: On Key Aspects in Sparse Regularisation,
Springer Optimization and Its Applications vol. 168: 125-172 (2021) - M. Colbrook, A. C. Hansen, On the Infinite-dimensional QR Algorithm,
Numerische Mathematik 143:17Ð83 (2019) - R. Calderbank, A. C. Hansen, L. Thesing, B. Roman On reconstructions from measurements with binary functions,
Applied and Numerical Harmonic Analysis. Birkhauser 97-128 (2019) - L. Thesing, A. C. Hansen, Linear reconstructions and the analysis of the stable sampling rate,
Sampl. Theory Signal Image Process. 17:103-126 (2018) - A. C. Hansen, L. Thesing, On the Stable Sampling rate for binary measurements and wavelet reconstruction,
Appl. Comput. Harmon. Anal. 48(2): 630-654 (2020) - A. Bastounis, B. Adcock, A. C. Hansen, From Global to Local: Getting More from Compressed Sensing,
SIAM News, 50, no. 8 October 2017 - A. C. Hansen, L. Thesing, Sampling from binary measurements - On Reconstructions from Walsh coefficients,
IEEE 2017 Int. Conf. on Samp. Theory and Appl. 256-260 (2017) - A. Bastounis, A. C. Hansen, On the absence of uniform recovery in many real-world applications of compressed sensing and the RIP & nullspace property in levels.
SIAM Jour. Imag. Scienc. 10(1):335-371 - B. Adcock, A. C. Hansen, C. Poon, B. Roman, Breaking the coherence barrier: A new theory for compressed sensing,
Forum of Mathematics, Sigma 5(4):1-84 - A. C. Hansen, O. Nevanlinna, Complexity Issues in Computing Spectra, Pseudospectra and Resolvents,
Banach Centre Pub. 112:171-194 - B. Adcock, M. Gataric, A. C. Hansen, Density theorems for nonuniform sampling of bandlimited functions using derivatives or bunched measurements,
J. Fourier Anal. Appl. 23(6):1311-1347 - B. Adcock, A. C. Hansen, B. Roman, A note on compressed sensing of structured sparse wavelet coefficients from subsampled Fourier measurements,
IEEE Signal Process. Lett. 23(5):732 - 736
- A. Jones , A. Tamtogl, I. Calvo-Almazan, A. C. Hansen, Continuous compressed sensing of inelastic and quasielastic Helium Atom Scattering spectra,
Nature, Sci. Rep. 6, Art. num.: 27776 - A. Jones , A. Tamtogl, I. Calvo-Almazan, A. C. Hansen, Continuous compressed sensing of inelastic and quasielastic Helium Atom Scattering spectra (supplementary material),
Nature, Sci. Rep. 6, Art. num.: 27776 - J. Ben-Artzi, A. C. Hansen, O. Nevanlinna, M. Seidel, New barriers in complexity theory: On the Solvability Complexity Index and towers of algorithms,
C. R. Acad. Sci. Paris Sér. I Math. 353, no. 10, 931-936 - J. Ben-Artzi, A. C. Hansen, O. Nevanlinna, M. Seidel, The Solvability Complexity Index - Computer science and logic meet scientific computing.
- B. Adcock, M. Gataric, A. C. Hansen, Recovering piecewise smooth functions from nonuniform Fourier measuremets,
Springer Lect. Notes in Comp. Sci. and Eng. 2015 - A. Bastounis, A. C. Hansen, On random and deterministic compressed sensing and the Restricted Isometry Property in Levels,
IEEE 2015 Int. Conf. on Samp. Theory and Appl. - B. Adcock, A. C. Hansen, M. Gataric, Weighted frames of exponentials and stable recovery of multidimensional functions from nonuniform Fourier samples,
Appl. Comput. Harmon. Anal. 42(3):508-535 - B. Adcock, M. Gataric, A. C. Hansen, Stable nonuniform sampling with weighted Fourier frames and recovery in arbitrary spaces,
IEEE 2015 Int. Conf. on Samp. Theory and Appl. - B. Adcock, A. C. Hansen, A. Jones, On asymptotic incoherence and its implications for compressed sensing for inverse problems,
IEEE Trans. Inf. Theory, 62, no. 2, 1020-1032 - B. Roman, B. Adcock, A. C. Hansen, On asymptotic structure in compressed sensing.
- B. Adcock, G. Kutyniok, A. C. Hansen, J. Ma, Linear Stable Sampling Rate: Optimality of 2D Wavelet Reconstructions from Fourier Measurements,
SIAM J. Math. Anal. 47(2), 1196–1233 - B. Adcock, A. C. Hansen, Generalized Sampling and Infinite Dimensional Compressed Sensing,
Found. Comp. Math. 16, no. 5, 1263-1323 - B. Adcock, A. C. Hansen, B. Roman The quest for optimal sampling: computationally efficient, structure-exploiting measurements for compressed sensing,
Springer, 2015 - B. Roman, A. Bastounis, B. Adcock, A. C. Hansen, On fundamentals of models and sampling in compressed sensing.
- A. Jones, B. Adcock, A. C. Hansen Analyzing the structure of multidimensional compressed sensing problems through coherence.
- B. Adcock, M. Gataric, A. C. Hansen, On stable reconstructions from univariate nonuniform Fourier measurements,
SIAM Jour. Imag. Scienc. 7(3):1690-1723 - B. Adcock, A. C. Hansen, B. Roman, G. Teschke, Generalized sampling: stable reconstructions, inverse problems and compressed sensing over the continuum,
Adv. in Imag. and Electr. Phys. vol 182, 187-279, Elsevier, 2014 - B. Adcock, A. C. Hansen, A. Shadrin, A stability barrier for reconstructions from Fourier samples,
SIAM Jour. on Num. Anal. 52, no. 1, 125-139 - B. Adcock, A. C. Hansen, C. Poon, B. Roman, Breaking the coherence barrier: asymptotic incoherence and asymptotic sparsity in compressed sensing,
Proc. of the 10th Int. Conf. on Samp. Theory and Appl., 2013 - B. Adcock, A. C. Hansen, C. Poon, Optimal wavelet reconstructions from Fourier samples via generalized sampling,
Proc. of the 10th Int. Conf. on Samp. Theory and Appl., 2013 - B. Adcock, A. C. Hansen, C. Poon, Beyond Consistent Reconstructions: Optimality and Sharp Bounds for Generalized Sampling, and Application to the Uniform Resampling Problem,
SIAM J. Math. Anal. 45, no. 5, 3132-3167 - B. Adcock, A. C. Hansen, C. Poon, On optimal wavelet reconstructions from Fourier samples: linearity and universality of the stable sampling rate,
Appl. Comput. Harmon. Anal. 36, no. 3, 387-415 - B. Adcock, A. C. Hansen, Generalized sampling and the stable and accurate reconstruction of piecewise analytic functions from their Fourier coefficients,
Math. Comp. 84, 237-270 - B. Adcock, A. C. Hansen, E. Herrholz, G. Teschke, Generalized Sampling: Extensions to Frames and Inverse and Ill-Posed Problems,
Inverse Prob. 29, no 1, 015008 - B. Adcock, A. C. Hansen, Reduced Consistency Sampling in Hilbert Spaces,
Proc. of the 9th Int. Conf. on Samp. Theory and Appl., 2011 - B. Adcock, A. C. Hansen, Stable reconstructions in Hilbert spaces and the resolution of the Gibbs phenomenon,
Appl. Comput. Harmon. Anal. 32, no. 3, 357-388 - B. Adcock, A. C. Hansen, A Generalized Sampling Theorem for Stable Reconstructions in Arbitrary Bases,
J. Fourier Anal. Appl. 18, no. 4, 685-716 - A. C. Hansen, A theoretical framework for backward error analysis on manifolds,
J. Geom. Mech. 3, no. 1, 81 - 111 - A. C. Hansen, On the Solvability Complexity Index, the n-Pseudospectrum and Approximations of Spectra of Operators,
J. Amer. Math. Soc. 24, no. 1, 81-124 - A. C. Hansen, J. Strain, On the order of deferred correction,
Appl. Numer. Math. 61, no. 8, 961-973 - A. C. Hansen, Infinite dimensional numerical linear algebra; theory and applications,
Proc. R. Soc. Lond. Ser. A. 466, no. 2124, 3539-3559 - A. C. Hansen, On the approximation of spectra of linear operators on Hilbert spaces,
J. Funct. Anal. 254, no. 8, 2092--2126 - A. C. Hansen, J. Strain, Convergence theory for spectral deferred correction,
Preprint, UC Berkeley
Previous Events
- Organizing the workshop Computational mathematics in computer assisted proofs (Sept 12-16, 2022) together with Charles Fefferman and Svetlana Jitomirskaya.
- Plenary speaker at Thirty years of Acta Numerica (26 June - 02 July 2022).
- Speaking at King's College London Mathematics Colloquium (12 May, 2022).
- Speaking (online) at University of Oxford Data Science Seminar (Feb. 21, 2022).
- Speaking (online) at IST (Jan. 20, 2022).
- Organizing the workshop Interpretability, safety, and security in AI (Dec 13-15, 2021) together with Rich Baraniuk , Miguel Rodrigues and Adrian Weller.
- Speaking (online) at the University of Leicester (Oct. 14, 2021).
- Speaking at EPFL (Sept. 21, 2021).
- Speaking (online) at the University of Chicago Mathematics Colloquium (April 7, 2021).
- Speaking (online) at the Cambridge Science Festival (March 29, 2021).
- Speaking (online) at the One World IMAGing and INvErse problems (IMAGINE) seminar (Feb 17, 2021).
- Speaking (online) at the Gran Sasso Science Institute Mathematics Colloquium (Jan 28, 2021).
- Speaking (online) at XAI: Explaining what goes on inside DNN/AI (Oct 20, 2020).
- Speaking (online) at the Max Planck Institute of Molecular Cell Biology and Genetics (Sept 24, 2020).
- Speaking (online) at the Mathematics of Machine Learning, LMS-Bath Symposium (Aug 6, 2020). Watch the talk.
- Speaking (online) at the One World Seminar Series on the Mathematics of Machine Learning (July 5, 2020). Watch the talk.
- Speaking at the University of Minnesota, Applied and Computational Math Colloquium (Feb. 3 2020)
- Invited speaker at Computational Harmonic Analysis and Data Science, Banff International Research Station (Nov 2019).
- Speaking at EPFL, Imaging in the Age of Machine Learning (Oct 25, 2019)
- Speaking at the University of Pittsburgh, Algebra-combinatorics-geometry seminar (Sept 26, 2019)
- Invited speaker at Workshop on Harmonic analysis and Machine Learning (Sept 2019).
- Invited speaker at Algorithms and Complexity for Continuous Problems, Dagstuhl (Aug 2019).
- Plenary speaker at National Academy of Sciences, Arthur M. Sackler Colloquim: The Science of Deep Learning, Washington D.C. (March 2019).
- Speaking at Imperial College/University College London, Numerical Analysis Seminar (Feb. 20 2019)
- Invited speaker at Variational methods and optimization in imaging, Institut Henri Poincaré (Feb. 2019).
- Speaking at Imperial College, Pure Analysis Seminar (Jan. 10 2019).
- Invited speaker at Analysis and Computation in High Dimensions, Hausdorff Institute (Oct. 2018).
- Invited speaker at Measuring the Complexity of Computational Content: From Combinatorial Problems to Analysis, Dagstuhl (Sept. 2018).
- Invited speaker at the Algebraic and geometric aspects of numerical methods for differential equations, Mittag-Leffler Institute (July 5 2018)
- Invited speaker at Isaac Newton Institute (May 24 2018)
- Speaking at the University of Oslo (May 14-16 2018, slides).
- Speaking at the University of Manchester (May 4 2018).
- Invited speaker at Banff Research Station (April 25 2018).
- Invited speaker at the Institut Henri Poincaré (Feb 12 2018).
- Organizing the program Approximation, sampling and compression in data science, Isaac Newton Institute (Jan-June 2019).
- Organizing the workshop Mathematics of data: Structured representations for sensing, approximation and learning, Alan Turing Institute (May 27-May 31, 2019).
- Speaking at LMU Munich (Jan 31, 2018).
- Organizing the workshop Inverse Problems Network Meeting 2, Isaac Newton Institute (Nov 23-Nov 24, 2017).
- Speaking at the University of Warwick (Nov 15, 2017).
- Invited speaker at Generative models, parameter learning and sparsity, Isaac Newton Institute (2017).
- Plenary speaker at the Fourteenth International Conference on Computability and Complexity in Analysis (2017).
- Plenary speaker at SPARS (2017).
- Plenary speaker at Structured Regularization for High-Dimensional Data Analysis, Institut Henri Poincaré (2017).
- Keynote speaker at FoCM: Approximation Theory Workshop (2017).
- Invited speaker at FoCM: Information-Based Complexity Workshop (2017).
- Invited speaker at Multiscale and High-Dimensional Problems, Oberwolfach (2017).
- Plenary speaker at The 14th International workshop on Quantum Chromodynamics (QCD) in extreme conditions (2016).
- Plenary speaker at Strobl16: Time-Frequency Analysis and Related Topics (2016).
- Plenary speaker at Computational and Analytic Problems in Spectral Theory (2016).
- Invited speaker at Low Complexity Models in Signal Processing, Hausdorff Institute (2016).
- Plenary speaker at The Bath/RAL Numerical Analysis Day (2015).
- Plenary speaker at UCL-Duke Workshop on Sensing and Analysis of High-Dimensional Data (2014).
- Plenary speaker at Pseudospectra of operators: spectral singularities, semiclassics, pencils and random matrices (2014).
- Invited speaker at FoCM: Real Number Complexity Workshop (2014).
- Plenary speaker at iTWIST'14 (2014).
- Plenary speaker at French-German Conference on Mathematical Image Analysis, Institut Henri Poincaré (2014).
- Invited speaker at The 5th International Conference on Computational Harmonic Analysis (2014).
- Invited speaker at Compressed sensing and its Applications (2013).
- Plenary speaker at Sparse Representation of Functions: Analytic and Computational Aspects (2012).
- Plenary speaker at Sparsity, Localization and Dictionary Learning (2012).
Thesis
A. C. Hansen, On the approximation of spectra of linear Hilbert space operators, PhD Thesis.
Student Awards
- Smith-Knight/Rayleigh-Knight Prize 2007, On the approximation of spectra and pseudospectra of linear operators on Hilbert spaces
- John Butcher Award 2007 (joint with T. Schmelzer (Oxford)), A theoretical framework for backward error analysis on manifolds.