The Computable Mathematics of Intelligence
Tomaso Poggio

back to Overview

Date: Wednesday, 17.04.2024 15:20-17:00 CET

Location: Building S1|15 Room 133

Abstract:

Is mathematics invented or discovered? Why it has such an unreasonable effectiveness in describing aspects of our physical world? Recent progress in formulating fundamental principles underlying the stunning success of deep learning (DL) provides a new light on this age-old questions. I will discuss in particular why deep network seem to escape the curse of dimensionality. The answer lies in a key property of all functions that are efficiently Turing computable: they are compositionally sparse. This property enables the use of deep (and sparse) networks — the engine powering deep networks and more recent systems such as LLMs.

Relevant work:

Poggio, T., & Fraser, M. (2024, February 8). Compositional sparsity of learnable functions. https://dspace.mit.edu/handle/1721.1/153475