MScAC
by Yonatan Kahn (University of Toronto)
Neural networks are the backbone of artificial intelligence and machine learning systems. Despite the immense success of neural networks at a variety of real-world problems, the theory of deep (multi-layer) neural networks is still in its infancy. There are many tantalizing analogies between neural networks and situations we encounter in all branches of physics: the interactions of many entities which give rise to simple collective behaviour are strongly reminiscent of statistical mechanics and condensed matter physics, and the data structures encountered in physics may provide tractable models for how neural networks learn from complex real-world data. This talk will explore the perspective that physics may bring towards understanding neural network architectures and algorithms.
This is a hybrid talk. To attend in person, over at the OPG building, or to attend online please register. You can find the registration links in the talk announcement here https://mscac.utoronto.ca/mscac-talks/