Machine Learning

(You can also browse all our machine learning materials here.)


“Everything you wanted to know (and more) about PyTorch tensors”

Webinar (2022-Jan-19) by Marie-Hélène Burle

Python already has several multidimensional array structures – the most popular of which being NumPy’s ndarray – but the particularities of deep learning call for special characteristics: the ability to run operations on GPUs and/or in a distributed fashion, as well as the ability to keep track of computation graphs for automatic differentiation.

PyTorch tensors provide these and much more, can be easily converted to/from NumPy’s ndarray and integrate well with other Python libraries such as Pandas.

“Upscaling with PyTorch”

Webinar (2021-Nov-24) by Marie-Hélène Burle

Super-resolution (the process of recreating high-resolution images from low-resolution ones) is an old field, but deep neural networks have seen a sudden surge of new and very impressive methods over the past 10 years, from SRCNN to SRGAN to Transformers. This webinar provides a quick overview of these methods and shows how the latest state-of-the-art model — SwinIR — performs on a few test images using PyTorch as our framework.

“Introduction to deep learning with fastai”

Webinar (2021-Apr-14) by Marie-Hélène Burle

fastai is a deep learning library with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains. It also provides researchers with low-level components that can be mixed and matched to build new approaches.

fastai aims to do both things without substantial compromises in ease of use, flexibility, or performance. This is possible thanks to a carefully layered architecture, which expresses common underlying patterns of many deep learning and data processing techniques in terms of decoupled abstractions.

This webinar takes a closer look at the features and functionality of fastai.

“Machine learning in Julia with Flux”

Webinar (2020-May-13) by Marie-Hélène Burle