Machine Learning Workshop: go to https://meetings.aip.de/event/6/

Europe/Berlin
SH/Lecture Hall (AIP)

SH/Lecture Hall

AIP

An der Sternwarte 16 14482 Potsdam
Description

2-day Machine Learning course

Machine Learning on GPUs is currently boosting AI’s massive leap into real-life applications: autopilots, intelligent automated
assistants, real-time translation, image recognition, data sequencing and clustering. With the unprecedented computing power
of GPUs, many automotive, robotics and big data companies are creating products and services based on a new class of intelligent
machines. This training course is intended for developers and scientists willing to rapidly integrate the open-source AI technology
into new and existing software.


Hands-on Sessions:

All discussed topics will be accompanied with practical sessions either on the provided remote GPU server or on the
customer’s local system.
All corresponding presentations will be available to attendees in printed handouts.

=============================

Costs will be shared by the participants.

Depending on the Number of Participants, this will be about 230 Euro per participant. 

Organiser
    • 10:00 11:15
      Introduction to Machine Learning: Lecture 1

      • The concepts of Neural Networks and Machine Learning. Weak and Strong AI. Artificial neuron model.

      • Typical neural networks workflow: features, layers, tensors, gradients, loss function, back propagation
      • Types of problems: Regression, Classification, Object detection, Segmentation, Super Resolution, Image generation
      • The feasibility of Machine Learning for science and research

      • Advantages of Python and PyTorch for Neural Networks development
      • Tensor objects in Python and PyTorch, interoperation with NumPy
      • Obtaining tensor gradients in PyTorch with Autograd
      • The structure of neural network implementation in PyTorch: definition, loss function, backprop, weights update

      • Our first neural network in PyTorch, by example of Binary Classification
      • Python environments: Linux and Windows, pip, Anaconda, Jupyter Notebook
      • Training and evaluation of Binary Classification in PyTorch: “Cat or Dog?”

      • PyTorch optimization algorithms in the context of neural network training
      • The generic structure of optimization algorithms: parameters, optimization step, funciton evaluation closure
      • The SGD algorithm explained
      • Convolutional neural networks for classification problems

      • Deploying LeNet to handwritten digits recognition for ZIP codes
      • Deploying Mobilenetv2 imagenet classification

    • 11:15 11:30
      Coffee Break 15m
    • 11:30 12:30
      Introduction to Machine Learning: Lecture 2

      • The concepts of Neural Networks and Machine Learning. Weak and Strong AI. Artificial neuron model.

      • Typical neural networks workflow: features, layers, tensors, gradients, loss function, back propagation
      • Types of problems: Regression, Classification, Object detection, Segmentation, Super Resolution, Image generation
      • The feasibility of Machine Learning for science and research

      • Advantages of Python and PyTorch for Neural Networks development
      • Tensor objects in Python and PyTorch, interoperation with NumPy
      • Obtaining tensor gradients in PyTorch with Autograd
      • The structure of neural network implementation in PyTorch: definition, loss function, backprop, weights update

      • Our first neural network in PyTorch, by example of Binary Classification
      • Python environments: Linux and Windows, pip, Anaconda, Jupyter Notebook
      • Training and evaluation of Binary Classification in PyTorch: “Cat or Dog?”

      • PyTorch optimization algorithms in the context of neural network training
      • The generic structure of optimization algorithms: parameters, optimization step, funciton evaluation closure
      • The SGD algorithm explained
      • Convolutional neural networks for classification problems

      • Deploying LeNet to handwritten digits recognition for ZIP codes
      • Deploying Mobilenetv2 imagenet classification

    • 12:30 13:30
      Introduction to Machine Learning: Hands On Session

      • The concepts of Neural Networks and Machine Learning. Weak and Strong AI. Artificial neuron model.

      • Typical neural networks workflow: features, layers, tensors, gradients, loss function, back propagation
      • Types of problems: Regression, Classification, Object detection, Segmentation, Super Resolution, Image generation
      • The feasibility of Machine Learning for science and research

      • Advantages of Python and PyTorch for Neural Networks development
      • Tensor objects in Python and PyTorch, interoperation with NumPy
      • Obtaining tensor gradients in PyTorch with Autograd
      • The structure of neural network implementation in PyTorch: definition, loss function, backprop, weights update

      • Our first neural network in PyTorch, by example of Binary Classification
      • Python environments: Linux and Windows, pip, Anaconda, Jupyter Notebook
      • Training and evaluation of Binary Classification in PyTorch: “Cat or Dog?”

      • PyTorch optimization algorithms in the context of neural network training
      • The generic structure of optimization algorithms: parameters, optimization step, funciton evaluation closure
      • The SGD algorithm explained
      • Convolutional neural networks for classification problems

      • Deploying LeNet to handwritten digits recognition for ZIP codes
      • Deploying Mobilenetv2 imagenet classification

    • 13:30 14:30
      Lunch Break 1h
    • 14:30 15:30
      Introduction to Machine Learning: Lecture 3

      • The concepts of Neural Networks and Machine Learning. Weak and Strong AI. Artificial neuron model.

      • Typical neural networks workflow: features, layers, tensors, gradients, loss function, back propagation
      • Types of problems: Regression, Classification, Object detection, Segmentation, Super Resolution, Image generation
      • The feasibility of Machine Learning for science and research

      • Advantages of Python and PyTorch for Neural Networks development
      • Tensor objects in Python and PyTorch, interoperation with NumPy
      • Obtaining tensor gradients in PyTorch with Autograd
      • The structure of neural network implementation in PyTorch: definition, loss function, backprop, weights update

      • Our first neural network in PyTorch, by example of Binary Classification
      • Python environments: Linux and Windows, pip, Anaconda, Jupyter Notebook
      • Training and evaluation of Binary Classification in PyTorch: “Cat or Dog?”

      • PyTorch optimization algorithms in the context of neural network training
      • The generic structure of optimization algorithms: parameters, optimization step, funciton evaluation closure
      • The SGD algorithm explained
      • Convolutional neural networks for classification problems

      • Deploying LeNet to handwritten digits recognition for ZIP codes
      • Deploying Mobilenetv2 imagenet classification

    • 15:30 15:50
      Coffee Break 20m
    • 15:50 18:00
      Introduction to Machine Learning: Hands On Session

      • The concepts of Neural Networks and Machine Learning. Weak and Strong AI. Artificial neuron model.

      • Typical neural networks workflow: features, layers, tensors, gradients, loss function, back propagation
      • Types of problems: Regression, Classification, Object detection, Segmentation, Super Resolution, Image generation
      • The feasibility of Machine Learning for science and research

      • Advantages of Python and PyTorch for Neural Networks development
      • Tensor objects in Python and PyTorch, interoperation with NumPy
      • Obtaining tensor gradients in PyTorch with Autograd
      • The structure of neural network implementation in PyTorch: definition, loss function, backprop, weights update

      • Our first neural network in PyTorch, by example of Binary Classification
      • Python environments: Linux and Windows, pip, Anaconda, Jupyter Notebook
      • Training and evaluation of Binary Classification in PyTorch: “Cat or Dog?”

      • PyTorch optimization algorithms in the context of neural network training
      • The generic structure of optimization algorithms: parameters, optimization step, funciton evaluation closure
      • The SGD algorithm explained
      • Convolutional neural networks for classification problems

      • Deploying LeNet to handwritten digits recognition for ZIP codes
      • Deploying Mobilenetv2 imagenet classification

    • 09:00 10:00
      Machine Learning on GPUs: Lecture 1

      An overview of activation function types and their applications: ReLU, Sigmoid, Tanh
      • Cross Entropy: efficient loss beyond MSE
      • Softmax - multiclass logistic function
      • Intersection over union metric

      • Using PyTorch datasets to organize the data for batch training
      • Unet structure. Semantic Image segmentation

      • An overview of GPU performance in various applications
      • Brief intercomparison of different types of accelerators
      • Key programming principles to achieve high GPU performance in Machine Learning
      • Benchmarking CNNs on different GPUs: cnn-benchmarks test suite

      • An overview of GPU-enabled libraries for feature extraction: torchvision.transform, Pandas, OpenCV, CUBLAS, CUFFT

      • Performance limitations of pure Python code
      • Implementing high-performance C++/Fortran/OpenCL extensions to PyTorch with pybind11

      • Analyzing GPU efficiency for training and inference, by example of VoiceLoop – a neural text-to-speech processor

    • 10:00 10:30
      Coffee Break 30m
    • 10:30 11:30
      Machine Learning on GPUs: Hands On Session

      An overview of activation function types and their applications: ReLU, Sigmoid, Tanh
      • Cross Entropy: efficient loss beyond MSE
      • Softmax - multiclass logistic function
      • Intersection over union metric

      • Using PyTorch datasets to organize the data for batch training
      • Unet structure. Semantic Image segmentation

      • An overview of GPU performance in various applications
      • Brief intercomparison of different types of accelerators
      • Key programming principles to achieve high GPU performance in Machine Learning
      • Benchmarking CNNs on different GPUs: cnn-benchmarks test suite

      • An overview of GPU-enabled libraries for feature extraction: torchvision.transform, Pandas, OpenCV, CUBLAS, CUFFT

      • Performance limitations of pure Python code
      • Implementing high-performance C++/Fortran/OpenCL extensions to PyTorch with pybind11

      • Analyzing GPU efficiency for training and inference, by example of VoiceLoop – a neural text-to-speech processor

    • 11:30 12:30
      Machine Learning on GPUs: Lecture 2

      An overview of activation function types and their applications: ReLU, Sigmoid, Tanh
      • Cross Entropy: efficient loss beyond MSE
      • Softmax - multiclass logistic function
      • Intersection over union metric

      • Using PyTorch datasets to organize the data for batch training
      • Unet structure. Semantic Image segmentation

      • An overview of GPU performance in various applications
      • Brief intercomparison of different types of accelerators
      • Key programming principles to achieve high GPU performance in Machine Learning
      • Benchmarking CNNs on different GPUs: cnn-benchmarks test suite

      • An overview of GPU-enabled libraries for feature extraction: torchvision.transform, Pandas, OpenCV, CUBLAS, CUFFT

      • Performance limitations of pure Python code
      • Implementing high-performance C++/Fortran/OpenCL extensions to PyTorch with pybind11

      • Analyzing GPU efficiency for training and inference, by example of VoiceLoop – a neural text-to-speech processor

    • 12:30 13:30
      Lunch Break 1h
    • 13:30 14:30
      Machine Learning on GPUs: Lecture 3

      An overview of activation function types and their applications: ReLU, Sigmoid, Tanh
      • Cross Entropy: efficient loss beyond MSE
      • Softmax - multiclass logistic function
      • Intersection over union metric

      • Using PyTorch datasets to organize the data for batch training
      • Unet structure. Semantic Image segmentation

      • An overview of GPU performance in various applications
      • Brief intercomparison of different types of accelerators
      • Key programming principles to achieve high GPU performance in Machine Learning
      • Benchmarking CNNs on different GPUs: cnn-benchmarks test suite

      • An overview of GPU-enabled libraries for feature extraction: torchvision.transform, Pandas, OpenCV, CUBLAS, CUFFT

      • Performance limitations of pure Python code
      • Implementing high-performance C++/Fortran/OpenCL extensions to PyTorch with pybind11

      • Analyzing GPU efficiency for training and inference, by example of VoiceLoop – a neural text-to-speech processor

    • 14:30 15:30
      Machine Learning on GPUs: Lecture 4

      An overview of activation function types and their applications: ReLU, Sigmoid, Tanh
      • Cross Entropy: efficient loss beyond MSE
      • Softmax - multiclass logistic function
      • Intersection over union metric

      • Using PyTorch datasets to organize the data for batch training
      • Unet structure. Semantic Image segmentation

      • An overview of GPU performance in various applications
      • Brief intercomparison of different types of accelerators
      • Key programming principles to achieve high GPU performance in Machine Learning
      • Benchmarking CNNs on different GPUs: cnn-benchmarks test suite

      • An overview of GPU-enabled libraries for feature extraction: torchvision.transform, Pandas, OpenCV, CUBLAS, CUFFT

      • Performance limitations of pure Python code
      • Implementing high-performance C++/Fortran/OpenCL extensions to PyTorch with pybind11

      • Analyzing GPU efficiency for training and inference, by example of VoiceLoop – a neural text-to-speech processor

    • 15:30 15:50
      Coffee Break 20m
    • 15:50 17:00
      Machine Learning on GPUs: Hands On Session

      An overview of activation function types and their applications: ReLU, Sigmoid, Tanh
      • Cross Entropy: efficient loss beyond MSE
      • Softmax - multiclass logistic function
      • Intersection over union metric

      • Using PyTorch datasets to organize the data for batch training
      • Unet structure. Semantic Image segmentation

      • An overview of GPU performance in various applications
      • Brief intercomparison of different types of accelerators
      • Key programming principles to achieve high GPU performance in Machine Learning
      • Benchmarking CNNs on different GPUs: cnn-benchmarks test suite

      • An overview of GPU-enabled libraries for feature extraction: torchvision.transform, Pandas, OpenCV, CUBLAS, CUFFT

      • Performance limitations of pure Python code
      • Implementing high-performance C++/Fortran/OpenCL extensions to PyTorch with pybind11

      • Analyzing GPU efficiency for training and inference, by example of VoiceLoop – a neural text-to-speech processor