最新消息:请大家多多支持

Machine Learning, Deep Learning and Bayesian Learning updated 7/2022

其他教程 dsgsd 140浏览 0评论
Machine Learning, Deep Learning And Bayesian Learning

Last updated 7/2022
MP4 | Video: h264, 1280×720 | Audio: AAC, 44.1 KHz
Language: English | Size: 5.84 GB | Duration: 17h 32m

Learn Machine Learning, Deep Learning, Bayesian Learning and Model Deployment in Python.

What you’ll learn
Deep Learning with Tensorflow!!!
Deep Learning with PyTorch!!! Yes both Tensorflow + PyTorch!
Bayesian learning with PyMC3
Data Analysis with Pandas
Algorithms from scratch using Numpy
Using Scikit-learn to its full effect
Model Deployment
Model Diagnostics
Natural Language Processing
Unsupervised Learning
Natual Language Processing with Spacy
Time series modelling with FB Prophet
Python
Requirements
Willingness to learn
Description
This is a course on Machine Learning, Deep Learning (Tensorflow + PyTorch) and Bayesian Learning (yes all 3 topics in one place!!!). Yes BOTH Pytorch and Tensorflow for Deep Learning.We start off by analysing data using pandas, and implementing some algorithms from scratch using Numpy. These algorithms include linear regression, Classification and Regression Trees (CART), Random Forest and Gradient Boosted Trees.We start off using TensorFlow for our Deep Learning lessons. This will include Feed Forward Networks, Convolutional Neural Nets (CNNs) and Recurrent Neural Nets (RNNs). For the more advanced Deep Learning lessons we use PyTorch with PyTorch Lightning.We focus on both the programming and the mathematical/ statistical aspect of this course. This is to ensure that you are ready for those theoretical questions at interviews, while being able to put Machine Learning into solid practice.Some of the other key areas in Machine Learning that we discuss include, unsupervised learning, time series analysis and Natural Language Processing. Scikit-learn is an essential tool that we use throughout the entire course.We spend quite a bit of time on feature engineering and making sure our models don’t overfit. Diagnosing Machine Learning (and Deep Learning) models by splitting into training and testing as well as looking at the correct metric can make a world of difference.I would like to highlight that we talk about Machine Learning Deployment, since this is a topic that is rarely talked about. The key to being a good data scientist is having a model that doesn’t decay in production.I hope you enjoy this course and please don’t hesitate to contact me for further information.

Overview

Section 1: Introduction

Lecture 1 Introduction

Lecture 2 How to tackle this course

Lecture 3 Installations and sign ups

Lecture 4 Jupyter Notebooks

Lecture 5 Course Material

Section 2: Basic python + Pandas + Plotting

Lecture 6 Intro

Lecture 7 Basic Data Structures

Lecture 8 Dictionaries

Lecture 9 Python functions (methods)

Lecture 10 Numpy functions

Lecture 11 Conditional statements

Lecture 12 For loops

Lecture 13 Dictionaries again

Lecture 14 ——————————– Pandas ——————————–

Lecture 15 Intro

Lecture 16 Pandas simple functions

Lecture 17 Pandas: Subsetting

Lecture 18 Pandas: loc and iloc

Lecture 19 Pandas: loc and iloc 2

Lecture 20 Pandas: map and apply

Lecture 21 Pandas: groupby

Lecture 22 —– Plotting ——–

Lecture 23 Plotting resources (notebooks)

Lecture 24 Line plot

Lecture 25 Plot multiple lines

Lecture 26 Histograms

Lecture 27 Scatter Plots

Lecture 28 Subplots

Lecture 29 Seaborn + pair plots

Section 3: Machine Learning: Numpy + Scikit Learn

Lecture 30 Your reviews are important to me!

Lecture 31 ———– Numpy ————-

Lecture 32 Gradient Descent

Lecture 33 Kmeans part 1

Lecture 34 Kmeans part 2

Lecture 35 Broadcasting

Lecture 36 —————- Scikit Learn ————————————-

Lecture 37 Intro

Lecture 38 Linear Regresson Part 1

Lecture 39 Linear Regression Part 2

Lecture 40 Classification and Regression Trees

Lecture 41 CART part 2

Lecture 42 Random Forest theory

Lecture 43 Random Forest Code

Lecture 44 Gradient Boosted Machines

Section 4: Machine Learning: Classification + Time Series + Model Diagnostics

Lecture 45 Kaggle part 1

Lecture 46 Kaggle part 2

Lecture 47 Theory part 1

Lecture 48 Theory part 2 + code

Lecture 49 Titanic dataset

Lecture 50 Sklearn classification prelude

Lecture 51 Sklearn classification

Lecture 52 Dealing with missing values

Lecture 53 ——— Time Series ——————-

Lecture 54 Intro

Lecture 55 Loss functions

Lecture 56 FB Prophet part 1

Lecture 57 FB Prophet part 2

Lecture 58 Theory behind FB Prophet

Lecture 59 ———— Model Diagnostics —–

Lecture 60 Overfitting

Lecture 61 Cross Validation

Lecture 62 Stratified K Fold

Lecture 63 Area Under Curve (AUC) Part 1

Lecture 64 Area Under Curve (AUC) Part 2

Section 5: Unsupervised Learning

Lecture 65 Principal Component Analysis (PCA) theory

Lecture 66 Fashion MNIST PCA

Lecture 67 K-means

Lecture 68 Other clustering methods

Lecture 69 DBSCAN theory

Lecture 70 Gaussian Mixture Models (GMM) theory

Section 6: Natural Language Processing + Regularization

Lecture 71 Intro

Lecture 72 Stop words and Term Frequency

Lecture 73 Term Frequency – Inverse Document Frequency (Tf – Idf) theory

Lecture 74 Financial News Sentiment Classifier

Lecture 75 NLTK + Stemming

Lecture 76 N-grams

Lecture 77 Word (feature) importance

Lecture 78 Spacy intro

Lecture 79 Feature Extraction with Spacy (using Pandas)

Lecture 80 Classification Example

Lecture 81 Over-sampling

Lecture 82 ——– Regularization ————

Lecture 83 Introduction

Lecture 84 MSE recap

Lecture 85 L2 Loss / Ridge Regression intro

Lecture 86 Ridge regression (L2 penalised regression)

Lecture 87 S&P500 data preparation for L1 loss

Lecture 88 L1 Penalised Regression (Lasso)

Lecture 89 L1/ L2 Penalty theory: why it works

Section 7: Deep Learning

Lecture 90 Intro

Lecture 91 DL theory part 1

Lecture 92 DL theory part 2

Lecture 93 Tensorflow + Keras demo problem 1

Lecture 94 Activation functions

Lecture 95 First example with Relu

Lecture 96 MNIST and Softmax

Lecture 97 Deep Learning Input Normalisation

Lecture 98 Softmax theory

Lecture 99 Batch Norm

Lecture 100 Batch Norm Theory

Section 8: Deep Learning (TensorFlow) – Convolutional Neural Nets

Lecture 101 Intro

Lecture 102 Fashion MNIST feed forward net for benchmarking

Lecture 103 Keras Conv2D layer

Lecture 104 Model fitting and discussion of results

Lecture 105 Dropout theory and code

Lecture 106 MaxPool (and comparison to stride)

Lecture 107 Cifar-10

Lecture 108 Nose Tip detection with CNNs

Section 9: Deep Learning: Recurrent Neural Nets

Lecture 109 Word2vec and Embeddings

Lecture 110 Kaggle + Word2Vec

Lecture 111 Word2Vec: keras Model API

Lecture 112 Recurrent Neural Nets – Theory

Lecture 113 Deep Learning – Long Short Term Memory (LSTM) Nets

Lecture 114 Deep Learning – Stacking LSTMs + GRUs

Lecture 115 Transfer Learning – GLOVE vectors

Lecture 116 Sequence to Sequence Introduction + Data Prep

Lecture 117 Sequence to Sequence model + Keras Model API

Lecture 118 Sequence to Sequence models: Prediction step

Section 10: Deep Learning: PyTorch Introduction

Lecture 119 Notebooks

Lecture 120 Introduction

Lecture 121 Pytorch: TensorDataset

Lecture 122 Pytorch: Dataset and DataLoaders

Lecture 123 Deep Learning with PyTorch: nn.Sequential models

Lecture 124 Deep Learning with Pytorch: Loss functions

Lecture 125 Deep Learning with Pytorch: Stochastic Gradient Descent

Lecture 126 Deep Learning with Pytorch: Optimizers

Lecture 127 Pytorch Model API

Lecture 128 Pytorch in GPUs

Lecture 129 Deep Learning: Intro to Pytorch Lightning

Section 11: Deep Learning: Transfer Learning with PyTorch Lightning

Lecture 130 Notebooks

Lecture 131 Transfer Learning Introduction

Lecture 132 Kaggle problem description

Lecture 133 PyTorch datasets + Torchvision

Lecture 134 PyTorch transfer learning with ResNet

Lecture 135 PyTorch Lightning Model

Lecture 136 PyTorch Lightning Trainer + Model evaluation

Lecture 137 Deep Learning for Cassava Leaf Classification

Lecture 138 Cassava Leaf Dataset

Lecture 139 Data Augmentation with Torchvision Transforms

Lecture 140 Train vs Test Augmentations + DataLoader parameters

Lecture 141 Deep Learning: Transfer Learning Model with ResNet

Lecture 142 Setting up PyTorch Lightning for training

Lecture 143 Cross Entropy Loss for Imbalanced Classes

Lecture 144 PyTorch Test dataset setup and evaluation

Lecture 145 WandB for logging experiments

Section 12: Pixel Level Segmentation (Semantic Segmentation) with PyTorch

Lecture 146 Notebooks

Lecture 147 Introduction

Lecture 148 Coco Dataset + Augmentations for Segmentation with Torchvision

Lecture 149 Unet Architecture overview

Lecture 150 PyTorch Model Architecture

Lecture 151 PyTorch Hooks

Lecture 152 PyTorch Hooks: Step through with breakpoints

Lecture 153 PyTorch Weighted CrossEntropy Loss

Lecture 154 Weights and Biases: Logging images.

Lecture 155 Semantic Segmentation training with PyTorch Lightning

Section 13: Deep Learning: Transformers and BERT

Lecture 156 Resources

Lecture 157 Introduction to Transformers

Lecture 158 The illustrated Transformer (blogpost by Jay Alammar)

Lecture 159 Encoder Transformer Models: The Maths

Lecture 160 BERT – The theory

Lecture 161 Kaggle Multi-lingual Toxic Comment Classification Challenge

Lecture 162 Tokenizers and data prep for BERT models

Lecture 163 Distilbert (Smaller BERT) model

Lecture 164 Pytorch Lightning + DistilBERT for classification

Section 14: Bayesian Learning and probabilistic programming

Lecture 165 Introduction and Terminology

Lecture 166 Bayesian Learning: Distributions

Lecture 167 Bayes rule for population mean estimation

Lecture 168 Bayesian learning: Population estimation pymc3 way

Lecture 169 Coin Toss Example with Pymc3

Lecture 170 Data Setup for Bayesian Linear Regression

Lecture 171 Bayesian Linear Regression with pymc3

Lecture 172 Bayesian Rolling Regression – Problem setup

Lecture 173 Bayesian Rolling regression – pymc3 way

Lecture 174 Bayesian Rolling Regression – forecasting

Lecture 175 Variational Bayes Intro

Lecture 176 Variational Bayes: Linear Classification

Lecture 177 Variational Bayesian Inference: Result Analysis

Lecture 178 Minibatch Variational Bayes

Lecture 179 Deep Bayesian Networks

Lecture 180 Deep Bayesian Networks – analysis

Section 15: Model Deployment

Lecture 181 Intro

Lecture 182 Saving Models

Lecture 183 FastAPI intro

Lecture 184 FastAPI serving model

Lecture 185 Streamlit Intro

Lecture 186 Streamlit functions

Lecture 187 CLIP model

Section 16: AWS Sagemaker (for Model Deployment)

Lecture 188 Resources

Lecture 189 Introduction and WARNING (Must watch!)

Lecture 190 Setting up AWS

Lecture 191 awscli + IAM setup

Lecture 192 AWS s3 introduction + bash scriptting

Lecture 193 AWS IAM roles

Lecture 194 AWS Sagemaker – Processing jobs Part 1

Lecture 195 Sagemaker Processing – Part 2

Lecture 196 Sagemaker Training – Part 1

Lecture 197 Sagemaker Training – Part 2

Lecture 198 AWS Cloudwatch

Lecture 199 AWS Sagemaker inference (model deployment) – Part 1

Lecture 200 AWS Sagemaker Inference – Part 2

Lecture 201 AWS Sagemaker Inference – Part 3

Lecture 202 AWS Billing

Section 17: Final Thoughts

Lecture 203 Some advice on your journey

Anyone interested in Machine Learning.


Password/解压密码www.tbtos.com

资源下载此资源仅限VIP下载,请先

转载请注明:0daytown » Machine Learning, Deep Learning and Bayesian Learning updated 7/2022

发表我的评论
取消评论
表情

Hi,您需要填写昵称和邮箱!

  • 昵称 (必填)
  • 邮箱 (必填)
  • 网址