Autoplay
Autocomplete
Previous Lesson
Complete and Continue
Train and deploy deep learning models using Tensorflow 2 and Google AI Platform
Introduction and software setup
Introduction (In depth overview of the course content) (3:33)
OS, Python IDE, docker (4:21)
Setting up Google Cloud Platform (2:36)
Setting up a VS code folder and creating a virtual environment using virtualenv (6:09)
Python packages we will be using and how to install them (2:04)
Testing your installation and setup (1:51)
Necessary code
Building your deep learning model
How do machine learning or deep learning projects usually work? (2:56)
What is our end goal? (4:42)
Downloading the dataset (7:29)
Data exploration : splitting data into category folders (14:07)
Data exploration : visualizing random samples from the dataset (14:17)
Data exploration : getting insights about widths and heights of images (8:06)
What to consider when building a neural network for our task? (5:26)
Building the neural network architecture using Keras and tensorflow (13:04)
Creating data pipelines using generators (15:16)
Putting everything together inside a train function (15:59)
Improving and cleaning the code for robustness and automation (10:41)
Launching training locally on a subset of our data (4:30)
Adding evaluation at the end of training (14:26)
Summary (2:11)
Introduction to Google Cloud Storage
Our different setups for reading data during the training (4:21)
What are buckets and how to create them? (8:37)
Uploading our data to the bucket (3:55)
Creating a credentials json file to allow access to our bucket (5:50)
Problem with our credentials file and how to fix it (16:32)
Adding code for downloading data from the bucket (17:34)
Verifying that our training pipeline is working properly with the new modifications (5:53)
Dockerizing our code
What is docker and how to use it for our project? (optional) (3:33)
Small modifications to our files (2:21)
Building a docker image using dockerfiles (9:18)
Running a docker container using our docker image (9:51)
Adding arguments to our training application using Argparse (16:12)
Necessary steps to use Docker with GPU (5:28)
Building our docker image with GPU support (9:07)
Section summary (2:01)
AI Platform on Google Cloud Platform
What is cloud computing and what is Google AI Platform? (optional) (5:48)
What other APIs do we need? (9:15)
Pushing our image to Google Container Registry (9:47)
Setting up things for our training job (7:32)
Launching a training job on AI Platform and checking the logs (5:54)
What is hyperparameters tuning? (5:42)
Configuring hyperparameters tuning (10:48)
Building a new docker image with the new setup (1:28)
Launching a training job with the new setup (8:27)
Saving our trained model (but there is a problem) (8:52)
Adding function to upload trained models to a google bucket (4:30)
Zipping and uploading trained models to google storage (13:26)
Running the final training job (11:00)
Section summary (5:34)
Serving our trained model using Cloud Run and Flask
What is Cloud Run and what is Flask? (optional) (2:12)
Creating the skeleton of our Flask web app (11:17)
Adding a helping function to only accept certain images (5:59)
Creating a view function to show our main web page (13:59)
Quick test to verify that everything is working properly (4:46)
Finishing the main web page (6:12)
Adding a web page for viewing the uploaded image (11:03)
Finishing the web app and testing our code locally (22:01)
Using gunicorn to serve the web app instead of Flask server (4:51)
Dockerizing our code (12:59)
Deploying our web app to Cloud Run (11:14)
Summary (5:56)
Teach online with
Adding function to upload trained models to a google bucket
Lesson content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock