The tensorflow workshop a hands-on guide to building deep learning models from scratch using real-world datasets

This Workshop will teach you how to build deep learning models from scratch using real-world datasets with the TensorFlow framework. You will gain the knowledge you need to process a variety of data types, perform tensor computations, and understand the different layers in a deep learning model.

Detalles Bibliográficos
Otros Autores: Moocarme, Matthew, author (author), So, Anthony Veasna, 1992-2020, author, Maddalone, Anthony, author
Formato: Libro electrónico
Idioma:Inglés
Publicado: Birmingham, England ; Mumbai : Packt [2021]
Edición:1st edition
Materias:
Ver en Biblioteca Universitat Ramon Llull:https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009644320406719
Tabla de Contenidos:
  • Cover
  • FM
  • Copyright
  • Table of Contents
  • Preface
  • Chapter 1: Introduction to Machine Learning with TensorFlow
  • Introduction
  • Implementing Artificial Neural Networks in TensorFlow
  • Advantages of TensorFlow
  • Disadvantages of TensorFlow
  • The TensorFlow Library in Python
  • Exercise 1.01: Verifying Your Version of TensorFlow
  • Introduction to Tensors
  • Scalars, Vectors, Matrices, and Tensors
  • Exercise 1.02: Creating Scalars, Vectors, Matrices, and Tensors in TensorFlow
  • Tensor Addition
  • Exercise 1.03: Performing Tensor Addition in TensorFlow
  • Activity 1.01: Performing Tensor Addition in TensorFlow
  • Reshaping
  • Tensor Transposition
  • Exercise 1.04: Performing Tensor Reshaping and Transposition in TensorFlow
  • Activity 1.02: Performing Tensor Reshaping and Transposition in TensorFlow
  • Tensor Multiplication
  • Exercise 1.05: Performing Tensor Multiplication in TensorFlow
  • Optimization
  • Forward Propagation
  • Backpropagation
  • Learning Optimal Parameters
  • Optimizers in TensorFlow
  • Activation functions
  • Activity 1.03: Applying Activation Functions
  • Summary
  • Chapter 2: Loading and Processing Data
  • Introduction
  • Exploring Data Types
  • Data Preprocessing
  • Processing Tabular Data
  • Exercise 2.01: Loading Tabular Data and Rescaling Numerical Fields
  • Activity 2.01: Loading Tabular Data and Rescaling Numerical Fields with a MinMax Scaler
  • Exercise 2.02: Preprocessing Non-Numerical Data
  • Processing Image Data
  • Exercise 2.03: Loading Image Data for Batch Processing
  • Image Augmentation
  • Activity 2.02: Loading Image Data for Batch Processing
  • Text Processing
  • Exercise 2.04: Loading Text Data for TensorFlow Models
  • Audio Processing
  • Exercise 2.05: Loading Audio Data for TensorFlow Models
  • Activity 2.03: Loading Audio Data for Batch Processing
  • Summary.
  • Chapter 3: TensorFlow Development
  • Introduction
  • TensorBoard
  • Exercise 3.01: Using TensorBoard to Visualize Matrix Multiplication
  • Activity 3.01: Using TensorBoard to Visualize Tensor Transformations
  • Exercise 3.02: Using TensorBoard to Visualize Image Batches
  • TensorFlow Hub
  • Exercise 3.03: Downloading a Model from TensorFlow Hub
  • Google Colab
  • Advantages of Google Colab
  • Disadvantages of Google Colab
  • Development on Google Colab
  • Exercise 3.04: Using Google Colab to Visualize Data
  • Activity 3.02: Performing Word Embedding from a Pre-Trained Model from TensorFlow Hub
  • Summary
  • Chapter 4: Regression and Classification Models
  • Introduction
  • Sequential Models
  • Keras Layers
  • Exercise 4.01: Creating an ANN with TensorFlow
  • Model Fitting
  • The Loss Function
  • Model Evaluation
  • Exercise 4.02: Creating a Linear Regression Model as an ANN with TensorFlow
  • Exercise 4.03: Creating a Multi-Layer ANN with TensorFlow
  • Activity 4.01: Creating a Multi-Layer ANN with TensorFlow
  • Classification Models
  • Exercise 4.04: Creating a Logistic Regression Model as an ANN with TensorFlow
  • Activity 4.02: Creating a Multi-Layer Classification ANN with TensorFlow
  • Summary
  • Chapter 5: Classification Models
  • Introduction
  • Binary Classification
  • Logistic Regression
  • Binary Cross-Entropy
  • Binary Classification Architecture
  • Exercise 5.01: Building a Logistic Regression Model
  • Metrics for Classifiers
  • Accuracy and Null Accuracy
  • Precision, Recall, and the F1 Score
  • Confusion Matrices
  • Exercise 5.02: Classification Evaluation Metrics
  • Multi-Class Classification
  • The Softmax Function
  • Categorical Cross-Entropy
  • Multi-Class Classification Architecture
  • Exercise 5.03: Building a Multi-Class Model
  • Activity 5.01: Building a Character Recognition Model with TensorFlow
  • Multi-Label Classification.
  • Activity 5.02: Building a Movie Genre Tagging a Model with TensorFlow
  • Summary
  • Chapter 6: Regularization and Hyperparameter Tuning
  • Introduction
  • Regularization Techniques
  • L1 Regularization
  • L2 Regularization
  • Exercise 6.01: Predicting a Connect-4 Game Outcome Using the L2 Regularizer
  • Dropout Regularization
  • Exercise 6.02: Predicting a Connect-4 Game Outcome Using Dropout
  • Early Stopping
  • Activity 6.01: Predicting Income with L1 and L2 Regularizers
  • Hyperparameter Tuning
  • Keras Tuner
  • Random Search
  • Exercise 6.03: Predicting a Connect-4 Game Outcome Using Random Search from Keras Tuner
  • Hyperband
  • Exercise 6.04: Predicting a Connect-4 Game Outcome Using Hyperband from Keras Tuner
  • Bayesian Optimization
  • Activity 6.02: Predicting Income with Bayesian Optimization from Keras Tuner
  • Summary
  • Chapter 7: Convolutional Neural Networks
  • Introduction
  • CNNs
  • Image Representation
  • The Convolutional Layer
  • Creating the Model
  • Exercise 7.01: Creating the First Layer to Build a CNN
  • Pooling Layer
  • Max Pooling
  • Average Pooling
  • Exercise 7.02: Creating a Pooling Layer for a CNN
  • Flattening Layer
  • Exercise 7.03: Building a CNN
  • Image Augmentation
  • Batch Normalization
  • Exercise 7.04: Building a CNN with Additional Convolutional Layers
  • Binary Image Classification
  • Object Classification
  • Exercise 7.05: Building a CNN
  • Activity 7.01: Building a CNN with More ANN Layers
  • Summary
  • Chapter 8: Pre-Trained Networks
  • Introduction
  • ImageNet
  • Transfer Learning
  • Exercise 8.01: Classifying Cats and Dogs with Transfer Learning
  • Fine-Tuning
  • Activity 8.01: Fruit Classification with Fine-Tuning
  • TensorFlow Hub
  • Feature Extraction
  • Activity 8.02: Transfer Learning with TensorFlow Hub
  • Summary
  • Chapter 9: Recurrent Neural Networks
  • Introduction
  • Sequential Data.
  • Examples of Sequential Data
  • Exercise 9.01: Training an ANN for Sequential Data - Nvidia Stock Prediction
  • Recurrent Neural Networks
  • RNN Architecture
  • Vanishing Gradient Problem
  • Long Short-Term Memory Network
  • Exercise 9.02: Building an RNN with an LSTM Layer - Nvidia Stock Prediction
  • Activity 9.01: Building an RNN with Multiple LSTM Layers to Predict Power Consumption
  • Natural Language Processing
  • Data Preprocessing
  • Dataset Cleaning
  • Generating a Sequence and Tokenization
  • Padding Sequences
  • Back Propagation Through Time (BPTT)
  • Exercise 9.03: Building an RNN with an LSTM Layer for Natural Language Processing
  • Activity 9.02: Building an RNN for Predicting Tweets' Sentiment
  • Summary
  • Chapter 10: Custom TensorFlow Components
  • Introduction
  • TensorFlow APIs
  • Implementing Custom Loss Functions
  • Building a Custom Loss Function with the Functional API
  • Building a Custom Loss Function with the Subclassing API
  • Exercise 10.01: Building a Custom Loss Function
  • Implementing Custom Layers
  • Introduction to ResNet Blocks
  • Building Custom Layers with the Functional API
  • Building Custom Layers with Subclassing
  • Exercise 10.02: Building a Custom Layer
  • Activity 10.01: Building a Model with Custom Layers and a Custom Loss Function
  • Summary
  • Chapter 11: Generative Models
  • Introduction
  • Text Generation
  • Extending NLP Sequence Models to Generate Text
  • Dataset Cleaning
  • Generating a Sequence and Tokenization
  • Generating a Sequence of n-gram Tokens
  • Padding Sequences
  • Exercise 11.01: Generating Text
  • Generative Adversarial Networks
  • The Generator Network
  • The Discriminator Network
  • The Adversarial Network
  • Combining the Generative and Discriminative Models
  • Generating Real Samples with Class Labels
  • Creating Latent Points for the Generator.
  • Using the Generator to Generate Fake Samples and Class Labels
  • Evaluating the Discriminator Model
  • Training the Generator and Discriminator
  • Creating the Latent Space, Generator, Discriminator, GAN, and Training Data
  • Exercise 11.02: Generating Sequences with GANs
  • Deep Convolutional Generative Adversarial Networks (DCGANs)
  • Training a DCGAN
  • Exercise 11.03: Generating Images with DCGAN
  • Activity 11.01: Generating Images Using GANs
  • Summary
  • Appendix
  • Index.