Applied deep learning with Python use scikit -learn, tensorflow, and keras to create intelligent systems and machine learning solutions
A hands-on guide to deep learning that’s filled with intuitive explanations and engaging practical examples Key Features Designed to iteratively develop the skills of Python users who don’t have a data science background Covers the key foundational concepts you’ll need to know when building deep lea...
Otros Autores: | , |
---|---|
Formato: | Libro electrónico |
Idioma: | Inglés |
Publicado: |
Birmingham ; Mumbai :
Packt
2018.
|
Edición: | 1st edition |
Materias: | |
Ver en Biblioteca Universitat Ramon Llull: | https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009630711806719 |
Tabla de Contenidos:
- Cover
- Title Page
- Copyright and Credits
- Packt Upsell
- Contributors
- Table of Contents
- Preface
- Chapter 1: Jupyter Fundamentals
- Basic Functionality and Features
- What is a Jupyter Notebook and Why is it Useful?
- Navigating the Platform
- Introducing Jupyter Notebooks
- Jupyter Features
- Exploring some of Jupyter's most useful features
- Converting a Jupyter Notebook to a Python Script
- Python Libraries
- Import the external libraries and set up the plotting environment
- Our First Analysis - The Boston Housing Dataset
- Loading the Data into Jupyter Using a Pandas DataFrame
- Load the Boston housing dataset
- Data Exploration
- Explore the Boston housing dataset
- Introduction to Predictive Analytics with Jupyter Notebooks
- Linear models with Seaborn and scikit-learn
- Activity: Building a Third-Order Polynomial Model
- Linear models with Seaborn and scikit-learn
- Using Categorical Features for Segmentation Analysis
- Create categorical fields from continuous variables and make segmented visualizations
- Summary
- Chapter 2: Data Cleaning and Advanced Machine Learning
- Preparing to Train a Predictive Model
- Determining a Plan for Predictive Analytics
- Preprocessing Data for Machine Learning
- Exploring data preprocessing tools and methods
- Activity: Preparing to Train a Predictive Model for the Employee-Retention Problem
- Training Classification Models
- Introduction to Classification Algorithms
- Training two-feature classification models with scikit-learn
- The plot_decision_regions Function
- Training k-nearest neighbors for our model
- Training a Random Forest
- Assessing Models with k-Fold Cross-Validation and Validation Curves
- Using k-fold cross-validation and validation curves in Python with scikit-learn
- Dimensionality Reduction Techniques.
- Training a predictive model for the employee retention problem
- Summary
- Chapter 3: Web Scraping and Interactive Visualizations
- Scraping Web Page Data
- Introduction to HTTP Requests
- Making HTTP Requests in the Jupyter Notebook
- Handling HTTP requests with Python in a Jupyter Notebook
- Parsing HTML in the Jupyter Notebook
- Parsing HTML with Python in a Jupyter Notebook
- Activity: Web Scraping with Jupyter Notebooks
- Interactive Visualizations
- Building a DataFrame to Store and Organize Data
- Building and merging Pandas DataFrames
- Introduction to Bokeh
- Introduction to interactive visualizations with Bokeh
- Activity: Exploring Data with Interactive Visualizations
- Summary
- Chapter 4: Introduction to Neural Networks and Deep Learning
- What are Neural Networks?
- Successful Applications
- Why Do Neural Networks Work So Well?
- Representation Learning
- Function Approximation
- Limitations of Deep Learning
- Inherent Bias and Ethical Considerations
- Common Components and Operations of Neural Networks
- Configuring a Deep Learning Environment
- Software Components for Deep Learning
- Python 3
- TensorFlow
- Keras
- TensorBoard
- Jupyter Notebooks, Pandas, and NumPy
- Activity: Verifying Software Components
- Exploring a Trained Neural Network
- MNIST Dataset
- Training a Neural Network with TensorFlow
- Training a Neural Network
- Testing Network Performance with Unseen Data
- Activity: Exploring a Trained Neural Network
- Summary
- Chapter 5: Model Architecture
- Choosing the Right Model Architecture
- Common Architectures
- Convolutional Neural Networks
- Recurrent Neural Networks
- Generative Adversarial Networks
- Deep Reinforcement Learning
- Data Normalization
- Z-score
- Point-Relative Normalization
- Maximum and Minimum Normalization
- Structuring Your Problem.
- Activity: Exploring the Bitcoin Dataset and Preparing Data for Model
- Using Keras as a TensorFlow Interface
- Model Components
- Activity: Creating a TensorFlow Model Using Keras
- From Data Preparation to Modeling
- Training a Neural Network
- Reshaping Time-Series Data
- Making Predictions
- Overfitting
- Activity: Assembling a Deep Learning System
- Summary
- Chapter 6: Model Evaluation and Optimization
- Model Evaluation
- Problem Categories
- Loss Functions, Accuracy, and Error Rates
- Different Loss Functions, Same Architecture
- Using TensorBoard
- Implementing Model Evaluation Metrics
- Evaluating the Bitcoin Model
- Overfitting
- Model Predictions
- Interpreting Predictions
- Activity:Creating an Active Training Environment
- Hyperparameter Optimization
- Layers and Nodes - Adding More Layers
- Adding More Nodes
- Layers and Nodes - Implementation
- Epochs
- Epochs - Implementation
- Activation Functions
- Linear (Identity)
- Hyperbolic Tangent (Tanh)
- Rectifid Linear Unit
- Activation Functions - Implementation
- Regularization Strategies
- L2 Regularization
- Dropout
- Regularization Strategies - Implementation
- Optimization Results
- Activity:Optimizing a Deep Learning Model
- Summary
- Chapter 7: Productization
- Handling New Data
- Separating Data and Model
- Data Component
- Model Component
- Dealing with New Data
- Re-Training an Old Model
- Training a New Model
- Activity: Dealing with New Data
- Deploying a Model as a Web Application
- Application Architecture and Technologies
- Deploying and Using Cryptonic
- Activity: Deploying a Deep Learning Application
- Summary
- Other Books You May Enjoy
- Index.