Mastering predictive analytics with scikit-learn and TensorFlow implement machine learning techniques to build advanced predictive models using Python
Learn advanced techniques to improve the performance and quality of your predictive models Key Features Use ensemble methods to improve the performance of predictive analytics models Implement feature selection, dimensionality reduction, and cross-validation techniques Develop neural network models...
Otros Autores: | |
---|---|
Formato: | Libro electrónico |
Idioma: | Inglés |
Publicado: |
Birmingham, UK :
Packt Publishing Ltd
2018.
|
Edición: | 1st edition |
Materias: | |
Ver en Biblioteca Universitat Ramon Llull: | https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009630501406719 |
Tabla de Contenidos:
- Cover
- Title Page
- Copyright and Credits
- Packt Upsell
- Contributor
- Table of Contents
- Preface
- Chapter 1: Ensemble Methods for Regression and Classification
- Ensemble methods and their working
- Bootstrap sampling
- Bagging
- Random forests
- Boosting
- Ensemble methods for regression
- The diamond dataset
- Training different regression models
- KNN model
- Bagging model
- Random forests model
- Boosting model
- Using ensemble methods for classification
- Predicting a credit card dataset
- Training different regression models
- Logistic regression model
- Bagging model
- Random forest model
- Boosting model
- Summary
- Chapter 2: Cross-validation and Parameter Tuning
- Holdout cross-validation
- K-fold cross-validation
- Implementing k-fold cross-validation
- Comparing models with k-fold cross-validation
- Introduction to hyperparameter tuning
- Exhaustive grid search
- Hyperparameter tuning in scikit-learn
- Comparing tuned and untuned models
- Summary
- Chapter 3: Working with Features
- Feature selection methods
- Removing dummy features with low variance
- Identifying important features statistically
- Recursive feature elimination
- Dimensionality reduction and PCA
- Feature engineering
- Creating new features
- Improving models with feature engineering
- Training your model
- Reducible and irreducible error
- Summary
- Chapter 4: Introduction to Artificial Neural Networks and TensorFlow
- Introduction to ANNs
- Perceptrons
- Multilayer perceptron
- Elements of a deep neural network model
- Deep learning
- Elements of an MLP model
- Introduction to TensorFlow
- TensorFlow installation
- Core concepts in TensorFlow
- Tensors
- Computational graph
- Summary
- Chapter 5: Predictive Analytics with TensorFlow and Deep Neural Networks
- Predictions with TensorFlow.
- Introduction to the MNIST dataset
- Building classification models using MNIST dataset
- Elements of the DNN model
- Building the DNN
- Reading the data
- Defining the architecture
- Placeholders for inputs and labels
- Building the neural network
- The loss function
- Defining optimizer and training operations
- Training strategy and valuation of accuracy of the classification
- Running the computational graph
- Regression with Deep Neural Networks (DNN)
- Elements of the DNN model
- Building the DNN
- Reading the data
- Objects for modeling
- Training strategy
- Input pipeline for the DNN
- Defining the architecture
- Placeholders for input values and labels
- Building the DNN
- The loss function
- Defining optimizer and training operations
- Running the computational graph
- Classification with DNNs
- Exponential linear unit activation function
- Classification with DNNs
- Elements of the DNN model
- Building the DNN
- Reading the data
- Producing the objects for modeling
- Training strategy
- Input pipeline for DNN
- Defining the architecture
- Placeholders for inputs and labels
- Building the neural network
- The loss function
- Evaluation nodes
- Optimizer and the training operation
- Run the computational graph
- Evaluating the model with a set threshold
- Summary
- Other Books You May Enjoy
- Leave a review - let other readers know what you think
- Index.