Artificial intelligence by example develop machine intelligence from scratch using real artificial intelligence use cases
Be an adaptive thinker that leads the way to Artificial Intelligence About This Book AI-based examples to guide you in designing and implementing machine intelligence Develop your own method for future AI solutions Acquire advanced AI, machine learning, and deep learning design skills Who This Book...
Otros Autores: | |
---|---|
Formato: | Libro electrónico |
Idioma: | Inglés |
Publicado: |
Birmingham :
Packt
2018.
|
Edición: | 1st edition |
Materias: | |
Ver en Biblioteca Universitat Ramon Llull: | https://discovery.url.edu/permalink/34CSUC_URL/1im36ta/alma991009631831906719 |
Tabla de Contenidos:
- Cover
- Title Page
- Copyright and Credits
- Dedication
- Packt Upsell
- Contributors
- Table of Contents
- Preface
- Chapter 1: Become an Adaptive Thinker
- Technical requirements
- How to be an adaptive thinker
- Addressing real-life issues before coding a solution
- Step 1 - MDP in natural language
- Step 2 - the mathematical representation of the Bellman equation and MDP
- From MDP to the Bellman equation
- Step 3 - implementing the solution in Python
- The lessons of reinforcement learning
- How to use the outputs
- Machine learning versus traditional applications
- Summary
- Questions
- Further reading
- Chapter 2: Think like a Machine
- Technical requirements
- Designing datasets - where the dream stops and the hard work begins
- Designing datasets in natural language meetings
- Using the McCulloch-Pitts neuron
- The McCulloch-Pitts neuron
- The architecture of Python TensorFlow
- Logistic activation functions and classifiers
- Overall architecture
- Logistic classifier
- Logistic function
- Softmax
- Summary
- Questions
- Further reading
- Chapter 3: Apply Machine Thinking to a Human Problem
- Technical requirements
- Determining what and how to measure
- Convergence
- Implicit convergence
- Numerical - controlled convergence
- Applying machine thinking to a human problem
- Evaluating a position in a chess game
- Applying the evaluation and convergence process to a business problem
- Using supervised learning to evaluate result quality
- Summary
- Questions
- Further reading
- Chapter 4: Become an Unconventional Innovator
- Technical requirements
- The XOR limit of the original perceptron
- XOR and linearly separable models
- Linearly separable models
- The XOR limit of a linear model, such as the original perceptron
- Building a feedforward neural network from scratch.
- Step 1 - Defining a feedforward neural network
- Step 2 - how two children solve the XOR problem every day
- Implementing a vintage XOR solution in Python with an FNN and backpropagation
- A simplified version of a cost function and gradient descent
- Linear separability was achieved
- Applying the FNN XOR solution to a case study to optimize subsets of data
- Summary
- Questions
- Further reading
- Chapter 5: Manage the Power of Machine Learning and Deep Learning
- Technical requirements
- Building the architecture of an FNN with TensorFlow
- Writing code using the data flow graph as an architectural roadmap
- A data flow graph translated into source code
- The input data layer
- The hidden layer
- The output layer
- The cost or loss function
- Gradient descent and backpropagation
- Running the session
- Checking linear separability
- Using TensorBoard to design the architecture of your machine learning and deep learning solutions
- Designing the architecture of the data flow graph
- Displaying the data flow graph in TensorBoard
- The final source code with TensorFlow and TensorBoard
- Using TensorBoard in a corporate environment
- Using TensorBoard to explain the concept of classifying customer products to a CEO
- Will your views on the project survive this meeting?
- Summary
- Questions
- Further reading
- References
- Chapter 6: Don't Get Lost in Techniques - Focus on Optimizing Your Solutions
- Technical requirements
- Dataset optimization and control
- Designing a dataset and choosing an ML/DL model
- Approval of the design matrix
- Agreeing on the format of the design matrix
- Dimensionality reduction
- The volume of a training dataset
- Implementing a k-means clustering solution
- The vision
- The data
- Conditioning management
- The strategy
- The k-means clustering program.
- The mathematical definition of k-means clustering
- Lloyd's algorithm
- The goal of k-means clustering in this case study
- The Python program
- 1 - The training dataset
- 2 - Hyperparameters
- 3 - The k-means clustering algorithm
- 4 - Defining the result labels
- 5 - Displaying the results - data points and clusters
- Test dataset and prediction
- Analyzing and presenting the results
- AGV virtual clusters as a solution
- Summary
- Questions
- Further reading
- Chapter 7: When and How to Use Artificial Intelligence
- Technical requirements
- Checking whether AI can be avoided
- Data volume and applying k-means clustering
- Proving your point
- NP-hard - the meaning of P
- NP-hard - The meaning of non-deterministic
- The meaning of hard
- Random sampling
- The law of large numbers - LLN
- The central limit theorem
- Using a Monte Carlo estimator
- Random sampling applications
- Cloud solutions - AWS
- Preparing your baseline model
- Training the full sample training dataset
- Training a random sample of the training dataset
- Shuffling as an alternative to random sampling
- AWS - data management
- Buckets
- Uploading files
- Access to output results
- SageMaker notebook
- Creating a job
- Running a job
- Reading the results
- Recommended strategy
- Summary
- Questions
- Further reading
- Chapter 8: Revolutions Designed for Some Corporations and Disruptive Innovations for Small to Large Companies
- Technical requirements
- Is AI disruptive?
- What is new and what isn't in AI
- AI is based on mathematical theories that are not new
- Neural networks are not new
- Cloud server power, data volumes, and web sharing of the early 21st century started to make AI disruptive
- Public awareness contributed to making AI disruptive
- Inventions versus innovations
- Revolutionary versus disruptive solutions.
- Where to start?
- Discover a world of opportunities with Google Translate
- Getting started
- The program
- The header
- Implementing Google's translation service
- Google Translate from a linguist's perspective
- Playing with the tool
- Linguistic assessment of Google Translate
- Lexical field theory
- Jargon
- Translating is not just translating but interpreting
- How to check a translation
- AI as a new frontier
- Lexical field and polysemy
- Exploring the frontier - the program
- k-nearest neighbor algorithm
- The KNN algorithm
- The knn_polysemy.py program
- Implementing the KNN compressed function in Google_Translate_Customized.py
- Conclusions on the Google Translate customized experiment
- The disruptive revolutionary loop
- Summary
- Questions
- Further reading
- Chapter 9: Getting Your Neurons to Work
- Technical requirements
- Defining a CNN
- Defining a CNN
- Initializing the CNN
- Adding a 2D convolution
- Kernel
- Intuitive approach
- Developers' approach
- Mathematical approach
- Shape
- ReLu
- Pooling
- Next convolution and pooling layer
- Flattening
- Dense layers
- Dense activation functions
- Training a CNN model
- The goal
- Compiling the model
- Loss function
- Quadratic loss function
- Binary cross-entropy
- Adam optimizer
- Metrics
- Training dataset
- Data augmentation
- Loading the data
- Testing dataset
- Data augmentation
- Loading the data
- Training with the classifier
- Saving the model
- Next steps
- Summary
- Questions
- Further reading and references
- Chapter 10: Applying Biomimicking to Artificial Intelligence
- Technical requirements
- Human biomimicking
- TensorFlow, an open source machine learning framework
- Does deep learning represent our brain or our mind?
- A TensorBoard representation of our mind
- Input data.
- Layer 1 - managing the inputs to the network
- Weights, biases, and preactivation
- Displaying the details of the activation function through the preactivation process
- The activation function of Layer 1
- Dropout and Layer 2
- Layer 2
- Measuring the precision of prediction of a network through accuracy values
- Correct prediction
- accuracy
- Cross-entropy
- Training
- Optimizing speed with Google's Tensor Processing Unit
- Summary
- Questions
- Further reading
- Chapter 11: Conceptual Representation Learning
- Technical requirements
- Generate profit with transfer learning
- The motivation of transfer learning
- Inductive thinking
- Inductive abstraction
- The problem AI needs to solve
- The Γ gap concept
- Loading the Keras model after training
- Loading the model to optimize training
- Loading the model to use it
- Using transfer learning to be profitable or see a project stopped
- Defining the strategy
- Applying the model
- Making the model profitable by using it for another problem
- Where transfer learning ends and domain learning begins
- Domain learning
- How to use the programs
- The trained models used in this section
- The training model program
- GAP - loaded or unloaded
- GAP - jammed or open lanes
- The gap dataset
- Generalizing the Γ(gap conceptual dataset)
- Generative adversarial networks
- Generating conceptual representations
- The use of autoencoders
- The motivation of conceptual representation learning meta-models
- The curse of dimensionality
- The blessing of dimensionality
- Scheduling and blockchains
- Chatbots
- Self-driving cars
- Summary
- Questions
- Further reading
- Chapter 12: Automated Planning and Scheduling
- Technical requirements
- Planning and scheduling today and tomorrow
- A real-time manufacturing process.
- Amazon must expand its services to face competition.