Materias dentro de su búsqueda.
Materias dentro de su búsqueda.
- Machine learning 25
- Artificial intelligence 19
- Python (Computer program language) 14
- Neural networks (Computer science) 8
- Cloud computing 6
- Data processing 5
- Computer programs 4
- Data mining 4
- Natural language processing (Computer science) 4
- Big data 3
- Computer vision 3
- TensorFlow 3
- Algorithms 2
- Application software 2
- Computer algorithms 2
- Development 2
- Industrial applications 2
- Mathematics 2
- Matrices 2
- Quantitative research 2
- R (Computer program language) 2
- Web services 2
- complexity 2
- compressed sensing 2
- deep learning 2
- machine learning 2
- neural networks 2
- 1D pooling 1
- 3D-CALIC 1
- ADMM 1
-
41Publicado 2024Tabla de Contenidos: “…Evaluating classification performance -- Tuning models with cross-validation -- Summary -- Exercises -- References -- Chapter 3: Predicting Online Ad Click-Through with Tree-Based Algorithms -- A brief overview of ad click-through prediction -- Getting started with two types of data - numerical and categorical -- Exploring a decision tree from the root to the leaves -- Constructing a decision tree -- The metrics for measuring a split -- Gini Impurity -- Information gain -- Implementing a decision tree from scratch -- Implementing a decision tree with scikit-learn -- Predicting ad click-through with a decision tree -- Ensembling decision trees - random forests -- Ensembling decision trees - gradient-boosted trees -- Summary -- Exercises -- Chapter 4: Predicting Online Ad Click-Through with Logistic Regression -- Converting categorical features to numerical - one-hot encoding and ordinal encoding -- Classifying data with logistic regression -- Getting started with the logistic function -- Jumping from the logistic function to logistic regression -- Training a logistic regression model -- Training a logistic regression model using gradient descent -- Predicting ad click-through with logistic regression using gradient descent -- Training a logistic regression model using stochastic gradient descent (SGD) -- Training a logistic regression model with regularization -- Feature selection using L1 regularization -- Feature selection using random forest -- Training on large datasets with online learning -- Handling multiclass classification -- Implementing logistic regression using TensorFlow -- Summary -- Exercises -- Chapter 5: Predicting Stock Prices with Regression Algorithms -- What is regression? …”
Libro electrónico -
42por Webber, EmilyTabla de Contenidos: “…References -- Chapter 9: Advanced Training Concepts -- Evaluating and improving throughput -- Calculating model TFLOPS -- Using Flash Attention to speed up your training runs -- Speeding up your jobs with compilation -- Integrating compilation into your PyTorch scripts -- Amazon SageMaker Training Compiler and Neo -- Best practices for compilation -- Running compiled models on Amazon's Trainium and Inferentia custom hardware -- Solving for an optimal training time -- Summary -- References -- Part 4: Evaluate Your Model -- Chapter 10: Fine-Tuning and Evaluating -- Fine-tuning for language, text, and everything in between -- Fine-tuning a language-only model -- Fine-tuning vision-only models -- Fine-tuning vision-language models -- Evaluating foundation models -- Model evaluation metrics for vision -- Model evaluation metrics in language -- Model evaluation metrics in joint vision-language tasks -- Incorporating the human perspective with labeling through SageMaker Ground Truth -- Reinforcement learning from human feedback -- Summary -- References -- Chapter 11: Detecting, Mitigating, and Monitoring Bias -- Detecting bias in ML models -- Detecting bias in large vision and language models -- Mitigating bias in vision and language models -- Bias mitigation in language - counterfactual data augmentation and fair loss functions -- Bias mitigation in vision - reducing correlation dependencies and solving sampling issues -- Monitoring bias in ML models -- Detecting, mitigating, and monitoring bias with SageMaker Clarify -- Summary -- References -- Chapter 12: How to Deploy Your Model -- What is model deployment? …”
Publicado 2023
Libro electrónico -
43Publicado 2017Tabla de Contenidos: “…Alternating least squares with Apache Spark MLlib -- References -- Summary -- Chapter 12: Introduction to Natural Language Processing -- NLTK and built-in corpora -- Corpora examples -- The bag-of-words strategy -- Tokenizing -- Sentence tokenizing -- Word tokenizing -- Stopword removal -- Language detection -- Stemming -- Vectorizing -- Count vectorizing -- N-grams -- Tf-idf vectorizing -- A sample text classifier based on the Reuters corpus -- References -- Summary -- Chapter 13: Topic Modeling and Sentiment Analysis in NLP -- Topic modeling -- Latent semantic analysis -- Probabilistic latent semantic analysis -- Latent Dirichlet Allocation -- Sentiment analysis -- VADER sentiment analysis with NLTK -- References -- Summary -- Chapter 14: A Brief Introduction to Deep Learning and TensorFlow -- Deep learning at a glance -- Artificial neural networks -- Deep architectures -- Fully connected layers -- Convolutional layers -- Dropout layers -- Recurrent neural networks -- A brief introduction to TensorFlow -- Computing gradients -- Logistic regression -- Classification with a multi-layer perceptron -- Image convolution -- A quick glimpse inside Keras -- References -- Summary -- Chapter 15: Creating a Machine Learning Architecture -- Machine learning architectures -- Data collection -- Normalization -- Dimensionality reduction -- Data augmentation -- Data conversion -- Modeling/Grid search/Cross-validation -- Visualization -- scikit-learn tools for machine learning architectures -- Pipelines -- Feature unions -- References -- Summary -- Index…”
Libro electrónico -
44Publicado 2018Tabla de Contenidos: “…† Reprinted from: Mathematics 2018, 6, 146, doi: 10.3390/math6090146 1 -- Krassimir Atanassov On the Most Extended Modal Operator of First Type over Interval-Valued Intuitionistic Fuzzy Sets Reprinted from: Mathematics 2018, 6, 123, doi: 10.3390/math6070123 25 -- Young Bae Jun, Seok-Zun Song and Seon Jeong Kim N -Hyper Sets Reprinted from: Mathematics 2018, 6, 87, doi: 10.3390/math6060087 . 35 -- Muhammad Akram and Gulfam Shahzadi Hypergraphs in m-Polar Fuzzy Environment Reprinted from: Mathematics 2018, 6, 28, doi: 10.3390/math6020028 . 47 -- Noor Rehman, Choonkil Park, Syed Inayat Ali Shah and Abbas Ali On Generalized Roughness in LA-Semigroups Reprinted from: Mathematics 2018, 6, 112, doi: 10.3390/math6070112 65 -- Hsien-Chung Wu Fuzzy Semi-Metric Spaces Reprinted from: Mathematics 2018, 6, 106, doi: 10.3390/math6070106 73 -- E. …”
Libro electrónico -
45Publicado 2024Tabla de Contenidos: “…5.4 Instruction optimization -- 5.4.1 Device intrinsics -- 5.4.1.1 Directed rounding -- 5.4.1.2 C intrinsics -- 5.4.1.3 Fast math intrinsics -- 5.4.1.4 Compiler options -- 5.4.2 Divergent warps -- 6 Porting tips and techniques -- 6.1 CUF kernels -- 6.2 Conditional inclusion of code -- 6.3 Renaming variables -- 6.3.1 Renaming via use statements -- 6.3.2 Renaming via the associate construct -- 6.4 Minimizing memory footprint for work arrays -- 6.5 Array compaction -- 7 Interfacing with CUDA C code and CUDA libraries -- 7.1 Calling user-written CUDA C code -- 7.1.1 The ignore"80"137tkr directive -- 7.2 cuBLAS -- 7.2.1 Legacy cuBLAS API -- 7.2.2 New cuBLAS API -- 7.2.3 Batched cuBLAS routines -- 7.2.4 GEMM with tensor cores -- 7.3 cuSPARSE -- 7.4 cuSOLVER -- 7.5 cuTENSOR -- 7.5.1 Low-level cuTENSOR interfaces -- 7.6 Thrust -- 8 Multi-GPU programming -- 8.1 CUDA multi-GPU features -- 8.1.1 Peer-to-peer communication -- 8.1.1.1 Requirements for peer-to-peer communication -- 8.1.2 Peer-to-peer direct transfers -- 8.1.3 Peer-to-peer transpose -- 8.2 Multi-GPU programming with MPI -- 8.2.1 Assigning devices to MPI ranks -- 8.2.2 MPI transpose -- 8.2.3 GPU-aware MPI transpose -- 2 Case studies -- 9 Monte Carlo method -- 9.1 CURAND -- 9.2 Computing π with CUF kernels -- 9.2.1 IEEE-754 precision -- 9.3 Computing π with reduction kernels -- 9.3.1 Reductions with SHFL instructions -- 9.3.2 Reductions with atomic locks -- 9.3.3 Reductions using the grid"80"137group cooperative group -- 9.4 Accuracy of summation -- 9.5 Option pricing -- 10 Finite difference method -- 10.1 Nine-point 1D finite difference stencil -- 10.1.1 Data reuse and shared memory -- 10.1.2 The x-derivative kernel -- 10.1.2.1 Performance of the x-derivative kernel -- 10.1.3 Derivatives in y and z -- 10.1.4 Nonuniform grids -- 10.2 2D Laplace equation…”
Libro electrónico -
46Publicado 2022“…There is also an introductory lesson included on Deep Neural Networks with a worked-out example on image classification using TensorFlow and Keras. By the end of the course, you will learn some basic foundations of data science using Python. …”
Video -
47Publicado 2024Tabla de Contenidos: “…Evolving language models - the AR Transformer and its role in GenAI -- Implementing the original Transformer -- Data loading and preparation -- Tokenization -- Data tensorization -- Dataset creation -- Embeddings layer -- Positional encoding -- Multi-head self-attention -- FFN -- Encoder layer -- Encoder -- Decoder layer -- Decoder -- Complete transformer -- Training function -- Translation function -- Main execution -- Summary -- References -- Chapter 4: Applying Pretrained Generative Models: From Prototype to Production -- Prototyping environments -- Transitioning to production -- Mapping features to production setup -- Setting up a production-ready environment -- Local development setup -- Visual Studio Code -- Project initialization -- Docker setup -- Requirements file -- Application code -- Creating a code repository -- CI/CD setup -- Model selection - choosing the right pretrained generative model -- Meeting project objectives -- Model size and computational complexity -- Benchmarking -- Updating the prototyping environment -- GPU configuration -- Loading pretrained models with LangChain -- Setting up testing data -- Quantitative metrics evaluation -- Alignment with CLIP -- Interpreting outcomes -- Responsible AI considerations -- Addressing and mitigating biases -- Transparency and explainability -- Final deployment -- Testing and monitoring -- Maintenance and reliability -- Summary -- Part 2: Practical Applications of Generative AI -- Chapter 5: Fine-Tuning Generative Models for Specific Tasks -- Foundation and relevance - an introduction to fine-tuning -- PEFT -- LoRA -- AdaLoRA -- In-context learning -- Fine-tuning versus in-context learning -- Practice project: Fine-tuning for Q& -- A using PEFT -- Background regarding question-answering fine-tuning -- Implementation in Python -- Evaluation of results -- Summary -- References…”
Libro electrónico -
48Publicado 2017Tabla de Contenidos: “…-- Supervised learning -- Unsupervised learning -- Reinforcement learning -- Training and testing the model -- The data cycle -- Evaluation metrics -- Confusion matrix -- True Positive Rate -- True Negative Rate -- Accuracy -- Precision and recall -- F-score -- Receiver Operating Characteristic curve -- Learning in neural networks -- Back to backpropagation -- Neural network learning algorithm optimization -- Supervised learning in neural networks -- Boston dataset -- Neural network regression with the Boston dataset -- Unsupervised learning in neural networks&…”
Libro electrónico -
49Publicado 2017Tabla de Contenidos: “…-- Loading and preparing the dataset -- Implementing the OneR algorithm -- Testing the algorithm -- Summary -- Chapter 2: Classifying with scikit-learn Estimators -- scikit-learn estimators -- Nearest neighbors -- Distance metrics -- Loading the dataset -- Moving towards a standard workflow -- Running the algorithm -- Setting parameters -- Preprocessing -- Standard pre-processing -- Putting it all together -- Pipelines -- Summary -- Chapter 3: Predicting Sports Winners with Decision Trees -- Loading the dataset -- Collecting the data -- Using pandas to load the dataset -- Cleaning up the dataset -- Extracting new features -- Decision trees -- Parameters in decision trees -- Using decision trees -- Sports outcome prediction -- Putting it all together -- Random forests -- How do ensembles work? …”
Libro electrónico -
50Publicado 2018“…You'll practice the ML workflow from model design, loss metric definition, and parameter tuning to performance evaluation in a time series context. …”
Libro electrónico -
51Publicado 2021“…You’ll also learn to identify and measure the metrics that tell how well your classifier is doing. …”
Libro electrónico -
52Publicado 2018Tabla de Contenidos: “…Chapter 15: Introducing Neural Networks -- Deep learning at a glance -- Artificial neural networks -- MLPs with Keras -- Interfacing Keras to scikit-learn -- Summary -- Chapter 16: Advanced Deep Learning Models -- Deep model layers -- Fully connected layers -- Convolutional layers -- Dropout layers -- Batch normalization layers -- Recurrent Neural Networks -- An example of a deep convolutional network with Keras -- An example of an LSTM network with Keras -- A brief introduction to TensorFlow -- Computing gradients -- Logistic regression -- Classification with a multilayer perceptron -- Image convolution -- Summary -- Chapter 17: Creating a Machine Learning Architecture -- Machine learning architectures -- Data collection -- Normalization and regularization -- Dimensionality reduction -- Data augmentation -- Data conversion -- Modeling/grid search/cross-validation -- Visualization -- GPU support -- A brief introduction to distributed architectures -- Scikit-learn tools for machine learning architectures -- Pipelines -- Feature unions -- Summary -- Other Books You May Enjoy -- Index…”
Libro electrónico -
53Publicado 2024Tabla de Contenidos: “…Working with DistilBERT for knowledge distillation -- Pruning transformers -- Quantization -- Working with efficient self-attention -- Sparse attention with fixed patterns -- Learnable patterns -- Low-rank factorization, kernel methods, and other approaches -- Easier quantization using bitsandbytes -- Summary -- References -- Chapter 13: Cross-Lingual and Multilingual Language Modeling -- Technical requirements -- Translation language modeling and cross-lingual knowledge sharing -- XLM and mBERT -- mBERT -- XLM -- Cross-lingual similarity tasks -- Cross-lingual text similarity -- Visualizing cross-lingual textual similarity -- Cross-lingual classification -- Cross-lingual zero-shot learning -- Massive multilingual translation -- Fine-tuning the performance of multilingual models -- Summary -- References -- Chapter 14: Serving Transformer Models -- Technical requirements -- FastAPI Transformer model serving -- Dockerizing APIs -- Faster Transformer model serving using TFX -- Load testing using Locust -- Faster inference using ONNX -- SageMaker inference -- Summary -- Further reading -- Chapter 15: Model Tracking and Monitoring -- Technical requirements -- Tracking model metrics -- Tracking model training with TensorBoard -- Tracking model training live with W& -- B -- Summary -- Further reading -- Part 4: Transformers beyond NLP -- Chapter 16: Vision Transformers -- Technical requirements -- Vision transformers -- Image classification using transformers -- Semantic segmentation and object detection using transformers -- Visual prompt models -- Summary -- Chapter 17: Multimodal Generative Transformers -- Technical requirements -- Multimodal learning -- Generative multimodal AI -- Stable Diffusion for text-to-image generation -- Stable Diffusion in action -- Music generation using MusicGen -- Text-to-speech generation using transformers -- Summary…”
Libro electrónico -
54Publicado 2018Tabla de Contenidos: “…-- Step-by-step installation -- Installing the necessary packages -- Package upgrades -- Scientific distributions -- Anaconda -- Leveraging conda to install packages -- Enthought Canopy -- WinPython -- Explaining virtual environments -- Conda for managing environments -- A glance at the essential packages -- NumPy -- SciPy -- pandas -- pandas-profiling -- Scikit-learn -- Jupyter -- JupyterLab -- Matplotlib -- Seaborn -- Statsmodels -- Beautiful Soup -- NetworkX -- NLTK -- Gensim -- PyPy -- XGBoost -- LightGBM -- CatBoost -- TensorFlow -- Keras -- Introducing Jupyter -- Fast installation and first test usage -- Jupyter magic commands -- Installing packages directly from Jupyter Notebooks -- Checking the new JupyterLab environment -- How Jupyter Notebooks can help data scientists -- Alternatives to Jupyter -- Datasets and code used in this book -- Scikit-learn toy datasets -- The MLdata.org and other public repositories for open source data -- LIBSVM data examples -- Loading data directly from CSV or text files -- Scikit-learn sample generators -- Summary -- Chapter 2: Data Munging -- The data science process -- Data loading and preprocessing with pandas -- Fast and easy data loading -- Dealing with problematic data -- Dealing with big datasets -- Accessing other data formats -- Putting data together -- Data preprocessing -- Data selection -- Working with categorical and textual data -- A special type of data - text -- Scraping the web with Beautiful Soup -- Data processing with NumPy -- NumPy's n-dimensional array -- The basics of NumPy ndarray objects -- Creating NumPy arrays -- From lists to unidimensional arrays…”
Libro electrónico -
55Publicado 2024Tabla de Contenidos: “…Monitoring -- Logs -- Metrics -- System metrics -- Model metrics -- Drifts -- Monitoring vs. observability -- Alerts -- 6. …”
Libro electrónico -
56Publicado 2024Tabla de Contenidos: “…-- Costs and trade-offs of interoperability -- ESP32-H2 -- Interoperability concept, approaches, and principles for building with IoT -- Concepts, approaches, and principles -- Types of interoperability -- Layers of IoT -- Architecting for interoperability -- Projects working toward greater interoperability -- Global interoperability -- Interoperability within the cloud -- E-health platform case study -- Advancing the interoperability of IoT platforms -- Practical - Creating a Telegram household motion detector -- Creating a chatbot -- Getting a Telegram user ID -- Working with the Arduino IDE -- Hardware setup -- Coding it up -- Outcome -- Summary -- Further reading -- Part 3: Operating, Maintaining, and Securing IoT Networks -- Chapter 9: Operating and Monitoring IoT Networks -- Technical requirements -- Continuous operation of IoT systems -- Challenges and benefits of maintaining continuous operation -- Strategies for achieving continuous operation -- Automation and machine learning in monitoring -- Exercise on simulating monitoring networks -- Setting KPIs and the metrics for success -- Setting clear objectives and goals for monitoring -- Different types of KPIs -- Selecting, analyzing, and monitoring KPIs -- Monitoring capabilities on-premises and on the cloud -- Monitoring for security purposes -- Creating a unified monitoring solution -- Practical - operating and monitoring a joke creator with IoT Greengrass -- Setting up your OpenAI account -- Spinning up an Amazon EC2 instance -- Configure AWS Greengrass on Amazon EC2 -- Monitoring the EC2 Thing when publishing messages -- Summary -- Further reading -- Chapter 10: Working with Data and Analytics -- Technical requirements…”
Libro electrónico -
57Publicado 2019Tabla de Contenidos: “…Describing the Context of Data Science Programming Languages -- Unfolding Open Source Frameworks for AI/ML Models -- TensorFlow -- Theano -- Torch -- Caffe and Caffe2 -- The Microsoft Cognitive Toolkit (previously known as Microsoft CNTK) -- Keras -- Scikit-learn -- Spark MLlib -- Azure ML Studio -- Amazon Machine Learning -- Choosing Open Source or Not? …”
Libro electrónico -
58Publicado 2023Tabla de Contenidos: “…Part 2: Developing Custom Object Detection Models -- Chapter 3: Data Preparation for Object Detection Applications -- Technical requirements -- Common data sources -- Getting images -- Selecting an image labeling tool -- Annotation formats -- Labeling the images -- Annotation format conversions -- Converting YOLO datasets to COCO datasets -- Converting Pascal VOC datasets to COCO datasets -- Summary -- Chapter 4: The Architecture of the Object Detection Model in Detectron2 -- Technical requirements -- Introduction to the application architecture -- The backbone network -- Region Proposal Network -- The anchor generator -- The RPN head -- The RPN loss calculation -- Proposal predictions -- Region of Interest Heads -- The pooler -- The box predictor -- Summary -- Chapter 5: Training Custom Object Detection Models -- Technical requirements -- Processing data -- The dataset -- Downloading and performing initial explorations -- Data format conversion -- Displaying samples -- Using the default trainer -- Selecting the best model -- Evaluation metrics for object detection models -- Selecting the best model -- Inferencing thresholds -- Sample predictions -- Developing a custom trainer -- Utilizing the hook system -- Summary -- Chapter 6: Inspecting Training Results and Fine-Tuning Detectron2's Solvers -- Technical requirements -- Inspecting training histories with TensorBoard -- Understanding Detectron2's solvers -- Gradient descent -- Stochastic gradient descent -- Momentum -- Variable learning rates -- Fine-tuning the learning rate and batch size -- Summary -- Chapter 7: Fine-Tuning Object Detection Models -- Technical requirements -- Setting anchor sizes and anchor ratios -- Preprocessing input images -- Sampling training data and generating the default anchors -- Generating sizes and ratios hyperparameters -- Setting pixel means and standard deviations…”
Libro electrónico -
59Publicado 2017Tabla de Contenidos: “…Merging SparkR DataFrames -- Using User Defined Functions (UDFs) -- Using SparkR for computing summary statistics -- Using SparkR for data visualization -- Visualizing data on a map -- Visualizing graph nodes and edges -- Using SparkR for machine learning -- Summary -- Chapter 9: Developing Applications with Spark SQL -- Introducing Spark SQL applications -- Understanding text analysis applications -- Using Spark SQL for textual analysis -- Preprocessing textual data -- Computing readability -- Using word lists -- Creating data preprocessing pipelines -- Understanding themes in document corpuses -- Using Naive Bayes classifiers -- Developing a machine learning application -- Summary -- Chapter 10: Using Spark SQL in Deep Learning Applications -- Introducing neural networks -- Understanding deep learning -- Understanding representation learning -- Understanding stochastic gradient descent -- Introducing deep learning in Spark -- Introducing CaffeOnSpark -- Introducing DL4J -- Introducing TensorFrames -- Working with BigDL -- Tuning hyperparameters of deep learning models -- Introducing deep learning pipelines -- Understanding Supervised learning -- Understanding convolutional neural networks -- Using neural networks for text classification -- Using deep neural networks for language processing -- Understanding Recurrent Neural Networks -- Introducing autoencoders -- Summary -- Chapter 11: Tuning Spark SQL Components for Performance -- Introducing performance tuning in Spark SQL -- Understanding DataFrame/Dataset APIs -- Optimizing data serialization -- Understanding Catalyst optimizations -- Understanding the Dataset/DataFrame API -- Understanding Catalyst transformations -- Visualizing Spark application execution -- Exploring Spark application execution metrics -- Using external tools for performance tuning -- Cost-based optimizer in Apache Spark 2.2.…”
Libro electrónico -
60por Prakash, Kolla BhanuTabla de Contenidos: “…-- 1.6.20 Online Video Streaming (Netflix) -- 1.7 Challenges in Machine Learning -- 1.8 Limitations of Machine Learning -- 1.9 Projects in Machine Learning -- References -- Chapter 2 Machine Learning Building Blocks -- 2.1 Data Collection -- 2.1.1 Importing the Data from CSV Files -- 2.2 Data Preparation -- 2.2.1 Data Exploration -- 2.2.2 Data Pre-Processing -- 2.3 Data Wrangling -- 2.4 Data Analysis -- 2.5 Model Selection -- 2.6 Model Building -- 2.7 Model Evaluation -- 2.7.1 Classification Metrics -- 2.7.1.1 Accuracy -- 2.7.1.2 Precision -- 2.7.1.3 Recall -- 2.7.2 Regression Metrics -- 2.7.2.1 Mean Squared Error -- 2.7.2.2 Root Mean Squared Error -- 2.7.2.3 Mean Absolute Error -- 2.8 Deployment…”
Publicado 2024
Libro electrónico