Materias dentro de su búsqueda.
Materias dentro de su búsqueda.
- Ciencias sociales 269
- Metodología 164
- metodología 140
- Investigación 137
- Research 119
- Sociología 119
- investigaciones 112
- Medios de comunicación social 106
- Historia 92
- Educación 90
- Aspectos sociales 89
- aspectos sociales 88
- Psychology 69
- Investigació 65
- Zeitschrift 64
- Psicología 56
- Filosofía 55
- Diseases 54
- Metodologia 52
- History 50
- Management 50
- Education 49
- Study and teaching 49
- Ciències socials 47
- Marketing 47
- Sociology 46
- Social aspects 44
- Social sciences 44
- Familia 41
- Política 41
-
5781
-
5782
-
5783
-
5784
-
5785
-
5786
-
5787Publicado 2001Revista digital
-
5788
-
5789
-
5790por Levy, Frank
Publicado 2004Biblioteca Universitat Ramon Llull (Otras Fuentes: Biblioteca de la Universidad de Navarra, Biblioteca Universidad de Deusto)Libro -
5791
-
5792Publicado 2024Tabla de Contenidos: “…Quantization with GPTQ and EXL2 -- Other quantization techniques -- Summary -- References -- Chapter 9: RAG Inference Pipeline -- Understanding the LLM twin's RAG inference pipeline -- Exploring the LLM twin's advanced RAG techniques -- Advanced RAG pre-retrieval optimizations: query expansion and self-querying -- Query expansion -- Self-querying -- Advanced RAG retrieval optimization: filtered vector search -- Advanced RAG post-retrieval optimization: reranking -- Implementing the LLM twin's RAG inference pipeline -- Implementing the retrieval module -- Bringing everything together into the RAG inference pipeline -- Summary -- References -- Chapter 10: Inference Pipeline Deployment -- Criteria for choosing deployment types -- Throughput and latency -- Data -- Understanding inference deployment types -- Online real-time inference -- Asynchronous inference -- Offline batch transform -- Monolithic versus microservices architecture in model serving -- Monolithic architecture -- Microservices architecture -- Choosing between monolithic and microservices architectures -- Exploring the LLM Twin's inference pipeline deployment strategy -- The training versus the inference pipeline -- Deploying the LLM Twin service -- Implementing the LLM microservice using AWS SageMaker -- What are Hugging Face's DLCs? -- Configuring SageMaker roles -- Deploying the LLM Twin model to AWS SageMaker -- Calling the AWS SageMaker Inference endpoint -- Building the business microservice using FastAPI -- Autoscaling capabilities to handle spikes in usage -- Registering a scalable target -- Creating a scalable policy -- Minimum and maximum scaling limits -- Cooldown period -- Summary -- References -- Chapter 11: MLOps and LLMOps -- The path to LLMOps: Understanding its roots in DevOps and MLOps -- DevOps -- The DevOps lifecycle -- The core DevOps concepts -- MLOps…”
Libro electrónico -
5793
-
5794
-
5795
-
5796
-
5797
-
5798
-
5799Publicado 1997Libro
-
5800por Klein, David M., 1943-
Publicado 1996Biblioteca Universidad de Deusto (Otras Fuentes: Biblioteca de la Universidad Pontificia de Salamanca)Libro