Sumario: | Welcome to the captivating world of Large Language Models (LLMs)! This course is designed to guide you through the fascinating landscape of LLMs, encompassing both no-code and low-code solutions powered by Python. We begin by equipping you with the necessary software to operate these models locally, enabling you to explore various open-source models, including Google's Gemma:7b and Llama3, without the need for an internet connection. In the advanced section, you will delve into using Python for streaming responses from open-source LLMs and integrating with the official OpenAI API. The course covers low code solutions for streaming ChatGPT responses and introduces function calling within ChatGPT models. You'll also learn about the llama3 model and how to interface it with Python for web applications. Throughout the course, you'll gain experience fetching responses in both batch and streaming modes, and we will jointly develop a small web frontend using Streamlit for our streaming LLM API. By the end of the course, you'll be equipped to run open-source LLMs locally and leverage their capabilities to the fullest.
|