AI-110 Intro to Large Language Models (LLMs) and LLM-based apps.

Course Description: Artificial intelligence has become an extremely important area for IT professionals and engineers in the past 10-20 years with the scientific breakthroughs and practical applications of deep learning and more recently of generative AI systems, especially with Large Language Models (LLM) such as ChatGPT. That’s why understanding the concepts, and practical usage of AI systems generally and LLMs specifically is becoming essential for all IT and other technical professionals as well as for managers with technical background.

This training focuses on Large Language Models (LLMs) and applications made possible by LLMs and gives
an insight into their theory and practice, namely:

  • Introduction to LLM based applications
  • The Transformer Architecture, Base for all LLMs
  • The 3-phase training process of LLMs (Pre-training, Fine-tuning, RLHF)
  • Prompt engineering
  • Using our own data sources by Retriever Augmented Generation (RAG)
  • Creating LLM chains with LangChain
  • Web interfaces for LLMs (Gradio or Streamlit)

Besides gaining a basic understanding of the theory of Large Language Models (LLMs) as well as other technologies used in LLM-based applications, students will be able to examine their features and play with them during instructor’s demonstration and lab exercises.

Course Length: 8 training hours

Structure: 50% lecture, 25% demonstration by the instructor, 25% hands on lab exercises

Target audience: Technical managers as well as IT and telco professionals who want to familiarize themselves with cloud-native technologies

Prerequisites: General understanding of and experience in IT systems and/or IT development

This training is part of our AI portfolio which explores essential AI topics, such as: