In this hands-on workshop, participants will learn how to leverage Hugging Face’s cutting-edge tools to build and deploy powerful Natural Language Processing (NLP) models using foundation models like BERT, Phi-3.5, and more. We will cover the key concepts of pre-trained transformers, tokenization, and fine-tuning models for tasks, including sequence classification.
Whether you’re new to NLP or looking to deepen your understanding of transformer-based architectures, this workshop will offer an interactive introduction to Hugging Face’s ecosystem. Participants will explore model inferencing, customizations, and practical techniques for efficient deployment.
Workshop Information | |
---|---|
Prerequisites | Proficient in Python and PyTorch, some Machine Learning and Deep Learning Background, some knowledge of transformers and finetuning |
Note: Workshops may only be attended by students currently enrolled at UW-Madison. Faculty, staff, and former students are not eligible to attend STS sessions.