Spring 2026, University of Virginia. This graduate seminar explores the security of large language models and AI systems. Topics include adversarial attacks, jailbreaking, prompt injection, data poisoning, safety alignment, guardrails, and secure deployment architectures. Students will critically analyze recent research from top-tier venues (IEEE S&P, USENIX Security, CCS, NDSS, NSDI, ASPLOS) and leading industry labs, present papers, and conduct original research projects.
The course features guest lectures and hands-on workshops from industry practitioners at Google, Cisco, IBM, Dreadnode, and other leading organizations working on AI security.
Course Topics
Background (Weeks 1–3)
- Week 1 (Jan 13–15): Course Overview and Introduction to LLM Security
- Week 2 (Jan 20–22): Transformer Architecture and LLM Interfaces
- Week 3 (Jan 27–29): Security Foundations (Adversarial Machine Learning and LLM Threat Landscape)
Attacks & Threats (Weeks 4–7)
- Week 4 (Feb 3–5): Jailbreaking Techniques and Advanced Prompt Evasion
- Week 5 (Feb 10–12): Data Poisoning, Training-Time Attacks, and Privacy/Model Extraction
- Week 6 (Feb 17–19): Red-Teaming, Adversarial Benchmarking
- Week 7 (Feb 24–26): Attacking AI Agents
Spring Break
- Week 8 (Feb 28 – Mar 8)
Defenses & Evaluation (Weeks 9–11)
- Week 9 (Mar 10–12): Introduction to LLM Defense, Safety Alignment, and Secure Fine-Tuning
- Week 10 (Mar 17–19): Guardrails, Runtime Protection, and Secure-by-Design Architectures
- Week 11 (Mar 24–26): Multi-Agent Security, Watermarking, and Security Benchmarking
Security Applications (Weeks 12–14)
- Week 12 (Mar 31 – Apr 2): AI Agents for Security and Autonomous Penetration Testing
- Week 13 (Apr 7–9): Vulnerability Discovery, Fuzzing, and Threat Detection
- Week 14 (Apr 14–16): Threat Hunting, Forensics, and Supply Chain Security
Project Presentations
- Week 15 (Apr 21–23)
Course Information
Instructor
- 📧 hassan@virginia.edu
- 🏢 Rice Hall 522
- 🕐 Office Hours: Thursdays 4-5PM EST or by appointment
Teaching Assistant
- 📧 mshoaib@virginia.edu
- 🏢 442
- 🕐 Office Hours: Wednesdays 3:30-5PM EST
Logistics
- Class Time: Tuesday & Thursday, 5:00 PM - 6:15 PM EST
- Location: Rice Hall 340
- Prerequisites: No formal prerequisites. Background in computer security and operating systems (CS4630, CS4414) helpful.
Communication Channels
Grading
We will calculate your course grade based on these components:
| Component | Weight | Description |
|---|---|---|
| Participation | 10% | Show up to class and contribute to discussion. |
| Reading Responses | 10% | Short written reflections on assigned papers, submitted before class. |
| Paper Presentations | 15% | Present research papers and lead class discussion. Expect no more than 2 presentations per student in this course. |
| Red Teaming Homework | 15% | Hands-on assignments where you will attempt to jailbreak or bypass safety mechanisms in LLMs, document your methodology, and analyze the results. |
| Course Project | 50% | Conduct original research in AI security, culminating in a conference-style paper. Individual or pairs; larger groups require proportionately more work. Grade based on milestone artifacts and final report. |