Large Language Model Operations (LLMOps) — Duke University Coursera Specialization
2026-04-17
Six courses covering LLMOps end-to-end — generative AI foundations, Azure LLM operations, advanced data engineering, AWS GenAI, Databricks-to-local deployment, and the open-source LLMOps stack. The Duke University curriculum for engineers shipping LLMs to production.
What You Will Build
A production LLMOps pipeline: fine-tuned and prompt-engineered LLMs on Azure and AWS, advanced data engineering for retrieval and training pipelines, Databricks-hosted LLM workloads, and open-source deployment patterns for local and edge inference.
Courses in This Specialization
- Introduction to Generative AI — Foundation models, transformer architecture, and the generative AI landscape.
- Operationalizing LLMs on Azure — Azure OpenAI, Azure ML, and production LLM deployment patterns.
- Advanced Data Engineering — Retrieval pipelines, vector stores, and the data layer that LLMs need.
- GenAI and LLMs on AWS — Bedrock, SageMaker, and AWS-native LLM operations.
- Databricks to Local LLMs — Lakehouse-hosted LLM training through local and edge inference.
- Open Source LLMOps Solutions — The open-source stack: vLLM, Ollama, llama.cpp, and friends.
Who This Is For
- ML and MLOps engineers moving from classical ML to LLMs
- Platform engineers building internal LLM platforms
- Data engineers owning the retrieval and RAG layer
Related Specializations
- MLOps | Machine Learning Operations — MLOps foundations this extends
- Enterprise AI and Data Engineering with Databricks — lakehouse-native LLMOps
- AI Tooling — production AI tooling around LLMs