Course Overview:

This course provides a comprehensive introduction to developing applications with Large Language Models (LLMs). Learners will explore how LLMs transform enterprise workflows, improve productivity, and power innovative solutions. Through hands-on labs, participants will work with HuggingFace, Transformers, LangChain, and multimodal architectures to build end-to-end applications. The course equips learners with practical skills to integrate LLMs into data science workflows and achieve efficiency at scale. 

After completing this course, participants will be able to:

  • Find, pull, and experiment with models from the HuggingFace repository and Transformers API.
  • Use encoder models for semantic analysis, embeddings, classification, and zero-shot learning.
  • Apply GPT-style decoder architectures for text generation and translation tasks.
  • Build multimodal applications combining text, audio, and visual data.
  • Scale text generation and deploy LLM-based applications effectively.
  • Orchestrate workflows with LangChain and agentic frameworks to build enterprise-ready AI systems.

Upcoming Batches

Loading Dates...

Key Features:

  • Hands-On Learning with Real Models 

    • Practice with HuggingFace, Transformers, LangChain, and multimodal architectures. 
    • Guided labs for building and deploying LLM applications. 
  • End-to-End Workflow Coverage 

    • Covers fundamentals, model training, inference, scaling, and orchestration. 
    • Application-driven learning with enterprise use cases. 
  • Enterprise-Oriented Outcomes 

    • Learn how LLMs can enhance customer engagement, automate workflows, and improve analytics. 
    • Explore safe, effective, and scalable approaches to generative AI. 
  • Instructor-Led Immersive Training 

    • Structured, interactive, live sessions with expert instructors. 
    • Combines theory, practice, and assessments for effective learning. 

Who should Attend?

  • Data Scientists and ML Engineers
  • AI/ML Developers exploring LLMs for enterprise use cases
  • Solution Architects designing AI-powered workflows
  • Business Analysts and Technical Leaders seeking applied AI knowledge

Prerequisites:

  • Basic understanding of Python programming
  • Familiarity with machine learning concepts (preferred)
  • Prior exposure to NLP is helpful but not mandatory
  • Why choose CloudThat as your training partner?

    • Authorized NVIDIA Training Partner with proven expertise in cloud and AI.
    • Industry-Recognized Trainers certified by NVIDIA and experienced in enterprise AI.
    • Hands-On Learning with practical labs and real-world case studies.
    • Customized Learning Paths designed for beginners to advanced AI professionals.
    • Interactive Learning Experience with live sessions, group discussions, and mentorship.
    • Career Support with guidance on AI/ML roles, certification paths, and resume building.
    • Regularly Updated Content aligned with the latest NVIDIA and open-source ecosystem.

    Course Outline: Download Course Outline

    • Overview of workshop topics and schedule.
    • Introduction to HuggingFace and Transformers.
    • Discuss how LLMs can enhance enterprise applications.

    • Introduce and motivate the transformer-style architecture from deep learning first principles.
    • Understand input-output processing with tokenizers, embeddings, and attention mechanisms.

    • Profile encoder models for different NLP tasks where they are most useful.
    • Investigate the use of lightweight models for natural language embedding, classification, subsetting, and zero-shot prediction.

    • Introduce GPT-style decoder models for sequence generation and autoregressive tasks.
    • Apply encoder-decoder architectures for applications like machine translation and few-shot task completion.

    • Integrate different data modalities (text, images, audio) into LLM workflows.
    • Explore multimodal models like CLIP for cross-modal learning, visual language models for image question-answering, and diffusion models for text-guided image generation.

    • Explore LLM inference challenges and deployment strategies, including optimized server deployments.
    • Incorporate LLMs into interesting applications that can scale to larger repositories and user bases.

    • Introduce LangChain for LLM orchestration and agentic workflows
    • Investigate use of agentics and tool-calling for integrating natural language with standard applications and data.

    • Build an LLM-based application integrating text generation, multimodal learning, and agentic orchestration.

    • Review key learnings and answer final questions.
    • Earn a certificate upon successful completion.
    • Complete the workshop survey.

    Certification Details:

      Learners will receive a CloudThat–NVIDIA Certificate of Completion after successfully completing the course and final project.

    Select Course date

    Loading Dates...
    Add to Wishlist

    Course ID: 26354

    Course Price at

    Loading price info...
    Enroll Now
    Enquire Now