[Build with AI] Running Gemini locally using Docker

NAC, 160 Convent Avenue, New York, 10031

GDG on Campus The City College of New York - New York, United States

Join us for an interactive workshop where you’ll learn how to containerize applications, manage Docker images, and orchestrate containers for scalable development. This session covers both foundational and advanced Docker concepts, including running Large Language Models (LLMs) like Google Gemma 2 locally using Docker and Ollama.

Feb 27, 5:30 – 6:45 PM (UTC)

35 RSVP'd

Key Themes

CloudDevOpsGeminiWeb

About this event

Join us for an exciting workshop on Docker with Mofi Rahman, a Senior Developer Relations Engineer at Google and a City College of New York alumnus.

In this workshop, you'll gain hands-on experience with Docker, learning how to containerize applications, manage images, and orchestrate containers efficiently. We'll start with the fundamentals of Docker, covering how containers work, why they are essential for modern development, and how they compare to traditional virtualization.

Additionally, we will explore how to run Large Language Models (LLMs) locally, including Google Gemma 2 using Docker and Ollama. This will enable you to leverage powerful AI models without relying on cloud services, enhancing privacy and performance for AI-driven applications.

Through interactive exercises, you'll learn how to:

✅ Set up and run containers using Docker

✅ Build and manage Docker images

✅ Use Docker Compose to manage multi-container applications

✅ Optimize workflows for local development and production deployments

✅ Install and run Google Gemma 2 LLM locally with Docker & Ollama

✅ Leverage Ollama for efficient AI model execution on your machine

By the end of the workshop, you'll have a solid understanding of containerized application development and practical experience running LLMs locally. Whether you're new to Docker or looking to deepen your knowledge, this session is perfect for developers interested in cloud-native technologies, AI applications, and scalable deployments.

Don't miss this opportunity to learn from an industry expert and enhance your skills in both Docker and local AI deployment! 🚀

When

When

Thursday, February 27, 2025
5:30 PM – 6:45 PM (UTC)

Organizer

  • Daniel Chen

    Organizer

Contact Us