Data Engineering: Apache Airflow

GDG Cloud Twin Cities
Mon, May 2, 8:00 PM (CDT)

55 RSVP'ed

About this event

Ever have the right data, but in the wrong place?  Get it where it needs to be by using Apache Airflow workflows in Cloud Composer.

Your data scientists will thank you for having the right data where they need it to be!


You might not have ever worked with Apache Airflow, but that's perfectly fine, this is a great chance to learn more about it and ow it works in Google Cloud Platform.

At this event you will get to practice working with data using Apache Airflow workflows in Cloud Composer to help you prepare for the Google Cloud Certified Professional Data Engineer Certification.


This week's hands-on lab:

Cloud Composer: Copying BigQuery Tables Across Different Locations

In this advanced lab you will create and run an Apache Airflow workflow in Cloud Composer that exports tables from a BigQuery dataset located in Cloud Storage buckets in the US to buckets in Europe, then import those tables to a BigQuery dataset in Europe.


We'll work through the lab to get hands-on experience and when you complete each whole quest, you earn a badge for your LinkedIn!


Learn data engineering skills one week at a time.

All upcoming events:

https://gdg.community.dev/gdg-cloud-twin-cities/#upcoming-events

#googlecloudplatform #gcp #event #data #dataengineering


Host

  • Jeff Williams

    Jeff Williams

    UC Berkeley

    Google Tech Lead

    See Bio

Organizers