DevOps Pro Europe 2022

May 30 - June 3

Workshops

Online

May 24 - 26

Conference

Online

Adam Dewberry

Positon: Data Engineer

Company: Infinity Works

Country: UK

Biography

Adam Dewberry is a consultant data engineer, specialising in building cloud data platforms. Along with engineering, he is responsible for up-skilling customers’ staff, working alongside them to help them own and expand their new data infrastructure.

In a pre-Covid world, he would often be found hitch-hiking across the Balkans and eastern Europe; in a peri-Covid world, he’s learning to skateboard.

Workshop

How to rapidly deploy an AWS & Snowflake cloud data platform at scale

This hands-on workshop will teach you how to create and roll out a scalable data platform in AWS and Snowflake using Terraform. The session will cover how to build all of the cloud infrastructure required to automate data ingestion and exports between AWS and Snowflake through code, set up deployment pipelines and finally connect an analytics dashboard to derive insights and instant value.

Agenda

  • Part 1 Snowflake in the console – creating a simple data warehouse with imports and exports
    • Setting up a Snowflake account
    • Creating resources:
      • Users & roles
      • Databases, schemas & tables
    • Getting local data into Snowflake
    • Querying data in Snowflake
    • Getting data out of Snowflake to your local machine
    • Automating data ingestion from S3
      • Account integrations: connecting Snowflake to AWS S3
      • IAM Roles & Policies
      • Stages: Where the cloud source data lives
      • Landing data with Snowpipes
        • Pushing data to AWS from the CLI
      • Automating data exports from Snowflake to S3

    Part 2 Deploying Snowflake With Terraform

    • The power of infrastructure as code
      • What is Terraform and how does it work?
    • Deploying Snowflake resources through Terraform
    • Setting up the codebase
      • State resources
    • Terraforming resources to automate data ingestion from S3 to Snowflake:
      • Users and roles
      • Databases and schemas
      • Integrations & IAM
      • Stages
      • Pipes & Notifications
    • Automating data exports from Snowflake to S3 with Terraform
    • Terraform modules – Have the heavy lifting done for you
    • Snow Cannon
    • Data warehouse deployment speed record

    Part 3 Data Analytics Dashboards with Docker

    • Running Metabase
    • Connecting Snowflake to your dashboard

    Servicing analytics

Objectives

The main goal of this workshop is to learn how to use infrastructure as code and automated deployments to deploy a cloud data platform with AWS and Snowflake. It is a hands-on session where all participants will ultimately produce cloud resources to automate data imports and exports from Snowflake.

Target audience

The target audience are those interested in building cloud data platforms, particularly with AWS and Snowflake, through the use of infrastructure as code and automated deployment pipelines. The session is open to all who enjoy and are interested in cloud platform and data engineering. The program requires some familiarity with platform engineering but importantly a good knowledge of AWS or Snowflake.

Technical requirements

Accounts to create before the workshop:

  • AWS with unrestricted access / deployment privileges that you can use in a local session with the AWS CLI.
  • Snowflake account with ACCOUNTADMIN role (create a free trial account and choose the enterprise edition).

OS:

  • Ubuntu, Mac OS (UNIX based etc) strongly preferred
  • Windows not recommended

Installations:

  • Minimum (not recommended but acceptable):
    • VS Code + Extension Code Remote – Containers
    • Docker v20.10
  • Preferred:
    • AWS Command Line Interface v2.0 (test connectivity to your AWS account via CLI)
    • Docker v20.10
    • Python >= 3.6
      • botocore>=1.17
      • boto3>=1.12
    • Terraform v13
    • SnowSQL CLI v1.2 (test connectivity to your Snowflake account via CLI)

Technical knowledge:

  • AWS: working knowledge of S3, IAM roles and policies. DynamoDB is a bonus.
  • Previous use of Snowflake is preferred but not essential.
  • Working knowledge of Terraform is strongly preferred.
  • Working knowledge of Docker is preferred but not essential.

Trainer

Adam Dewberry is a Consultant Data Engineer at Infinity Works, a digital transformation consultancy, he holds a Master’s degree in physics from the University of Sussex and specialises in building cloud data platforms. Adam began his career as a Data Analyst in the wine and automotive industry and quickly took to engineering, automation and later platform engineering – at which time he transitioned roles to Data Engineering, working primarily with AWS & Snowflake.

He has a passion for sprinkling software and DevOps tooling and best practises on data platforms, working with clients to design and implement cloud migrations, model and design data warehouses, create ELT processes and automate the end to end processes. Adam is passionate about teaching, code reusability and is a GitOps evangelist.

After the successful reception of presenting at the Snowflake Cloud Data Summit 2020, he wishes to share with you the depth of lessons learnt and practical skills in building scalable data platforms with a GitOps & DevOps approach.

Outside of work, Adam teaches a data engineering course for career changers. In a pre-Covid world, he would often be found hitch-hiking across the Balkans and eastern Europe; in a peri-Covid world, he’s learning how to skateboard.