The scope of this project will be to build a data ware house on Google Cloud Platform that will help answer common business questions as well as powering dashboards

Overview

Project Scopes

The scope of this project will be to build a data ware house on Google Cloud Platform that will help answer common business questions as well as powering dashboards. To do that, a conceptual data model and a data pipeline will be defined.

Architecture

Data are uploaded to Google Cloud Storage bucket. GCS will act as the data lake where all raw files are stored. Data will then be loaded to staging tables on BigQuery. The ETL process will take data from those staging tables and create data mart tables. An Airflow instance can be deployed on a Google Compute Engine or locally to orchestrate the pipeline.

Here are the justifications for the technologies used:

  • Google Cloud Storage: act as the data lake, vertically scalable.
  • Google Big Query: act as data base engine for data warehousing, data mart and ETL processes. BigQuery is a serverless solution that can easily and effectively process petabytes scale dataset.
  • Apache Airflow: orchestrate the workflow by issuing command line to load data to BigQuery or SQL queries for ETL process. Airflow does not have to process any data by itself, thus allowing the architecture to scale.

Data Model

The database is designed following a star-schema principal with 1 fact table and 5 dimensions tables.

image

  • F_IMMIGRATION_DATA: contains immigration information such as arrival date, departure date, visa type, gender, country of origin, etc.
  • D_TIME: contains dimensions for date column
  • D_PORT: contains port_id and port_name
  • D_AIRPORT: contains airports within a state
  • D_STATE: contains state_id and state_name
  • D_COUNTRY: contains country_id and country_name
  • D_WEATHER: contains average weather for a state
  • D_CITY_DEMO: contains demographic information for a city

Data pipeline

This project uses Airflow for orchestration.

image

A DummyOperator start_pipeline kick off the pipeline followed by 4 load operations. Those operations load data from GCS bucket to BigQuery tables. The immigration_data is loaded as parquet files while the others are csv formatted. There are operations to check rows after loading to BigQuery.

Next the pipeline loads 3 master data object from the I94 Data dictionary. Then the F_IMMIGRATION_DATA table is created and check to make sure that there is no duplicates. Other dimension tables are also created and the pipelines finishes.

Scenarios

Data increase by 100x

Currently infrastructure can easily supports 100x increase in data size. GCS and BigQuery can handle petabytes scale data. Airflow is not a bottle neck since it only issue commands to other services.

Pipelines would be run on 7am daily. how to update dashboard? would it still work?

Schedule dag to be run daily at 7 AM. Setup dag retry, email/slack notification on failures.

Make it available to 100+ people

BigQuery is auto-scaling so if 100+ people need to access, it can handle that easily. If more people or services need access to the database, we can add steps to write to a NoSQL database like Data Store or Cassandra, or write to a SQL one that supports horizontal scaling like BigTable.

Project Instructions

GCP setup

Follow the following steps:

  • Create a project on GCP
  • Enable billing by adding a credit card (you have free credits worth $300)
  • Navigate to IAM and create a service account
  • Grant the account project owner. It is convenient for this project, but not recommended for production system. You should keep your key somewhere safe.

Create a bucket on your project and upload the data with the following structure:

gs://cloud-data-lake-gcp/airports/:
gs://cloud-data-lake-gcp/airports/airport-codes_csv.csv
gs://cloud-data-lake-gcp/airports/airport_codes.json

gs://cloud-data-lake-gcp/cities/:
gs://cloud-data-lake-gcp/cities/us-cities-demographics.csv
gs://cloud-data-lake-gcp/cities/us_cities_demo.json

gs://cloud-data-lake-gcp/immigration_data/:
gs://cloud-data-lake-gcp/immigration_data/part-00000-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00001-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00002-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00003-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00004-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00005-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00006-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00007-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00008-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00009-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00010-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00011-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00012-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet
gs://cloud-data-lake-gcp/immigration_data/part-00013-b9542815-7a8d-45fc-9c67-c9c5007ad0d4-c000.snappy.parquet

gs://cloud-data-lake-gcp/master_data/:
gs://cloud-data-lake-gcp/master_data/
gs://cloud-data-lake-gcp/master_data/I94ADDR.csv
gs://cloud-data-lake-gcp/master_data/I94CIT_I94RES.csv
gs://cloud-data-lake-gcp/master_data/I94PORT.csv

gs://cloud-data-lake-gcp/weather/:
gs://cloud-data-lake-gcp/weather/GlobalLandTemperaturesByCity.csv
gs://cloud-data-lake-gcp/weather/temperature_by_city.json

You can copy the data to your own bucket by running the following:

gsutil cp -r gs://cloud-data-lake-gcp/ gs://{your_bucket_name}

Local setup

Clone the project, create environment, install required packages by running the following:

Install docker if it's not already installed. You can find the resources to do that here.

Install the Astronomer CLI following the instructions here.

Run the following commands to bring up the Airflow instance:

astro d start

You can look at the logs by running make logs if you need to debug something. You can access and manage the pipeline by typing the following address to a browser:

localhost:8080/admin/

If everything is setup correctly, you will see the following screen:

image

Navigate to Admin -> Connections and paste in the credentials for the following two connections: bigquery_default and google_cloud_default

image

Navigate to the main dag on path dags\cloud-data-lake-pipeline.py and change the following parameters with your own setup:

project_id = 'cloud-data-lake'
staging_dataset = 'IMMIGRATION_DWH_STAGING'
dwh_dataset = 'IMMIGRATION_DWH'
gs_bucket = 'cloud-data-lake-gcp'

You can then trigger the dag and the pipeline will run.

The data warehouse

The final data warehouse looks like this: img

Owner
Shweta_kumawat
AI software @ Computer programmer, Deep Learning, Computer Vision Researcher and Developer. Implement AI products and machine learning solutions
Shweta_kumawat
Osmnx-examples - Usage examples, demos, and tutorials for OSMnx.

OSMnx Examples OSMnx is a Python package to work with street networks and other spatial data from OpenStreetMap: retrieve, model, analyze, and visuali

Geoff Boeing 1.2k Jan 03, 2023
Мои личные наработки по новому API Тинькофф. Не официально.

TinkoffNewAPI Мои личные наработки по новому API Тинькофф. Не официально. Официально по ссылке: https://github.com/Tinkoff/investAPI/ Выложено по прос

1 Jan 20, 2022
Project template for using aws-cdk, Chalice and React in concert, including RDS Postgresql and AWS Cognito

What is This? This repository is an opinonated project template for using aws-cdk, Chalice and React in concert. Where aws-cdk and Chalice are in Pyth

Rasmus Jones 4 Nov 07, 2022
Python wrapper for Interactive Brokers Client Portal Web API

EasyIB: Unofficial Wrapper for Interactive Brokers API EasyIB is an unofficial python wrapper for Interactive Brokers Client Portal Web API. Features

39 Dec 13, 2022
Lending-Club-Loans - Using TensorFlow to create an ANN model to predict whether people would charge off or pay back their loans.

Lending Club Loans: Brief Introduction LendingClub is a US peer-to-peer lending company, headquartered in San Francisco, California.[3] It was the fir

Ali Akram 1 Jan 03, 2022
The Best Telegram UserBot Made With Pyrogram [Python]

Asterix UserBot A Powerful Telegram userbot based on Pyrogram. How To Deploy Asterix Heroku Railway Qovery Termux Tutorial Railway Deploy Comming Soon

TeamAsterix 9 Oct 17, 2022
Fully undetected auto skillcheck hack for dead by daylight that works decently well

Auto-skillcheck was made by Love ❌ code ✅ ❔ ・How to use Start off by installing python ofc Open cmd in the same directory and type pip install -r requ

Rdimo 10 Aug 13, 2022
Latest Open Source Code for Playing Music in Telegram Video Chat. Made with Pyrogram and Pytgcalls 💖

MusicPlayer_TG Latest Open Source Code for Playing Music in Telegram Video Chat. Made with Pyrogram and Pytgcalls 💖 Requirements 📝 FFmpeg NodeJS nod

Abhijith Sudhakaran 2 Feb 04, 2022
A Telegram Userbot to play Audio and Video songs / files in Telegram Voice Chats

TG-MusicPlayer A Telegram Userbot to play Audio and Video songs / files in Telegram Voice Chats. It's made with PyTgCalls and Pyrogram Requirements Py

Mᴏᴏɴʟɪɢʜᴛ 4 Dec 14, 2022
🎵 RythmReloaded 🎵 A bot that can play music on Telegram Group and Channel Voice Chats

🎵 RythmReloaded 🎵 A bot that can play music on Telegram Group and Channel Voice Chats POWERED BY MARSHALX TGCALLS Available on telegram as @OptimusP

0 Nov 03, 2021
3X Fast Telethon Based Bot

📺 YouTube Song Downloader Bot For Telegram 🔮 3X Fast Telethon Based Bot ⚜ Easy To Deploy 🤗

@Dk_king_offcial 1 Dec 09, 2021
Send alert to telegram use telegram cli

Run standalone: Rename conf.yml.example to conf.yml Change block cli(Add your api_id and hash) Install requirements.txt Run python AlertManagerTG.py I

Eugene Arkharov 1 Nov 12, 2021
Telephus is a connection pooled, low-level client API for Cassandra in Twisted python.

Telephus Son of Heracles who loved Cassandra. He went a little crazy, at one point. One might almost say he was twisted. Description Telephus is a con

Brandon Williams 93 Apr 29, 2021
✨ 🐍 Python SDK for StarkNet.

✨ 🐍 starknet.py StarkNet SDK for Python 📘 Documentation Installation Quickstart Guide API Installation To install this package run pip install stark

Software Mansion 158 Jan 04, 2023
Deezer client for python

Deezer Python Client A friendly Python wrapper around the Deezer API. Installation The package is published on PyPI and can be installed by running: p

Bruno Alla 103 Dec 19, 2022
stories-matiasucker created by GitHub Classroom

Stories do Instagram Este projeto tem como objetivo desenvolver uma pequena aplicação que simule os efeitos e funcionalidades ao estilo Instagram. A a

1 Dec 20, 2021
The Python client library for the Tuneup Technology App.

Tuneup Technology App Python Client Library The Python client library for the Tuneup Technology App. This library allows you to interact with the cust

Tuneup Technology 0 Jun 29, 2022
Very Sempil Bot Auto Filter bot

SAMANTHA_BOT Very Sempil Bot Auto Filter bot ##[ # 𝐂𝐋𝐈𝐂𝐊 𝐁𝐄𝐋𝐎𝐖 𝐈𝐌𝐀𝐆𝐄 𝐓𝐎 𝐃𝐄𝐏𝐋𝐎𝐘 👇 👇 👇 Auto Filter Manuel Filter IMDB Admin Co

DARK WEBLOAD 3 Jun 27, 2022
Rio Userbot Adalah Bot Untuk Membantu Mempermudahkan Sesuatu Di Telegram, Last Repository With Pytgcalls v0.8.3

RIO - USERBOT Disclaimer Saya tidak bertanggung jawab atas penyalahgunaan bot ini. Bot ini dimaksudkan untuk bersenang-senang sekaligus membantu Anda

RioProjectX 4 Oct 18, 2022
Userbot Telegram + Music Voice Chats. Dibuat Untuk Bersenang - Senang , Dan Mempermudah Kegiatan. Created By Rio.

RIO - USERBOT Disclaimer Saya tidak bertanggung jawab atas penyalahgunaan bot ini. Bot ini dimaksudkan untuk bersenang-senang sekaligus membantu Anda

RioProjectX 1 Nov 10, 2021