DAT-016 Cloud Data Architect

Canada-wide Remote

Our Client is a recognized and certified Managed Service Provider (MSP) for AWS, Azure, and GCP. They were positioned as a Leader in the 2020 Gartner Magic Quadrant report for Public Cloud Infrastructure Professional and Managed Services. They have more than 1000 customers worldwide and help them gain a competitive edge through the cloud.

Job Description:

  • Design, build and implement complex data solutions in the cloud (AWS, GCP and/or Microsoft).
  • Understanding on premise and on cloud data platforms - databases, data marts, datahubs, data warehouses and data lakes.
  • Ability to consult on data governance, data management, data life cycle and ETL performance improvements.
  • Overall architecture design, performance tuning, cost management, and implementation of Big Data analytical solutions with GCP, AWS and/or Azure cloud environments.
  • Provide thought leadership with defining architecture, data, and analytics strategies on any of the three major clouds (GCP, AWS & Azure - preferred GCP & AWS Experience)
  • Implement end-to-end data analytics solutions from data ingestion, data processing, data quality, data aggregation, semantic view and visualization for large-scale, complex and diverse client environments.
  • Build a modern data analytics platform with self-service BI and data discovery capabilities and leveraging frameworks/solutions built for business analytics on cloud.
  • Lead a team of data engineers in designing, developing, testing, and deploying high performance data analytics solutions.
  • Communicate complex technical topics to non-technical business and senior executives and assist with scoping, architecting and selling cloud data solutions.
  • Thought Leadership on data management, collaterals, whitepapers

Requirements

  • B.A. /B.S. Degree is required
  • Google Cloud Data Engineer Certification preferred
  • 5+ years of relevant experience as Data Warehouse architect and delivery experience
  • 5+ years of experience leading/mentoring a development team, mentoring, reviewing code, assigning work
  • 3+ years delivering successful Data Analytics projects in Cloud data/analytics platform.
  • Experience estimating, planning, configuring and implementing big data solutions in a cloud environment (AWS, Azure or GCP).
  • Ability to translate business challenges and use cases into Data Analytics solutions
  • 3+ years ETL/ELT/Python programming skills including experience with pipeline orchestration tools, logging, monitoring & alerting, deep troubleshooting, and automated testing & deployments (CI/CD)
  • Demonstrated experience working in all phases of full life cycle data analytics development: Requirements, Architecture, Design, Testing, and deployment
  • Ability to collaborate with client technical executives on strategy, project portfolio management, best practices
  • Strong presentation skills to be able to author slides and sell a technical solution to clients
  • Experience or exposure to one or more of the following - Apache Beam, GCP Cloud Dataflow, GCP Dataprep, Trifacta, GCP Data Fusion, CDAP, GCP Dataproc, AWS Data Pipeline, AWS Glue
  • Prior experience in Python and/or Java, developing complex SQL queries, and working with relational database technologies.
  • Experience using cloud storage and computing technologies such as BigQuery, Snowflake, RedShift or Synapse.
  • Experience with modern BI/analytics platforms such as Looker, Tableau & Power BI.
  • Experience developing complex technical and ETL programs within a Hadoop ecosystem.
  • Strong understanding of cloud technology (especially GCP and AWS) with a broad understanding of infrastructure and GCP and AWS Ecosystem technologies
  • Understanding and some experience with Professional Services and Consulting engagements