Back to jobs

Principle Cloud Architect (GCP)

 

As Google Cloud's premier partner in data and analytics, we provide world-class businesses with cutting-edge data solutions in the cloud.

We help clients take leading technology to the limits by combining our expertise in machine learning, data engineering, and analytics. With Google Cloud as our foundation, we help businesses future-proof their solutions, deepen their understanding of consumers, increase competitive advantage and unlock operational efficiencies.

Our team consists of experts in machine learning, data science, software engineering, mathematics, and design. We share a passion for data & analysis, operate at the cutting edge, and believe in a pragmatic approach to solving hard problems.

 

The role:

As Cloud Architect at Datatonic you will shape the architecture for key projects. You'll be engaging with our customers and prospects to provide pre & post sales architectural advice and thought leadership for machine learning, analytics, and data migration projects. In addition to that, you will be helping us to build out the architectural side of our next-generation machine learning products.

This is an excellent opportunity for an experienced professional in cloud technologies who would like to work as part of a team of experts in the fields of AI and data, as a subject matter expert in solution/cloud architecture. The role is customer-facing, working closely with business and technical influencers, and will require a background in Computer Science. Successful candidates will bring multiple years’ experience with cloud platforms like Google Cloud or AWS. Ideally, you'll have served as an SME in designing and migrating applications to live in the cloud, working with complex cloud deployments, and being adept at the design and implementation of complex platforms.

 

As Cloud Architect at Datatonic, you will:

  • Work with the most innovative and scalable data processing and cloud technologies
  • Build innovative state-of-the-art solutions with our customers
  • Lead large scale enterprise design and implementation projects for Data and AI projects 
  • Be a thought leader and bring in new innovative ideas and technical approaches for Datatonic to adopt
  • Play a key role in shaping the architecture team at Datatonic
  • Support our sales teams when engaging with new customers/projects. Provide technical inputs to help shape and scope solutions to solve our clients most challenging needs
  • Support our GTM teams in defining new solutions to take to the market by providing technical leadership and governance
  • Execute architecture reviews for some of our key customers. Identify and share recommendations with senior client stakeholders
  • Provide technical governance and define best practices to be embedded into project delivery
  • Input into our internal knowledge base helping us develop collateral and thought leadership
  • Work in an agile and dynamic environment together with a small team with our data scientists, machine learning experts, data analysts and data engineers
  • Be a leader and mentor to other team members
  • Work closely with our tech partners: Google Cloud, Tableau, Looker

 

Requirements:

  • BSc or MSc degree in Computer Science or a related technical field
  • Proven experience building big data cloud architecture
  • The ability to take ownership from end-to-end, finding creative solutions
  • Demonstrated, strong analytical and technical capabilities, with an innovative 'edge'
  • Exceptional communication skills; both written and verbal with great attention to detail.
  • Able to present concepts in an authoritative and clear manner to customers through white-boarding, presentations, and proposals
  • Ability to develop and maintain relationships with key external stakeholders at various business levels
  • Multiple years experience in technical pre-sales, able to evangelise disruptive proposals
  • Holistic experience in end-to-end production grade cloud technologies and their design/usage (across all areas i.e. data/security/networking etc)
  • Knowledge and multiple years of  hands-on experience with container technologies
  • Multiple years of programming experience, ideally in Python, Java and SQL
  • Experience building scalable and high-performant code

Bonus Points:

  • Google Cloud or AWS Certified Solutions Architect certification
  • Experience with ETL tools, Hadoop-based technologies (e.g. Spark) and/or batch/streaming data pipelines (e.g. Beam, Flink)
  • Experience designing and building data lake and data warehouse solutions using technology such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, etc
  • Experience designing and building analytical products using technology such as Looker, Tableau, Data Studio, PowerBI, Qlik, etc
  • Experience with Agile methodologies such as Scrum
  • Basic knowledge of and ideally some experience with data science topics like machine learning, data mining, statistics, and visualisation
  • Contributions to open-source projects

Apply for this job

*

indicates a required field

Resume/CV
,,Google Drive,or

Accepted file types: pdf, doc, docx, txt, rtf

Cover Letter
,,Google Drive,or

Accepted file types: pdf, doc, docx, txt, rtf

Select...