As a GCP/Cloud Engineer, you will work with the Data Analytics and Technology Advancement Center (TAC) to make things easier, faster and more reliable for thousands of employees and customers. You will help create the infrastructure that powers the analytical insight and data science efforts. You will work alongside the Senior Data Architect to ensure that the data analytics and data science mission is being achieved through tool setup, process creation, training and advocacy.

You will also be driving the strategy of Tredence’s cloud engineer team as we continue to grow and scale. You will be working on cutting edge technology with one of the most recognizable consumer goods brands in the entire country, with opportunity to expand this service channel to other new and existing Tredence clients.



We are looking for an analytical, big picture thinker who is driven to enhance and further the mission of Tredence by delivering technology to internal business and functional stakeholders. You will serve as a leader to drive the IT strategy to create value across the organization. This Cloud Engineer will be empowered to lead the organization to focus on implementing both top-level, strategic, innovative solutions, as well as the day-to-day tactics that drive efficiency, effectiveness and value across the entire organization.

You will play a critical role in creating and analyzing deliverables to provide critical content to enable fact-based decision making, facilitation and achievement of successful collaboration with the business stakeholders. You will analyze, design, and develop best practices business changes through technology solutions.


  • Minimum 5 years’ experience in IT or professional services experience in IT delivery or large-scale IT analytics projects.
  • BS in Computer Science or Engineering.
  • Experience connecting and integrating with at least one of the following platforms: Google Cloud, Microsoft Azure, Amazon AWS and/or various data providers like Facebook or Twitter API Integration.
  • Expert knowledge in SQL development.
  • Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc).
  • Experience in designing and building medium to large scale data centric applications.



  • Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines.
  • Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning.
  • Experience programming in Java, Python, etc.
  • Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases.