All Opportunities

Data Engineer - High Performance Solutions - Job Opening ID: 563818

We are looking for a data engineer to be our data analytics advocate, aiming to foster further data analytics-driven insight, and be an active participant to Saint-Gobain worldwide data analytics community.

 

In this role, the data engineer supports data analytics efforts related to business intelligence improvement and manufacturing processes transition toward industry 4.0. Working in collaboration with data scientists to scope data analytics projects to support the various businesses of Saint-Gobain. Leads the data engineering efforts of larger scale projects, i.e. she/he designs the data architecture, and carries on the data consolidation, integration, cleaning and structuring.


The primary responsibilities:

  • Integrating, consolidating, cleaning and structuring data for use in analytics applications by data scientists.
  • Conducting meeting with internal customers to identify data analytics needs and scope projects including big data discovery projects and production deployments.
  • Working with data scientists in order to define the requirements in terms of data format and structure for both big data discovery and production mode.
  • Autonomously define the data architecture, setting up the data pipelines and data processing, and ensuring sustainability of the solution.
  • Collaborating with central IT services and/or external vendors to ensure successful and sustainable developments.
  • Summarizing project approach and results in concise and accurate technical memos.
  • Communicate project strategy and results to internal customers and management through formal presentations.
  • Being an active member of Saint-Gobain worldwide data analytics community, through participation in tech days and sharing of best practices.

COMPETENCIES:

  • Ability to develop relationships with others and being able to explain potentially complex technical concepts to non-technical colleagues.
  • Ability to efficiently communicate orally and by email with people having a wide variety of background.
  • Demonstrated ability to write concise and accurate technical documents and give strong technical presentations to a wide variety of audience.
  • Ability and desire to self-learn and pick up emerging technologies.
  • Strong team player with creative thinking skills, and a desire to “make a difference”.

REQUIREMENTS:

  • Bachelor’s Degree with a minimum of 7+ years’ experience, or MS with 3+ years’ of experience, or entry level Ph.D. in Computer Science, Applied Mathematics or Information Technology.
  • Data Engineering experience with significant business exposure -5+ years required.
  • Enterprise data warehousing- 3 + years required.
  • Hadoop and other Big Data frameworks leveraging any one of the Hadoop distributions-3+ years required.
  • Underlying infrastructure (e.g. cloud, Hadoop, NAS, MPP, SAN)- 3+ years required.
  • Demonstrated experience in developing and implementing proof of concept and production ready data pipelines and Hadoop processing solutions required.
  • Demonstrated experience with the following applications/languages: Spark, Hive, Impala, Solr, HDFS, NoSQL, Kafka, Scoop, Oozie, Python, R, Matlab, and at least one ETL/ELT tool required.
  • Familiarity with smart manufacturing concepts and related data architecture preferred.
  • Familiarity with modern SCADA systems such as Ignition, Docker container, and Microsoft Azure technology preferred.
  • Ability to travel 25% of time required.

 

 

interns-VIEs2017

Top Employer 2016