Data Engineer


Data Engineer

@ Stuller


  • The Data Engineer position is responsible for developing and maintaining data pipelines from our various data sources to our cloud data warehouse and modeling that data in a variety of formats to support Stuller’s business intelligence and data science objectives. 
  • This role provides the opportunity to perform end-to-end development of data solutions using a modern data engineering technology stack, incorporating best practices from software development including automated testing, continuous integration, and continuous deployment. 

Essential Duties & Responsibilities:  

  • Define, design, and implement data pipelines from ingestion to consumption using a varied toolset which might include Fivetran, AWS Glue, Python, Scala and DBT 
  • Architect data models optimized for business intelligence consumption 
  • Identify and expose data pipeline risks, issues, and dependencies 
  • Build and follow modern data security and data governance principles 
  • Evaluate and identify new sources of data located throughout the organization 
  • Collaborate with and support analytics and data science teams to streamline organization of analytics 
  • Integrate disparate datasets into common data models 
  • Troubleshoot poorly performing data workflows and queries inside and outside the team 
  • Detect data quality issues, identify their root causes, implement fixes, and design data audits to capture issues 
  • Support business decisions with ad hoc analyses as needed 



  • Strong technical background 
  • Strong problem-solving skills 
  • Ability to multi-task  



  • Inherently curious, a fast learner, and motivated to have a deep understanding of the projects being worked on 
  • Integrity in business practices and when working with others 
  • Hunger for continuous improvement personally and professionally 



  • Bachelor’s degree in Computer Science or related technical field 
  • Strong foundational knowledge of SQL 
  • Knowledge of data modeling in analytic environments (Kimball, Star schema, snowflake schema, galaxy schema) a plus 
  • Familiarity with cloud data warehouses like Snowflake a plus 
  • Familiarity with Amazon Web Services, especially services in the data migration arena. 
  • Familiarity with modern ETL tools like Fivetran or Stitch and DBT. 
  • Experience with Agile development practices.

How to Apply:

Apply online at

Visit Site to Apply

Location: Lafayette, LA
Date Posted: January 27, 2023
Application Deadline: April 27, 2023
Job Type: Full-time