AWS Data Engineer

Back

AWS Data Engineer

@ CGI

Position Description:

  • Want to leverage your experience and development skills in the Judicial Sector as an AWS Data Engineer?
  • CGI seeks a data engineer for a high-volume data ingestion and search modernization effort.
  • The data engineer will provide key features in the modern system, including: creating high-volume data ingestion pipelines for artifacts from legacy systems, automate the analysis of the ingested artifacts, and create knowledge bases using the latest artificial intelligence models.
  • The data engineer will be involved in the modern system's data architecture, pipeline version control, data lineage tracking, and automated testing.
  • Data quality management and traceability will be integrated into the automated pipelines.
  • The data engineer will apply software engineering best practices throughout their work and leverage the latest cloud technologies to innovate and provide value to our customers.
  • This position is located in our Lafayette, LA office; however, a hybrid working model is acceptable.


Your future duties and responsibilities:

Our AWS Data Engineer will be a key contributor with the below responsibilities:

  • Work with the technical development team and team leader to understand desired application capabilities.
  • Work with clients and legacy systems to setup ingestion pipelines that pull new and updated artifacts into knowledge bases. 
  • Continuously improve machine learning models
  • Design and apply data architectures that enable both field-based search and semantic search.
  • Understand conflicting artifacts and prioritize accurate or more recent artifacts within the knowledge bases.
  • Work within and across Agile teams to test and support technical solutions across a full stack of development tools and technologies. 
  • Develop applications in AWS - data and analytics technologies including, but not limited to: OpenSearch, RDS, S3, Athena; Lambda, Step Functions, Glue; Sagemaker, Textract, Comprehend, Bedrock, AI Chatbot/Lex; SQS, SNS; CloudTrail, CloudWatch; VPC, EC2, ECS and IAM.
  • Application development by lifecycles, & continuous integration/deployment practices. • Work to integrate open-source components into data-analytic solutions. 
  • Working with vendors to enhance tool capabilities to meet enterprise needs. • Willingness to continuously learn & share learnings with others.



Qualifications:

Required qualifications to be successful in this role:

  • Proficient in navigating the AWS console and programmatic interaction with AWS through the AWS Python SDKs and AWS CLI.
  • Hands on experience with AWS services, such as: RDS, S3, Lambda, Step Functions, Glue, SQS, SNS, CloudTrail, CloudWatch, VPC, EC2 and IAM.
  • Proficiency in Python: Data structures, writing custom classes/modules, object-oriented code organization, data extraction/transformation/serialization, database/API interaction, creating virtual environments, and the AWS Python SDK.
  • Deep experience with troubleshooting complex end-to-end data processing issues whose causes may stem from: library code, workflow logic, API inconsistencies, network issues, corrupt data, out-of-order updates, and AI hallucinations.
  • Hands on experience with high-volume data application development and version control systems such as Git.
  • Hands on experience in implementing data ingestion processes incorporating ETL processes.
  • Hands on experience in data modeling and relational database design of large datasets
  • Knowledge of application development lifecycles, & continuous integration/deployment practices. 
  • 7-10 years experience delivering and operating large-scale, highly-visible distributed systems.
  • Knowledge of IaC using terraform is preferred.
  • Agile development experience


Desired qualifications/non-essential skills required:

  • DevOps practices: IaC for pipelines, pipeline monitoring and logging, code versioning, data versioning, container writing
  • Experience working with the Atlassian toolset (Jira, Confluence)
  • Hands on experience with reading and writing into DynamoDB or other NoSQL databases; Redshift
  • API design; API Gateway Experience 
  • ElasticSearch/OpenSearch Experience
  • AWS Certifications
  • Agile or SAFe Certificat


Skills:

  • GIT
  • GIT
  • Python


How to Apply:

Apply online at https://cgi.jobs/locations/lafayette-la/jobs/ 

Visit Site to Apply

Location: Lafayette, LA
Date Posted: October 30, 2024
Application Deadline: January 02, 2025
Job Type: Full-time