Descripción de la oferta
Company DescriptionAGSI was incorporated in April 2016. We are committed to supporting the goals of Arch divisions through exceptional service delivery. We pride ourselves on maintaining flexibility and responsiveness to adapt to business unit and industry demands while focusing on sound project management. We are dedicated to growing and developing our employees as we build strong teams with strategic leadership.Job DescriptionSchedule: Mid ShiftThe PositionThis position develops, implements, and maintains software solutions that enable business operations to realise company goals & objectives. The incumbent performs analysis, design, coding, debugging, testing, and support of software application systems. He/she may be assigned to develop new applications, enhance existing applications and/or provide production support. The incumbent works independently on projects of moderate scope or complexity and receives detailed instructions on new and/or more complex assignments.Job ResponsibilitiesDesign and develop data pipelines using Apache Airflow to orchestrate complex workflows and ensure reliable data deliveryBuild and maintain transformation logic using dbt Core, supporting the infrastructure needed, implementing best practices for modular, tested, and documented analytics codeDevelop and optimize data models in Snowflake, leveraging cloud data warehouse capabilities for performance and cost efficiencyWrite complex SQL for data transformation, quality validation, and business logic implementationCollaborate closely with the infrastructure team to ensure the data platform remains modern, well‑monitored, and fully optimized, with industry best practices consistently appliedCollaborate with analytics and business teams to understand requirements and translate them into scalable data solutionsImplement data quality checks, monitoring, and alerting to ensure data reliabilityDocument data pipelines, models, and processes for knowledge sharingOptimize query performance and manage Snowflake resource utilizationParticipate in code reviews and contribute to data engineering best practicesQualificationsRequired Skills3+ years of experience in data engineering or related roleStrong proficiency in SQL with experience writing complex queries, CTEs, and window functionsProficiency in Python for data engineering tasks, scripting, and automationHands‑on experience with dbt (Core or Cloud) for data transformation and modelingExperience orchestrating workflows with Apache Airflow or similar toolsWorking knowledge of Snowflake or similar cloud data warehouses (Redshift, BigQuery)Understanding of infrastructure requirements for data engineering, including deployment strategies, environment configuration, and resource managementUnderstanding of dimensional modeling and data warehouse design patternsExperience with version control (Git) and CI/CD practicesStrong problem‑solving skills and attention to data qualityDesired SkillsExperience with data replication tools such as Qlik Replicate, Fivetran, AWS DMS, or similar CDC solutionsExperience setting up and managing infrastructure for dbt Core, including deployment automation, testing frameworks, and orchestration integrationKnowledge of real‑time data streaming and event‑driven architecturesKnowledge of containerization (Docker) and infrastructure as code (Terraform, CloudFormation)Experience with cloud platforms (AWS, Azure, GCP)Knowledge of data governance and security best practicesFamiliarity with DataOps practices and testing frameworksUnderstanding of software engineering principles and agile methodologiesAdditional InformationRequired knowledge and skills would typically be acquired through a Bachelors degree in computer science, business, or related field
#J-18808-Ljbffr