Posted on 2025/12/06
AWS Data Engineer - (Python/PySpark/Aws Services/Unit testing/CI/CD/Gitlab/Banking)
GIOS Technology
Glasgow, United Kingdom
Full Description
I am hiring for AWS Data Engineer
Location: Glasgow 2–3 days per weekly Onsite
Job Description
We are looking for an experienced AWS Data Engineer with strong hands-on coding skills and expertise in designing scalable cloud-based data solutions.
The ideal candidate will be proficient in Python, PySpark, and core AWS services, with a strong background in building robust data pipelines and cloud-native architectures.
Key Responsibilities
• Design, develop, and maintain scalable data pipelines and ETL workflows using AWS services.
• Implement data processing solutions using PySpark and AWS Glue.
• Build and manage infrastructure as code using CloudFormation.
• Develop serverless applications using Lambda, Step Functions, and S3.
• Perform data querying and analysis using Athena.
• Support Data Scientists in model operationalization using SageMaker.
• Ensure secure data handling using IAM, KMS, and VPC configurations.
• Containerize applications using ECS.
• Write clean, testable Python code with strong unit testing practices.
• Use GitLab for version control and CI/CD.
Key Skills
Python, PySpark, S3, Lambda, Glue, Step Functions, Athena, SageMaker, VPC, ECS, IAM, KMS, CloudFormation, GitLab
Find AI, ML, Data Science Jobs By Location
Find Jobs By Position