implement, and optimize Big Data Pipelines using AWS services.
- Ensure data integrity, security
Relevant IT/Business/Engineering Degree.
- AWS Certified Cloud Practitioner or equivalent.
-
Reference: JHB001660-GuguN-1 Join our client as an Expert AWS Data Engineer in South Africa. Design, implement implement, and optimize Big Data Pipelines using AWS services. Ensure data integrity, security, and compliance to integrate solutions. Bring your expertise in AWS technologies and data engineering to drive impactful implement, and optimize Big Data Pipelines using AWS services. - Ensure data integrity, security, and Qualifications: - Relevant IT/Business/Engineering Degree. - AWS Certified Cloud Practitioner or equivalent. - Strong
Reference: JHB009406-BG-1 AWS Data Engineer ESSENTIAL SKILLS REQUIREMENTS: Exceptional experience/understanding using AWS Glue or Data Pipeline, or similar platforms. - Familiar with data store such as AWS S3, and and AWS RDS or DynamoDB. - Experience and solid understanding of various software design patterns. - Experience organizational skills. Experience / understanding of AWS Components (in order of importance): Glue CloudWatch modelling and schema design (“not drag and drop”) Kafka AWS EMR Redshift QUALIFICATIONS/EXPERIENCE NEEDED Relevant
based in Pretoria is looking for an Python & AWS Software Engineer to join their team on a contract injection etc) (essential). 3-5 years' experience in AWS (API Gateway, Lambda, Dynamodb, Fargate, EMR, Glue injection etc) (essential). 3-5 years' experience in AWS (API Gateway, Lambda, Dynamodb, Fargate, EMR, Glue development. 3-5 years' experience in SQL (advantageous). AWS Certified Developer Associate / Solutions Architect
An AWS Data Engineer with -12 years hands-on data engineering experience is required to join a team of modelling and schema design (“not drag and drop”) Kafka AWS certified developer / architect Reference Number
using AWS Glue or Data Pipeline, or similar platforms.
· Familiar with data store such as AWS S3,
and AWS RDS or DynamoDB.
- Experience and solid understanding of various software design patterns
/>Basic experience/understanding of AWS Components (in order of importance):
- Experience building data pipelines using AWS Glue or Data Pipeline, or similar platforms.
-
/>- Familiarity with data stores such as AWS S3, and AWS RDS or DynamoDB.
- Understanding of various
BASIC EXPERIENCE/UNDERSTANDING OF AWS COMPONENTS (in order of importance):
- Kafka
- AWS EMR
- Redshift
- Basic experience in Networking
certifications:
- AWS Certified Cloud Practitioner
- AWS Certified SysOps Associate
- AWS Certified
Reference: JHB001716-Laka-1 Join our team as an AWS Data Engineer (Expert) and leverage your skills in problem-solvers with expertise in Python, SQL, and AWS services to drive our data initiatives forward. Be using AWS Glue or Data Pipeline, or similar platforms. - Familiarity with data stores such as AWS S3, and and AWS RDS or DynamoDB. - Understanding of various software design patterns. - Experience preparing organizational skills. BASIC EXPERIENCE/UNDERSTANDING OF AWS COMPONENTS (in order of importance): - Glue - CloudWatch
existing cloud infrastructure from a set of source AWS accounts to a new set of target accounts. They need
existing cloud infrastructure from a set of source AWS accounts to a new set of target accounts. They need Terraform. The person should also have some Python and AWS data engineering experience, as everything will be