Requirements: Sound knowledge in Python and Java Data Science concepts and principles Experience in tools applications. Architecture and Interface Design. Data modeling and Database technologies (relational,
and global reach. We are seeking a talented AWS Data Engineer to become a crucial part of their dynamic to guarantee the accuracy of data transformations. Build resilient data pipelines, leveraging platforms platforms like AWS Glue or Data Pipeline. Craft precise specifications to direct the development, testing, Harness expertise in Oracle SQL, showcasing prowess in data modeling. Create technical documentation and artifacts comprehensively. Employ Data Quality Tools like Great Expectations to maintain data integrity at all levels
organisational capabilities with agile, modern, data-driven solutions and services built according to and beyond expectations are recruiting for an AWS Data Engineer to join an environment with cutting edge expertise in data modelling. Develop technical documentation and artefacts. Knowledge of data formats such Parquet, AVRO, JSON, XML, CSV etc. Working with Data Quality Tools such as Great Expectations. Developing Building data pipelines using AWS Glue or Data Pipeline, or similar platforms. Familiar with data stores
every company, globally, by harnessing the value of data using high performance, interoperable and simplified simplified solutions are currently looking for an AWS Data Engineer to join their fast-paced and dynamic team enterprise data solutions and applications Analyse, re-architect and re-platform on-premise data warehouses warehouses to data platforms on AWS cloud using AWS or 3rd party services and Kafka CC. Design and build production production data pipelines from ingestion to consumption within a big data architecture, using Java, PySpark
and IT Methodology processes is recruiting for a Data Scientist – AI Platform to offer a deep insight sustainability of the group. Responsibilities: Cloud Management Coordination between development and support and simplifying requirements DevOps Experience Managing projects / processes Ability to develop within
focus on health administration has a role for a Data Development Engineer. The purpose of the ETL Developer which will populate the Data Warehouse, develop and enhance the back end of the data warehouse to satisfy Diploma/Degree in Information Technology, Computer Science or data related qualifications 5-10 years' experience in Minimum of 5-10 years Data Warehousing experience. SQL/PLSQL knowledge essential. Data Modelling Tool Experience and implementation of ETL pipelines. Cloud-based data storage solutions like AWS and Azure. Technical
An AWS Data Engineer with -12 years hands-on data engineering experience is required to join a team of format and assisting with identification and management of risks To be part of this brand that creates Oracle/PostgreSQL Py Spark Boto3 ETL Docker Linux / Unix Big Data PowerShell / Bash Glue CloudWatch SNS Athena S3 Lambda DynamoDB Step Function Param Store Secrets Manager Code Build/Pipeline CloudFormation Nice to have Business Intelligence (BI) Experience Technical data modelling and schema design (“not drag and drop”)
solutions for a Swiss client specializing in geospatial data. The ideal candidate should have a diverse coding detail-oriented, and results-driven, capable of managing multiple projects simultaneously while maintaining detail. The capacity to work independently and manage multiple priorities in a fast-paced environment
Access Management. Service Management: Incidents Monitoring, manage SLAs, problem management reporting Logstash, Influx DB, Dynatrace, Oracle, Postgres, Management of Keys, Certificates and Secrets Dev: Java, Confluence, Python Architecture : Cloud, On-prem, hybrid, data modelling, SW-Architecture Design architectures
performant applications Implementation of security and data protection Reference Number for this position is