Intermediate/Senior Data Scientist who can contribute to collecting, analysing, reporting, and interpreting data for development of business strategies. Responsibilities: Use data management systems to deliver prescribed outcomes outcomes. Collect, analyse, report, and interpret data for use in the development of business strategies. Employ mining, modelling, and testing techniques to enable data analysis. Communicate the project status back to 5 years working experience within an analytical, data science or computer science environment SAS SQL
scouting for a Senior Data Engineer. You will be responsible for developing high quality data warehouse solutions in an Azure environment using Synapse, Data Lake, Azure Data Factory, Wherescape and PowerBI The groups similar fields like Information Systems, Big Data Azure Data Engineer Certification would be advantageous ADF and Data Lake Experience in developing data warehouses and data marts Experience in Data Vault and DataOps environment Experience working with automated data warehousing solutions would be advantageous Minimum
management system (DBMS) Set up, maintain reporting and data integration processes using SSRS, SSIS and SSAS relational database structure / architecture Oversee all data generated, monitor server transaction speeds, and
and institutions worldwide is looking for an Azure Data Engineer. The unique position gives a deep insight maintaining Big Data Pipelines using Data Platforms Custodians of data and must ensure that data is shared need-to-know basis Experience using programming skills in data related programming languages and frameworks, such Kusto Experience with Azure Data Solutions: Azure Data Factory, Azure Data Explorer, Azure Databricks Profound technical understanding for Data Engineering and Data Warehouse Design Familiar with modern
telecommunications company in Johannesburg as a Data Engineer. They are seeking a highly skilled and pivotal role in transforming their data infrastructure and driving data-driven decision-making across the volumes of data from various sources. Utilize AWS or other cloud technologies to build and manage data pipelines reliability. Implement data processing solutions using PySpark to process and analyze big data sets efficiently requirements, develop data models, and deliver high-quality data solutions. Ensure data quality and integrity
collecting, analysing, reporting, and interpreting data for use in the development of business strategies strategies. Responsibilities Defining and capturing metadata and rules associated with ETL and ELT processes analysis, development and testing of our specialized data and analytical "recipes". Working with clients to to identify and understand their source data systems and data requirements. Perform support activities maintain and enhance data tools and analytical services. Design and develop data models for analytics
retail customer base. Currently searching for a Data Engineer that will be responsible for driving, designing ETL systems for a big data warehouse to implement robust & trustworthy data to support high performing algorithms, predictive models and support real-time data visualisation requirements across the organisation and knowledge of emerging trends across Data/Analytics (Big Data, Machine Learning, Deep Learning, AI) Experience and understanding in designing and developing data warehouses according to the Kimball methodology
requirements for data, reports, analyses, metadata, training, service levels, data quality, and performance potential data sources • Recommends appropriate scope of requirements • Validates that data warehouse requirements for data, reports, analyses, metadata, training, service levels, data quality, and performance potential data sources • Recommends appropriate scope of requirements • Validates that data warehouse BA/System Analysis experience • 4 years BA experience in data warehousing and or BI projects Experience in all
with Modern Activities within UiPath: Extract Table Data, Get Test, Go to URL, and Highlight Must have relevant with Modern Activities within UiPath: Extract Table Data, Get Test, Reference Number for this position is
broadcast companies is on the lookout for a Senior Data Engineer to join their team. The chosen candidate Solid understanding of Golden Gate Plug-ins (Big Data / Kafka), GG Directors and Cloud Console Cloud Services