Send me Jobs like this
Nationality
Any Nationality
Gender
Not Mentioned
Vacancy
1 Vacancy
Job Description
Roles & Responsibilities
Job Purpose
Drive the design deployment and optimization of data pipelines to handle a wide variety of structured and unstructured data sources with exposure to latest data platform technologies on the AWS Eco-System. Implement data ingestion, continuous integration, monitoring & orchestration on cloud for Mondia business entity.
Roles and Responsibilities
Engage in collaboration with a cross-functional team of data scientists, data engineers, software developers, and other key stakeholders who work within an agile environment to create data products and enrich Mondia s data ecosystem.
Assist the team in the successful execution, performance optimization of the cloud data warehouse & cost estimation of serverless cloud components.
Design, construct, install and maintain data management systems using Spark/PySpark, AWS Glue, Dataflow, or similar cloud ETL Tools.
Execute Data orchestration, Workflows & ETL Scheduling Tools like Apache Airflow, luigi & step functions.
Recommend different ways to constantly improve data reliability and quality.
Employ an array of technological languages and tools to connect systems together.
Recommend different ways to constantly improve data reliability and quality.
Communicate clearly results & ideas within the team.
Communicate effectively to all levels of the organization.
Comply with Mondia policies and procedures and support Mondia mission and vision.
Perform other job-related duties as assigned by direct manager.
Desired Candidate Profile
Behavioral Skills
Accountability and Ownership
Communication
Analytical Thinking
Attention to Details
Result Focus (Delivering Results)
Problem Solving
Relationship Building
Organizational Commitment
Technical Competencies/Skills
Hands-on experience with Glue, Lambda, Step Functions, Redshift, DynamoDB, CloudWatch, and IAM; strong understanding of data lakes, warehouses, and cloud-native architecture on AWS
Proficient in building and managing ETL pipelines using Airflow, deploying scalable data services on EC2 and ECS, and leveraging serverless architectures
Advanced proficiency in Python, with solid skills in SQL and Shell scripting for automation, data transformation, and workflow management
Fluent in English with excellent reporting skills; proven ability to track analytics, monitor KPIs, and translate data into actionable insights
Job Requirements
Education
Bachelor s degree in Computer Science/Engineering or Statistics.
Experience
+3 years of professional experience in Data Engineering or Data Warehousing with an integrative perspective, from management to operations involvement and hands-on experience with cloud architecture and cloud technologies such as AWS, Azure or Google Cloud Platform GCP.
- Opportunity to work for a dynamic international company with a flat hierarchical structure, where your voice matters and your impact is seen.
Company Industry
- Media
- Publishing
- TV
- Radio
- Outdoor
- Digital
Department / Functional Area
- Engineering
Keywords
- Senior Data Engineer
Disclaimer: Naukrigulf.com is only a platform to bring jobseekers & employers together. Applicants are advised to research the bonafides of the prospective employer independently. We do NOT endorse any requests for money payments and strictly advice against sharing personal or bank related information. We also recommend you visit Security Advice for more information. If you suspect any fraud or malpractice, email us at abuse@naukrigulf.com
Similar Jobs
Data Engineer
Live Connections Consulting - FZE
- 2 - 6 Years
- Riyadh - Saudi Arabia
AWS Cloud Engineer
Staff Connect Information Technology Consultants
- 6 - 10 Years
- Dubai - United Arab Emirates (UAE)
Senior Data Engineer / Data Scientist
Staff Connect Information Technology Consultants
- 6 - 10 Years
- Dubai - United Arab Emirates (UAE)