Get new jobs by email
- ...using tools like Terraform or Ansible. Big Data Solutions: Architect and implement big data solutions using technologies such as Hadoop, Spark, and Kafka. Distributed Systems: Design and manage distributed data architectures to ensure efficient data processing and...
- ...Job Title: Senior Data Engineer - Big Data/ Hadoop Ecosystem Job Type: Full-time Location: On-site Dubai, Dubai, United Arab Emirates Overview Join our team as a Senior Data Engineer - Big Data/ Hadoop Ecosystem, where you will take the technical lead on pioneering...
- A leading IT solutions provider based in Dubai is looking for a professional who specializes in DevOps and big data solutions. The role demands expertise in CI/CD pipeline management using tools like Jenkins, deploying containerized applications with Kubernetes, and developing...
- A leading tech company is seeking a DevOps Data Hadoop professional in Dubai to design and manage CI/CD pipelines, deploy containerized applications, and implement big data solutions. The ideal candidate should have experience with tools such as Jenkins, Kubernetes, and...
- ...using tools like Terraform or Ansible. Big Data Solutions :Architect and implement big data solutions using technologies such as Hadoop Spark and Kafka. Distributed Systems: Design and manage distributed data architectures to ensure efficient data processing and...
- ...Equivalent work experience Experiences (Years & Type) Industry Regional Functional ~ At least 4 years production Hadoop experience ~8 years hands on Technology experience ~ Experience with large and complex global enterprises defined by high availability...
AED30000 per month
...influence decision- making within the organization. 17. Big Data Technologies: Working with distributed computing frameworks like Hadoop and Spark for processing large-scale data. 18. Database Management: Managing and querying databases to extract, transform, and...- ...automated geophysical analysis, and AI-driven reservoir modelling. ~ Familiarity with data processing frameworks (e.g., Apache Spark, Hadoop). ~ Bachelor’s or Master’s degree in Computer Science, Artificial Intelligence, or a related field (PhD preferred). ~8+...
- ...Configuration. Hands on knowledge on Informatica BDM and Power Center for trouble shooting production issues. Hands on knowledge on Hadoop components for trouble shooting production issues – viz OOZIE, Ambari, Scoop scripting. HDFS commands. Proficient in writing...
- ...delivering a multi-year data and AI transformation for a large, complex enterprise. The programme includes: Migration from legacy Hadoop, Oracle, SAP, and bespoke ETL systems A modern cloud lakehouse architecture supporting batch and real-time workloads...
- ...Java, Scala, and R. Data management & databases : Oracle, SAP, SQL, NoSQL, Data Warehousing Big data technologies : Apache Hadoop, Spark, Kafka, etc. Cloud platforms: Experience with Microsoft Fabric, Azure (Synapse Analytics, Databricks, Machine Learning,...
- ...Strong proficiency in Python and/or R; familiarity with SQL for data querying ~ Ability to build data pipelines (Spark, Airflow, Hadoop) and work with big data tools ~ Understanding of model serving, API development (FastAPI, Flask), and optimizing model performance...
- ...background in developing and maintaining CI/CD pipelines. # Data Tech Awareness: Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) and data warehousing concepts. # Scripting Prowess: Proficiency in scripting and programming languages commonly used...
- ...Low level blockchain data understanding (traces, events) Cloud platform experience (AWS, GCP) Big data technology experience (Hadoop, Spark, Snowflake) Experience in crafting dashboards and reports Experience in uncovering business-driving data insights Why...
- ...OTHER FACTORS Master’s degree or PhD in Data Science, Statistics, or related field. Experience with Big Data tools (Spark, Hadoop). Knowledge of data governance, data ethics, and explainability. Relevant certifications in Data Science or Analytics. #J-1...
- ...technologies, especially those in the Enterprise Data & Analytics space. Technology Domain Key Technologies/Tools. Big Data : Hadoop, Spark, Scala, Hive, HBase, Sqoop, Oozie, Apache Nifi, Airflow, HDFS, ADLS (Gen 2), Azure Data Factory (ADF), DataBricks, Kafka,...
- ...Key Responsibilities: Design, build, and maintain scalable data pipelines using big data technologies such as Apache Spark, Hadoop, and Kafka. Collaborate with data scientists, analysts, and IT teams to identify data requirements and deliver solutions that meet...
- ...within views. ~ o Utilize different Denodo cache types and ensure efficient query performance. ~(Agile Performance) ~ o Connect to Hadoop ecosystem components like Hive, use Sqoop for data ingestion, and access metadata via HCatalog. ~ o Translate business...
- ...tooling and DevOps technologies (Docker, Kubernetes/EKS, Helm, Terraform). • Proficiency in Python, Scala, Spark, PySpark, Java, or Hadoop. • 4+ years developing and optimizing dbt models with testing, version control, and documentation best practices. • Hands-on...
- ...Supermicro® is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/ Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are amongst the fastest growing company among the Silicon...
- ...Engineering ~ Strong hands‑on experience in PySpark ~ Experience working on Cloudera Data Platform (CDP) ~ Strong knowledge of Hadoop ecosystem (HDFS, Hive, Impala, YARN) ~ Proficiency in SQL and data modelling concepts ~ Experience with workflow orchestration...
- ...~ Experience in developing and maintaining application on Big data technologies - Spark Framework, NoSQL, Azure Data Bricks and Hadoop Ecosystem- Hive, Impala, HDFS, YARN, Pig, Oozie etc ~ Experience on Azure SQL DB & Synapse Analytics, Power BI, SSIS, SSRS, T-SQL...
- ...Data technologies such as Cloudera Data Engineering and NiFi (optional). Solid experience working with core components of the Hadoop Ecosystem , including HDFS, Hive, Impala, Sqoop, Spark, and YARN. Proficient in programming and scripting languages such as...
- ...Proficiency and demonstrated experience in: Python, SQL, Spark, Hive. ~ Demonstrated experience with database technologies (e.g. Hadoop, BigQuery, Amazon EMR, Hive, Oracle, SAP, DB2, Teradata, MS SQL Server, MySQL). ~ Demonstrated experience with business intelligence...
- ...image processing, text analysis) Experience and knowledge in big data analysis and management and distributed computing tools (e.g., Hadoop, Hive, Spark) Experience and knowledge in one or more programming languages (C, C#, Java) Experience and knowledge in web...
- ...Supermicro® is a Top Tier provider of advanced server, storage, and networking solutions for Data Center, Cloud Computing, Enterprise IT, Hadoop/ Big Data, Hyperscale, HPC and IoT/Embedded customers worldwide. We are amongst the fastest growing company among the Silicon...
- ...or analytics industry Hands-on experience with big data technologies such as Apache Spark and distributed storage systems (e.g., Hadoop ecosystem) Expertise in ClickHouse databases for managing analytical workloads at scale Hands-on experience with Java, Spring...
- ...of ETL processes and tools, particularly for data integration into Vertica. # Familiarity with other big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is advantageous. # Understanding of data warehousing concepts and best practices....
- ...more recent big data tools (Cloudera Data Engineering, NiFi, etc.)-Nifi is optional ~4 Hands on experience on major components of Hadoop Ecosystem like HDFS, HIVE, Impala, Sqoop, Spark and YARN. ~5 Experience in programming languages and tools (Python, Pyspark and...
- ...Overview An Informatica BDM Engineer who builds and optimizes big data integrations using Informatica, Hadoop, and SQL-based platforms. Experience and Location Experience: 6-9 years (overall 7+ years of experience in Informatica BDM platform). Location: Dubai...

