Job Archives

Posted 3 days ago

Responsibilities

  • Assessment: Analyze current GCP pipelines (Cloud Functions, Cloud Pub/Sub, Cloud Scheduler, Cloud Run, BigQuery) to understand their structure, dependencies, and performance requirements.
  • Design: Create a detailed migration plan, mapping GCP services to equivalent Azure resources (Azure Functions, Azure Service Bus, Azure Scheduler, Azure Container Instances/App Service, Azure Synapse Analytics).
  • Implementation: Execute the migration plan, scripting or using infrastructure-as-code tools to automate the process wherever possible.
  • Testing: Develop a comprehensive testing strategy to validate the functionality and performance of the migrated Azure pipelines.
  • Optimization: Identify opportunities to enhance pipeline efficiency and cost-effectiveness on Azure.
  • Documentation: Thoroughly document the migration process, including design decisions, troubleshooting guides, and best practices.
  • Support: Provide ongoing support during and after the migration to ensure smooth operation and address any issues that arise.

Required Skills

  • Cloud Platforms: Expert knowledge of both GCP and Azure, including their core services, data processing capabilities, and pipeline orchestration tools.
  • Pipeline Migration: Proven experience migrating pipelines between cloud platforms, ideally from GCP to Azure.
  • Programming: Proficiency in Python, Bash, or other scripting languages for automation and data manipulation.
  • Infrastructure as Code (IaC): Familiarity with Terraform or ARM templates for managing cloud infrastructure.
  • Data Processing: Experience with data warehousing (BigQuery, Azure Synapse Analytics) and ETL/ELT processes.
  • Problem-Solving: Strong analytical and troubleshooting skills to address any unexpected issues during migration.
  • Communication: Good written and verbal communication to collaborate with teams and document the process clearly.
  • Nice to Have:
  • Certifications: GCP and/or Azure certifications (e.g., GCP Professional Cloud Architect, Azure Solutions Architect Expert).
  • DevOps: Experience with CI/CD pipelines and DevOps practices for automated testing and deployment.
  • Monitoring: Familiarity with cloud monitoring tools (e.g., CloudWatch, Stackdriver) to track pipeline performance and identify bottlenecks.

Responsibilities Required Skills

Full-Time
Dubai
Posted 5 days ago

What you will do?

  • Design end-to-end data solutions, including solution architecture, data modeling, and data flow diagrams, to address complex business requirements.
  • Design, develop, and maintain scalable data pipeline architecture to support the company’s and clients’ data processing needs.
  • Implement best practices for data ingestion, storage, and processing to ensure data integrity, reliability, and security.
  • Architect and implement Big Data processing workflows to handle large volumes of data with high throughput and low latency, ensuring timely and accurate data delivery.
  • Drive the adoption of data engineering best practices and standards across the organization, including data governance, data security, and data quality.
  • Lead and mentor junior team members, providing guidance on best practices and technical solutions.
  • Maintains consistent communication with the Project Team Lead and Project Manager, providing clear updates on task status and promptly escalating any issues or potential delays

Good to have:

  • Design and develop real-time streaming data pipelines using technologies such as Apache Kafka, Apache Pulsar, or cloud-native streaming services (e.g., AWS Kinesis, Google Cloud Pub/Sub).
  • Lead data architecture reviews and provide recommendations for optimizing performance, scalability, and cost efficiency.
  • Collaborate with data scientists and analysts to understand data requirements and provide technical expertise in data modeling and analytics.

What we are looking for?

  • Proficiency in designing, building, and maintaining scalable ETL data processing systems using Python, Java, Scala, or similar programming languages.
  • Expertise in SQL - Demonstrate a strong grip on SQL to process and analyze various forms of data.
  • Experience working with Cloud Deployment Environments, preferably GCP, Azure or AWS.
  • Experience in creating and managing CI/CD pipelines using Azure DevOps, Jenkins, GoCD, or similar technologies.
  • Proficiency in orchestration tools like Apache Airflow, Azure Data Factory, AWS Data Pipelines, or similar.
  • Familiarity with version control systems like GitHub and CI/CD concepts.
  • Understanding of Data Modeling and Data Warehousing techniques, including star schema or raw data vault.
  • Experience working with large-scale data sets, including structured and unstructured data formats like Parquet, Avro, JSON, etc.

Good to have:

  • Knowledge of big data processing technologies such as Hadoop, Apache HBase, Apache Hive, or similar.
  • Experienced with designing and implementing Data Governance policies covering Data Accessibility, Security, Management, Quality etc.

Job Features

Job Category

Business Intellegence

What you will do? Good to have: What we are looking for? Good to have: