Original text


BUILD YOUR SKILLS AND

YOUR CAREER

Join A Team That Celebrates You Daily!
Our people are not only our greatest asset but also our biggest competitive advantage. We don’t call our employees- employees, we call them associates.

    Cloud Data Architect

    Experience - 5+ years

    Location - Noida (WFH)

    Immediate joiner or 30 days’ notice period.


    Role and Responsibilities:

    Ability to showcase strong data architecture design using Azure and AWS data engineering capabilities, Client facing role, should have strong communication and presentation skills

    Required Technical and Professional Expertise

    • Should have minimum of 5 years of experience in Azure and AWS Cloud Data Engineering frameworks Like AWS Lambda, AWS, Glue, Azure function, Azure Data Factory , AWS Cloud Watch
    • Expertise and hands-on experience on Python, Java, BigData, Apache Spark, Hadoop
    • Considerable experience in Developing Data Pipelines and Data Querying (SQL and No-SQL) using native AWS , Azure and Marketplace Tools & Services, AWS and Azure Data Migration Tools and Services.
    • Provide technical expertise in the design and development using Azure and AWS Cloud like Data Feed Logging and Monitoring
    • Collaborated with business stakeholders to identify and meet data requirements and transformation to implement in Data Pipelines
    • Experience in using Azure services and tools to ingest, egress, and transform data from multiple sources
    • In-depth technical knowledge of tools like Azure Data Factory, Databricks, Azure Synapse, SQL DB, ADLS etc.
    • Delivered ETL solution including data extraction, transformation, cleansing, data integration and data management
    • Designed ingestion layer for structured & unstructured data & implement specific data model for business & analytics use
    • Implemented batch & near real time data ingestion pipelines
    • Experience in working on Event-driven cloud platform for cloud services and apps, Data integration for building and managing pipeline, Data warehouse running on server less infrastructure on cloud Native services, Workflow orchestration using Azure and AWS Cloud Data Engineering components with CI-CD pipelines to automate the deployment.
    • Working knowledge of different file formats like Avro, parquet , text etc