Prior experience in application data migration activities ETL, data pipelines, data sets.
Experience in data engineering activities like developing /modifying the python based data migration framework orchestration and, scheduling of migration workloads.
Exceptionally good on experience in Python, Apache Airflow, SQL, Microsoft SQL Server database, Sybase, Snowflake.
Good understanding on Application Data models and data modelling activities.
Must have low level design and development skills. Should able to design a solution for a given use cases.
must able to show design and code on daily basis
Must be an experienced PySpark developer, Python, Scala coding. Primary skill is PySpark. and Azure Data bricks.
Must have good knowledge in Azure, Data bricks, Delta lake and notebooks development.
Must have experience in designing job orchestration, sequence, metadata design, Audit trail, dynamic parameter passing and error/exception handling.
Must have good knowledge on Datawarehouse concepts and Datawarehouse modelling
Apply for this role