A leading State Government Client based in Western Sydney Area is looking for a Senior Azure Data Engineer. Initial 6 months contract role. APPLY NOW!
About the role:
Data Engineer will be responsible for developing and maintaining the data architecture, data models and standards for various Data Integration & Data Warehouse projects in Azure.
Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. In addition, develop and maintain the documentation of the data architecture, data flow, and data warehouse data models appropriate for both technical and business audiences.
- Provide technical expertise in Azure Data Platform
- Experienced in Data Warehouse / Data Lake / Lake House implementations
- Develop ELT pipelines in and out of data lake using Azure Data Factory, Databricks and Delta Lake.
- Promote changes between environments using Azure Devops Pipelines
- Integrate on-premise infrastructure with public cloud AZURE infrastructure.
- Partner with the Data Architects, Product Managers, Data Modellers, Business Users and Scrum Masters to deliver data integrations and BI solutions required for various projects.
- Review the work provisioned by vendor and ensure it is compliant with guidelines.
- Proactively identify improvement on the way of working within the team, with stakeholders to achieve a better outcome.
To be successful for the role, one must have:
- Minimum 5+ years in IT and 4+ years of hands-on experience working as an Azure Data Engineer, and 4+ years in Data Warehousing, ETL/ELT, BI, and Analytics projects.
- Extensive design and implementation experience in Azure cloud data platform and modern data warehousing, including data security, Azure Logging and Monitoring, Azure Databricks, Azure Blob Storage, Azure Data Lake, Azure Data Factory, Azure SQL Database, Azure Devops.
- Expertise in data modelling, ELT using ADF, implementing complex views, stored procedures and standard DWH and ETL/ELT concepts.
- Proven experience working with Spark on large data volumes (preferably in Databricks)
- Proven understanding of the Lakehouse concept and how Delta Lake is used to realise this.
- Deep understanding of relational and NoSQL data stores, methods, and approaches (star and snowflake, dimensional modelling).
- Experience with data security and data access controls and design.
- Proficiency in Python (Pyspark), SQL (ANSI and Spark SQL) and SCALA performance tuning and troubleshooting.
- Proficiency in building and optimising Azure Devops Pipelines is desired.
- Experience working in DataOps model preferred.
- Experience with Agile development methodologies.
If this is something that sounds like you, please APPLY NOW!!!
To be considered for the role click the 'apply' button or for more information about this and other opportunities please
contact Shweta Sharma on (02) 9464 5855 or email firstname.lastname@example.org and quote the above job reference number.
Paxus values diversity and welcomes applications from Indigenous Australians, people from diverse cultural and linguistic
backgrounds and people living with a disability. If you require an adjustment to the recruitment process please contact me on the above contact details.