Are you a seasoned Data Engineer with a strong background in Databricks? This forward-thinking government department, is on the lookout for a talented individual to join their Enterprise Intelligence team. As a Senior Data Engineer, you will play a crucial role in developing enterprise data pipelines to fulfill the reporting, analytical, and big data requirements of the business.
Evolve the Enterprise Data Platform (EDP): Contribute to the enhancement of their cloud-based, large-scale data and intelligence solution, ensuring it remains customer-focused, easy to consume, and impactful for the business.
Lead Data Engineering Initiatives: Drive end-to-end analytical solutions that are highly available, scalable, stable, secure, and cost-effective.
Mentorship and Collaboration: Lead, mentor, and conduct code reviews for other data engineers and modelers. Foster a collaborative environment within the Enterprise Intelligence team, contributing to knowledge sharing and up-skilling.
Compliance and Security: Ensure adherence to standards, security/privacy obligations, quality rules, and the Information Management Framework.
Design Data Pipelines: Create robust data extraction and transformation pipelines to support unstructured, semi-structured, and structured data.
Commitment to Safety and Diversity: Demonstrate a personal commitment to health, safety, and the environment. Support a diverse and inclusive workplace.
- Expert in secure, reliable, and scalable AWS infrastructure.
Data Solutions Architect:
- Expertise in large-scale data solutions for analytics.
Data Ingestion Mastery:
- Proven experience in real-time and batch data ingestion using AWS, Azure, Databricks, and PowerBI.
Mentorship and Feedback:
- Proven experience in mentoring and providing constructive feedback.
Project Management Skills:
- Demonstrated ability to multitask and manage conflicting priorities.
- Experience in an agile project setting.
- Advanced problem-solving, relational databases, and analytical skills. Ability to develop queries and use querying tools to develop reports and models.
Data Management Principles:
- Knowledge and application of data management principles and best practices.
- Experience building Terraform templates.
AWS Cloud Certification:
- Additional AWS cloud certification in architecture, development, and data analytics.
- Comfortable working in cross-functional teams.
Additional Coding Proficiency:
- Further expertise in Python, Java, or C#, with full CI/CD pipelines.
- Additional experience with information management tools, such as Alation.
To be considered for the role click the 'apply' button or for more information about this and other opportunities please contact Jeffrey Bullen on 07 3339 5624 or email [email protected] and quote the above job reference number.
Paxus values diversity and welcomes applications from Indigenous Australians, people from diverse cultural and linguistic backgrounds and people living with a disability. If you require an adjustment to the recruitment process please contact me on the above contact details.