Urgent requirement ofData Engineer - Permanent - Sydney/Melbourne Requirements Experience in Data Lake, Azure Data Bricks andSpark SQL Experience Designing and deploying Data Bricks platforms for any of AWS, Azure or Google Cloud Experience building and deploying data engineering pipelines into production, including using automation best practices for CI/CD Guiding clients as they implement transformational big data projects, including end-to-end development and deployment of industry-leading big data and AI applications Experience in working in Scrum Agile Technical Streamlining the customer machine learning lifecycle and integrating it with the rest of the data platform Experience in Big Data Technologies Required Apache Spark andData Bricks certifications Very Good Communication Skills Eligibility:Australian/NZ Citizens/PR Holders only Email: jobs@hasthasolutions.com
5+ years
Demonstrated experience in managing ICT projects using formal project management methodologies (eg. Waterfall or Agile) and/or demonstrated experience in managing large and complex IT infrastructure and cloud projects Proven experience in managing multiple tasks to challenging deadlines and adapt to changing priorities Ability to manage budget, resources, and risk to ensure the successful outcomes of projects Demonstrated experience and skills with engaging and managing ICT vendors, including approach to markets, evaluation, and delivery management Very Good Communication Skills Desirable: Project management qualifications, with a sound knowledge of techniques for establishing, planning, monitoring, and controlling major ICT projects A sound understanding of both project management methodologies (Waterfall, PMBOK) and agile frameworks (Scrum) for delivery of ICT projects Duration: 8 Months and possible extension Eligibility: Baseline Holders OR Ability to obtain Baseline Clearance Email: jobs@hasthasolutions.com