Data Engineer Jobs. Australia

Content Media Network | Social Media Platform

Peers24 MobileScroll to Jobs

Australia Careers

Popular: Data Engineer Jobs. Big Data Engineer. Analytics Engineer
Search Term: Data Engineer
Search Results: 96
Last Updated: Tue, 05 Mar
Data Engineer » Port Melbourne, VIC - We have a unique opportunity for a passionate Data Engineer to join our Technology department on a 12-month fixed term contract.... Reporting into our Lead Data Engineer, you will work with the broader local data team and partner closely with our Marketing... • Mon, 04 MarSpecsavers
Data Engineer Jobs. Big Data Engineer. Analytics Engineer
Data Engineer » Australia - Atlassian is looking for a Data Engineer to join our Data Engineering team and build world-class data solutions and applications that power crucial decisions throughout the organisation. We are looking for an open-minded, structured thinker who is passionate about building systems at scale. You will enable a world-class data engineering practice, drive the approach with which we use data, develop backend systems and data models to serve the needs of insights and play an active role in building Atlassian’s data-driven culture. Data is a BIG deal at Atlassian. We ingest over 180 billion events each month into our analytics platform and we have dozens of teams across the company driving their decisions and guiding their operations based on the data and services we provide. The data engineering team manages several data models and data pipelines across Atlassian, including finance, growth, product analysis, customer support, sales, and marketing. You'll join a team that is smart and very direct. We ask hard questions and challenge each other to constantly improve our work. As a Data Engineer, you will apply your technical expertise to build analytical data models that support a broad range of analytical requirements across the company. You will work with extended teams to evolve solutions as business processes and requirements change. You'll own problems end-to-end and on an ongoing basis, you'll improve the data by adding new sources, coding business rules, and producing new metrics that support the business. Bachelor’s/Master's degree or equivalent in a STEM field with a minimum 2 Years of Experience in Data Engineering or related field. Expertise in Python or other modern programming languages. Working knowledge of relational databases and query authoring via SQL. Experience designing data models for optimal storage, retrieval and dashboarding to meet product and business requirements. Experience building scalable data pipelines using Spark or Spark-SQL with Airflow scheduler/executor framework or similar scheduling tools. Experience building real-time data pipelines using a micro-services architecture. Experience working with AWS data services or similar Apache projects (Spark, Flink, Hive, and Kafka). Understanding of Data Engineering tools/frameworks and standards to improve the productivity and quality of output for Data Engineers across the team. Well-versed in modern software development practices (Agile, TDD, CICD). Don’t see an exact role match? No problem Join our Talent Community and stay up-to-date on company and careers updates relevant to your career. J-18808-Ljbffr • Wed, 06 MarAtlassian
Data Engineer » Port Melbourne, VIC - We have a unique opportunity for a passionate Data Engineer to join our Technology department on a 12-month fixed term contract.... Reporting into our Lead Data Engineer, you will work with the broader local data team and partner closely with our Marketing... • Tue, 05 MarSpecsavers
Data Engineer - ETL - Federal Government » New South Wales - Data Engineer - ETL - Federal Government Leading Federal Government Department Initial 12 month contract + 12 month... Engineer to be apart of the department’s internal key data centric projects. You will be responsible demonstrating strong data... • Tue, 05 MarHiTech Personnel
Senior Data & Analytics Architect & Engineer » Bedford Park, SA - services and more. About the Role As the Senior Data & Analytics Architect & Engineer, you will be responsible... to engineer and deliver data models aligned with the IDS data and analytics architecture principles, guardrails and design... • Mon, 04 MarFlinders University$106403 - 118281 per year

Related Jobs in Australia

Search Data Engineer Related Links
Search Auditor
Search Treasurer
Search Angular Developer
Search Genetic Counselor
Search Chief Technology Officer
Search Undergraduate Research Assistant
Search Veterinarian
Search SQL Developer
Search Field Technician
Search Writer
Search Jordan Jobs

Data Engineer » Port Melbourne, VIC - We have a unique opportunity for a passionate Data Engineer to join our Technology department on a 12-month fixed term contract.... Reporting into our Lead Data Engineer, you will work with the broader local data team and partner closely with our Marketing... • Mon, 04 MarSpecsavers
Data Center Technical Operations Engineer, Infraops DCEO » Sydney, NSW - DESCRIPTION The Facility Engineer with be responsible for Data Center Engineering Operations within an Amazon Data...: Sydney, NSW, AUS BASIC QUALIFICATIONS - 2+ years of electrical or mechanical, or 2+ years of data center or mission... • Mon, 04 MarAmazon
Data Center Technical Operations Engineer, Infraops DCEO » Sydney, NSW - DESCRIPTION The Facility Engineer with be responsible for Data Center Engineering Operations within an Amazon Data... QUALIFICATIONS Ability to participate in a 24 x 7 rotating shift roster 1+ years of Data Center Engineering/Operations Experience... • Mon, 04 MarAmazon
Senior Data Engineer Design » Eveleigh, NSW - Senior Data Engineer - Design You are an experienced solution designer with strong experience in Data modelling, Big... Data platforms. We are one of the critically performing Data Engineering teams in CBA Together we can lead strategic data... • Mon, 04 MarCommonwealth Bank of Australia
Data Engineer » Port Melbourne, VIC - We have a unique opportunity for a passionate Data Engineer to join our Technology department on a 12-month fixed term contract.... Reporting into our Lead Data Engineer, you will work with the broader local data team and partner closely with our Marketing... • Mon, 04 MarSpecsavers
Senior Data & Analytics Architect & Engineer » Bedford Park, SA - architects and technology vendors to engineer and deliver data models aligned with the IDS data and analytics architecture... to support university projects and provide core services such as IDS ServiceDesk, Information Security, Data, Architecture... • Mon, 04 MarFlinders University$106403 - 118281 per year
Data Engineer » Port Melbourne, VIC - We have a unique opportunity for a passionate Data Engineer to join our Technology department on a 12-month fixed term contract.... Reporting into our Lead Data Engineer, you will work with the broader local data team and partner closely with our Marketing... • Mon, 04 MarSpecsavers
Data Engineer Lead » Wodonga, VIC - Creating a Better World for our Pets. As our Data Engineer Lead you will be responsible for leading Smart Factory engineering...Job Description: Hands on development on python, pycharm, spark to create standardised data assets. Mars Petcare... • Mon, 04 MarMars
Big Data/PySpark Senior Engineer » Sydney, NSW - and communication skills; Up to 3 years' experience in financial domain development as a data engineer; Highly motivated, and suitably.... We have an opportunity for a highly driven engineer to join the team. The business is committed to advancing its' position within the FX... • Mon, 04 MarWestpac
Data Engineer » Port Melbourne, VIC - We have a unique opportunity for a passionate Data Engineer to join our Technology department on a 12-month fixed term contract.... Reporting into our Lead Data Engineer, you will work with the broader local data team and partner closely with our Marketing... • Mon, 04 MarSpecsavers
Data Engineer » Port Melbourne, VIC - We have a unique opportunity for a passionate Data Engineer to join our Technology department on a 12-month fixed term contract.... Reporting into our Lead Data Engineer, you will work with the broader local data team and partner closely with our Marketing... • Mon, 04 MarSpecsavers
Data Engineer » The Rocks, Sydney - PRA have partnered with a dynamic and innovative product company that is at the forefront of cutting-edge technologies. Position: Data Engineer Experience: 3-7 years Location: Hybrid work arrangement Responsibilities: Design, build, and maintain scalable and efficient data pipelines to support the product ecosystem. Collaborate with cross-functional teams to understand data requirements and ensure data integrity and availability. Utilise your expertise in Terraform, SQL, and Python to develop and optimize data workflows. Implement best practices in data governance and security to safeguard sensitive information. Work with cloud technologies, particularly Google Cloud Platform (GCP) or other cloud services, to enhance data infrastructure (experience with GCP is a nice-to-have). Troubleshoot and resolve data-related issues, ensuring minimal impact on product operations. Stay abreast of industry trends and emerging technologies, making recommendations for continuous improvement. Qualifications: 3-7 years of experience as a Data Engineer or similar role. Proficiency in Terraform, SQL, and Python is essential. Familiarity with cloud technologies, particularly GCP (or another cloud technology), is a plus. Strong analytical and problem-solving skills. Excellent communication and collaboration skills to work effectively in a cross-functional environment. Ability to thrive in a fast-paced, innovative, and dynamic work environment. How to Apply: If you are a talented and motivated Data Engineer looking to contribute to groundbreaking projects, we want to hear from you. For a confidential conversation, please email emma.liddingtonpra.com.au Applicants must have full working rights in Australia. J-18808-Ljbffr • Mon, 04 MarPRA Australia
Data Engineer » Australia - Engineering - Software (Information & Communication Technology) Full time $120,000 – $130,000 per annum Superannuation Opportunity to work 2 days/week from home once you are fully autonomous in the role A paid day off for your birthday Work in a positive and supportive team environment with a great culture A long-term career with with the opportunity to take ownership of the infrastructure and systems A pivotal role as we undergo transformational changes while we significantly scale up our global client base Great opportunities for growth and development Free car parking on-site We have a diverse, inclusive, and safe workplace where our team members care about each other and enjoy building meaningful connections. About The Role We are seeking a highly skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining our data infrastructure and systems. You will collaborate with cross-functional teams to implement data pipelines, optimize data workflows, and ensure data quality and reliability. The ideal candidate will have a strong background in software engineering, database management, and data modelling, along with expertise in big data technologies and cloud platforms. Responsibilities Collaborate with developers and analysts to understand business requirements and translate them into technical solutions. Design, build, and maintain scalable data pipelines and ETL processes to collect, process, and store data from various sources. Develop and implement data models, schemas, and structures to support analytical and operational needs. Optimize data storage and retrieval for performance and cost efficiency. Ensure data quality and integrity through data validation, cleansing, and monitoring processes. Create reports and other outputs for use by internal stakeholders and client-facing applications. Stay updated on industry best practices and emerging technologies in data engineering and analytics. Provide technical guidance and mentorship to the broader team. Required Qualifications, Skills & Experience Bachelor’s degree in Computer Science, Engineering, or related field. 5 years of experience in data engineering or related roles. Strong experience with SQL and NoSQL databases, data warehousing, and data modelling concepts. Hands-on experience with cloud data analytics technologies (Databricks) and cloud platforms (AWS, Azure). Proficiency in creating interactive dashboards and reports using Power BI or similar BI tools. Solid understanding of software engineering principles and best practices, including version control, testing, and CI/CD. Excellent problem-solving and analytical skills, with a keen attention to detail. Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. About the company Retail Edge Consultants is a jewellery industry-specific software and data company that offers point of sale software, strategic consulting services, and benchmarking and industry leading data insights. These tools and services allow their clients to operate day-to-day and make better informed decisions that positively add to their bottom-line. We are a progressive dynamic company, well respected in our marketplace and global leaders in our field. Your application will include the following questions: Which of the following statements best describes your right to work in Australia? How many years' experience do you have as a Data Engineer? How much notice are you required to give your current employer? What's your expected annual base salary? How many years' experience do you have as a Business Intelligence Architect? How many years' experience do you have as a Cloud Analyst? Report this job advert Don’t provide your bank or credit card details when applying for jobs. Want to know how much you can earn as a Data Engineer? J-18808-Ljbffr • Sun, 03 MarRetail Edge Consultants
Data Engineer » Largs North, Port Adelaide Area - Our Company We exist to bring amazing people together to explore the art of possible. Bailey Abbott are a progressive and dynamic IT Consultancy business working across both Public and Private sectors. People are at the core of everything we do. We’re all about nurturing and inspiring people – ours and our clients. We bring a positive attitude and mindset to everything we do. We confidently use our knowledge and skills to solve problems, finding a better and easier way forward. We deliver transformational outcomes to take businesses further, fearlessly. Confidence and trust is fundamental to delivering superior outcomes and we take ownership and responsibility for our commitments. Our People Our specialist IT Consultants are creative, collaborative, and passionate. We’re strong advocates for change, challenging the status quo through new thinking, technology, and practices. Always exploring creative ways to simplify complexity. As a Consultant with Bailey Abbott, you will have the opportunity to build your project portfolio by gaining exposure and experience working across different clients, industries, technologies, and methodologies, all while having the stability and security of a permanent position. Our Career Opportunity Bailey Abbott have an exciting opportunity for a Data Engineer to join our Data Intelligence capability. Our Data Intelligence specialists work within an active community of practice, enabling you to benefit from their years of collective experience and wisdom. Working as a Data Engineer you will get the opportunity to: Represent Bailey Abbott, sharing your skills and knowledge to help deliver projects for our clients Develop and maintain ETL (Extract, Transform, Load) pipelines to ingest, process, and transform data within our Microsoft Azure-based data lake, utilising Azure Data Factory Design and implement efficient data models and schemas in Azure to ensure data quality and reliability, integrating Azure Purview for data governance Utilise your proficiency in SQL and NoSQL databases to optimise data storage and retrieval in Azure environments Leverage big data technologies such as Apache Spark, Hadoop, or Azure HDInsight to process and analyse large datasets Collaborate with data scientists, analysts, and stakeholders to understand business requirements and deliver valuable data solutions tailored for Azure ecosystems Write high-quality, scalable code in languages such as Python, Java, or Scala to support Azure-based data engineering tasks Continuously monitor, troubleshoot, and optimise data pipelines in Azure for performance and reliability Stay up-to-date with industry best practices and emerging technologies in the Azure landscape to drive innovation in our data platform Bachelor's degree in a related field or equivalent work experience Demonstrated experience in data engineering or a related role, with a specific focus on Microsoft Azure Proficiency in SQL and NoSQL databases, with a strong understanding of data manipulation and querying in Azure Demonstrable experience in ETL pipeline development using Azure Data Factory, data modeling, and schema design Familiarity with Azure big data technologies such as Apache Spark, Hadoop, or Azure HDInsight Experience with Azure Purview for data governance and cataloging Application of CI/CD and DevOps methodologies, including source code management Programming skills in languages like Python, Java, or Scala, with a focus on Azure applications Strong problem-solving skills and attention to detail Effective communication and collaboration skills in an Agile team setting Azure certification or relevant cloud experience is highly desirable Exposure to emerging approaches such as Azure Fabric Paid Birthday Leave Access to Employee Assistance Program Flexible Working Options Regular team gatherings including monthly team lunch Professional development opportunities and continuous learning Financial benefits through Discount Store Our clients are diverse and so are we. We engage with great talent from all walks of life to bring their extensive and varied experience to help promote innovation. We encourage applications from candidates from all backgrounds to further strengthen Bailey Abbott. Bailey Abbott. Explore Possible. J-18808-Ljbffr • Sun, 03 MarBailey Abbott Pty Ltd
Sr. Civil Engineer in Data Center Engineering - Civil, Structural, Architectural Solutions » Sydney, NSW - . As a Civil Engineer, you will be leading projects to ever-evolving customer needs for the new and existing AWS data center fleet... as we continue hyper-scale expansion. A Civil Engineer in Data Center Engineering - Civil, Structural, Architectural Solutions (DCE... • Sat, 02 MarAmazon
SQL Data Administrator / Data Engineer » Sydney, NSW - is all about With us, you'll have an Impact . As a Data Engineer, you will be responsible for designing, implementing and maintaining our products... team fits in The SPACE tribe within the Technology and Data Group is responsible for driving REA's internal tooling... • Sat, 02 MarREA Group
Lead Data Warehouse Engineer » Sydney, NSW - Australian Stock Exchange (ASX: ABB). Job Description We are looking for a Lead Data Warehouse Engineer for our Data... Warehouse Engineer: Establish strong working relationships with internal stakeholders to build data warehousing solutions... • Sat, 02 MarSymbio
Data Engineer » Australia - Use your ETL, cloud, automation skills and business acumen to drive business value and enhance organisation's data platforms and data use. About the Company The Client is a Financial Services organisation based in Melbourne's CBD. They are a Data-driven organisation and are constantly looking at better utilising their data for business use cases. About the Role As a Data Engineer, you will work on Data Engineering solutions and projects end-to-end. You will use your business understanding to solve business problems using the Data Tech stack. You will work with a variety of stakeholders and act as the Subject Matter Expert for Cloud Data Engineering. Duties Design, develop, and maintain scalable data pipelines for ETL processes within cloud environments. Oversee Azure Cloud infrastructure and services including: Data Factory, Data Lake, Synapse, Databricks, Apache Spark (Databricks and Snowflake being explored now). Integrate a wide range of disparate source systems and databases to the centralised Azure data lake. Collaborate with internal business stakeholders to understand different systems (ERP, CRM etc.) and the organisational data uses. Consult with internal colleagues to understand business logic. Support future Advanced Analytics and Machine Learning projects. Promote the utilisation and understanding of Master/Metadata while driving its development. Lead best practice in data handling and storage methods. Skills & Experience Experience working commercially with Cloud Technology (AWS, GCP or Azure) and on-prem data warehouses ETL, data migration and/or integration experience (SSIS, stored procs., ADF etc.) Master data concepts and tool understanding Experience with data architecture concepts and dimensional data modelling concepts Python and SQL experience Ability to seek solutions and passion for learning new technologies How to Apply If you are a forward-thinking Data Engineer with a passion for driving data-driven solutions, please apply and suitable candidates will be contacted for a further conversation. J-18808-Ljbffr • Sat, 02 MarClearCompany
Data Engineer » Dunnstown, Moorabool Area - Engineering - Software (Information & Communication Technology) About the Company The Client is a Financial Services organisation based in Melbourne's CBD. They are a Data-driven organisation and are constantly looking at better utilising their data for business use cases. About the Role As a Data Engineer, you will work on Data Engineering solutions and projects end-to-end. You will use your business understanding to solve business problems using the Data Tech stack. You will work with a variety of stakeholders and act as the Subject Matter Expert for Cloud Data Engineering. Duties Design, develop, and maintain scalable data pipelines for ETL processes within cloud environments. Oversee Azure Cloud infrastructure and services including: Data Factory, Data Lake, Synapse, Databricks, Apache Spark (Databricks and Snowflake being explored now). Integrate a wide range of disparate source systems and databases to the centralised Azure data lake. Collaborate with internal business stakeholders to understand different systems (ERP, CRM etc.) and the organisational data uses. Consult with internal colleagues to understand business logic. Support future Advanced Analytics and Machine Learning projects. Promote the utilisation and understanding of Master/Metadata while driving its development. Lead best practice in data handling and storage methods. Skills & Experience Experience working commercially with Cloud Technology (AWS, GCP or Azure) and on-prem data warehouses ETL, data migration and/or integration experience (SSIS, stored procs., ADF etc.) Master data concepts and tool understanding Experience with data architecture concepts and dimensional data modelling concepts Python and SQL experience Ability to seek solutions and passion for learning new technologies How to Apply If you are a forward-thinking Data Engineer with a passion for driving data-driven solutions, please apply and suitable candidates will be contacted for a further conversation. Report this job advert Don’t provide your bank or credit card details when applying for jobs. Thinking about a career as a Data Engineer? Explore job and salary trends, in-demand skills and role reviews. J-18808-Ljbffr • Sat, 02 MarLaunch Recruitment
Data Engineer » Dunnstown, Moorabool Area - Opportunity snapshot: An exciting, permanent opportunity has become available for a Data Engineer. As a Data Engineer at Bupa you will bring specialist skills which enables Bupa ANZ to execute on its Data Strategy. Collectively, we are working to build the future state of Bupa’s data platforms on Microsoft Azure services, and to enable the business to work strategically with data to maximise competitive performance. Data Engineers contribute to the evolution of Bupa’s data estate via squad, project, and business partnering approaches. At Bupa ANZ, Data Engineers have a broad skill set, and work across various Data Services; Operational Data and Analytics, Data Warehouse and BI, Advanced Analytics and Self-service, and across different business units including Health Insurance, Health Services, and corporate functions. Data Engineers deliver solutions aligned to our various Target State Architectures across a range of technologies. They are responsible for implementation of data solutions aligned to our standards, and consistent with engineering guidelines. They are also contributors towards the continuous improvement of the engineering practices of the team. How will I help? Identify a variety of solution options and work collaboratively with stakeholders to analyse the pros/cons and implications of each option, decide on the right option to proceed with. Engineer data pipelines from data acquisition and ingestion though the data lake, various means of consumption including, reports, analytics, dashboards, applications, models, and API’s. Strong knowledge using SQL to write and tune complex and optimised queries over large datasets Engineering of CI/CD processes and execution of releases following source code control and DevOps best practices. Ensure solutions use secure development practices that are well tested including identification and implementation of data quality and integrity checks. What do I need? Educated to a minimum of degree level in engineering, computer science, or related technology discipline. 2 to 5 years’ experience as a software or data engineer Azure certifications(s) or equivalent Strong level SQL/TSQL Strong level Spark Strong CI/CD experience About us: Bupa has a strategic goal of being the most customer-centric digital healthcare organisation, with the use of data as an explicit pillar of this strategy. The program pillars include: A Customer-Centric Focus - Bupa aims to grow our data products, the data that people have access to and the utilisation of data as a strategic asset that drives our Connected Care, Core Modernisation, and other strategic change agents Data Access and Democratization - Ensuring everyone has access to and can understand the data Bupa holds; reducing barriers to entry and make the complex simple for people who want to leverage our data assets Support for Access - Help leaders in ensuring easy access to data for people that they work with Empowerment - Enabling business people to do their role as owners, stewards, and leaders. Delivery & Customer Value First - Treat data as an asset to create value for customers and remove barriers to achieving compliance, risk, or other barriers to value What’s in it for you? As well as a competitive salary, a range of Bupa benefits and flexible working/ work from home, you’ll be challenged and encouraged to innovate. You will collaborate strongly with colleagues who are committed to delivering exceptional experiences. We trust, respect and consider everyone, knowing your difference will make the difference. Other benefits include subsidies on our health insurance, travel, car, home, contents and pet insurance products as well as Bupa services such as Dental and Optical. You can also access a ‘People First’ wellness program which provides a range of services such as health coaches, annual skin checks and flu vaccinations, assistance with nutrition, mental and general well-being guides and product discounts. You’ll feel happier & healthier working at Bupa J-18808-Ljbffr • Sat, 02 MarBupaoptical
Data Engineer » Australia - Job Description Job Number: Data Engineer ( Job Number: 24000082 ) Description About the role The Data Engineer will be entrusted to design, develop, and maintain the infrastructure and processes that enable the collection, storage, management, and conversion of raw data from numerous sources into usable data assets in a central storage environment to be utilised by data professionals within Analytics and across RACQ to provide key business insights. Working closely with Analytics and other RACQ teams, Data Engineer will provide data to support the business in meeting its key performance objectives. Data Engineer will be responsible for - Building and maintaining data pipelines that extract data from various sources, transforming it into a suitable format, and loading it into data storage systems for analysis. Monitoring data pipelines and platforms to identify and resolve any issues or bottlenecks that may arise. Building and managing data warehouses, data lakes, and other data platforms to store large volumes of structured and unstructured data for analytical purposes. Automating data-related tasks to streamline processes and reduce manual interventions. Staying up to date with advancements in data pipeline, platforms, tools, and related fields. Conceptualising and generating infrastructure that allows data to be accessed and analysed appropriately and ethically. Providing and supporting a development environment with tools to enable data professionals to provide new solutions. About you Excellent understanding of AWS environments and services Expert proficiency in SQL, and Python. Proven analytical and problem-solving skills. Demonstrated interpretation skills, problem solving and decision-making skills. Well-developed communication and internal consulting skills. Why join RACQ? It is an exciting time to be part of RACQ as we continue our transformation towards better, smarter solutions that deliver more security and prosperity for Queenslanders. We are big enough to make a real difference in our community, yet small enough to nurture your professional and personal development. By joining RACQ, you’ll be part of new sustainability initiatives, including a commitment to being a carbon neutral organisation by 2030. We’ve invested in and partnered with sustainable companies, including the introduction home solar and batteries, Green Home and Car Loans, and many other products and initiatives. We’ll continue to advocate for, research, and offer green solutions. RACQ is committed to ensuring that every employee has access to flexibility in their role. We encourage all candidates to discuss what flexibility means to you, and how we can help you find your balance. At RACQ we are passionate about building a diverse and inclusive workplace to create better outcomes for our employees and our members. We aim for our teams to be a true reflection of our communities and are committed to inclusion particularly people of Aboriginal and Torres Strait Islander heritage, and those who may experience diversity or disability related barriers in securing employment. Applicants from these and other diverse backgrounds are encouraged to apply. If you have any support or access requirement, please advise us at the time of applying. What we offer you: Our employee benefits are designed to enrich your work life, your home and the community you live in, some of these include: Financial benefits and discounts on a range of banking, assistance and insurance products Give back to our communities with paid volunteer leave, payroll giving and volunteering for community events such as MotorFest and EKKA A range of flexible working and leave options, parental leave of 13 weeks primary carer leave, purchased leave up to 4 weeks per year, cultural observance leave, substitution of public holidays for cultural/religious reasons, personal leave for wellbeing, gender affirmation leave. Prioritise your wellbeing with our onsite and virtual health hub, discounts on health programs and access to wellbeing initiatives such as yoga, mindfulness and free and confidential Employee Assistance Program (EAP). Invest in your development with in-house development programs and access to 80,000 online courses. Please Note: to objectively and comprehensively assess your potential and suitability for the role, RACQ may require you to complete video interviews, psychometric assessments and/or criminal history check as part of the application process. If you require any adjustments during the application process, please advise the RACQ Careers team. RACQ is a ‘Circle Back Initiative’ employer – we commit to respond to every applicant. J-18808-Ljbffr • Fri, 01 MarRoyal Automobile Club of Queensland
Data Engineer » Largs North, Port Adelaide Area - Data Engineers are in high demand, so an Internship in this particular stream of data science is a terrific way to start your career. Through this Data Engineer Internship, you will learn a wide range of skills in data extraction, manipulation, artificial intelligence, and database development to help you pursue a career in Data Engineering. As a Data Engineer Intern, you will be part of a high-performing data science team. You will have the opportunity to implement machine learning and deep learning technologies as well as use Python for data cleaning, wrangling and visualisation projects. Start your career as a Data Engineer with the Data Engineer graduate internship program. Data Engineer Internship Responsibilities Manage and analyse data to find trends for decision making Perform data acquisition from large-complex data sources using Python Support the Senior Data Engineer through different data extraction techniques Developing database schemas Work with Data Scientists to collect and analyse large data sets Work across multiple databases as a Data Engineering Intern What you will learn in your Data Engineer Internship? Develop your skills using OLTP databases, OLAP databases and ORMs and NoSQL databases Gain hands-on experience using extract-load data pipelines Experience using cloud platforms, DevOps, micro-services, and serverless architectures (ideally AWS) and deployments How to apply for the Data Engineer Internship? To be considered for this Data Engineering Internship, you need to be a member of the CSA Internship Academy. To learn more about doing an Internship and to book a free Career Counselling and Internships phone consultation , click the button below. Career Training, Mentoring & Internship Program Management fees apply to become a member of the CSA Internship Academy. J-18808-Ljbffr • Fri, 01 MarCareer Success Australia
Data Engineer » Australia - Data Engineers are in high demand, so an Internship in this particular stream of data science is a terrific way to start your career. Through this Data Engineer Internship, you will learn a wide range of skills in data extraction, manipulation, artificial intelligence, and database development to help you pursue a career in Data Engineering. As a Data Engineer Intern, you will be part of a high-performing data science team. You will have the opportunity to implement machine learning and deep learning technologies as well as use Python for data cleaning, wrangling and visualisation projects. Start your career as a Data Engineer with the Data Engineer graduate internship program. Data Engineer Internship Responsibilities Manage and analyse data to find trends for decision making Perform data acquisition from large-complex data sources using Python Support the Senior Data Engineer through different data extraction techniques Developing database schemas Work with Data Scientists to collect and analyse large data sets Work across multiple databases as a Data Engineering Intern What you will learn in your Data Engineer Internship? Develop your skills using OLTP databases, OLAP databases and ORMs and NoSQL databases Gain hands-on experience using extract-load data pipelines Experience using cloud platforms, DevOps, micro-services, and serverless architectures (ideally AWS) and deployments How to apply for the Data Engineer Internship? To be considered for this Data Engineering Internship, you need to be a member of the CSA Internship Academy. To learn more about doing an Internship and to book a free Career Counselling and Internships phone consultation , click the button below. Career Training, Mentoring & Internship Program Management fees apply to become a member of the CSA Internship Academy. J-18808-Ljbffr • Fri, 01 MarCareer Success Australia
Data Engineer » Geebung, Brisbane - Data Engineers are in high demand, so an Internship in this particular stream of data science is a terrific way to start your career. Through this Data Engineer Internship, you will learn a wide range of skills in data extraction, manipulation, artificial intelligence, and database development to help you pursue a career in Data Engineering. As a Data Engineer Intern, you will be part of a high-performing data science team. You will have the opportunity to implement machine learning and deep learning technologies as well as use Python for data cleaning, wrangling and visualisation projects. Start your career as a Data Engineer with the Data Engineer graduate internship program. Data Engineer Internship Responsibilities Manage and analyse data to find trends for decision making Perform data acquisition from large-complex data sources using Python Support the Senior Data Engineer through different data extraction techniques Developing database schemas Work with Data Scientists to collect and analyse large data sets Work across multiple databases as a Data Engineering Intern What you will learn in your Data Engineer Internship? Develop your skills using OLTP databases, OLAP databases and ORMs and NoSQL databases Gain hands-on experience using extract-load data pipelines Experience using cloud platforms, DevOps, micro-services, and serverless architectures (ideally AWS) and deployments How to apply for the Data Engineer Internship? To be considered for this Data Engineering Internship, you need to be a member of the CSA Internship Academy. To learn more about doing an Internship and to book a free Career Counselling and Internships phone consultation , click the button below. Career Training, Mentoring & Internship Program Management fees apply to become a member of the CSA Internship Academy. J-18808-Ljbffr • Fri, 01 MarCareer Success Australia
Data Engineer » Australia - Engineering - Software (Information & Communication Technology) We have an exciting opportunity for a Data Engineer to join our growing team Open to applications from Melbourne and Brisbane. Nucleus Network is the only global phase 1 clinical trial specialist with facilities in the USA and Australia, fulfilling our purpose of “advancing medicine and improving lives” , by attracting the best talent. Since our establishment in 2004, Nucleus Network has conducted well over 1000 phase 1 clinical trials for biotechnology and pharmaceutical companies from the USA, Europe, Australia, Japan, South Korea, Taiwan, and China. We are defined by our values, great work culture, commitment to staff development and dedication to provide the best possible work environment. Come and join us through this exciting journey of expansion and gain world-class experience servicing some of the most innovative biotechnology and pharmaceutical companies globally. About the role The Data Engineer is responsible for designing, developing, and maintaining data engineering solutions within contemporary Microsoft platforms. The role sits as a member of the Information Technology team and works across functional departments to ensure the delivery of high-quality, scalable, and secure data solutions supporting the business’ digital transformation objectives and compliance requirements. Key responsibilities Design, develop, and deploy data engineering solutions within Microsoft platforms Lead the design and implementation of a data lake solution in Microsoft Fabric, ensuring data is accurate, consistent, accessible, and secure Collaborate with stakeholders to understand data needs and requirements and provide solutions Ensure data engineering solutions comply with relevant regulations, standards, and best practices Troubleshoot and resolve data engineering issues and provide technical support and guidance Document and maintain data engineering processes, procedures, and artefacts, and provide training to other IT staff and users Requirements Tertiary qualification in Computer Science, Information Technology, or related field. At least 4 years proven experience as a Data Engineer, with a focus on Microsoft platforms (Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure Functions, and Power Platform tools) Able to design and implement data lake solutions, preferably in Microsoft Fabric or Snowflake Strong knowledge of data engineering concepts, methodologies, and best practices, such as data modelling, data pipelines, data quality, data governance, data security, and data privacy Familiar with various data sources and formats, such as structured, semi-structured, and unstructured data, SQL, NoSQL, JSON, XML, etc. Experience using various data analysis and visualisation tools, such as Power BI, Tableau, etc. Excellent communication and interpersonal skills to collaborate with diverse stakeholders High level analytical and problem-solving skills Knowledge of clinical research processes and regulatory requirements Meaningful Impact - 'advancing medicine, improving lives' of people around the world Professional Growth Opportunities - we believe in investing in our employees' development Employee Recognition - your hard work and achievements will be recognised and rewarded Employee Wellness Initiatives - we prioritize the health and wellbeing of our employee Hybrid work options available following completion of initial training If you are interested in applying for this position, please include a CV and cover letter with your application. At Nucleus Network we are committed to creating a supportive and inclusive space where every individual feels they can be their full selves. Our policy applies to all aspects of employment, including our hiring and selection process. We are an equal opportunity employer; seeking people from all life stages, backgrounds and life perspectives. Our focus are people who embody our values of drive, teamwork, customer focus and excellence. For further information about Nucleus Network, please go to www.nucleusnetwork.com.au Your application will include the following questions: Which of the following statements best describes your right to work in Australia? How many years' experience do you have as a Data Engineer? What's your expected annual base salary? How much notice are you required to give your current employer? Reviews from people working in the industry J-18808-Ljbffr • Fri, 01 MarNucleus Network
Data Engineer » Australia - Data Engineer contract opportunity expanding and optimising an FMCG business's Data & Analytics platform through the ingestion, transformation and provisioning of data for their Insights teams. This role will both support and grow capabilities within the platform to deliver value in an iterative way, championing a data driven culture and adhering to data governance practises. Responsibilities: Design, Build, Test and Deploy highly available & reliable data pipelines, architectures, and data structures (data sets, views, data marts etc) leveraging relevant tools Drive further business value creation by leveraging new and emerging data platform toolsets as part of an innovation pipeline. Assemble large, complex structured and unstructured data sets that meet functional / non-functional business requirements including security and governance. Update existing data models and metadata repositories for new use cases. Design and implement internal process improvements: automating manual processes and reconciliations, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the technology required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Python and AWS data technologies. Work with stakeholders including the Executive, Product, Data and Insights teams to resolve data-related technical issues and support their data needs. Keep our data secure, documented, tagged and accurate Create data sets for analytics and data scientist team members that assist them in building and optimizing our product to drive innovation and competitive advantage. Work with data and analytics experts to strive for greater functionality in our data system Knowledge: Solid understanding of Data & Analytics concepts and technologies including data warehousing, data lakes, ETL and visualisation and reporting, and how business value is created from these technologies. Skills in use of data engineering tools and languages including SQL, Python, Athena, Red Shift & Glue. Solid understanding of Data Architecture, Data modelling, Data marts and Metadata management best practice Exposure to Agile delivery including definition of User Stories Emerging Design thinking and Human centred design understanding Degree in Computer Science, Data Science, IT, or similar field Experience: Minimum 1-2 Years experience in working with Modern Data Platforms (AWS, Azure etc.) Minimum 1-2 Years of Data Engineering experience Good communication and presentation skills Please apply today J-18808-Ljbffr • Fri, 01 MarAmbition Group
Data Engineer » Evandale, Northern Midlands - Permanent position Mid – Senior Level G08: $111,205 - $118,509 17% Super based in relevant experience Join an organisation that is not just embracing digital transformation – they are leading it There is more to this role than being one of their powerhouse data engineers who are behind designing and maintaining complex data pipelines You will collaborate with stakeholders at different levels, translating business requirements and delivering data solutions across various specialist teams – the ability to be flexible and adaptable is essential. If you thrive in the fast lane and love tackling challenges – APPLY NOW What we’re looking for: 3 years experience in data engineering tools and techniques-ETL,Data modeling,Reporting Experience in data warehousing, integration, and management of large datasets 2 years of experience working in AWS cloud environment Excellent documentation skills Familiarity with version control – GitHub What is in it for you? WFH 1 day per week and flexible working hours Onsite parking available Competitive package 17% Super Work alongside industry leaders across Data & Analytics This role is open for someone wanting to relocate, however the work is full-time based in Perth. You must already have full-time working rights in Australia. To find out more contact Jaydene d'Offay on 61 8 6146 4468 or Jaydene.doffayversegroup.com.au J-18808-Ljbffr • Fri, 01 MarVerse Group
Data Engineer » Australia - Data Engineer Join our client who is Australia's largest privately owned transport company and be part of the data-driven journey We're seeking a Data Engineer to help the organisation in making smarter decisions and support growing business with data solutions. Key Responsibilities: Create data solutions for better decision-making. Build and maintain our data warehouse . Design ETL processes for data storage. Set up efficient data pipelines. Develop data models and processing systems. Improve data quality. Work well with others. You'd Bring with you: Experience in data engineering, cloud-based solutions . Knowledge of databases, data warehousing, and data architecture. Strong Python and SQL skills. In-depth knowledge of Power BI or similar tools. Ability to work with data and solve problems. Good communication skills. Attention to detail. If you're excited to help us use data to drive our business forward, apply now Interested? We'd love to hear from you. If what you have read has piqued your interest, apply now or contact our Technical Consultant, hazeldigitalresources.com.au for further information. We would still want to hear from you if you're enthusiastic about the position but unsure if you match the requirements. When a new opportunity arises, you will be given priority consideration once you join our talent network. At this stage, our client can only consider those with Australian PR/Australian Citizenship in place and who are currently based in Western Australia. J-18808-Ljbffr • Fri, 01 MarDigital Resources
Data Engineer » North Hobart, Hobart - Data Engineers are in high demand, so an Internship in this particular stream of data science is a terrific way to start your career. Through this Data Engineer Internship, you will learn a wide range of skills in data extraction, manipulation, artificial intelligence, and database development to help you pursue a career in Data Engineering. As a Data Engineer Intern, you will be part of a high-performing data science team. You will have the opportunity to implement machine learning and deep learning technologies as well as use Python for data cleaning, wrangling and visualisation projects. Start your career as a Data Engineer with the Data Engineer graduate internship program. Data Engineer Internship Responsibilities Manage and analyse data to find trends for decision making Perform data acquisition from large-complex data sources using Python Support the Senior Data Engineer through different data extraction techniques Developing database schemas Work with Data Scientists to collect and analyse large data sets Work across multiple databases as a Data Engineering Intern What you will learn in your Data Engineer Internship? Develop your skills using OLTP databases, OLAP databases and ORMs and NoSQL databases Gain hands-on experience using extract-load data pipelines Experience using cloud platforms, DevOps, micro-services, and serverless architectures (ideally AWS) and deployments How to apply for the Data Engineer Internship? To be considered for this Data Engineering Internship, you need to be a member of the CSA Internship Academy. To learn more about doing an Internship and to book a free Career Counselling and Internships phone consultation , click the button below. Career Training, Mentoring & Internship Program Management fees apply to become a member of the CSA Internship Academy. J-18808-Ljbffr • Fri, 01 MarCareer Success Australia
Data Engineer » Dunnstown, Moorabool Area - Data Engineers are in high demand, so an Internship in this particular stream of data science is a terrific way to start your career. Through this Data Engineer Internship, you will learn a wide range of skills in data extraction, manipulation, artificial intelligence, and database development to help you pursue a career in Data Engineering. As a Data Engineer Intern, you will be part of a high-performing data science team. You will have the opportunity to implement machine learning and deep learning technologies as well as use Python for data cleaning, wrangling and visualisation projects. Start your career as a Data Engineer with the Data Engineer graduate internship program. Data Engineer Internship Responsibilities Manage and analyse data to find trends for decision making Perform data acquisition from large-complex data sources using Python Support the Senior Data Engineer through different data extraction techniques Developing database schemas Work with Data Scientists to collect and analyse large data sets Work across multiple databases as a Data Engineering Intern What you will learn in your Data Engineer Internship? Develop your skills using OLTP databases, OLAP databases and ORMs and NoSQL databases Gain hands-on experience using extract-load data pipelines Experience using cloud platforms, DevOps, micro-services, and serverless architectures (ideally AWS) and deployments How to apply for the Data Engineer Internship? To be considered for this Data Engineering Internship, you need to be a member of the CSA Internship Academy. To learn more about doing an Internship and to book a free Career Counselling and Internships phone consultation , click the button below. Career Training, Mentoring & Internship Program Management fees apply to become a member of the CSA Internship Academy. J-18808-Ljbffr • Fri, 01 MarCareer Success Australia
Data Engineer » Evandale, Northern Midlands - Data Engineers are in high demand, so an Internship in this particular stream of data science is a terrific way to start your career. Through this Data Engineer Internship, you will learn a wide range of skills in data extraction, manipulation, artificial intelligence, and database development to help you pursue a career in Data Engineering. As a Data Engineer Intern, you will be part of a high-performing data science team. You will have the opportunity to implement machine learning and deep learning technologies as well as use Python for data cleaning, wrangling and visualisation projects. Start your career as a Data Engineer with the Data Engineer graduate internship program. Data Engineer Internship Responsibilities Manage and analyse data to find trends for decision making Perform data acquisition from large-complex data sources using Python Support the Senior Data Engineer through different data extraction techniques Developing database schemas Work with Data Scientists to collect and analyse large data sets Work across multiple databases as a Data Engineering Intern What you will learn in your Data Engineer Internship? Develop your skills using OLTP databases, OLAP databases and ORMs and NoSQL databases Gain hands-on experience using extract-load data pipelines Experience using cloud platforms, DevOps, micro-services, and serverless architectures (ideally AWS) and deployments How to apply for the Data Engineer Internship? To be considered for this Data Engineering Internship, you need to be a member of the CSA Internship Academy. To learn more about doing an Internship and to book a free Career Counselling and Internships phone consultation , click the button below. Career Training, Mentoring & Internship Program Management fees apply to become a member of the CSA Internship Academy. J-18808-Ljbffr • Fri, 01 MarCareer Success Australia
Data Engineer » The Rocks, Sydney - Job Description Since Chat GPT came out everyone wants a Data Engineer on the team. But in reality is fewer than 5% of Data (or AI) projects ever make it to production. This strategy firm have been playing in the space way before it was cool. The majority of their projects are in financial services in Sydney and across APAC (lot's on in Singapore if you fancied being based there). Think projects focused on things like Open Banking, Blockchain and AI. They aren't into "body shopping" and most of their work is more concerned with strategy, POC's and architecture rather than pure delivery (there may be some oversight required of partner firms from time to time). You can create a flexible working dynamic best suited to your requirements. People are treated like adults here. There will be plenty of opportunities to advance your career (locally and globally if you wish). The Data Engineer isn't all about winning work; the consultants handle that; however, you need to be able to build relationships with technical and business peers on the client side, and in partner agencies. A commercial background in Data Engineering / Software / Consulting (5 years) Practical application of Data Technologies such as Spark An ability to translate technical concepts into language that normal people understand Financial services domain experience Please get in touch with ronnytheonset.com.au or on 0448 808 848 to discuss. J-18808-Ljbffr • Fri, 01 MarOnset Group
Data Engineer » Sydney, NSW - . Position: Data Engineer Experience: 3-7 years Location: Hybrid work arrangement Responsibilities: Design, build... improvement. Qualifications: 3-7 years of experience as a Data Engineer or similar role. Proficiency in Terraform, SQL... • Fri, 01 MarProfessional Recruitment Australia
Data Engineering » Canberra Region, Australian Capital Territory - please respond to the job if you are an Australian Citizen residing in Australia and have an Baseline Clearance. Contract start 01 March 2023 To 12 months, 3 x 12 months extensions. Australian Citizen, Baseline minimum with ATO PIEP clearance, Canberra role. Send your responses to jobssofttestpays.com Overview 1 Background The Australian Taxation Office (ATO) is striving to become a more data-driven, client-centric organisation, using data and analytics to improve the administration of tax and superannuation systems. The ATO is collecting increasingly greater volumes of data from a wide variety of sources. To support and inform ATO business decisions, this data must be curated and discoverable, accessible, and usable. The Data Acquisition Prefill and Sharing Program (DAPS) is a complex multi-year program aiming to: Establish a foundational technical solution for a new data acquisition pattern that is scaleable and extensible Migrate critical use-cases off the on-premise Cloudera Enterprise Data Hub (CDH6) to a new cloud hosted architecture Migrate and mature the ATOs Advanced Analytics Platform to a cloud platform Uplift data matching and sharing capabilities, and Acquire, ingest, and use new data. This program will establish a grouping of capabilities that will provide curated business-ready data that will underpin ATO decision making, providing the basis for tailored client services and improved compliance activities. 3 Detailed Scope & services The ATO requires specialised expertise in cloud architecture and security (including multi cloud architecture and operating models), data migration to cloud (with specific experience migrating from Cloudera CDH), and expertise in data architecture and pattern-based data platform implementation. The Provider will work in a blended ATO/Service Provider team to deliver the services. The provider will be engaged on a Time & Materials basis to: 1) Provide strategic, technical, and architectural guidance and advice on design and implementation of cloud-based data and analytic solutions across capabilities including (but not limited to): a. Cloud Architecture b. Data Architecture c. Data Migration d. Data Engineering e. DevOps practices f. Cloud security (including user, data, and application layer) g. Network capacity & Bandwidth h. Cost Optimisation 2) Provide technical and strategic advice in other specialist areas, including seamless data movement, processing and governance, and pattern-based delivery architecture 3) Assist in the formulation and creation of data integration solutions across the ATO's data and analytic landscape, including data movement across mainframe, midrange and third-party data sources into the Teradata Enterprise Data Warehouse and Cloud hosted Enterprise Lakehouse (ELH) architecture. 4) Contribute to the successful migration of data and applications from Cloudera Enterprise Data Hub (CDH) to the cloud hosted Enterprise Lakehouse (ELH), including supporting contingency planning and mitigation options around CDH approaching end-of-life. 5) Work with teams to contribute to the successful delivery of new cloud hosted enterprise systems and subcomponents across the entire software development lifecycle, including but not limited to architecture, design, build and test activities as required. 4 Resourcing skillsets The Service Provider engaged by the ATO will be required to provide 1 x resource with demonstrate skills and experience relevant to the above required scope and services, and: Have demonstrated experience and expertise in data migration from Cloudera Enterprise Data Hub on premise (CDH) to cloud. Have demonstrated experience and expertise in establishing cloud architecture, including across a multi cloud, multi-vendor environment. Have demonstrated experience in data architecture, including defining data acquisition patterns and governance. Have demonstrated experience in implementation of an enterprise data platform/ Have proven ability to work with business users and technology stakeholders in solving problems, providing advice, and formulating solutions. 7 Security Clearance The Service Provider resource must have a current Baseline Security Clearance and will also be required to undertake a Pre-Employment Integrity Check prior to commencing work with the ATO. • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Canberra Region, Australian Capital Territory - Australian Citizens residing in Australia with Baseline Clearance only respond. Contract start 14 April 2023 to 12 months, 12 months extensions. Australian Citizen, Baseline Clearance, Canberra or Brisbane, Hybrid role. Send your responses to jobssofttestpays.com Overview Data Engineer Understand complex information systems and design effective data solutions and models to store and retrieve departmental data in adherence to enterprise standards. Drive Stakeholder workshops in order to identify client requirements and conceptualise and communicate these requirements via visual aids such as Data models. Lead and champion designs including Data Integration Designs, Data Structures, Data Flows, Solution Architectures, Design Standards, Data Strategies to both business executive as well as to technical staff. Configure data stores and data flows across our enterprise platforms (including MS Synapse, IBM Domino, Oracle database and Teradata EDW environment) Recommend solutions and patterns for improving processes in existing or new systems/capabilities. Support the iterative design and definition of common entities in the Enterprise Data Model Work collaboratively in Scaled Agile Framework team to deliver contemporary data solutions, prototypes in a hybrid hyperscale cloud environment. The Deliverables for the Data Engineer include, but are not limited to the following: Data Management artefacts including Data Strategies, business glossaries, business rules, data quality, governance principles, stakeholder engagements and presentation/showcases Data Architecture artefacts including Solution Architecture Documents, Patterns, Design Standards and Options papers Data Modelling artefacts such as conceptual, logical and physical models, data dictionaries and canonical/enterprise data/information models Configuration of Microsoft Synapse and related tools to implement relevant data flows and data governance. Required skills and experience Data Engineer Demonstrated ability to lead a team of developers and data engineers in building effective data products. Demonstrated experience implementing and supporting large data integration patterns and complex data migrations. Demonstrated experience in implementing data transfer patterns including event, streaming and batch processes. Demonstrated experience in both ETL and ELT tools and techniques. Familiarity with emerging agile Data management approaches including DataOps and MLOps. Demonstrated release management experience in coordinating concurrent releases as part of a hybrid Scaled Agile / Waterfall delivery process within a large organisation. A strong ability to adapt to, and work across various delivery methodologies (ITIL, Agile, SAFe etc.) Demonstrated ability to work effectively with multi-disciplinary teams. A continuous improvement mindset. A proven ability to communicate with a range of stakeholders. Proven interpersonal (soft) and communication skills. Experience using Azure DevOps, SharePoint and TRIM highly valued. Qualifications on Microsoft Synapse Platform or related Microsoft Products or equivalent experience and ability to gain this qualification Every application requires to address selection criteria as part of application submission. Essential Criteria 1. Experience with Azure DevOps, Data Factory, SQL Server, Blob/Gen2 Data Lake storage and Azure Synapse. (60%). 2. Experience working in an agile software development team. (20%). 3. Bachelors degree in Computer Science or a relevant qualification in information technology field. (20%). • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Brisbane, Brisbane Region - Australian Citizens residing in Australia with Baseline Clearance only respond. Contract start 14 April 2023 to 12 months, 12 months extensions. Australian Citizen, Baseline Clearance, Canberra or Brisbane, Hybrid role. Send your responses to jobssofttestpays.com Overview Data Engineer Understand complex information systems and design effective data solutions and models to store and retrieve departmental data in adherence to enterprise standards. Drive Stakeholder workshops in order to identify client requirements and conceptualise and communicate these requirements via visual aids such as Data models. Lead and champion designs including Data Integration Designs, Data Structures, Data Flows, Solution Architectures, Design Standards, Data Strategies to both business executive as well as to technical staff. Configure data stores and data flows across our enterprise platforms (including MS Synapse, IBM Domino, Oracle database and Teradata EDW environment) Recommend solutions and patterns for improving processes in existing or new systems/capabilities. Support the iterative design and definition of common entities in the Enterprise Data Model Work collaboratively in Scaled Agile Framework team to deliver contemporary data solutions, prototypes in a hybrid hyperscale cloud environment. The Deliverables for the Data Engineer include, but are not limited to the following: Data Management artefacts including Data Strategies, business glossaries, business rules, data quality, governance principles, stakeholder engagements and presentation/showcases Data Architecture artefacts including Solution Architecture Documents, Patterns, Design Standards and Options papers Data Modelling artefacts such as conceptual, logical and physical models, data dictionaries and canonical/enterprise data/information models Configuration of Microsoft Synapse and related tools to implement relevant data flows and data governance. Required skills and experience Data Engineer Demonstrated ability to lead a team of developers and data engineers in building effective data products. Demonstrated experience implementing and supporting large data integration patterns and complex data migrations. Demonstrated experience in implementing data transfer patterns including event, streaming and batch processes. Demonstrated experience in both ETL and ELT tools and techniques. Familiarity with emerging agile Data management approaches including DataOps and MLOps. Demonstrated release management experience in coordinating concurrent releases as part of a hybrid Scaled Agile / Waterfall delivery process within a large organisation. A strong ability to adapt to, and work across various delivery methodologies (ITIL, Agile, SAFe etc.) Demonstrated ability to work effectively with multi-disciplinary teams. A continuous improvement mindset. A proven ability to communicate with a range of stakeholders. Proven interpersonal (soft) and communication skills. Experience using Azure DevOps, SharePoint and TRIM highly valued. Qualifications on Microsoft Synapse Platform or related Microsoft Products or equivalent experience and ability to gain this qualification Every application requires to address selection criteria as part of application submission. Essential Criteria 1. Experience with Azure DevOps, Data Factory, SQL Server, Blob/Gen2 Data Lake storage and Azure Synapse. (60%). 2. Experience working in an agile software development team. (20%). 3. Bachelors degree in Computer Science or a relevant qualification in information technology field. (20%). • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Canberra Region, Australian Capital Territory - Australian Citizens With ability to obtain NV1 Clearance residing in Australia only respond Contract start 03 October 2023 to 30 June 2023, 2 x 12 months extensions. Australian Citizen, Ability to obtain NV1 Clearance, Canberra role. Send your responses to jobssofttestpays.com Overview The Data Capability Exchange (DCx) section is a part of the Enterprise Transformation Branch within the Enterprise Transformation Division. We are a collaborative and driven team of professionals focused on supporting business with data analytics, geospatial capability and report visualisation to inform broader organisation governance, policies and operations. It is an exciting time as the team also works closely with the Election Systems Modernisation program to define the new landscape for the AEC systems and processes. As part of the team, you also get to see tangible outcomes from your work: most notably the delivery of a Referendum and other electoral events and services. The Power BI Data Engineer will be primarily focussed on improvements to existing Power BI reports as well as development of new reports utilising both Power BI reporting Services (on-prem) and Cloud Services platforms. Data Engineers will be also expected to have an understanding of the report-development life cycle, articulating data visualisation techniques with report-owners and be able to work with limited supervision. The role will be responsible for, but not limited to: Evaluate business needs and objectives. Conduct complex data analysis and report on results. Explore ways to enhance data quality and reliability. Build algorithms and prototypes. Interpret trends and patterns. Identify opportunities for data acquisition. Prepare data for prescriptive and predictive modelling. Collaborate with relevant stakeholders on projects/outcomes. Analyse and organize data from different sources, Incl Structured and Unstructured Build data systems and pipelines including building ETL/ELT Every application requires to address selection criteria as part of application submission. Mandatory Criteria Previous experience in a similar role, i.e. Data Engineer, Data Analyst, Developer Numerical and analytical skills Hands-on experience with SQL database design, SSAS, SSIS Knowledge of programming languages and frameworks (e.g. C#, Python, R, SQL, BIML) Technical expertise with data modelling, data mining, and segmentation techniques Working knowledge using Microsoft Power platform, mainly PowerBI Degree and/or related certification Desirable Criteria Experience in Power BI Cloud Services administration user access, workspaces, publishing. Understanding of pro-active co-design/collaboration techniques. Experience in developing corporate/strategic reports for senior management. • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Sydney, Sydney Region - Australian Citizens with the ability to obtain Baseline Clearance residing in Australia only respond. Contract start 05 February 2024 to 6 months, 2 x 6 months extensions. Australian Citizen, baseline Clearance preferred, Canberra or offsite role. Send your responses to jobssofttestpays.com Overview About the Branch The role of the Policy, Markets and Data Branch is to provide insights, solutions and advice to decision makers, policy makers and the market to improve scheme operations and position the agency as a modern, efficient and effective regulator. About the Section The Data and Innovation section supports the agency to unlock the potential of our data to drive innovation and compliance for the agency. With a major uplift in data capability for the Agency, planned and delivered through our change program, there are vast opportunities to be a key contributor to drive, design and deliver policies, frameworks, and infrastructure to suit the agencys data operation needs. You will be part of a dedicated team that will clarify accountability to better enable crucial products needed to support program-level transparency and efficiency in data migration, data architecture design and analytics. Role Responsibilities and Duties Harmonizing the data migration, data architecture design and analytics across different schemes needed to effectively manage resources and delivery operations. Collecting data in an unformed and consistent way. Decrease the data migration complexity with well-defined common entity designs across the projects. Defining a common set of semantics being managed and reflected in the data architecture. Hold workshops to determine common and specific data attribution for each scheme. Implement data pipelines across D365 into DataLake and Synapse Analysis Qualifications/Experience Minimum 7 years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in data integration tools Experience with processing large volumes of complex data. Ability to assess complexity of data (volume, structure, relationship etc.) A strong understanding of data modelling, data structures, databases, and processes Implemented Azure DecOps CI/CD with custom parameters for build and release pipelines for Data Factory, Synapse, and data lake artifacts. Hands on technical expertise implementing and delivering data solutions and pipelines in Azure. Hands on technical expertise working with DataLake and Synapse Analytics Hands on technical expertise working with Dynamics 365 Dataverse and SQL data sources. Hands on technical expertise setting up DevOps automation using tools such as GitHub etc. Experience building data profiling, data cleansing and data engineering frameworks. Exceptional communication, documentation, and presentation skills Will require Australian Citizenship with a Baseline Security clearance as a minimum. Every application requires to address selection criteria as part of application submission. Essential criteria 1. The extent to which the candidate has the required level of experience and qualifications to deliver against the role description. • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Adelaide, Adelaide Region - Australian Citizens with the ability to obtain Baseline Clearance residing in Australia only respond. Contract start 05 February 2024 to 6 months, 2 x 6 months extensions. Australian Citizen, baseline Clearance preferred, Canberra or offsite role. Send your responses to jobssofttestpays.com Overview About the Branch The role of the Policy, Markets and Data Branch is to provide insights, solutions and advice to decision makers, policy makers and the market to improve scheme operations and position the agency as a modern, efficient and effective regulator. About the Section The Data and Innovation section supports the agency to unlock the potential of our data to drive innovation and compliance for the agency. With a major uplift in data capability for the Agency, planned and delivered through our change program, there are vast opportunities to be a key contributor to drive, design and deliver policies, frameworks, and infrastructure to suit the agencys data operation needs. You will be part of a dedicated team that will clarify accountability to better enable crucial products needed to support program-level transparency and efficiency in data migration, data architecture design and analytics. Role Responsibilities and Duties Harmonizing the data migration, data architecture design and analytics across different schemes needed to effectively manage resources and delivery operations. Collecting data in an unformed and consistent way. Decrease the data migration complexity with well-defined common entity designs across the projects. Defining a common set of semantics being managed and reflected in the data architecture. Hold workshops to determine common and specific data attribution for each scheme. Implement data pipelines across D365 into DataLake and Synapse Analysis Qualifications/Experience Minimum 7 years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in data integration tools Experience with processing large volumes of complex data. Ability to assess complexity of data (volume, structure, relationship etc.) A strong understanding of data modelling, data structures, databases, and processes Implemented Azure DecOps CI/CD with custom parameters for build and release pipelines for Data Factory, Synapse, and data lake artifacts. Hands on technical expertise implementing and delivering data solutions and pipelines in Azure. Hands on technical expertise working with DataLake and Synapse Analytics Hands on technical expertise working with Dynamics 365 Dataverse and SQL data sources. Hands on technical expertise setting up DevOps automation using tools such as GitHub etc. Experience building data profiling, data cleansing and data engineering frameworks. Exceptional communication, documentation, and presentation skills Will require Australian Citizenship with a Baseline Security clearance as a minimum. Every application requires to address selection criteria as part of application submission. Essential criteria 1. The extent to which the candidate has the required level of experience and qualifications to deliver against the role description. • Fri, 01 MarSofttest pays pty ltd
Data Engineers » Canberra Region, Australian Capital Territory - Australian Citizens residing in Australia with ability to obtain Baseline Clearance only respond. Contract start 18 December 2023 to 30 June 2024, 2 x 12 months. Australian Citizen, Ability to obtain Baseline Clearance, Canberra, Offsite role. Send your responses to jobssofttestpays.com Overview The Department of Agriculture, Fisheries and Forestry (DAFF) is looking for Data Engineers with strong experience developing Extract, Transform, Load (ETL) and Extract, Load, Transform (ELT) processes across several data and analytics platforms to join its Enterprise Analytics section. The role will be responsible for design, development and unit testing activities across several data movement and transformation processes within DAFF. These data movement processes are being developed to focus on the preparation of data for use in decision making processes across the department, utilising modern cloud technology (Azure) to enable operational analytics use cases. The successful candidate will require experience with the following techniques and technologies: MS Azure Stack Data Integration Data Factory SQL Server Integration Services Databricks Data Store SQL Server Data Lake Storage Analytics Azure Databricks Azure Machine Learning ArcGIS Enterprise Development Tools DevOps, Visual Studio, VS Code Data technology solutions sourcing (Oracle, Ingres, Azure, SQL Server), collecting, ingesting and storing Data Preparation Transformation of data into formats tailored to analytics use cases Parquet Delta Every application requires to address selection criteria as part of application submission. Essential Criteria Weighting 1. Experience developing ETL/ELT processes for data movement and transformation. 25% 2. Experience preparing data optimised for query performance in cloud computed engines. E.g. Distributed computing engines (Spark) Graph Databases Azure SQL 25% 3. Experience working with Engineering, Storage and Analytics services in cloud infrastructure. 25% 4. Working with multi-disciplinary teams using an Agile methodology 25% Desirable Criteria Weighting 1. Experience working with Azure Data Factory and Databricks 2. Experience/Knowledge of working with Data Lake and Lakehouses • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Canberra Region, Australian Capital Territory - Australian Citizens with the ability to obtain Baseline Clearance residing in Australia only respond. Contract start 05 February 2024 to 6 months, 2 x 6 months extensions. Australian Citizen, baseline Clearance preferred, Canberra or offsite role. Send your responses to jobssofttestpays.com Overview About the Branch The role of the Policy, Markets and Data Branch is to provide insights, solutions and advice to decision makers, policy makers and the market to improve scheme operations and position the agency as a modern, efficient and effective regulator. About the Section The Data and Innovation section supports the agency to unlock the potential of our data to drive innovation and compliance for the agency. With a major uplift in data capability for the Agency, planned and delivered through our change program, there are vast opportunities to be a key contributor to drive, design and deliver policies, frameworks, and infrastructure to suit the agencys data operation needs. You will be part of a dedicated team that will clarify accountability to better enable crucial products needed to support program-level transparency and efficiency in data migration, data architecture design and analytics. Role Responsibilities and Duties Harmonizing the data migration, data architecture design and analytics across different schemes needed to effectively manage resources and delivery operations. Collecting data in an unformed and consistent way. Decrease the data migration complexity with well-defined common entity designs across the projects. Defining a common set of semantics being managed and reflected in the data architecture. Hold workshops to determine common and specific data attribution for each scheme. Implement data pipelines across D365 into DataLake and Synapse Analysis Qualifications/Experience Minimum 7 years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in data integration tools Experience with processing large volumes of complex data. Ability to assess complexity of data (volume, structure, relationship etc.) A strong understanding of data modelling, data structures, databases, and processes Implemented Azure DecOps CI/CD with custom parameters for build and release pipelines for Data Factory, Synapse, and data lake artifacts. Hands on technical expertise implementing and delivering data solutions and pipelines in Azure. Hands on technical expertise working with DataLake and Synapse Analytics Hands on technical expertise working with Dynamics 365 Dataverse and SQL data sources. Hands on technical expertise setting up DevOps automation using tools such as GitHub etc. Experience building data profiling, data cleansing and data engineering frameworks. Exceptional communication, documentation, and presentation skills Will require Australian Citizenship with a Baseline Security clearance as a minimum. Every application requires to address selection criteria as part of application submission. Essential criteria 1. The extent to which the candidate has the required level of experience and qualifications to deliver against the role description. • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Perth, Perth Region - Australian Citizens with the ability to obtain Baseline Clearance residing in Australia only respond. Contract start 05 February 2024 to 6 months, 2 x 6 months extensions. Australian Citizen, baseline Clearance preferred, Canberra or offsite role. Send your responses to jobssofttestpays.com Overview About the Branch The role of the Policy, Markets and Data Branch is to provide insights, solutions and advice to decision makers, policy makers and the market to improve scheme operations and position the agency as a modern, efficient and effective regulator. About the Section The Data and Innovation section supports the agency to unlock the potential of our data to drive innovation and compliance for the agency. With a major uplift in data capability for the Agency, planned and delivered through our change program, there are vast opportunities to be a key contributor to drive, design and deliver policies, frameworks, and infrastructure to suit the agencys data operation needs. You will be part of a dedicated team that will clarify accountability to better enable crucial products needed to support program-level transparency and efficiency in data migration, data architecture design and analytics. Role Responsibilities and Duties Harmonizing the data migration, data architecture design and analytics across different schemes needed to effectively manage resources and delivery operations. Collecting data in an unformed and consistent way. Decrease the data migration complexity with well-defined common entity designs across the projects. Defining a common set of semantics being managed and reflected in the data architecture. Hold workshops to determine common and specific data attribution for each scheme. Implement data pipelines across D365 into DataLake and Synapse Analysis Qualifications/Experience Minimum 7 years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in data integration tools Experience with processing large volumes of complex data. Ability to assess complexity of data (volume, structure, relationship etc.) A strong understanding of data modelling, data structures, databases, and processes Implemented Azure DecOps CI/CD with custom parameters for build and release pipelines for Data Factory, Synapse, and data lake artifacts. Hands on technical expertise implementing and delivering data solutions and pipelines in Azure. Hands on technical expertise working with DataLake and Synapse Analytics Hands on technical expertise working with Dynamics 365 Dataverse and SQL data sources. Hands on technical expertise setting up DevOps automation using tools such as GitHub etc. Experience building data profiling, data cleansing and data engineering frameworks. Exceptional communication, documentation, and presentation skills Will require Australian Citizenship with a Baseline Security clearance as a minimum. Every application requires to address selection criteria as part of application submission. Essential criteria 1. The extent to which the candidate has the required level of experience and qualifications to deliver against the role description. • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Hobart, Hobart Region - Australian Citizens with the ability to obtain Baseline Clearance residing in Australia only respond. Contract start 05 February 2024 to 6 months, 2 x 6 months extensions. Australian Citizen, baseline Clearance preferred, Canberra or offsite role. Send your responses to jobssofttestpays.com Overview About the Branch The role of the Policy, Markets and Data Branch is to provide insights, solutions and advice to decision makers, policy makers and the market to improve scheme operations and position the agency as a modern, efficient and effective regulator. About the Section The Data and Innovation section supports the agency to unlock the potential of our data to drive innovation and compliance for the agency. With a major uplift in data capability for the Agency, planned and delivered through our change program, there are vast opportunities to be a key contributor to drive, design and deliver policies, frameworks, and infrastructure to suit the agencys data operation needs. You will be part of a dedicated team that will clarify accountability to better enable crucial products needed to support program-level transparency and efficiency in data migration, data architecture design and analytics. Role Responsibilities and Duties Harmonizing the data migration, data architecture design and analytics across different schemes needed to effectively manage resources and delivery operations. Collecting data in an unformed and consistent way. Decrease the data migration complexity with well-defined common entity designs across the projects. Defining a common set of semantics being managed and reflected in the data architecture. Hold workshops to determine common and specific data attribution for each scheme. Implement data pipelines across D365 into DataLake and Synapse Analysis Qualifications/Experience Minimum 7 years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in data integration tools Experience with processing large volumes of complex data. Ability to assess complexity of data (volume, structure, relationship etc.) A strong understanding of data modelling, data structures, databases, and processes Implemented Azure DecOps CI/CD with custom parameters for build and release pipelines for Data Factory, Synapse, and data lake artifacts. Hands on technical expertise implementing and delivering data solutions and pipelines in Azure. Hands on technical expertise working with DataLake and Synapse Analytics Hands on technical expertise working with Dynamics 365 Dataverse and SQL data sources. Hands on technical expertise setting up DevOps automation using tools such as GitHub etc. Experience building data profiling, data cleansing and data engineering frameworks. Exceptional communication, documentation, and presentation skills Will require Australian Citizenship with a Baseline Security clearance as a minimum. Every application requires to address selection criteria as part of application submission. Essential criteria 1. The extent to which the candidate has the required level of experience and qualifications to deliver against the role description. • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Darwin, Darwin Region - Australian Citizens with the ability to obtain Baseline Clearance residing in Australia only respond. Contract start 05 February 2024 to 6 months, 2 x 6 months extensions. Australian Citizen, baseline Clearance preferred, Canberra or offsite role. Send your responses to jobssofttestpays.com Overview About the Branch The role of the Policy, Markets and Data Branch is to provide insights, solutions and advice to decision makers, policy makers and the market to improve scheme operations and position the agency as a modern, efficient and effective regulator. About the Section The Data and Innovation section supports the agency to unlock the potential of our data to drive innovation and compliance for the agency. With a major uplift in data capability for the Agency, planned and delivered through our change program, there are vast opportunities to be a key contributor to drive, design and deliver policies, frameworks, and infrastructure to suit the agencys data operation needs. You will be part of a dedicated team that will clarify accountability to better enable crucial products needed to support program-level transparency and efficiency in data migration, data architecture design and analytics. Role Responsibilities and Duties Harmonizing the data migration, data architecture design and analytics across different schemes needed to effectively manage resources and delivery operations. Collecting data in an unformed and consistent way. Decrease the data migration complexity with well-defined common entity designs across the projects. Defining a common set of semantics being managed and reflected in the data architecture. Hold workshops to determine common and specific data attribution for each scheme. Implement data pipelines across D365 into DataLake and Synapse Analysis Qualifications/Experience Minimum 7 years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in data integration tools Experience with processing large volumes of complex data. Ability to assess complexity of data (volume, structure, relationship etc.) A strong understanding of data modelling, data structures, databases, and processes Implemented Azure DecOps CI/CD with custom parameters for build and release pipelines for Data Factory, Synapse, and data lake artifacts. Hands on technical expertise implementing and delivering data solutions and pipelines in Azure. Hands on technical expertise working with DataLake and Synapse Analytics Hands on technical expertise working with Dynamics 365 Dataverse and SQL data sources. Hands on technical expertise setting up DevOps automation using tools such as GitHub etc. Experience building data profiling, data cleansing and data engineering frameworks. Exceptional communication, documentation, and presentation skills Will require Australian Citizenship with a Baseline Security clearance as a minimum. Every application requires to address selection criteria as part of application submission. Essential criteria 1. The extent to which the candidate has the required level of experience and qualifications to deliver against the role description. • Fri, 01 MarSofttest pays pty ltd
Data Engineer » Melbourne, Melbourne Region - Australian Citizens with the ability to obtain Baseline Clearance residing in Australia only respond. Contract start 05 February 2024 to 6 months, 2 x 6 months extensions. Australian Citizen, baseline Clearance preferred, Canberra or offsite role. Send your responses to jobssofttestpays.com Overview About the Branch The role of the Policy, Markets and Data Branch is to provide insights, solutions and advice to decision makers, policy makers and the market to improve scheme operations and position the agency as a modern, efficient and effective regulator. About the Section The Data and Innovation section supports the agency to unlock the potential of our data to drive innovation and compliance for the agency. With a major uplift in data capability for the Agency, planned and delivered through our change program, there are vast opportunities to be a key contributor to drive, design and deliver policies, frameworks, and infrastructure to suit the agencys data operation needs. You will be part of a dedicated team that will clarify accountability to better enable crucial products needed to support program-level transparency and efficiency in data migration, data architecture design and analytics. Role Responsibilities and Duties Harmonizing the data migration, data architecture design and analytics across different schemes needed to effectively manage resources and delivery operations. Collecting data in an unformed and consistent way. Decrease the data migration complexity with well-defined common entity designs across the projects. Defining a common set of semantics being managed and reflected in the data architecture. Hold workshops to determine common and specific data attribution for each scheme. Implement data pipelines across D365 into DataLake and Synapse Analysis Qualifications/Experience Minimum 7 years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in data integration tools Experience with processing large volumes of complex data. Ability to assess complexity of data (volume, structure, relationship etc.) A strong understanding of data modelling, data structures, databases, and processes Implemented Azure DecOps CI/CD with custom parameters for build and release pipelines for Data Factory, Synapse, and data lake artifacts. Hands on technical expertise implementing and delivering data solutions and pipelines in Azure. Hands on technical expertise working with DataLake and Synapse Analytics Hands on technical expertise working with Dynamics 365 Dataverse and SQL data sources. Hands on technical expertise setting up DevOps automation using tools such as GitHub etc. Experience building data profiling, data cleansing and data engineering frameworks. Exceptional communication, documentation, and presentation skills Will require Australian Citizenship with a Baseline Security clearance as a minimum. Every application requires to address selection criteria as part of application submission. Essential criteria 1. The extent to which the candidate has the required level of experience and qualifications to deliver against the role description. • Fri, 01 MarSofttest pays pty ltd
Senior Engineer, Data and Business Intelligence » Chatswood, NSW - Sydney, NSW - Services team for a Senior Engineer, Data and Business Intelligence. Reporting into the Senior Product Owner – Finance... and practices, to enhance security, functionality, supportability and efficiency. More information about the Senior Engineer, Data... • Fri, 01 MarNSW Health$137173 - 163431 per year
Senior Engineer, Data and Business Intelligence » Chatswood, NSW - Sydney, NSW - . Come work with us We have an exciting opportunity in the Data Analytics Services team for a Senior Engineer, Data and Business.... More information about the Senior Engineer, Data and Business Intelligence role, can be found in the . About you The skills... • Thu, 29 FebNSW Health$137173 - 163431 per year
Engineer, Data and Business Intelligence » Chatswood, NSW - Sydney, NSW - . Come work with us We have an exciting opportunity in the Data Analytics Services team for an Engineer, Data and Business... of our Data Analytics Services. What you will do As a Support Engineer, you will play an integral role in maintaining... • Thu, 29 FebNSW Health$122850 - 139559 per year
Associate/Senior Electrical Engineer - Data Centre » Sydney, NSW - focus on Mission Critical Facilities (Data Centres). Our Property and Buildings team is a world-leading management... communities. What you'll do As a Associate/Senior Electrical Engineer you will be part of a wider network of professional... • Thu, 29 FebWSP
Associate Data Engineer » New South Wales - as an Associate Data Engineer. Reporting to the Associate Director – Technology Services, Data Analytics Service (DAS), as a key.... Come work with us We have an exciting opportunity in Technology Services team at Data Analytics Service (DAS), eHealth NSW... • Thu, 29 FebNSW Health$106142 - 125241 per year
Data Engineer » Melbourne, VIC - 20 Months Fixed Term Contract What you’ll do: The Data Engineer will be responsible to Implement processes... and available for downstream products, key stakeholders and business processes. The Data Engineer will be accountable for : Supporting the... • Thu, 29 FebEquifax
Associate Mechanical Engineer - Data Centre » Sydney, NSW - focus on Mission Critical Facilities (Data Centres). Our Property and Buildings team is a world-leading management... communities. What you'll do As a Senior/Associate Mechanical Engineer you will be part of a wider network of professional... • Thu, 29 FebWSP
SQL Data Administrator / Data Engineer » Sydney, NSW - their solutions to life. What the role is all about With us, you’ll have an Impact. As a Data Engineer, you will be responsible... as a Best Workplace for Women. Where the team fits in The SPACE tribe within the Technology and Data Group is responsible for driving... • Thu, 29 FebREA Group
Data Engineer » East Perth, WA - : We are seeking applications from enthusiastic individuals for the position of Data Engineer. In this position, you will work... such as Snowflake and Airflow to help our team implement exciting new enterprise data warehouse and system solutions. The role... • Thu, 29 FebGovernment of Western Australia$96307 - 105254 per year
Data Engineer » Sydney, NSW - years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in... or offsite role. Send your responses to Overview About the Branch The role of the Policy, Markets and Data Branch... • Thu, 29 FebSofttestpays
Data Analyst Data Engineer senior data analyst senior data engineer » Adelaide, SA - and strategy for the solution. Data Engineer Experts design and implement complex data flows to connect operational systems... role in resolving and managing issues that impact on performance and/or service. Data Engineer Experts drive innovation... • Thu, 29 FebSofttestpays
Senior Data Engineer - ANZ Plus » Melbourne, VIC - ' financial wellbeing. What's Your Mission? As an Engineer, supporting the Data Enablement stream within ANZx, your mission... is to build dynamic data solutions to support all modalities of data for a range of use cases. Your key accountabilities... • Thu, 29 FebANZ
Data Analyst Data Engineer senior data analyst senior data engineer » Brisbane, QLD - and strategy for the solution. Data Engineer Experts design and implement complex data flows to connect operational systems... role in resolving and managing issues that impact on performance and/or service. Data Engineer Experts drive innovation... • Thu, 29 FebSofttestpays
Data Engineer » Canberra, ACT - your responses to Overview Data Engineer Understand complex information systems and design effective data solutions and models... data solutions, prototypes in a hybrid hyperscale cloud environment. The Deliverables for the Data Engineer include... • Thu, 29 FebSofttestpays
Principal Data Engineer » New South Wales - as a Principal Data Engineer. Reporting to the Associate Director – Technology Services, Data Analytics Service (DAS), as a key.... More information about the Principal Data Engineer role, can be found in the . About you The skills, knowledge and experience... • Thu, 29 FebNSW Health$161042 - 180143 per year
Enterprise Data Warehouse Engineer » Brisbane, QLD - Purpose The Senior EDW engineer will advise, design, model and deliver strategic data assets to the data users and customers... support teams; Technical Competencies The Enterprise Data Warehouse Engineer must have 10+ years experience in similar... • Thu, 29 FebSofttestpays
Enterprise Data Warehouse Engineer » Canberra, ACT - Purpose The Senior EDW engineer will advise, design, model and deliver strategic data assets to the data users and customers... support teams; Technical Competencies The Enterprise Data Warehouse Engineer must have 10+ years experience in similar... • Thu, 29 FebSofttestpays
Data Engineer - Multiple » Canberra, ACT - Technology, or a related field (or equivalent experience). Relevant certifications (e.g., Microsoft Azure Data Engineer, AWS... data integration solutions that meet business requirements, utilising Microsoft Azure tools such as Azure Data Factory... • Thu, 29 FebSofttestpays
Enterprise Data Warehouse Engineer » Melbourne, VIC - Purpose The Senior EDW engineer will advise, design, model and deliver strategic data assets to the data users and customers... support teams; Technical Competencies The Enterprise Data Warehouse Engineer must have 10+ years experience in similar... • Thu, 29 FebSofttestpays
Enterprise Data Warehouse Engineer » Hobart, TAS - Purpose The Senior EDW engineer will advise, design, model and deliver strategic data assets to the data users and customers... support teams; Technical Competencies The Enterprise Data Warehouse Engineer must have 10+ years experience in similar... • Thu, 29 FebSofttestpays
Data Engineer » Hobart, TAS - years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in... or offsite role. Send your responses to Overview About the Branch The role of the Policy, Markets and Data Branch... • Thu, 29 FebSofttestpays
Data Engineer » Perth, WA - years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in... or offsite role. Send your responses to Overview About the Branch The role of the Policy, Markets and Data Branch... • Thu, 29 FebSofttestpays
Enterprise Data Warehouse Engineer » Sydney, NSW - Purpose The Senior EDW engineer will advise, design, model and deliver strategic data assets to the data users and customers... support teams; Technical Competencies The Enterprise Data Warehouse Engineer must have 10+ years experience in similar... • Thu, 29 FebSofttestpays
ETL Developer / Data Engineer » Canberra, ACT - / Data Engineer to provide technical leadership to carry out development of and provide production support for a series... / Data Engineer will include, but are not limited to: Liaise with internal and external clients to determine requirements... • Thu, 29 FebSofttestpays
Python API Develper and/or Data Engineer » Canberra, ACT - for an experienced Python API Developer and/or Data Engineer to work together with Covert & Technical Operations technical staff in... furthering the integration of its covert collection, analysis and compliance systems for the Data Management & Analysis Platform... • Thu, 29 FebSofttestpays
Enterprise Data Warehouse Engineer » Adelaide, SA - Purpose The Senior EDW engineer will advise, design, model and deliver strategic data assets to the data users and customers... support teams; Technical Competencies The Enterprise Data Warehouse Engineer must have 10+ years experience in similar... • Thu, 29 FebSofttestpays
Data Operations Engineer » Canberra, ACT - Engineer contractor, in accordance with position requirements. About the Branch The role of the Policy, Markets and Data... and automation. The Data Operations Engineer is responsible for the integrity of the production data utilized by CER employees... • Thu, 29 FebSofttestpays
Data Engineer » Canberra, ACT - years experience as a data engineer within the data and analytics domain Strong project delivery toolset experience in... or offsite role. Send your responses to Overview About the Branch The role of the Policy, Markets and Data Branch... • Thu, 29 FebSofttestpays
Data Engineer » Brisbane, QLD - your responses to Overview Data Engineer Understand complex information systems and design effective data solutions and models... data solutions, prototypes in a hybrid hyperscale cloud environment. The Deliverables for the Data Engineer include... • Thu, 29 FebSofttestpays
Data Analyst Data Engineer senior data analyst senior data engineer » Melbourne, VIC - and strategy for the solution. Data Engineer Experts design and implement complex data flows to connect operational systems... role in resolving and managing issues that impact on performance and/or service. Data Engineer Experts drive innovation... • Thu, 29 FebSofttestpays
Data Engineer » Canberra, ACT - from your work: most notably the delivery of a Referendum and other electoral events and services. The Power BI Data Engineer... Previous experience in a similar role, i.e. Data Engineer, Data Analyst, Developer Numerical and analytical skills... • Thu, 29 FebSofttestpays
Data Modeller/Engineer/Developer » Brisbane, QLD - a Data Developer/Engineer and a Data Modeller who will be responsible for the implementation of enterprise data pipelines... digital system where their systems and data, and therefore their people, are better connected. The Digital Division... • Thu, 29 FebPaxus
Data Engineer » Melbourne CBD, Melbourne - Use your ETL, cloud, automation skills and business acumen to drive business value and enhance organisation's data platforms and data use. About the Company The Client is a Financial Services organisation based in Melbourne's CBD. They are a Data-driven organisation and are constantly looking at better utilising their data for business use cases. About the Role As a Data Engineer, you will work on Data Engineering solutions and projects end-to-end. You will use your business understanding to solve business problems using the Data Tech stack. You will work with a variety of stakeholders and act as the Subject Matter Expert for Cloud Data Engineering. Duties Design, develop, and maintain scalable data pipelines for ETL processes within cloud environments. Oversee Azure Cloud infrastructure and services including: Data Factory, Data Lake, Synapse, Databricks, Apache Spark (Databricks and Snowflake being explored now). Integrate a wide range of disparate source systems and databases to the centralised Azure data lake. Collaborate with internal business stakeholders to understand different systems (ERP, CRM etc.) and the organisational data uses. Consult with internal colleagues to understand business logic. Support future Advanced Analytics and Machine Learning projects. Promote the utilisation and understanding of Master/Metadata while driving its development. Lead best practice in data handling and storage methods. Skills & Experience Experience working commercially with Cloud Technology (AWS, GCP or Azure) and on-prem data warehouses ETL, data migration and/or integration experience (SSIS, stored procs., ADF etc.) Master data concepts and tool understanding Experience with data architecture concepts and dimensional data modelling concepts Python and SQL experience Ability to seek solutions and passion for learning new technologies How to Apply If you are a forward-thinking Data Engineer with a passion for driving data-driven solutions, please apply and suitable candidates will be contacted for a further conversation. • Wed, 28 FebLaunch Recruitment
Data Engineer » East Melbourne, Melbourne - Design, develop, and maintain data engineering solutions in this newly created role 6-month fixed term contract | Full-time hours | Hybrid work arrangements (3 days WFO, 2 days WFH) Competitive remuneration and excellent NFP salary packaging benefits About the role The Data Engineer is responsible for designing, developing, and maintaining data engineering solutions within contemporary Microsoft platforms. The role sits within the ITSS team and works across functional departments to ensure the delivery of high-quality, scalable, and secure data solutions which support VMCH’s digital transformation objectives and compliance requirements. Your experience To be successful in this role, you will have a tertiary qualification in Computer Science or Information Technology and experience with the following: Microsoft platforms such as Azure Data Factory, Azure Synapse Analytics, Azure Databricks, Azure Functions, and Power Platform tools. Design and implementation of data lake solutions, preferably in Microsoft Fabric. Data engineering concepts, methodologies, and best practices, such as data modelling, data pipelines, data quality, data governance, data security, and data privacy. Various data sources and formats, such as structured, semi-structured, and unstructured data, SQL, JSON, XML, etc. Data analysis and visualisation tools, such as Power BI. About you You have outstanding communication and interpersonal skills, with the ability to engage and form relationships with a variety of stakeholders. A strong team player, you display a collaborative approach when working with cross-functional teams. You are an analytical thinker with strong attention to detail and a commonsense approach to solving problems and finding solutions. Why VMCH? Because the people we employ make it a great place to work In addition, VMCH will provide you with: A positive and supportive workplace culture Excellent Salary Packaging benefits Ongoing professional development A competitive salary Employee Assistance Program Paid parental leave About us Here at VMCH our services are as diverse as the people we support. As a for-purpose organisation, compassion, respect, and inclusion aren’t just words to us, they guide us in everything we do. While we’ve been around for over a century, we’re always moving forward. We stay true to who we are by supporting people and families to live their best lives, providing a place to call home and spaces to learn and grow. If you meet the above criteria and are passionate about making a positive difference in people’s lives, we would love to hear from you. To work at VMCH, you will be required to supply proof of work rights, an Australian Criminal History Check, Evidence of Flu and Covid Vaccinations, a Victorian Employee Working with Children Check and comply with Child Safety standards. You may also be required to supply an International Police Check. We do not accept unsolicited resumes, emails and phone calls from recruitment agencies. • Wed, 28 FebVMCH
Data Engineering Consultant / Data Engineer » Western Australia, Australia - We are currently recruiting for Data Engineering Consultants, specialising in Data and AI to join us in our Digital Transformation and Technology service team. As part of an international and fast-growing community of 250 data professionals located in our eight (8) Centres of Excellence (Paris, Montréal, Bruxelles / Amsterdam, New-York, London, Hong Kong, Middle East), you will be an important member of the data and technology team in Australia. We are seeking data engineering consultants with the below delivery and technical experience in: Functional requirements gathering, data design, data modelling and development, enhancing the management and value of data across organisations and markets. Technical delivery of scalable, efficient and best practice data pipelines, transformations and ETL's using Azure or similar tools. Data visualisation skills in translating business requirements and metrics into visual reports, dashboards and analytics that provide insight and value to business functions. Machine learning, AI and statistics, including developing predictive models of decision making, supervised and unsupervised learning algorithms, econometrics, forecasting, quantitative diagnosis. Understanding of key data governance and management concepts including data quality, master data management, meta data management and data lineage. Agile and traditional delivery experience in a data and analytics development environment participating in scrum ceremonies, sprint planning and retrospectives. Development and delivery experience within a consulting environment maintaining client expectation and ethics. In addition, we are seeking technical experience across all or some of the below: Strong SQL skills to support data pipeline and data modelling. Azure, Snowflake, AWS or Oracle data warehousing technologies Data transformation tooling including Azure Data Factory (ADF), Fivetran, Matilion, Oracle Data Integration, AWS Glue, AWS Data Pipeline or other. Azure services including Azure Synapse Analytics, Azure Data Lake Storage, Azure Databricks Analytics tooling including Power BI, Tableau, QuickSight or Oracle Analytics Cloud Skills related to SSIS, SSAS and SSRS to support traditional data activities. DevOps and CI/CD to support deployment automation, Azure preferred. Programming skills in Python or similar JIRA / Confluence or similar agile tooling experience We are looking for a commitment to ensuring projects are delivered beyond expectations. This role is supported through active coaching and mentoring which provides a platform for growth and progression. The fresh perspectives you create will help challenge the status quo and introduce new ways of thinking. You will work with others to develop workable, pragmatic solutions and provide a lasting impact by transferring knowledge and building long-term capability for clients. Bring your ambition and expertise to work with iconic organisations in the resources, energy transport and logistics, health and life sciences, tourism, financial services, government, utilities and not-for-profit sectors. • Mon, 26 FebSia Partners
Python C Engineer in Data Prominent Global Financial MarketMaking Firm » Australia - Are you interested in working on businesscritical data challenges and optimizing them towards usability Our client a leading global financial marketmaking firm wants to hear from you Responsibilities Develop applications and tools to support a de...
Location: Sydney • Sun, 25 Feb
Data Engineer » Melbourne CBD, Melbourne - Role Requirements Implements, configures and supports solutions in the Enterprise Data Platforms, ensuring availability, performance, capacity and reliability are in line with the agreed SLA. Completes development deliverables in a timely manner and to the appropriate level of quality so that the delivered solutions are fit-for purpose, can be efficiently supported and the impact of future change requests can be assessed. Works with IT, the Business and external partners to ensure that the technology solutions are delivered on time, on schedule and on budget Experienced in all aspects of implementations including initial requirements gathering, program design and build, user training, go live support, production support and problem resolution activities Perform peer review of code and designs Assists and supports testing activities (e.g. System, Integration and User Acceptance). Identifies key issues and risk elements and escalates promptly to the relevant escalation point Creates and updates application documentation so that support can be efficiently performed and the impact of future change requests can be assessed. Follows all change control procedures and put in place full version control and configuration management to ensure that the production service is not compromised. Follows the procedures for reporting on the progress of systems development projects, taking action to deal with exceptions, problems and unforeseen events in a timely manner, escalating as appropriate. Maintains up to date awareness of system security processes and procedures. Contributes to post implementation reviews to incorporate learning into future work. Maintains an understanding of relevant industry trends and current knowledge of the technology deployed in the business area. Maintains awareness of current Information Security standards and ensures all activities comply with the standards. Demonstrates sufficient consideration of customer needs and customer experience targets, as driven by business requirements. Excellent analytical and problem solving skills. Excellent communication and documentation skills Skills requirement Experience with one or more languages including Python, Java and SQL Experience in Cloud services (GCP preferable) Strong understanding of cloud computing, Big Data and ETL Experience with cloud to cloud integration Experience and exposure to BI tools such as Tableu or Looker is a plus Diversity and inclusion are strongly supported at Peoplebank. People of all nationalities, gender identities, and cultural backgrounds, including Aboriginal and Torres Strait Islander Peoples, are encouraged to apply. • Fri, 23 FebPeoplebank Australia Ltd
Data Engineer » Adelaide CBD, Adelaide - Our Company We exist to bring amazing people together to explore the art of possible. Bailey Abbott are a progressive and dynamic IT Consultancy business working across both Public and Private sectors. People are at the core of everything we do. We’re all about nurturing and inspiring people – ours and our clients. We bring a positive attitude and mindset to everything we do. We confidently use our knowledge and skills to solve problems, finding a better and easier way forward. We deliver transformational outcomes to take businesses further, fearlessly. Confidence and trust is fundamental to delivering superior outcomes and we take ownership and responsibility for our commitments. Our People Our specialist IT Consultants are creative, collaborative, and passionate. We’re strong advocates for change, challenging the status quo through new thinking, technology, and practices. Always exploring creative ways to simplify complexity. As a Consultant with Bailey Abbott, you will have the opportunity to build your project portfolio by gaining exposure and experience working across different clients, industries, technologies, and methodologies, all while having the stability and security of a permanent position. Our Career Opportunity Bailey Abbott have an exciting opportunity for a Data Engineer to join our Data Intelligence capability. Our Data Intelligence specialists work within an active community of practice, enabling you to benefit from their years of collective experience and wisdom. Working as a Data Engineer you will get the opportunity to: Represent Bailey Abbott, sharing your skills and knowledge to help deliver projects for our clients Develop and maintain ETL (Extract, Transform, Load) pipelines to ingest, process, and transform data within our Microsoft Azure-based data lake, utilising Azure Data Factory Design and implement efficient data models and schemas in Azure to ensure data quality and reliability, integrating Azure Purview for data governance Utilise your proficiency in SQL and NoSQL databases to optimise data storage and retrieval in Azure environments Leverage big data technologies such as Apache Spark, Hadoop, or Azure HDInsight to process and analyse large datasets Collaborate with data scientists, analysts, and stakeholders to understand business requirements and deliver valuable data solutions tailored for Azure ecosystems Write high-quality, scalable code in languages such as Python, Java, or Scala to support Azure-based data engineering tasks Continuously monitor, troubleshoot, and optimise data pipelines in Azure for performance and reliability Stay up-to-date with industry best practices and emerging technologies in the Azure landscape to drive innovation in our data platform Requirements Bachelor's degree in a related field or equivalent work experience Demonstrated experience in data engineering or a related role, with a specific focus on Microsoft Azure Proficiency in SQL and NoSQL databases, with a strong understanding of data manipulation and querying in Azure Demonstrable experience in ETL pipeline development using Azure Data Factory, data modeling, and schema design Familiarity with Azure big data technologies such as Apache Spark, Hadoop, or Azure HDInsight Experience with Azure Purview for data governance and cataloging Application of CI/CD and DevOps methodologies, including source code management Programming skills in languages like Python, Java, or Scala, with a focus on Azure applications Strong problem-solving skills and attention to detail Effective communication and collaboration skills in an Agile team setting Azure certification or relevant cloud experience is highly desirable Exposure to emerging approaches such as Azure Fabric Benefits Paid Birthday Leave Access to Employee Assistance Program Flexible Working Options Regular team gatherings including monthly team lunch Professional development opportunities and continuous learning Financial benefits through Discount Store Our clients are diverse and so are we. We engage with great talent from all walks of life to bring their extensive and varied experience to help promote innovation. We encourage applications from candidates from all backgrounds to further strengthen Bailey Abbott. Bailey Abbott. Explore Possible. • Fri, 23 FebBailey Abbott
Data Engineer » Macquarie Park, Ryde Area - The Client One of Australia's leading universities in research, leading their way in ground-breaking discoveries. Their academics are at the forefront of innovation and, as accomplished researchers, they are actively engaged in the pursuit of high impact research. Benefits Contract until September (with view to extend). Hybrid Working / Flexible work arrangements. Daily Rate up to $1200 per day including super. Collaborative culture and work environment. The Role They are looking for a Data Engineer with a keen focus to design, develop, implement, scale, maintain solutions and large-scale processing systems. This includes building data pipelines, logging and monitoring. You will collaborate closely with stakeholders to address data infrastructure requirements and tackle technical challenges. This role will play an important role in supporting the implementation of a new, consolidated, secure CRM that will allow effective connection to successfully support users throughout their journey. The Program will deliver a student CRM (Salesforce) that will realise our ambitions for end-to end engagement and service delivery. Essential Skills Undergraduate / Post Graduate degree in Computer Science, Information Technology, or a related field Creation and maintenance of artefacts relating to data models (e.g. DDLs, mapping of data, DMLs, Data Dictionaries, Change Registers etc.) Assisting in the development and continuous improvement of data modelling and principles, processes, and standards Assisting in the implementation of data asset standards, and in the implementation of related data asset governance processes Experience with database, data pipeline and workflow management tools: Snowflake, JIRA, Wiki, DBT Airflow, etc. If you have the above skills, please APPLY NOW for early consideration. You MUST have rights to work in Australia. Please forward your CV in a Word format • Fri, 23 FebEnterprise IT Resources
Data Engineer » The Rocks, Sydney - About Ofload and our mission: At Ofload, we're at the forefront of digital freight services, empowering businesses to build efficient, flexible, and sustainable supply chains. Our mission is to achieve zero waste in trucking by digitising the transport industry. Every day in Australia, around 30% of trucks travel empty and a further 50% are idle amounting to mass inefficiency and harmful environmental impacts. Since 2020, we've successfully removed over 5 million empty kms from Australian roads. Our customer base has expanded to over 250 businesses, and we proudly support small business carriers by connecting them with household brands like Asahi, Metcash, and Noumi. As part of our sustainability commitment, we partnered with Trace to measure, reduce and offset our scope 1, 2 and 3 emissions, actively working to minimise our own environmental impact. In 2022, we propelled our growth through the strategic acquisition of a Melbourne-based logistics company, and raised $60m in Series B funding. Notably, we were recognised by the Australian Financial Review as one of Australia’s Most Innovative Companies. Fast forward to 2023, Ofload was named a Deloitte Tech Fast 50 Company in Australia, and secured the 4th spot on the LinkedIn Top Startups List 2023. And here's the exciting part – we're just getting started We are looking to grow our team and welcome people excited by the opportunity to transform the logistics industry. If you're ready to be part of a dynamic team driving change, join us on this incredible journey The Role: We’re currently looking for a talented Data Engineer / Machine Learning Engineer to help leverage our accumulated shipment data to build out a best in class AWS Data Warehouse for use by our Data Science and BI teams. We are developing some game changing prediction and recommendation engines to disrupt the freight industry and there is an opportunity for the right candidate to get hands-on with not just the ETL process but to also collaborate with our Data Scientists to assist in feature engineering and optimise models for production deployment. Responsibilities: Design, implement, and maintain scalable and reliable data pipelines from our production databases (PostgreSQL/MongoDB/TimeStream) using AWS Glue. Collaborate with software engineering, data science and BI teams to understand data requirements and ensure data availability and quality. Develop and deploy machine learning models using AWS SageMaker. Manage and optimise AWS infrastructure components to ensure performance, scalability, and cost-effectiveness. Implement and maintain monitoring and logging solutions for the data and machine learning pipelines. Requirements: Proven experience as a Data Engineer or Machine Learning Engineer, with a focus on AWS technologies. Hands-on experience with AWS services such as S3, Glue, Redshift, SageMaker, Aurora and others. Proficient in Python and Terraform / CDK Strong understanding of data modelling, ETL processes, and data warehousing concepts. Experience deploying and managing machine learning models in production. Life at Ofload: At Ofload, our values are in our DNA and underpin the work we do every day. We are proud of the collaborative, diverse and innovative culture we continue to build. The Ofload team is motivated by a shared ambition to digitise the logistics industry and have a truckload of fun while doing it As well as the opportunity to join a high-growth organisation with the ability to make a meaningful impact everyday, you will enjoy; Working in a hybrid environment with a split of in-office and working from home. Our philosophy on working is one built around connection, inclusion and flexibility. Holistic wellness support for you and your family through our wellbeing partnership with Acacia Connection Generous parental leave entitlements including 16 weeks primary caregiver leave and continued superannuation contributions Annual learning and development budget to support your professional growth Paid volunteer leave to support the causes that matter to you Plus all the exciting perks of joining a startup - epic Ofload swag, weekly catered breakfasts, monthly team lunch, a reward and recognition program and plenty of social events. Ofload is committed to building diverse and inclusive teams, and a culture where people can bring their whole authentic selves to work each day. Applications are highly encouraged from Aboriginal and Torres Strait Islander people; and equity target groups including women, people from culturally and linguistically diverse backgrounds, people with disabilities, and people who identify as LGBTIQ. We endeavour to create an equitable recruitment process with an even playing field for all candidates. If you require any adjustments to be made, please let a member of our recruitment team know. • Wed, 21 FebOfload
Data Engineer » Sydney, Sydney Region - Data Engineer Sydney $160,000 Inc Super Bonus As a Data Engineer within our dynamic Engineering team, you will play a crucial role in developing cutting-edge applications to support our Data Transformation initiatives. This is a permanent position offering the opportunity to work on innovative projects within a supportive team environment. Key Responsibilities: Utilize your expertise in data warehousing and information management to contribute to the development of our future-ready Data Platform. Design and implement data models to support business requirements. Leverage your Linux/Unix skills and prior experience with AWS to build robust and scalable data solutions. Develop and optimize complex SQL queries and data manipulation processes. Utilize Big Data querying tools such as Hive, Spark, and Presto to extract insights from large datasets. Implement API-based integrations and orchestrate data pipelines using tools like Apache Oozie, Airflow, or Argo Workflows. Apply your knowledge of technical solutions, design patterns, and code to develop medium/complex applications deployed in clustered computing environments. Desired Skills and Experience: Proficiency in programming languages such as Scala, Java, or Python, particularly in the context of Big Data technologies like Spark. Experience working with Docker, Kubernetes, and familiarity with containerized environments. Exposure to security concepts and best practices related to data engineering. Knowledge of machine learning concepts and tools such as Spark ML or R. Experience with major Hadoop distributions such as Cloudera, MapR, or Hortonworks HDP. Familiarity with build tools like Maven, Gradle, and Ant. If you feel you possess the relevant skills to the above, apply now with most updated CV • Wed, 21 FebTalenza
Data Engineers » Brisbane, Brisbane Region - ACT, NSW, NT, QLD, SA, TAS, VIC, WA. 3 days WFH 6 6 6 6 month contract (Baseline security clearance required) Develop code, patterns and frameworks in a data warehouse environment predominately using SQL Our client a large Federal Government entity focused on meteorology is seeking Data Engineers About the client The client exists to meet the Nation's commitment to care for our veteran and defence service community About the role - Data Engineers Understand databases, data latency, data and storage formats, persistence needs, data volatility, business rules and aggregations, and other data engineering considerations Duties Undertake the back-end development of Teradata Enterprise Data Warehouse, including data transformation and automation of data flow jobs that produce data products Build and maintain data definition descriptors & database structures, such as tables, views, indexes, and constraints Quality assure the transformation and flow process and undertake data quality checks Skills / Experience Maintaining reference data, and business specific datasets by automating ETL jobs Undertaking necessary EDW administration activities, such as deployments, production issues, capacity limits, performance tuning, logging & monitoring, scheduling of jobs, and the setup of new environments Experience in multiple roles that include Business Analysis, Data Analysis, Data Modelling, Technical Writing, and Data Engineering How to apply To be considered for this role (JOB ID: EF8948), please forward your resume to Eric by clicking the Apply for this Job button. Please note that this role closes on the 23rd of February. Recruitment Hive: (02) 6299 1006 • Wed, 21 FebRecruitment Hive
Data Engineer » Melbourne, Melbourne Region - Data Engineer Hybrid Working - 3 days in Melbourne CBD Up to $160k plus super plus bonus Like the autonomy and pace of working in a startup, but not the risk that comes with it? Then you should be talking to this growing energy scaleup. Here, you'd join what is still a relatively small data team. That means touching every part of the data lifecycle, and a key role in championing widespread adoption of data and analytics. The ethos here is still very startup. The role of data, ML and AI in the business is still evolving. There's also a strong culture of autonomy and taking ownership. It's nearly 10 years since the business' inception though. That means a very well-established market position and customer base. As a utilities business, they're also part of a market-sector that's famously resilient. If job security if high on your agenda, this is great place to be. Sounds great, but what will I be doing? You'll be working "end-to-end" here. You'll likely find this role is a little more varied than some of the others you've seen advertised. In the near-term, much of your focus will be on enabling C-suite to make data-driven decisions, through better reporting and insights. That means working closely with stakeholders across the business, developing reporting dashboards, and delivering end-to-end solutions on the Azure platform. There's a longer term roadmap here though. That includes experimenting with new ways to add business value through predictive analytics, AI and ML. Key tools you'll work with day-to-day include T-SQL, Synapse, Azure, REST and a variety of cloud based BI tools. Key skills/experience: We're open to talented mid level people, or established seniors. What's important is that you offer a good blend of data engineering, and analytics skills. In terms of tooling, you'll need a working knowledge of: T-SQL Azure Synapse Analytics or Dat Factory (ADF) Python/PySpark If that sounds like you, either hit apply or reach out to Jay at Aurec for ore details on this and other Data Engineer openings in Melbourne. • Wed, 21 FebAurec
Data Engineer » Melbourne CBD, Melbourne - Company Description Discover Your Path at Keypath About Us: At Keypath, we invite you to be part of something transformative. As a global EdTech leader, we collaborate with prestigious universities to create and deliver online education solutions that tackle global social and economic challenges head-on. Join our dynamic, fast-growing international team that thrives on collaboration, innovation, and lifelong learning. With offices in Melbourne, Sydney, Chicago, and SE Asia, we are truly a global company. Today, we're a family of over 700 passionate "Keypathers" worldwide. Work Anywhere: At Keypath Education Australia, flexibility is our strength. We put our people first, allowing you to choose where you work - from home, in the office, or both. The successful candidate will work closely with the Data Engineering team to design & maintain the critical data infrastructure and pipelines that bolster the company's tactical and strategic data-driven initiatives. The Data Engineer will be a technical expert in the development of system-to-system integration solutions, complex data warehouse data models, and highly functional semantic layers that translate raw company data into reliable and actionable information. Additionally, the Data Engineer will ensure data pipelines and systems are efficient and reliable, and support insight and growth across the global Keypath organization. Responsibilities include but are not limited to the following: Design and develop Azure Data Factory data integration pipelines and SQL Server data warehouse solutions that support continuous delivery of data insights to business stakeholders. Develop complex SQL/TSQL data warehouse procedural processes and data models to drive business insights and support organizational decision making. Develop resilient system integration solutions using Azure Functions, Python, and .Net codebases and related development frameworks. Work closely with the IT Product team and business sponsors to gather information and data requirements based on business operations feedback and conveyed reporting needs. Design and manage Azure Analysis Services semantic-layer models to support business user self-service Power BI analysis initiatives. Collaborate with team members on SDLC activities and standards, including code Peer Review and/or Pull Request approvals in Azure DevOps. Qualifications Education: Minimum BS/BA Degree, preferably in a Technology, Computer Science, Mathematics or Statistics related discipline. Experience: 4 years of data engineering experience with a proven history of successfully developing and deploying data warehouse and data integration solutions. 2 years of knowledge and experience developing and implementing data integrations using Microsoft Azure technologies, including Data Factory and SQL Server for data processing, data warehouse, and data modeling solutions. 2 years of software engineering experience using modern data engineering languages and platforms, such as Python, Microsoft .Net, and Azure Functions. Proficiency with semantic layer and data packaging concepts (Cubes, Tabular Models), preferably with Microsoft Azure Analysis Services (AAS). Expertise in the design and maintenance of Data Pipelines, Data Transformations, and Data Flows within Azure Data Factory. Extensive understanding and experience with Agile and scrum engineering practices and principles. Technical Skills: Deep knowledge of Microsoft SQL Server database and Azure data engineering technologies required (ex. Data Factory, SQL/TSQL, Azure Functions, Analysis Services, and Power BI). Strong proficiency with Business Intelligence and reporting tools, such as Microsoft Power BI or Tableau. Advanced proficiency with Microsoft Azure Data Factory and Azure Functions. Understanding of predictive analytics platforms, software, languages, and libraries such as Microsoft Azure ML, SAS, Python, and/or R. Additional Information Benefits and Rewards: Opportunity to chose how you work - in office, hybrid or remotely Comprehensive online remote training. All the necessary equipment for success. Internet Allowance for remote work. Clear career progression pathways. Paid Mental Health days for your well-being. Toolkits for Digital Health and Mental Health support. Why Keypath? Embrace Your Authentic Self: A supportive culture recognized as a top workplace (Ranked 8 on 2023 the Best Places to Work Australia). Global EdTech Leader: Join an industry on the rise. Continuous Growth: Upskill through discounted study opportunities. Transform the World: Contribute to upskilling and reskilling globally. Make an Impact: Help solve future economic and social challenges. Join Keypath and make a difference. We welcome applicants from diverse backgrounds and offer inclusive support throughout the recruitment process. Apply now to unlock your potential with Keypath • Sun, 18 FebKeyPath
Data Engineer » Alexandria, Inner West - About the Company: Join an ASX-listed organization, who is seeking an Azure Data Engineer to join their team in Sydney. They hold a leading position in the personal transport sector and are recognized globally for their dispatch technology deployments and expertise in the payments space. They have a significant impact on the taxi industry throughout Australia, with a portfolio that encompasses well-established brands within the sector. Dedicated to supporting professional drivers and passengers, our client provides cutting-edge technology services to a substantial percentage of the taxi network in Australia and extends its influence globally, reaching taxis in multiple cities across North America, Europe, and New Zealand. Their commitment to accessible, dependable, and equitable transport is evident in their active efforts to build a strong team, advance technology, and foster brands that deliver exceptional transport solutions. Key Responsibilities: Collaborate on data solutions using Azure technologies: Data Factory, Databricks, Synapse. Develop and maintain ETL processes with SSIS and T-SQL. Apply Kimball methodologies for effective data models and star schemas. Requirements: Bachelor's degree in Computer Science, Information Technology, or related field. Proven experience in Data Engineering, focusing on Azure technologies. Proficiency in T-SQL, SSIS, Azure Data Factory, Databricks, and Synapse. Familiarity with Kimball methodologies. Key Benefits: Cutting-edge Technology: Work with the latest Azure tech. Hybrid Work: Balance office and remote work. Career Growth: Opportunities for professional development. If this aligns with the position you envision for yourself, take the next step by clicking the apply button or contacting Craig Nel at 02 6058 1085 or • Thu, 15 FebCox Purtell
Data Engineer » Fortitude Valley, Brisbane - Apply DataOps principles with a modern Data Tech stack to drive real business value at Picklebet, a growing Australian-based organisation. About the Company: Picklebet is an Australian owned and operated, next generation esports, sports and racing betting platform and media brand based in Brisbane. After continued growth, they are looking to expand their development capabilities for their in-house Data technology stack. Joining as part of a growing team, they need smart and self-motivated people to help them to become a major player both in Australia and internationally. About the role: They're looking for a skilled Data Engineer to strengthen their data systems and ensure smooth data flow across the organization. As a Data Engineer your focus will be on designing, building, and maintaining their data architecture. You'll work on creating efficient data pipelines, collaborating with teams, and troubleshooting data-related issues. The ideal candidate has a strong background in data engineering, covering: data modelling, ETL processes, and database management. Your role is crucial in establishing a scalable and reliable data ecosystem that supports the organisation's goals. Adaptability and problem-solving skills are key as you navigate the evolving data landscape. Reporting to the Head of Data and Analytics, you'll be a vital member of the data engineering team, contributing to innovation and enhancing the overall data infrastructure. The organisation has a modern approach to Data and Analytics and encourages a holistic DataOps approach to Data Management. While the organisation grows, the Technology Stack will likely evolve and you’ll be given the opportunity to work with modern Cloud and Data Technology. Responsibilities: Maintain and optimize the data warehouse, adhering to DataOps principles. Develop and manage efficient data pipelines from various sources. Create and update documentation for architecture and systems. Write high-quality, efficient code to ensure data reliability. Implement DataOps philosophy throughout data processes. Plan and execute system expansions to meet growth and analytic requirements. Collaborate with analysts to enhance operational efficiency. Address data needs across business functions and ensure data governance. Skills and experience: Design, optimisation and management of data warehouses Commercial Cloud experience SQL and Python experience Snowflake, Postgres and/or DBT experience highly regarded Turnkey ETL tooling (Fivetran, SSIS, Teradata, etc.) Orchestration tooling (Airflow, Dagster, etc.) Cloud service management, preferably IaC tools on AWS or GCP (Terraform etc.) beneficial Experience integrating data from SaaS tools such as advertising platforms (e.g. Google Ads) and CRMs (e.g. Hubspot). Experience ingesting and standardizing data from heterogeneous data vendors. Solid data modelling experience, preferably Kimball and/or Data Vault. A solid grasp of software engineering fundamentals; including Git, CI/CD, and automation. Ability to communicate architecture and process to an engineering audience. Startup mindset: team player, hustle and self-motivated. Qualifications 3 years’ experience in Data Engineering, preferably with a strong software engineering background. Bachelor's or Master's Degree in Computer Science or equivalent experience. An interest in one or more of the following would be beneficial; betting, esports, gaming, financial markets and/or sports. About you A curious mindset and a genuine need to solve problems and have meaningful impact through Data. Solid understanding of system architecture and horizontally-oriented optimisation. Excellent technical skills, and the ability to work in a team. A strong desire to do world class work and own your role. You don’t need to be a marketer or trader, but a working knowledge of marketing and bookmaking principles will serve you well. Working at Picklebet They are hungry for insights and have a strong culture of data-driven decision-making: your input will be highly valued and acted upon. Their employees make a material impact on the growth of the business within an egalitarian culture. They’re a remote-first business so you can work where you like (they have staff in Brisbane, Sydney, Darwin, Asia and South America). If you prefer to spend some time in the office, you can also work out of the Newstead (Brisbane) office. They also have a presence in Sydney, and expect that to grow as they scale. They make decisions quickly, encourage taking calculated risks and trying new things. They respectfully challenge each other to be better every day and radical (respectful) candor is one of core values. Work with a highly effective team of seasoned operators from some of Australia’s best brands including: Kayo Sports, Sportsbet, Dominos, Ladbrokes, Domain, PwC, Virgin Australia and more. In addition to competitive salaries, they offer an employee stock ownership plan to allow the team members to share in their success. A relaxed dress code; they aren’t a bank Sneakers & a tee is fine. They serve the Australian market (for now) but their aspirations are international and their tech has been built to scale across borders. How to Apply If you are a forward-thinking Data Engineer with a passion for delivering and unearthing value while solving business problems, please apply and suitable candidates will be contacted for a further conversation. • Thu, 08 FebLaunch Recruitment
Data engineer » Brisbane, Brisbane Region - As a Data Engineer on our team, you'll play a pivotal role in designing, implementing, and optimizing our data pipelines and architectures. Your primary focus will be on Azure-based solutions, leveraging your expertise to ensure efficient data processing, storage, and retrieval. This is an exciting opportunity to contribute to a long-term, cutting-edge data transformation program. Proven experience as a Data Engineer, preferably in a cloud environment with a focus on Azure. Strong proficiency in SQL and experience with data modeling techniques. Hands-on experience with Azure services such as Azure Data Factory, Azure Databricks, Azure SQL Database, and Azure Blob Storage. Excellent problem-solving skills and the ability to troubleshoot complex data issues. Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment. • Thu, 08 FebTalenza

Related Jobs in Australia

Search Data Engineer Related Links
Search Deputy Manager
Search Sales Executive
Search Illustrator
Search Java Consultant
Search Nail Technician
Search Graduate Teaching Assistant
Search Inside Sales
Search Team Leader
Search Pilot
Search Consultant
Search France Jobs