Data Engineer | Remotive.com

gravity9
Realising the next phase of an organisations digital journey requires more than just great technology. At gravity9, we have a different approach. With deep experience and personality, our team of designers and engineers unite art and science, to realise the next chapter in an organisations digital journey. We are just getting started!
Role Description
We are looking for a smart, enthusiastic Data Engineer with a keen interest in current technology and engineering.
We believe small, multi-disciplinary teams are most effective in delivering complex change. As such, we are looking for someone with a collaborative mindset who is happy to be flexible and take on different responsibilities over the lifetime of our projects.
The Data Engineer will collaborate with the Data Architect to implement data frameworks and processes. Develops data models, maintains data warehouse and analytics environment. Writes scripts for data integration and analysis. Collaborates with key stakeholders of the Data Transformation, Data Governance, and Business Intelligence teams to define business requirements and objectives. Designs and builds data provisioning workflows/pipelines, physical data schemas, extracts, data transformations, and data integrations and/or designs using ETL and microservices. Participates and leads peer development and code reviews with focus on test driven development and Continuous Integration and Continuous Development (CICD). Manages large projects or processes with limited oversight from manager. Mentors and reviews work for junior ETL Developers, Reporting teams, and functional teams in their acceptance of Data solutions. Problems faced are difficult and often complex.
Responsibilities:
- Designs and implements data management architecture to meet corporate data management needs and business functional requirements. Ensures that solution designs address operational requirements such as scalability, maintainability, extensibility, flexibility, and integrity.
- Designs and builds data provisioning workflows/pipelines, physical data schemas, extracts, data transformations, and data integrations and/or designs using ETL and microservices.
- Designs and develops programs and tools to support ingestion, curation, and provisioning of complex enterprise data to achieve analytics, reporting, and data science.
- Leads peer development and code reviews with focus on test driven development and Continuous Integration and Continuous Development (CICD).
- Monitors the system performance by performing regular tests, troubleshoots, and integrates new features.
- Engages with cross functional teams on database integration efforts for merging BI platforms with enterprise systems and applications.
Role requirements and experience
· Intermediate experience with methodologies, designs and processes in technical areas of ETL, ELT, and Data Modelling.
· Intermediate experience with developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data warehouse solutions.
· Intermediate experience with T-SQL and user input controls
· Intermediate understanding of providing practical direction within the Azure Native services.
· Hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Synapse/DW, Azure SQL DB.
· Intermediate experience with other BI tools such as Power BI, SSRS, SSAS, and Tableau.
· Understanding of data security best practices (database encryption approaches).
· Understanding of systems integrations and data integrations.
· Critical thinking, problem solving skills with the ability to look for root causes and implementable, workable solutions, as well as process improvement ability.
· Proven ability to perform high quality technical documentation and presentations.
Nice to have
· Experience with Big Data technologies such as PowerShell, Python, SQL, ADLS/Blob, Apache Spark/SparkSQL, Databricks, Hive, and streaming technologies such as Kafka, EventHub, NiFi etc.
· Experience with Azure Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Azure Cosmos DB, Azure Functions, Azure Data Lake.
· Knowledge of HDInsight, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc.
· Experience with RESTful APIs and messaging systems, and Microsoft Azure.
Role requirements – soft skills
Key attributes and behaviours to succeed in this role will be:
· Delivery focus – strong analysis and problem-solving skills. Ability to evaluate, design and implement effective solutions. Flexibility to adapt skills to a spectrum of client engagements, use cases and modes of change delivery.
· Exemplary standards of personal integrity and respect for others – professional approach to all aspects of client engagement and collaborative team working, able to quickly build relationships based on trust and transparency. Strong presentation and written and verbal communication are essential given the client facing nature of the role.
· Value driven – Creative thinking, with aptitude for innovation and strong desire to exceed client expectations.
· Energy – positive attitude and determination to learn and succeed.