Data Engineering Consultant

IT and Telecommunications

Job description

We are looking for a Data Engineer with a consultative edge to be a part of a consulting team working with big data platforms.

In this role you’ll be across projects delivering solutions on big data platforms and solutions on varied tech such as Cloudera, Hortonworks, AWS EMR, GCP or Hadoop, developing and designing data pipelines along with data workflow management, helping organisations start their journey into big data cloud-based solutions to support analytics/machine learning and AI.

We’re seeking experience across the following or similar tools; Apache Spark, Hive, HBase, Kafka, Sqoop, or NiFi, along with coding experience with Python or similar language  

The technical capability we are looking for, is to support the big data/cloud-based solutions team (AWS EMR, Google Big Query, Hadoop, Spark, Hive, Kafka, Nifi etc).

The ability to work with new technologies is essential.

Some of your key responsibilities will be to:

• Identify, design, and implement internal process improvements: automating
manual processes, optimizing data delivery, re-designing infrastructure for
greater scalability
• Build the infrastructure required for optimal extraction, transformation, and
loading of data from a wide variety of data sources using SQL and various
‘big data’ technologies
• Build analytics tools that utilize the data pipeline to provide actionable insights
into customer acquisition, operational efficiency and other key business
performance metrics

We’ll be progressing suitable applications as we receive them so strongly advise submitting your application as soon as possible.  For more information contact Chris Kent on chris.kent@beyond.co.nz

Share