BI Data Engineer
Salary: Please contact us
Struggling to find a Data Engineer role with interesting challenges where you can be involved in Machine Learning and Data Analytics? We are helping our client based in Cardiff to scale their Data Team with Big Data Engineers with a passion for creating solutions to process data at scale using both batch and real-time scenarios, on a wide range of data driven projects.
About the company…
• They’re at the forefront of modern technology and have a cloud-first environment
• 100% remote working is offered so you can work anywhere in the UK. You will not be asked to return to the office when lockdown restrictions ease (or ever, in fact).
• Progression is something they actively encourage; they invest heavily in learning and development and promote from within
• You have a passion for Big Data, a love for Cloud and a deep understanding of data technologies and strong software engineering skills
• You have an inquisitive nature and want to constantly develop your skills.
• As this is a late-stage start-up, you will thrive in a fast paced environment and enjoy sharing ideas in the daily stand ups
About the role…
• Day to day, you will implement workflows to ingest data into the company’s Data Warehouse and Data Lake from a variety of data sources.
• You will work closely with Data Architecture, Business Intelligence, Data Science and IT Delivery Teams to design, develop and maintain highly scalable data solutions and pipelines.
• You will identify and document requirements from Business Intelligence and Analytics Teams turning them into data assets for consumption against stated requirements.
• Additionally, the role involves developing and refining datasets and processes recommending ways to improve both the reliability and underlying quality of the company’s data
The technology / experience needed…
• Data engineering experience ideally using AWS
• Hands on experience with AWS Athena, Redshift and underlying Data Lake concepts.
• Expert knowledge of both SQL and NoSQL database systems and concepts.
• Experienced with optimised storage file formats (parquet, avro etc.).
• Proficient at building big data infrastructure, ingestion and modelling pipelines.
• Strong ETL skills, fluent in using AWS Glue, AWS EMR / Spark, NIFI and Airflow
• Coding experience with Python, Java, Spark-SQL, R.
• Data modelling skills, able to model and visualise data requirements to aid technical design.
• Experience consuming and integrating with 3rd Party APi’s.
As well as having a good skillset within Data and Cloud technologies we would also like candidates with experience with Apache Hudi / Delta Lake concepts and a background in Agile.
Next steps… Apply now to be considered or if you have any questions, get in touch with Kim - firstname.lastname@example.org
IT Recruitment Consultant Team Lead
Get in touch:email@example.com