Data Engineer
Leeds - Partly remote working
Job Number
38616
Posted
21st Jun 2021 : 6:05 pm
Job Status
Live
Job Type
Temporary
Duration
Other
Pay Type
Inside IR35
Pay Rate
Between £450.00 - £600.00
Payment Method
Hourly
Contact
Surita Dadral
Contact details
0203 356 4949, admin@121.uk.com
Job Description
The public sector client is looking to recruit a Data Engineer for a temporary contract, which is initially for 110 days, with potential for an extension thereafter. The location of this assignment is partly remote working and partially office based in Leeds. You must be willing to travel into the offices when required.
• Data Vault Proof of Concept – Solution, Platform and Security Architecture and Data flows.
Build, test, and promote data ingestion pipelines using Databricks
Build, test, and promote metadata-driven data pipelines using Databricks to load into Data Vault with the defined model
The successful candidate will have a valid DBS or be willing to apply for one on CV submission or when a job offer has been made and accepted.
Please Note: This assignment sits inside IR35.
About the Rates of Pay: Please note: The higher pay rate advertised in our job advert/s will always be the highest Ltd or Umbrella Company pay rate that the client is willing to pay up to and the lower pay rate advertised in our job advert/s will always be the highest PAYE pay rate, that the client is willing to pay up to, unless otherwise specified. If the PAYE rate is not indicated in the job advert then please contact us for confirmation of the PAYE daily pay rate.
Essential Skills & Experience
• Experience in AWS S3 storage, Lambda, DynamoDB
• Experience in Databricks Delta lake on how to Ingest, transform, load on Delta tables in Bronze, Silver, and Gold zone
• Experience/knowledge in Airflow
• Experience/knowledge in Python
• Experience/knowledge in Metadata catalog – AWS Glue / Collibra will be preferred
Key Tasks & Deliverables
Build, test, and promote metadata-driven data pipelines using Databricks to read from Data vault and load into Data Mart with defined data aggregations/enrichment/transformation/ data quality rules/data lineage
Orchestrate data pipelines using Airflow / AWS Lambda
Document low-level designs as per the defined standards
Qualifications, Training & Certificates
Interview Process: 2 stage which will include an initial screening call and then 90 minute interview to include a coding exercise and technical questions.