Are you looking for a Job in Data Model Engineer, Credit Risk Models Field then this post may be relevant to you.
Job Details:
Company name : NatWest Group
Position Name : Data Model Engineer, Credit Risk Models
Location: Edinburgh
Job ID : d3fc9ffac244d80d
Description : Our people work differently depending on their jobs and needs. From home working to job sharing, visit the remote and flexible working page on our website to find out more.
This role is based in the United Kingdom and as such all normal working days must be carried out in the United Kingdom.
Join us as a Data Model Engineer, Credit Risk Models
This is an opportunity for a driven Data Model Engineer to take on an exciting new career challenge, specialising in Credit Risk Models in AWS
You’ll have the opportunity to build a wide network of stakeholders of varying levels of seniority
It’s a chance to hone your existing technical skills and advance your career
What you’ll do
In your new role, you’ll engineer and maintain innovative, customer centric, high performance, secure and robust solutions.
You’ll be working within a feature team and using extensive experience to engineer software, scripts and tools that are often complex, as well as liaising with other engineers, architects and business analysts across the platform.
You’ll also be:
Producing complex and critical software rapidly and of high quality which adds value to the business
Working in permanent teams who are responsible for the full life cycle, from initial development, through enhancement and maintenance to replacement or decommissioning
Collaborating to optimise our software engineering capability
Designing, producing, testing and implementing our working code
Working across the life cycle, from requirements analysis and design, through coding to testing, deployment and operations
The skills you’ll need
You’ll need a background in software engineering, software design, architecture, and an understanding of how your area of expertise supports our customers. Alongside hands-on Python or PySpark development experience, you’ll need the ability to write advance Spark SQL or ANSI SQL, as well as good experience of Spark SQL query optimisation and performance tuning.
Additionally, you’ll need good experience of Airflow, as well as Unix or Linux scripting, paired with hands-on experience of Continuous Integration, DevOps, Jenkins, TeamCity, Bitbucket, GIT, and Artifactory. You’ll also bring IT experience in Agile, including Test Driven Development and software delivery best practice.
You’ll also need:
Experience of working with code repositories, bug tracking tools and wikis
Coding experience in multiple programming languages
Experience of Agile methodology and associated toolsets and methodologies
A background in solving highly complex, analytical and numerical problems
Experience of implementing programming best practice, especially around scalability, automation, virtualisation, optimisation, availability and performance
If you need any adjustments to support your application, such as information in alternative formats or special requirements to access our buildings, or if you’re eligible under the Disability Confident Scheme please contact us and we’ll do everything we can to help.
Disclaimer : Applicant must check the company profile before joining Bizplusapp.com is no way responsible for any loss.