Senior Data Engineer
- Salary: Gross monthly salary between EUR 4,516 and EUR 6,449 (scale 09).
- Extras: a thirteenth month, 8% holiday allowance, and a 10% Employee Benefit Budget.
- Development budget: EUR 1,400 development budget per year for your growth and development.
- Hybrid working: a balance between home and office work (possible for most roles).
- Pension: decide for yourself the amount of your personal contribution.
Or view all our benefits.
As a Data Engineer, you collaborate with product, engineering, and modelling teams to ensure we deliver business value. You will work on the design, development, and maintenance of our data platform and implement both ETL pipelines and real-time data ingestion and processing, as well as API development to enable seamless data access and integration.
In the Business Lending domain, we support business clients with financial solutions, such as loans, credit facilities, lease, and factoring solutions. Our ambition? To be ahead of the market by enabling 80% of our client requests through fast and seamless digital journeys. Simplification and standardization are crucial to achieve this goal.
Our team plays a pivotal role in this transformation. We firmly believe that data is a key asset in today’s financial world. Our mission is to support data-driven decision making, to ultimately help more business clients.
With our data platform we offer solutions that enable straight-through-processing and fully automated credit origination. We maintain client profiles that are continuously updated with data from client and employee journeys, and we implement risk models used for acceptance decisions.
We have the objective to better understand data: where it comes from, how it is used, what the quality is. We collaborate with product, engineering, and risk teams to continuously unlock new data sources and improve our data landscape.
- contribute to the design, scalability and reliability of our data infrastructure;
- build and optimize data pipelines for both batch and streaming data;
- enable data-driven solutions.
- designing and implementing ETL and real-time data pipelines;
- developing and deploying containerized applications and APIs to expose data and services;
- implementing data quality checks and monitoring;
- providing operational support and troubleshooting for data infrastructure;
- collaborating with stakeholders to understand requirements and deliver business value;
- continuously improving our data engineering practices.
We believe in the power of differences. It is precisely by combining people's differences that we become an even better bank. We are curious about what you add to our team of Data Models Analytics and Calculations.
Collaboration is our approach; as one focused team within Rabobank, we deliver innovative data solutions. Our culture is open, inclusive, and driven by continuous improvement.
For us, your development and that of society go hand in hand. That’s why we want to invest in you and work together for a better world. We summarise this in one sentence: At Rabobank, you work on yourself and the world around you at the same time. You’ll see this reflected in your personal development budget, our hybrid working environment, and a healthy balance between work and home. You can work on banking services for our private and business customers, as well as societal issues such as food and energy transitions.
At Rabobank, we believe we become stronger through people who complement each other. By embracing our differences, we bring out the best in one another. We seek diversity in areas such as knowledge, skills, and experience, but also in gender, background, and culture. Across every department, we strive for variety and the freedom to be yourself – whoever you are. That’s what diversity and inclusion at Rabobank is all about.
- A completed master’s degree in Computer Science, Software Engineering, or a related field
- 5+ years of experience in data engineering, data integration, or software engineering
- Hands-on experience with ETL development (using PySpark) and real-time data processing
- Experience in designing and implementing data architectures (e.g., lakehouse)
- Familiarity with SQL & NoSQL databases
- Strong programming skills in Python
- Experience with API development (preferably FastAPI)
- Experience with cloud platforms (e.g., AWS, Azure or GCP)
- AWS services is a plus: Glue, Lambda, MSK, S3, API Gateway, Fargate, DocumentDB
- Experience with Docker, Terraform, and building CI/CD pipelines
- To be considered for this position you must be located in the Netherlands.
- Questions about the role? Reach out to George Vermeer, Sr Tech Lead: george.vermeer@rabobank.nl
- Questions about working at Rabobank and the procedure? Lisa Boschma, recruiter, via lisa.boschma@rabobank.nl #LI-LB
- After you have sent your CV and motivation letter, you will receive a response from us as soon as possible. If we see a first match, then 2 interviews are part of the application process; interviews will take place via Microsoft Teams.
- If you are invited for an interview, Bo, our virtual assistant, will contact you via SMS and email to schedule the interview.
- Answers to frequently asked questions can be found at rabobank.jobs/en/faq.
- A reliability assessment is part of the procedure.
- We respect your privacy.
