Senior Data Engineer
About Us 🚀
Bridgit is a leading non-bank lender transforming the way Australians access property equity. Purpose-built to make property transactions faster and easier, we pioneered the Buy Now, Sell Later solution – empowering homeowners to unlock their property equity and move on their terms. With a simple digital application, fast approvals, and flexible loan options, we’re making property finance seamless and stress-free.
We’re now four years old and making huge strides. We have accredited over two-thirds of Australia’s broker network, and launched white label solutions with Australia’s largest aggregators – Connective, Aussie Home Loans, Finsure, and Loan Market Group – and we’re just getting started.
Our momentum has been recognised with awards such as Best Growth Story at the 2025 Fintech Awards and finalist for Excellence in Lending (Business & Consumer) at the 2025 Finnies.
The difference you’ll make
We are looking for a Senior Data Engineer with deep technical expertise in Python, SQL, and modern lakehouse architectures such as Microsoft Fabric or Azure Databricks.
This role involves designing, building, and maintaining highly scalable batch and streaming data pipelines, optimizing performance, and ensuring robust data quality across the enterprise.
The ideal candidate will have a strong engineering mindset, hands-on experience across the full data lifecycle, and a passion for modern cloud and AI-driven data ecosystems.
What you'll do:
- Design, develop, and optimize data pipelines for both batch and streaming workloads using Fabric or Azure Databricks.
- Implement data models, transformations, and ingestion frameworks following best practices (e.g., Medallion architecture: Bronze–Silver–Gold)
- Manage and maintain ETL/ELT processes using Azure Data Factory, Fabric Pipelines, or equivalent tools.
- Develop and maintain high-performance SQL queries and analytical models for reporting and analytics.
- Ensure data quality, reliability, and lineage through testing, monitoring, and documentation.
- Collaborate with data scientists, analysts, and business stakeholders to translate requirements into scalable technical solutions.
- Manage version control, branching, and CI/CD processes using Git.
- Drive adoption of engineering best practices, including modular code design, testing, and automation.
What you’ll bring:
- Extensive experience (7–10 years) in data engineering projects spanning batch and streaming data workloads.
- Advanced Python proficiency, including strong command of NumPy, Pandas, and related data-processing libraries.
- Expert-level SQL skills — capable of building efficient, production-grade transformations and analytical models.
- Proven experience with Fabric or Azure Databricks, particularly in Delta Lake or Lakehouse environments.
- Strong understanding of data modeling (Star/Snowflake), Medallion architecture (Bronze–Silver–Gold), and incremental/CDC design patterns.
- Hands-on experience with ETL/ELT orchestration (Azure Data Factory, Fabric Pipelines, or equivalent).
- Proficient with Git for version control, branching strategies, and CI/CD integration — this is essential.
- Excellent communication, documentation, and stakeholder collaboration skills.
Good-to-Have Skills:
- Familiarity withAI-assisted development tools such as Cursor, GitHub Copilot, or similar productivity-enhancing environments.
- Exposure to Power BI, Fabric Data Warehouse, and Semantic Models (DAX).
- Knowledge of data governance, lineage, and data quality frameworks.
- Experience with Azure Data Lake Storage, Synapse, or SQL Warehouse.
- Understanding of DevOps principles and cloud cost optimization for data workloads.
- Experience integrating machine learning models or LLM-based workflows into data pipelines.
Our Culture and Benefits
Bridgit values its team, they are the heart of how we build this business. Along with competitive remuneration, slick offices and the chance to be part of an innovative, agile fintech, we also offer:
Extra Leave – We offer birthday leave + an additional day of paid leave to be used for life events, celebrations, or just a mental health reset.
Two Weeks from Anywhere – We encourage employees to work remotely from a location of their choice for two weeks each year.
Learning and Development – All employees are encouraged and empowered to engage in professional development, including a number of learning initiatives run internally.
Social Events – We have a jam-packed social scene, with events throughout the year to bring the team together!
Ready to Make an Impact?
If you’re excited about reshaping the lending industry and want to be part of a company that values authenticity and innovation, we’d love to chat. Apply now and let’s build the future of finance together! 🚀
- Department
- Tech Team
- Locations
- Cebu, Manila