Data Architect

Noida, Uttar Pradesh, India

Amplifi Capital

Amplifi Capital UK. We grow credit unions. We work with the  boards and members of credit unions to deliver sustainable growth for their organisation and improve the financial well-being of their members.

View company page

About Us:

Amplifi Capital (U.K.) Limited is a dedicated consumer lending platform that specializes in offering unsecured personal loans. We primarily focus on serving near prime consumers who often encounter hurdles accessing credit from traditional high street banks. Our commitment lies in providing accessible financial solutions to bridge the gap in the lending landscape.

At the heart of our operations in Noida, Uttar Pradesh lies Gojoko, a bespoke fintech platform tailored to meet the unique needs of Credit Unions. This comprehensive system encompasses the entire spectrum of savings and loans, integrating best-in-class software suppliers like Mambu, Modulr, ADP, AWS, among others.

Building on the success of our ecosystem, we proudly introduced ReevoMoney, a consumer-facing brand operating with its own balance sheet. ReevoMoney caters to a wider customer base, offering innovative financial solutions.

People always come first at Gojoko Technologies.. From how we engage with our customers to the thorough recruitment process. Our journey is just getting started, the business has attracted amazing talent so far, and we don’t plan on stopping yet!

The Role:

Exciting opportunity for a hands-on Data Architect with strong data architecture skills and a comprehensive understanding of data engineering, BI, and reporting to assume responsibility for the data platform of a growing FinTech business.

This position is ideal for someone skilled in data modelling (relational, dimensional, industry-specific models such as FSLDM) and who enjoys tackling complex data challenges by developing innovative solutions using cloud technologies. Proficiency in building a data lakehouse on Databricks or Snowflake is required, as well as experience in data enrichment, database performance management, data caching techniques, data quality initiatives, data engineering techniques, job orchestration, and CI/CD pipelines.

As a business-facing role, successful candidates must possess excellent stakeholder management and communication skills, along with the ability to rapidly acquire new skills and take ownership of a complex, multi-layered data platform. Collaborating closely with business analysts, data engineers, and data scientists, you will play a pivotal role in constructing and sustaining a sophisticated data platform that is essential to the company's success.

Responsibilities

  • Own the data architecture across multiple data platforms, AWS and Databricks
  • Own the data lakehouse platform and maintain the data models for all layers of FSLDM
  • Lead the data modelling of diverse datasets and hands on development of the database
  • Build an understanding of the business process and be able to translate this into the data models
  • Translate business requirements into technical requirements and architectural diagrams
  • Lead initiatives through the build process with various engineering teams
  • Subject matter expert for all company data, master data, reference data
  • Be part of the operational support of the data platform to ensure a reliable service
  • Track and communicate issues with the data platform to the technology leadership team
  • Document the delivered solutions at the technical and business level

Requirements

  • Experience with performing database modeling and deploying changes to database schema
  • A sound knowledge cloud platforms and server less computing technologies such as AWS and Databricks
  • Utilizing complex SQL queries for logic, analysis, and performance tuning
  • Conducting data analytics on cloud platforms like AWS, Azure, or GCP
  • A sound understanding and experience in utilizing complex SQL queries to interrogate data and join datasets
  • Hands-on - managing and organizing large sets of messy data efficiently
  • Background with modern BI tools and semantic layers, such as Tableau and PowerBI
  • Experience in working effectively in virtual teams and maintaining constant collaboration
  • Ability to work independently and take ownership of key services

Nice to Have:

  • Experience with ETL architectures and tools, including integration with APIs and coding Python data pipelines
  • Team leadership skills for managing tasks and conducting stand-ups
  • Knowledge of reference/master data management
  • Understanding of data governance initiatives such as lineage, masking, retention policy, and data quality

Benefits

  • Competitive salary
  • 25 days annual leave
  • Gratuity
  • Subsidized Transport
  • Discount shopping
  • Private health Insurance
  • Hybrid working (1 day from home)

Commitment:

We are committed to equality of opportunity for all staff and applications from individuals are encouraged regardless of age, disability, sex, gender, race and social background.

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: APIs Architecture AWS Azure CI/CD Data Analytics Databricks Data governance Data management Data pipelines Data quality Engineering ETL FinTech GCP Pipelines Power BI Python Snowflake SQL Tableau

Perks/benefits: Competitive pay Health care Insurance

Region: Asia/Pacific
Country: India
Job stats:  1  0  0
Category: Architecture Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.