Business Intelligence-II-SUPPORT SERVICES-CTO Head
Bengaluru, Karnataka, India
Kotak Mahindra Bank
Kotak Mahindra Bank offers high interest rate savings account, low interest rate personal loan and credit cards with attractive offers. Experience the new age Personal Banking and Net Banking with Kotak Bank.Data Engineer -1 (Experience – 0-2 years)
What we offer
Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many.
About our team
DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics.
The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter.
As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies.
The data platform org is divided into 3 key verticals:
Data Platform
This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak.
Data Engineering
This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases.
Data Governance
The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform.
If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you.
You day to day role will include
- Drive business decisions with technical input and lead the team.
- Design, implement, and support an data infrastructure from scratch.
- Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA.
- Extract, transform, and load data from various sources using SQL and AWS big data technologies.
- Explore and learn the latest AWS technologies to enhance capabilities and efficiency.
- Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis.
- Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers.
- Build data platforms, data pipelines, or data management and governance tools.
BASIC QUALIFICATIONS for Data Engineer/ SDE in Data
- Bachelor's degree in Computer Science, Engineering, or a related field
- Experience in data engineering
- Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR
- Experience with data pipeline tools such as Airflow and Spark
- Experience with data modeling and data quality best practices
- Excellent problem-solving and analytical skills
- Strong communication and teamwork skills
- Experience in at least one modern scripting or programming language, such as Python, Java, or Scala
- Strong advanced SQL skills
PREFERRED QUALIFICATIONS
- AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow
- Prior experience in Indian Banking segment and/or Fintech is desired.
- Experience with Non-relational databases and data stores
- Building and operating highly available, distributed data processing systems for large datasets
- Professional software engineering and best practices for the full software development life cycle
- Designing, developing, and implementing different types of data warehousing layers
- Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions
- Building scalable data infrastructure and understanding distributed systems concepts
- SQL, ETL, and data modelling
- Ensuring the accuracy and availability of data to customers
- Proficient in at least one scripting or programming language for handling large volume data processing
- Strong presentation and communications skills.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow AWS Banking Big Data Business Intelligence Computer Science CX Data Analytics Data governance Data management Data pipelines Data quality Data warehouse Data Warehousing Distributed Systems EC2 Engineering ETL FinTech Firehose Java Kinesis Lambda Pipelines Privacy PySpark Python RDBMS Redshift Scala SDLC Security Spark SQL Swift
Perks/benefits: Career development
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Business Intelligence Engineer jobs
- Open Lead Data Analyst jobs
- Open Power BI Developer jobs
- Open Data Engineer II jobs
- Open Senior Business Intelligence Analyst jobs
- Open Marketing Data Analyst jobs
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open Business Intelligence Developer jobs
- Open Business Data Analyst jobs
- Open Data Scientist II jobs
- Open Product Data Analyst jobs
- Open Data Analytics Engineer jobs
- Open Data Analyst Intern jobs
- Open Sr Data Engineer jobs
- Open Principal Data Scientist jobs
- Open Sr. Data Scientist jobs
- Open Senior Data Architect jobs
- Open Data Engineering Manager jobs
- Open Junior Data Engineer jobs
- Open Big Data Engineer jobs
- Open Research Scientist jobs
- Open Data Quality Analyst jobs
- Open Azure Data Engineer jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open Data quality-related jobs
- Open ML models-related jobs
- Open Business Intelligence-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open PhD-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open Finance-related jobs
- Open NLP-related jobs
- Open PyTorch-related jobs
- Open TensorFlow-related jobs
- Open LLMs-related jobs
- Open APIs-related jobs
- Open Generative AI-related jobs
- Open CI/CD-related jobs
- Open Snowflake-related jobs
- Open Consulting-related jobs
- Open Kubernetes-related jobs
- Open Hadoop-related jobs
- Open Data governance-related jobs
- Open Databricks-related jobs
- Open Airflow-related jobs