Senior Data Engineer (GCP) | Remote-Friendly
India - Remote
Velotio
Velotio Technologies is a leading product engineering & digital solutions company for innovative startups and enterprises. Velotio has worked with over 90 global customers, including NASDAQ-listed enterprises and unicorn startups. We specialize...At Velotio, we are embracing a remote-friendly work culture where everyone has the flexibility to either work remotely or from our office in Pune.
Join us and work from wherever you feel most productive!
About Velotio:
Velotio Technologies is a product engineering company working with innovative startups and enterprises. We are a certified Great Place to Work® and recognised as one of the best companies to work for in India. We have provided full-stack product development for 110+ startups across the globe building products in the cloud-native, data engineering, B2B SaaS, IoT & Machine Learning space. Our team of 325+ elite software engineers solves hard technical problems while transforming customer ideas into successful products.
Requirements
Join our team as a Data Engineer to play a crucial role in enhancing our data infrastructure, focusing on leveraging Google Cloud Platform (GCP) to meet the evolving needs of Business Intelligence, Data Analysis, and Data Science. You will develop, maintain, and optimize data pipelines and models, delivering solutions that enhance data utilization across the organization.
Responsibilities:
- Construct and manage data pipelines that clean, transform, and aggregate data from various sources, utilizing GCP services like BigQuery, Dataflow, and Pub/Sub.
- Develop data models to support both operational and analytical reporting needs, ensuring scalability and performance.
- Deliver high-quality code and implement process improvements, including automation of manual tasks and optimization of data delivery.
- Collaborate with various teams to understand data requirements and integrate solutions effectively, promoting a data-driven culture.
- Communicate technical details effectively to the data team and contribute to data governance and documentation efforts.
Required Skills:
- Proficient in SQL and programming languages such as Java or Python.
- Strong expertise in data modeling and familiarity with data orchestration tools like Airflow.
- Experience with GCP data engineering services, including BigQuery, Dataflow, Pub/Sub, and Cloud Composer.
- Understanding of both relational and non-relational databases.
- Familiarity with DevOps methodologies.
Desired Skills:
- Experience handling SAP data and integrating it within cloud-based data solutions.
- Exposure to data warehousing, big data technologies, and machine learning frameworks.
Benefits
Our Culture:
- We have an autonomous and empowered work culture encouraging individuals to take ownership and grow quickly
- Flat hierarchy with fast decision making and a startup-oriented “get things done” culture
- A strong, fun & positive environment with regular celebrations of our success. We pride ourselves in creating an inclusive, diverse & authentic environment
Note: Currently, all interviews and onboarding processes at Velotio are being carried out remotely through virtual meetings.
* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰
Tags: Airflow Big Data BigQuery Business Intelligence Data analysis Dataflow Data governance Data pipelines Data Warehousing DevOps Engineering GCP Google Cloud Java Machine Learning Pipelines Python RDBMS SQL
Perks/benefits: Career development Flat hierarchy Startup environment
More jobs like this
Explore more AI, ML, Data Science career opportunities
Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.
- Open Business Intelligence Engineer jobs
- Open Lead Data Analyst jobs
- Open Power BI Developer jobs
- Open Data Engineer II jobs
- Open Senior Business Intelligence Analyst jobs
- Open Marketing Data Analyst jobs
- Open Data Science Manager jobs
- Open MLOps Engineer jobs
- Open Junior Data Scientist jobs
- Open Business Intelligence Developer jobs
- Open Business Data Analyst jobs
- Open Data Scientist II jobs
- Open Product Data Analyst jobs
- Open Data Analytics Engineer jobs
- Open Data Analyst Intern jobs
- Open Sr Data Engineer jobs
- Open Principal Data Scientist jobs
- Open Sr. Data Scientist jobs
- Open Senior Data Architect jobs
- Open Data Engineering Manager jobs
- Open Junior Data Engineer jobs
- Open Big Data Engineer jobs
- Open Research Scientist jobs
- Open Data Quality Analyst jobs
- Open Azure Data Engineer jobs
- Open GCP-related jobs
- Open Java-related jobs
- Open Data quality-related jobs
- Open ML models-related jobs
- Open Business Intelligence-related jobs
- Open Data management-related jobs
- Open Privacy-related jobs
- Open PhD-related jobs
- Open Data visualization-related jobs
- Open Deep Learning-related jobs
- Open Finance-related jobs
- Open NLP-related jobs
- Open PyTorch-related jobs
- Open TensorFlow-related jobs
- Open LLMs-related jobs
- Open APIs-related jobs
- Open Generative AI-related jobs
- Open CI/CD-related jobs
- Open Snowflake-related jobs
- Open Consulting-related jobs
- Open Kubernetes-related jobs
- Open Hadoop-related jobs
- Open Data governance-related jobs
- Open Databricks-related jobs
- Open Airflow-related jobs