Senior Data Engineer

Athens, Attica, Greece

Dialectica

Get access to unique insights and untapped expert knowledge through interviews with experts, B2B surveys, customized reports from across industries and locations

View company page

About the Tech Team

Technology powers everything we do at Dialectica – from communicating with clients and finding the most relevant expert profiles in just a few minutes to indexing and categorizing thousands of pieces of information every day. To do so, we have built our own proprietary web application that automates and optimizes the delivery of our market leading services.

To date, the team consists of 70+ people across Software Engineering, Product & Design, TechOps, and is expected to grow further in 2024. Being part of Dialectica’s Technology Team you will have the opportunity to help build tech products according to the requirements of our internal and external clients and share the company’s vision to roll out more information services software products in the future.

Technologies we use:

About the role

We are seeking a Data Engineer who will contribute to organizing and managing multiple data sources. The successful candidate will primarily be responsible for designing and optimizing data pipelines that enhance data accessibility and integrity. This role involves collaborating closely with other teams such as Data Science, BI, and product development teams, that utilize or enrich data. Often there are multi-stage enrichments of data where one team's output must be effectively captured and re-made accessible for subsequent utilization by other teams.

As a Data Engineer in Dialectica you will:

  • Create and maintain effective data pipeline architectures
  • Assemble large, complex data sets Design and implement infrastructure that enables teams to become more self-served
  • Design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies
  • Build and/or configure monitoring tools and alerts that assure the integrity of operations
  • Work with stakeholders including the Product, Engineering and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Requirements

We have seen that people who successfully fit in this position have:

  • 4+ years of experience in a Data Engineer role
  • Strong Knowledge of Data Management, Data Warehousing concepts (Dimensional Modeling, Data Marts) and Database Technologies.
  • Strong understanding of modern approaches to data engineering and the best approach to solve data problems.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • ETL/ELT processes (Knowledge of DBT is a plus)
  • Working experience in Python.
  • Hands-on experience in Production Data Lake architecture in cloud and Big Data implementations in general (designing, constructing, cataloging and optimizing data lake)
  • Experience with Cloud Technologies and Serverless Computing (we use AWS).
  • Working experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Strong analytic skills related to working with unstructured datasets.
  • Exposure to DevOps and CI/CD practices.
  • Familiarity with Change Data Capture (CDC) is a plus.
  • Experience in workflow management tools/job orchestration tools is a plus.

Our Values:

  • Ownership
  • Growth Mindset
  • Teamwork
  • Respect

Our Engineering Principles:

  • Master your Craft
  • Work for the Team
  • Build for the Business
  • Be Pragmatic

Fluency in English is a must!

Benefits

  • Competitive base salary with additional performance incentives
  • Coverage under the company’s collective health insurance plan
  • Learning and development opportunities (e.g. onboarding, on-the-job training, Udemy courses and many others!)
  • Hybrid or remote work model & extra personal/flex days and paid volunteer days a year for your favorite cause
  • Company sponsored team-bonding events
  • Weekly health & wellness activities (e.g. basketball, football, yoga, running), gym discounts, healthy breakfast, snacks and beverages
  • Entrepreneurial culture and amazing coworkers!

About Dialectica

Dialectica is the global leader in insights on-demand. We enable investment and business professionals to access untapped market, competitive & customer insights powered by the world’s hardest to find experts, and cutting-edge technology. Our team of +1,000 professionals in 5 offices spanning 3 continents, works with top-tier investment funds, management consulting firms, and Fortune 500 companies around the globe.

Driven by our mission to achieve unparalleled customer recognition, we are developing the most trusted and innovative knowledge-sharing platform in the world. Dialectica has been recognized as one of Europe’s fastest-growing companies by the Financial Times for 4 years in a row, a Top Employer for Recent Graduates by The Career Directory in Canada and a Best Workplace.

Apply now Apply later
  • Share this job via
  • or

* Salary range is an estimate based on our AI, ML, Data Science Salary Index 💰

Tags: Architecture AWS Big Data CI/CD Consulting Data management Data pipelines Data Warehousing dbt DevOps ELT Engineering ETL Pipelines Python RDBMS SQL

Perks/benefits: Career development Competitive pay Health care Snacks / Drinks Team events Wellness Yoga

Region: Europe
Country: Greece
Job stats:  1  0  0
Category: Engineering Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.