Senior Data Engineer

Company Description

  • Dentsu Media is the largest area of specialism within the dentsu network. It is brought to markets globally, through three award-winning agency brands: Carat, iProspect and dentsu X. All three are underpinned by a scaled network offering of talent, capabilities, and services to support, grow and transform the world’s leading advertisers. Operating across more than 145 countries, dentsu Media is trusted by leading brands across a broad range of sectors, from pioneering technology to cutting-edge fashion. Dentsu leverages technology, creativity, and in-depth data analytics to drive superior results that amplify its clients' brands. Dentsu Media is transforming advertising as a force for growth, a force for good and has become the destination for employees to cultivate meaningful careers and for brands to accelerate previously unseen, sustainable growth.
Job Description

1. Design, build, test, and maintain highly scalable data platforms, ensuring that these systems meet business requirements and industry best practices.
2. Incorporate new data engineering technologies into existing systems, to help move data across systems.
3. Create custom data pipelines and solution components that extract, cleanse, transform, move, and aggregate data across various enterprise systems
4. Identify potential opportunities for data acquisition and explore new uses for existing data.
5. Develop processes and standards for common data models, and data mapping across systems.
6. Use a range of ETL tools, Data integration platforms, and underlying languages and tools to integrate disparate data systems effectively.
7. Leverage data virtualization technologies for data aggregation and unification from multiple, disparate data sources, without the use of any ETL.
8. Recommend ways to improve data reliability, efficiency, and quality.
9. Foster collaboration with data architects, modelers, and IT team members on project goals.
10. Provide technical leadership and consultation on complex projects involving data management, data virtualization, and unification. Qualifications:
11. Bachelor’s/Master’s degree in Computer Science, Data Science, Information Systems, or a related field.
12. Minimum of 5 years of experience in a data engineering role, with a focus on data management and data pipeline construction.
13. Demonstrable experience with data engineering solutions, including data aggregation, data transformation, data movement, and data unification strategies.
14. Proficiency with Data Lakes built on Cloud Services such as Azure and AWS – e.g., ADLS (Azure Data Lake Storage Gen2),
15. Proficiency with cloud platforms, and cloud services, on at least 1 major enterprise cloud (Azure or AWS Cloud)
16. Solid exposure to and experience with data virtualization technologies and platforms, particularly the Denodo platform, is highly preferred.
17. Proficiency with data orchestration services and data workflow platforms – e.g., Apache Airflow, highly preferred.
18. Proficiency in scripting and data intensive languages like SQL, Python, etc.
19. Experience with big data frameworks and associated languages / toolsets is nice-to-have

Related Jobs

images/users/1709031220af58091dfb89e2da798450720f29f937.webp

Replicant

Senior Software Engineer II, ML team

  • Engineering
  • Full Time
  • architecture
  • chatbots
  • chatgpt
  • conversational ai
  • engineering
  • gpt
  • llms
  • machine learning
  • nlg
  • python
images/users/170845911955968bb979593ce8d951d5b5de86374c.webp

Octopus Energy

Data Engineer (m/w/d)

  • Engineering
  • Full Time
  • aws
  • data pipelines
  • pipelines
  • python
  • spark
  • sql
images/users/1707930130b34daa827a410246892752cd2d9674f1.webp

Hostinger

AI Engineer

  • Engineering
  • Full Time
  • apis
  • azure
  • devops
  • docker
  • engineering
  • generative ai
  • git
  • jira
  • langchain
  • linux
Land your dream job
Get a weekly email with the latest startup jobs.