Company Description :
We Dream. We Do. We Deliver.
As a full-service, data-driven customer experience transformation, we partner with Top 500 companies in the DACH region and in Eastern Europe.
Originally from Switzerland, Merkle DACH was created out of a merger Namics and Isobar - two leading full-service digital agencies.
Our 1200+ digital enthusiasts are innovating the way brands are built, through providing expertise in Digital Transformation strategy, MarTech platforms, Creativity, UX, CRM, Data, Commerce, Mobile, Social Media, Intranet and CMS.
We are part of the global Merkle brand, the largest brand within the dentsu group, who shares with us a network of over 66,000 passionate individuals in 146 countries.
Job Description :
Provide thought leadership and drive architecture of Enterprise Analytics / BI and Data Science solutions to deliver scalable implementations using the latest cloud technologies in AWS / Azure and 3rd party BI platforms
Translate business goals and use cases into technical solutions to deliver actionable insights for various business processes, especially in the area of sales, finance, e-commerce, manufacturing / logistics, banking or CRM.
Become the technical SPoC for all stakeholders.
Set architectural vision and direction across a matrix of teams. Propose a conceptual / logical design for various data components and large volumes of data integrations.
Apply effective governance methods for long-term scalability, high availability, fault tolerance, and solution elasticity in alignment with enterprise architecture, security and infrastructure guidelines and standards.
Provide supervision and guidance to the implementation team (data engineers, data scientists, BI consultants, ..) to ensure the project is implemented as per business requirements
Support pre-sales by proposing a technical solution and effort estimates
Maintain an internal framework to ensure consistent & optimal delivery across projects.
Ensure full DevOps / DataOps automation of continuous development / test / deployment processes
Qualifications : Required Skills
Bachelor’s or Master’s Degree in a technology-related field ( Engineering, Computer Science, IT, etc.)
Large-scale Data Lake / Datawarehouse (DWH) solutions
Big data pipelines / ETL / APIs (data ingestion, processing, transformation and activation) development, preferably in cloud-based tools / infrastructure
Business Intelligence projects, using BI tools like PowerBI, Tableau, Qlick Sense, Keboola, Tibco Spotfire
Data Science (Machine Learning) oriented solutions
Understanding data concepts and patterns (Data Lake, Master Data Management, data quality, lambda architectures, streaming processing)
Having a good business acumen, especially in sales, finance, e-commerce, manufacturing / logistics, banking or CRM areas
Proficient in architecture methodologies and tools like diagraming (, Sparks EA, Confluence) and standards (Archimate, TOGAF)
Experience with full software development lifecycle (CI / CD), including development skills using programming languages like Python / Java / Scala / C++ and relational / NoSQL databases
Strong analytical and complex problem-solving skills
Strong technical leadership, mentorship and collaboration
Strong project management and organizational skills
Proven experience with Data Lake / Datalakehouse implementations in AWS or Azure, using MS Azure (DataBricks, Data Factory, Data Lake, Cosmos DB, Event Hub, PowerBI, .
or AWS (Glue, EC2, EMR, RDS, Redshift, Sagemaker, ) cloud services
Understanding of Data Science (ML) concepts from both business and technical perspective
Experience with ERP solutions like SAP, NetSuite
Experience with Marketing, E-commerce and CRM solutions like Salesforce (Marketing Cloud, CDP, Commerce, CRM), Adobe (AEM, Analytics) or Google (GA, BigQuery)