Full-Time Senior Retail Data Engineer
NielsenIQ is hiring a remote Full-Time Senior Retail Data Engineer. The career level for this job opening is Manager and is accepting Canada based applicants remotely. Read complete job description before applying.
NielsenIQ
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
As a Senior Data Engineer, you’ll be part of a team of smart, highly skilled engineers who are proud to partner with some of the world’s leading retailers on challenging, cutting-edge, data driven solutions - all powered by our exceptional technology and people.
Our core technologies currently include Python, Databricks and Azure, and we continue to adopt the best of breed in modern data stack technologies. Our team is co-located and agile, with a central hub in Toronto.
Responsibilities:
- Responsible for leading data onboarding workstreams with new customers / pilots / POCs.
- Work directly with customers’ data teams on data interfaces, specifications and integration projects.
- Design, implement and maintain performance of reliable data pipelines and interfaces that feed various analytics applications.
- Design, develop, and maintain ETL from various data streams via multiple consumption methods (e.g. Blob, Kafka, SFTP, SQL connectors, REST API).
- Design, development, and implementation of new and existing data processing solutions; enabling teams to be more efficient and effective in analyzing data and getting to insights faster.
- Create, implement, and maintain data transformation and data validation processes.
- Perform data validation activities, normalize and explore datasets, investigate and solve data related problems.
- Support data architecture efforts in support of key analytics innovations that involve multiple internal and/or external (client) stakeholders.
- 5+ years of experience working as a data engineer in ETL and data integration.
- 5+ years of experience in client facing consultation role, project management expertise, managing customer expectations and deadlines and excellent communication skills.
- Strong Python – Pyspark development experience.
- Strong SQL with experience in querying large, complex data sets.
- Databricks, Spark, Airflow, and Scala development experience.
- Experience working with retailer and supplier analytics data.
- Experience with Cloud platforms – Azure strongly preferred.
- Experience in data integration design, development and maintenance, data lakes/hubs.
- Result-driven, pragmatic, and innovative.
- Experience with developing REST APIs and microservices.
- Computer Science, Engineering, Information Systems, Statistics, Math degree (or equivalent combination of skill and experience).
- Experience with SingleStore, Snowflake, BI tools i.e. PowerBI – nice to have.