Full-Time Data Engineer
NBCUniversal is hiring a remote Full-Time Data Engineer. The career level for this job opening is Experienced and is accepting USA based applicants remotely. Read complete job description before applying.
NBCUniversal
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
We are seeking a Data Engineer who is eager to contribute to building the next generation of data pipelines and applications to support our generative AI initiatives. This role is a perfect match for those who are early in their data engineering career, a strong interest in leveraging generative AI technologies and have a "hands-on" approach to coding, building, and cleansing datasets to derive actionable insights.
Responsibilities:
- Engage with business leaders, engineers, and product managers to define and meet data requirements.
- Work closely with technology teams to execute ETL/ELT processes, leveraging cloud-native principles to manage data from diverse sources.
- Participate in the design, construction, and scaling of data pipelines, integrating data from various sources, including internal systems, third-party platforms, and cloud environments.
- Support internal process optimizations by automating workflows, enhancing data delivery, and redesigning infrastructure to boost scalability.
- Apply appropriate design patterns to ensure performance, cost-efficiency, security, scalability, and a positive end-user experience.
- Be actively involved in development sprints, demonstrations, and retrospectives, contributing to the deployment and release processes.
- Cultivate relationships with IT support teams to ensure the smooth deployment of work products.
- 3+ years of experience in data engineering, demonstrating a foundational understanding of data modeling, ETL/ELT principles, and data warehousing.
- Experience with data management fundamentals, data storage principles, and cloud-based data warehouses such as cloud Storage (AWS S3, GCP Cloud Storage, Azure Blob Storage), GCP BigQuery, Snowflake, or similar platforms.
- Proficiency in building data pipelines using Python/SQL.
- Demonstrate experience with workflow orchestration tools like Airflow, or a willingness to learn.
- Experience in applying CI/CD principles and processes to data engineering solutions.
- General understanding of cloud data engineering design patterns and use cases
Desired Characteristics:
- A Bachelor’s degree in Computer Science, Data Science, Statistics, Informatics, Information Systems, Mathematics, Computer Engineering, or a related quantitative discipline is preferred.
- Effective communication skills, capable of working collaboratively across diverse teams and navigating a large, matrixed organization efficiently.
- Action-oriented – You're constantly figuring out new problems and are regularly showing results with a positive attitude, always displaying ethical behavior, integrity, and building trust