Full-Time Senior Data Engineer
Casebook PBC is hiring a remote Full-Time Senior Data Engineer. The career level for this job opening is Experienced and is accepting Raleigh, NC based applicants remotely. Read complete job description before applying.
Casebook PBC
Job Title
Posted
Career Level
Career Level
Locations Accepted
Salary
Share
Job Details
About Engineering Team:At Casebook, our Engineering team drives innovation and efficiency within our tech infrastructure. They are experts in development, operations, and automation, ensuring seamless deployment pipelines and championing best practices in infrastructure as code. They collaborate closely with cross-functional teams, fostering a culture of shared responsibility. They focus on innovation and optimization, remaining at the forefront of emerging technologies.
Job Summary:We seek a Senior Data Engineer to lead PostgreSQL and Redshift maintenance, updates, and optimization for a large, live, cloud-native microservices system. Responsibilities include data modeling, designing data pipelines for reporting and analytics, data quality initiatives, data analysis, and contributing to product and human service outcomes. The role also involves serving as a data domain expert in client engagements.
Responsibilities:
- Conceptualize and guide data processing, warehouse, and analysis projects.
- Design and oversee database models, schemas, data processing scripts, and data validation.
- Implement data governance policies that meet client needs.
- Develop a data management framework ensuring users can find information quickly and precisely.
- Ensure data quality in Casebook Platform deployments, and guide data integrity and quality.
- Collaborate with data engineers, developers, and product teams to ensure understanding of application behavior and data requirements.
- Lead the building of automated data transformation frameworks for legacy data integration.
- Partner with system integration and client data engineering teams.
- Consult with engineers on data models to support functionality additions.
- Develop and track measures for feature effectiveness, usability, and data quality.
- Evaluate and recommend data products, tools, and services for variable reporting needs.
- Review and approve production data fixes.
- Model data growth to meet performance standards and lead strategy for current and future reporting performance.
- Lead planning and strategy for customer access to data warehouse datasets and reports.
Skills and Requirements:
- Strong analysis of highly normalized, distributed microservices data to deliver lower normalization warehouse solutions for optimal reporting.
- Experience with technical and operational management, and process owners.
- Ability to balance short and long-term goals, communicating trade-offs clearly.
- Strong experience translating business requirements into data models (conceptual, logical, physical).
- Extensive ETL design and implementation experience with multiple databases.
- Expertise in metadata definition, implementation, and maintenance.
- Organization and prioritization of multiple assignments.
- Initiative, judgment, and ability to work under pressure to deliver accurate, timely, and professional results.
- Strong presentation, interpersonal, and communication skills.
- Understanding of system development lifecycle, project management, and requirements/design/test techniques.
- Experience with PostgreSQL and related technologies is desired.
- Experience with cloud services (AWS preferred).
- Experience with distributed platforms and information management systems.
- Data analysis using Python, R, or Scala.
- Business Intelligence toolings and solutions expertise.
- Bachelor's degree in computer science, information management, or related field, or equivalent experience.