Full-Time Lead Data Engineer
Logic20/20 Inc. is hiring a remote Full-Time Lead Data Engineer. The career level for this job opening is Manager and is accepting Seattle, WA based applicants remotely. Read complete job description before applying.
Logic20/20 Inc.
Job Title
Posted
Career Level
Career Level
Locations Accepted
Salary
Share
Job Details
Lead Data Engineer
As a Lead Data Engineer joining Logic20/20’s Advanced Analytics practice, you’ll help clients scale their data solutions and make data-driven decisions while driving enterprise-level innovation. You’ll work closely with the client to understand their business processes and analytics needs to design and build data pipelines and cloud data solutions.
You will guide your client through best practices in data lake, data processing, and data pipeline design to help them achieve their business goals.
You will work closely with your team including analysts, architects, and technical project managers to design solutions and deliver a world-class solution.
You’ll strike a balance of technical skills and business acumen to help clients better understand their core needs and what is possible for a future state.
What you’ll do:
- Design and develop cloud ETL and data pipelines
- Develop CI/CD pipelines and other DataOps fundamentals
- Communicate effectively to both business and technology teams to understand business needs and pain points
- Be creative in meeting the client’s core needs with their technology
- Explain technical benefits and deficits to non-technical audiences
- Learn new data tools and best practices
Required:
- 8+ years of cloud data engineering experience
- 2+ years of formal leadership experience
- Experience designing and developing cloud ETL and data pipelines with various technologies such as Python, SQL, Airflow, Talend, Matillion, DBT, Fivetran
- Experience with AWS: Glue, S3, Athena
- Experience in working with the business to understand the appropriate data model (relational, tabular, transactional) for their data solution
- Deep experience designing and building performant ETL jobs to move and transform data from various source types and performing exploratory data analysis, data cleansing, and aggregation
- Experience with scaling & automation to data preparation techniques
- Experience with developing and operating CI/CD pipelines and other DataOps fundamentals
- Experience developing client-facing, core design documents: data flows, source to target, requirements, data lineage, and data dictionary
- Understanding of data modeling (such as Kimball, Inman, and Data Vault design approaches)
- Excellent foundation of consulting skills: analytical, written and verbal communication, and presentation skills
- Demonstrated ability to identify business and technical impacts of user requirements and incorporate them into the project schedule
Preferred:
- An undergraduate degree in technology or business
- Microsoft or AWS or Snowflake Certifications (Azure Fundamentals, Azure Data Engineer Associate, MCSA [any], Power Platform, AWS Data Analytics Specialty, AWS ML, SnowPro Core, etc.)
- Experience with Big Data Technologies (Hadoop, PySpark, MongoDB)
- Experience with Power BI
- Experience and certifications with Agile, Scrum, and/or SAFe
Salary Range: $141,000 - $175,000