Full-Time Senior Palantir Developer
Logic20/20 Inc. is hiring a remote Full-Time Senior Palantir Developer. The career level for this job opening is Senior Manager and is accepting Seattle, WA based applicants remotely. Read complete job description before applying.
Logic20/20 Inc.
Job Title
Posted
Career Level
Career Level
Locations Accepted
Salary
Share
Job Details
As a Data Engineer joining Logic20/20’s Advanced Analytics practice, you’ll help clients scale their data solutions and make data-driven decisions while driving enterprise-level innovation.
You’ll work closely with the client to understand their business processes and analytics needs to design and build data pipelines and cloud data solutions.
You will have the opportunity to guide your client through best practices in data lake, data processing, and data pipeline design to help them achieve their business goals.
You will work closely with your team, including analysts, dashboard developers, and technical project managers to design solutions and work together to deliver a world-class solution.
You’ll strike a balance of technical skills and business acumen to help clients better understand their core needs and what is possible for a future state.
- Design and develop cloud ELT and data pipelines
- Develop CI/CD pipelines and other DataOps fundamentals
- Communicate at the appropriate level to both business and technology teams to understand business needs and pain points
- Be creative in meeting the client’s core needs with their technology
- Explain technical benefits and deficits to non-technical audiences
- Learn new data tools and best practices
Required:
- 5+ years of data engineering experience in a production environment
- 5+ years of experience with PySpark and/or Python
- 2+ years of experience working within Palantir Foundry or Foundry Pipeline Builder
- 2+ years of experience working with Palantir Foundry Streams or a comparable real-time data streaming technology (e.g., Apache Kafka, AWS Kinesis, Google Pub/Sub)
- Experience designing and developing cloud ELT and data pipeline with various technologies such as Python, SQL, Airflow, Talend, Matillion, DBT, Fivetran
- Experience in working with business to understand the appropriate data model (relational, tabular, transactional) for their data solution
- Deep experience designing and building performant ELT jobs to move and transform data from various source types and performing exploratory data analysis, data cleansing, and aggregation
- Experience with scaling & automation to data preparation techniques
- Experience with developing and operating CI/CD pipelines and other DataOps fundamentals
- Experience developing client-facing, core design documents: data flows, source to target, requirements, data lineage, and data dictionary.
- Understanding of data modeling (such as Kimball, Inman, Data Vault design approaches)
- Excellent foundation of consulting skills: analytical, written and verbal communication, and presentation skills
- Demonstrated ability to identify business and technical impacts of user requirements and incorporate them into the project schedule