Full-Time Senior Software Engineer, Data Platform
Temporal Technologies is hiring a remote Full-Time Senior Software Engineer, Data Platform. The career level for this job opening is Experienced and is accepting USA based applicants remotely. Read complete job description before applying.
Temporal Technologies
Job Title
Posted
Career Level
Career Level
Locations Accepted
Salary
Share
Job Details
SummaryThe Data Engineer position at Temporal Technologies will play a key role in transforming and analyzing data within the data platform. This position requires an experienced data engineering professional with deep expertise across the entire data stack, and the ability to collaborate effectively with stakeholders. The work we do is centered around providing value to Temporal stakeholders by helping them use data to make business decisions. We stay laser-focused on prioritizing work that contributes to tangible, positive business outcomes.
What You’ll Do
- Build out data pipelines to support ingestion of both event driven and batch driven workloads from internal tooling as well as 3rd party software and APIs
- Harden existing pipelines with data quality checks
- Model data for OLAP purposes
- Create actionable dashboards and reports that drive behavior in our business
- Perform and present analyses (exploratory, causal, predictive, diagnostic)
- Take end to end project ownership by facilitating working sessions with engineers and business stakeholders to design and produce data solutions that appropriately meet business needs
- Monitor performance of operations to ensure optimal performance of data ingest, processing, and querying
- Contribute to data platform roadmap by evaluating opportunities and recommending projects tied to delivering business needs
- Train and enable stakeholders on new data products as they are released to ensure optimal utilization and outcomes
What You'll Bring
- Experience across the data stack (ingest, storage, compute, data modeling, visualization, analysis)
- Strong interest in the data and what it represents to the business
- Experience owning key components of SLA bearing production data platform relying on distributed storage and compute components
- Experience building on data lake architectures
- Expertise within at least one of the major cloud providers (AWS, GCP, Azure)
- Experience working with a wide range of data sources (APIs, logs, event stores, etc.)
- Strong proficiency in Python and SQL
- Experience with multiple data processing and query engines (Spark, Presto/Trino, Athena, BigQuery, etc.)
- Significant experience with both object stores and relational databases such as S3 and Redshift
- Ability to quickly gain proficiency in new tools and technologies
- Strong desire to continue to learn and experiment
- Strong communicator and collaborator with the ability to connect technical projects to business impact