Full-Time Staff Architect
Astronomer is hiring a remote Full-Time Staff Architect. The career level for this job opening is Expert and is accepting USA based applicants remotely. Read complete job description before applying.
Astronomer
Job Title
Posted
Career Level
Career Level
Locations Accepted
Salary
Share
Job Details
Staff Architect
About this role: As a Staff Architect, you will be a key member of our professional services team and work directly with Astronomer’s most important customers, assisting them in a technical leadership capacity with their data ecosystem modernization and DataOps transformation initiatives.
In this role, you will be exposed to a wide variety of data processing use cases, primarily orchestrated by Apache Airflow.
What you get to do:
- Work directly with Astronomer customers, acting as technical lead for high-stakes professional services engagements
- Participate in pre-sales motions where a specialized technical solution is required
- Build architecture, data flow, and operational diagrams and documents
- Provide reference implementations of various activities, including composing data pipelines in Airflow, implementing new Airflow features, or integrating Airflow with 3rd party solutions
- Collaborate to build reusable assets, automation tools, and documented best practices
- Interact with Product and Engineering stakeholders to channel product feedback and requirements discussions
- Work with Global Service Delivery team members to ensure clients are realizing value in their Airflow and Astronomer journeys
- Establish strong relationships with key customer stakeholders
What you bring to the role:
- Experience with Apache Airflow in production environments
- Experience in designing and implementing ETL, Data Warehousing, and ML/AI/analytics use cases
- Experience with OpenLineage and data quality and observability tools
- Proficiency in Python and ideally other programming languages
- Knowledge of cloud-native data architecture
- Demonstrated technical leadership on team projects
- Strong oral and written communication skills
- Customer empathy
- Willingness to learn new technologies and build reference implementations
Bonus points if you have:
- 7+ years experience in a data engineering or similar role
- 2+ years in a customer-facing role
- Consulting experience
- Experience in migrating workflows from legacy schedulers (Control-M, Autosys, Oozie, Cron, etc.) to Apache Airflow
- Snowflake experience
- Databricks or Spark experience
- Kubernetes experience, either on-premise or in the cloud
- Enterprise data experience in regulated environments