Full-Time System Architect
Addepto is hiring a remote Full-Time System Architect. The career level for this job opening is Experienced and is accepting Poland based applicants remotely. Read complete job description before applying.
Addepto
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
Addepto is a leading consulting and technology company specializing in AI and Big Data, helping clients deliver innovative data projects. We partner with top-tier global enterprises and pioneering startups, including Rolls Royce, Continental, Porsche, ABB, and WGU. Our exclusive focus on AI and Big Data has earned us recognition by Forbes as one of the top 10 AI companies.
As a System Architect, you will have the exciting opportunity to work with a team of technology experts on challenging projects across various industries, leveraging cutting-edge technologies.
Here are some of the projects we are seeking talented individuals to join:
- Design and development of the platform for managing vehicle data for global automotive company. This project develops a shared platform for processing massive car data streams. It ingests terabytes of daily data, using both streaming and batch pipelines for near real-time insights. The platform transforms raw data for data analysis and Machine Learning, this empowers teams to build real-world applications like digital support and smart infotainment and unlocks data-driven solutions for car maintenance and anomaly detection across the organization.
- Design and development of a universal data platform for global aerospace companies. This Azure and Databricks powered initiative combines diverse enterprise and public data sources. The data platform is at the early stages of the development, covering design of architecture and processes as well as giving freedom for technology selection.
Your main responsibilities:
- Design and develop scalable data management architectures, infrastructure, and platform solutions for streaming and batch processing using Big Data technologies like Apache Spark, Hadoop, Iceberg.
- Design and implement data management and data governance processes and best practices.
- Contribute to the development of CI/CD and MLOps processes.
- Develop applications to aggregate, process, and analyze data from diverse sources.
- Collaborate with the Data Science team on data analysis and Machine Learning projects, including text/image analysis and predictive model building.
- Develop and organize data transformations using DBT and Apache Airflow.
- Translate business requirements into technical solutions and ensure optimal performance and quality.
What you'll need to succeed in this role:
- 5+ years of proven commercial experience in implementing, developing, or maintaining Big Data systems.
- Strong programming skills in Python or Java/Scala: writing a clean code, OOP design.
- Experience in designing and implementing data governance and data management processes.
- Familiarity with Big Data technologies like Spark, Cloudera, Airflow, NiFi, Docker, Kubernetes, Iceberg, Trino or Hudi.
- Proven expertise in implementing and deploying solutions in cloud environments (with a preference for AWS).
- Excellent understanding of dimensional data and data modeling techniques.
- Excellent communication skills and consulting experience with direct interaction with clients.
- Ability to work independently and take ownership of project deliverables.
- Master’s or Ph.D. in Computer Science, Data Science, Mathematics, Physics, or a related field.
- Fluent English (C1 level) is a must.