Full-Time Cloud Data Architect - AWS
Doitintl is hiring a remote Full-Time Cloud Data Architect - AWS. The career level for this job opening is Experienced and is accepting Colombia based applicants remotely. Read complete job description before applying.
Doitintl
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
Cloud Data Architect - AWS Data & Analytics
Location: Our Cloud Data Architect will be an integral part of our Cloud Reliability Engineering team in Colombia.
Who We Are
DoiT is a global technology company. We work with cloud-driven organizations to leverage the cloud to drive business growth and innovation. We combine data, technology, and human expertise to ensure our customers operate in a well-architected and scalable state from planning to production. Delivering DoiT Cloud Intelligence, we help our customers solve complex multicloud problems and drive efficiency.
The Opportunity
As a Cloud Data Architect, you will be part of our global CRE team, working with rapidly growing companies around the world. This role offers you the chance to:
- Apply your hands-on experience & skills in a consultative manner to address our customers’ strategic and tactical needs around cloud technologies
- Grow your technical and interpersonal skills by addressing customer challenges and leveraging dedicated learning time
- Strengthen your personal brand through thought leadership activities
Qualifications
- Good verbal and written communication skills
- Expertise in architecting, developing and troubleshooting large production-grade distributed systems on AWS and selecting appropriate tools
- Experienced working with cloud data storage, data warehousing, and/or data lake architecture, ELT/ ETL and reporting/ analytic frameworks and technologies, such as Redshift, Athena, Glue and EMR
- Experience with designing highly-available systems for serving transactional web-scale low-latency traffic using both RDBMS and NoSQL technologies, such as AWS RDS and DynamoDB
- Programming background with shell scripts and at least one of: JavaScript, Java, Python, GO or Rust
- Familiarity with debugging, refactoring and optimizing code, and knowing when and how to automate tasks
- Experience in data pipeline building and data wrangling; enjoy optimizing data systems and building from the ground up
- Hands-on with deploying production data pipelines using orchestration tools like AWS MWAA/Airflow
Bonus Points
- Experience with AI/ML technologies like SageMaker
- Experience with QuickSight or other BI Visualization tools
- Open to study and work with GCP data products