Contractor Cloud (AWS) / Databricks Technical Architect

Kaizen Analytix is hiring a remote Contractor Cloud (AWS) / Databricks Technical Architect. The career level for this job opening is Expert and is accepting USA based applicants remotely. Read complete job description before applying.

This job was posted 11 months ago and is likely no longer active. We encourage you to explore more recent opportunities on our site. However, you may still try your luck using 'Apply Now' link below. We recommend focusing on newer listings available here.

Kaizen Analytix

Job Title

Cloud (AWS) / Databricks Technical Architect

Posted

Career Level

Contractor

Career Level

Expert

Locations Accepted

USA

Job Details

Job Summary: We are seeking an experienced and dynamic Cloud (AWS) / Databricks Technical Architect to join our team. The ideal candidate will have extensive expertise in designing, building, and deploying cloud-based solutions using AWS and Databricks, with a focus on data engineering, analytics, and machine learning. In this role, you will be responsible for driving the architecture and implementation of scalable, secure, and efficient cloud-based data solutions that support the organization's data-driven initiatives.

Key Responsibilities:

  • Lead the design and implementation of cloud-based data architectures on AWS, utilizing a wide range of AWS services (e.g., S3, EC2, Lambda, RDS, Redshift, Athena, Glue).
  • Architect and deploy scalable, secure, and cost-effective Databricks environments to process large volumes of data for analytics, data engineering, and machine learning.
  • Provide leadership in designing modern data architectures, including real-time data pipelines, ETL/ELT workflows, and big data processing systems using Databricks and AWS technologies.
  • Define and implement best practices for managing and optimizing data lakes, data warehouses, and data pipelines.
  • Ensure architecture decisions align with business requirements, security policies, and compliance standards.

Requirements

  • AWS Certification: Maintain and leverage AWS certification to design and implement cloud solutions.
  • Cloud Architecture: Design and optimize cloud architecture for scalability and efficiency.
  • Containers & Orchestration: Implement containers and orchestration tools for streamlined application deployment.
  • Microservices Architecture: Design and manage microservices architecture for flexible and scalable systems.
  • Cloud Environment Setup and Configurations: Set up and configure cloud environments to meet project requirements.
  • Security & Access Management: Ensure secure access management and compliance within cloud environments.
  • SQL, Python, Visualization & Analytical Tools: Use SQL, Python, and analytical tools for data processing and visualization.
  • API Development & Management: Develop and manage APIs for seamless data integration and functionality.

Education & Experience: Bachelor's or Master's degree in Computer Science, Information Technology, Data Engineering, or a related field. 8+ years of experience in cloud architecture, data engineering, or a similar technical role, with at least 5 years of hands-on experience working with AWS and Databricks. Proven track record in architecting and deploying large-scale data engineering solutions on AWS and Databricks. Experience working with various data processing frameworks (e.g., Apache Spark, Apache Kafka, Airflow) and cloud-based data storage solutions.

Technical Skills & Competencies: Deep expertise in AWS services, including but not limited to S3, EC2, Lambda, Glue, Redshift, Athena, and RDS. Strong experience with Databricks, including notebook creation, Spark-based processing, and managing Databricks clusters. Expertise in data engineering concepts, including ETL/ELT, data lakes, data pipelines, and real-time streaming architectures. Proficiency in programming languages such as Python, Scala, SQL, or Java for data processing and solution development. Experience with DevOps practices, CI/CD pipelines, containerization (e.g., Docker, Kubernetes), and infrastructure as code (e.g., Terraform, CloudFormation). Familiarity with machine learning workflows and tools, particularly those that integrate with Databricks (e.g., MLflow, Spark MLlib). Strong understanding of cloud security best practices, including IAM, encryption, and network security.

FAQs

What is the last date for applying to the job?

The deadline to apply for Contractor Cloud (AWS) / Databricks Technical Architect at Kaizen Analytix is 28th of January 2025 . We consider jobs older than one month to have expired.

Which countries are accepted for this remote job?

This job accepts [ USA ] applicants. .

Looking for a specific job?