Full-Time Data Science AI Platform Engineer
SAIC is hiring a remote Full-Time Data Science AI Platform Engineer. The career level for this job opening is Experienced and is accepting USA based applicants remotely. Read complete job description before applying.
SAIC
Job Title
Posted
Career Level
Career Level
Locations Accepted
Salary
Share
Job Details
We are seeking an experienced Data Science & AI Platform Engineer to build and maintain a secure, monitored, cloud-hosted production laboratory. The laboratory will be used by data science teams to create analytics for insights and AI/ML models for deployment in a high-stakes production environment.
Key Responsibilities:
- Develop and maintain a data exploration and feature engineering pipeline to enable data readiness for modeling within a secure and compliant infrastructure.
- Implement and manage Databricks, JupyterHub, and other modern tools for exploratory data analysis, AI design, model development, and training.
- Integrate tools with TCloud Bitbucket for version control and collaboration.
- Configure AWS S3 buckets for Feature Store, ensuring compliance with IRS UNAX rules and separate security contexts for different teams.
- Establish and maintain a model development pipeline utilizing MLflow for experiment tracking and model registry.
- Create AWS Lambda functions and implement AWS Flow for efficient model artifact storage and .metadata management.
- Integrate with AWS RDS for robust data management and retrieval.
- Configure SNS notifications for model promotion workflows.
- Support data science teams in adhering to Responsible AI Principles via the implementation of the Responsible AI Toolbox
Qualifications
- Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field and 5 years experience.
- Must be able to obtain a Public Trust clearance.
- Proven experience with cloud services, particularly AWS (S3, Lambda, RDS, SNS, etc.).
- Expertise in setting up and managing data science environments with tools such as Databricks, JupyterHub, MLflow, and Bitbucket.
- Strong proficiency in programming languages and tools such as SQL, Python, R, and Spark.
- Familiarity with machine learning frameworks and libraries.
- Deep understanding of data security and privacy, especially in compliance with IRS standards.
- Experience with CI/CD pipelines and infrastructure as code.
- Excellent communication skills and the ability to collaborate with cross-functional teams.