Full-Time AI/ML Engineer
Burq is hiring a remote Full-Time AI/ML Engineer. The career level for this job opening is Experienced and is accepting USA based applicants remotely. Read complete job description before applying.
Burq
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
About Burq
Burq started with an ambitious mission: how can we turn the complex process of offering delivery into a simple turnkey solution.
We started with building the largest network of delivery networks, partnering with some of the biggest delivery companies. We then made it extremely easy for businesses to plug into our network and start offering delivery to their customers. Now, we’re powering deliveries for some of the fastest-growing companies from retailers to startups.
It’s a big mission and now we want you to join us to make it even bigger! 🚀
We’re already backed by some of the Valley's leading venture capitalists, including Village Global, the fund whose investors include Bill Gates, Jeff Bezos, Mark Zuckerberg, Reid Hoffman, and Sara Blakely. We have assembled a world-class team all over the U.S.
We operate at scale, but we're still a small team relative to the opportunity. We have a staggering amount of work ahead. That means you have an unprecedented opportunity to grow while doing the most important work of your career.
We want people who are unafraid to be wrong and support decisions with numbers and narrative. Here’s a quick overview of what you will be doing:
AI ML Engineer
As an AI ML engineer at Burq, you will play a crucial role in designing, developing, and deploying machine learning models and data pipelines. The ideal candidate will take full ownership, from problem definition through to delivering production-grade solutions, requiring strong initiative and self-direction. You will work closely with data engineers, data scientists, and business stakeholders to build scalable solutions that drive impactful business decisions. Your expertise in various data and machine learning tools will be essential in managing our data infrastructure and delivering high-quality AI/ML projects.
Basic Qualifications (Required Skills/Experience)
- Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related field
- Proficiency in SQL and Python
- Experience with data integration tools like Airbyte
- Strong knowledge of data warehousing concepts and hands-on experience with Snowflake
- Experience with data transformation tools such as dbt
- Proficiency in using Databricks and Apache Spark for big data processing and machine learning
- Familiarity with Delta Lake for data lake management
- Experience with data visualization tools like Tableau
- Experience with relational databases like MySQL and PostgreSQL
- Strong analytical and problem-solving skills
- Excellent communication and teamwork abilities
- Ability to work in a fast-paced and dynamic environment
Preferred Qualifications:
- A strong individual contributor mindset, demonstrating high autonomy
- Knowledge of machine learning libraries and frameworks such as TensorFlow, PyTorch, or Scikit-Learn
- Experience with cloud platforms such as AWS, GCP, or Azure
- Relevant certifications in data engineering, machine learning, or cloud technologies
Responsibilities:
- Design, develop, and deploy machine learning models using Databricks and Apache Spark
- Implement data preprocessing, feature engineering, and model training pipelines
- Utilize dbt (data build tool) to transform and model data in Snowflake to prepare datasets for machine learning
- Use SQL and Python to analyze large datasets, derive meaningful insights, and build training datasets
- Conduct exploratory data analysis to identify trends, patterns, and anomalies
- Develop and maintain ETL pipelines using Airbyte to ingest data from various sources into Snowflake
- Manage and optimize data storage and retrieval using Delta Lake on Databricks to ensure efficient access for ML models
- Create and maintain interactive dashboards and visualizations in Tableau to communicate model results and insights to stakeholders
- Collaborate with data scientists to refine and improve machine learning models
- Monitor and evaluate the performance of deployed models, ensuring they meet accuracy and performance standards
- Work with relational databases such as MySQL and PostgreSQL for data storage and management as needed
- Communicate complex technical concepts and results to non-technical stakeholders effectively
We make it easy for businesses to offer delivery! 🚀