Full-Time Analytics Engineer
Acquia is hiring a remote Full-Time Analytics Engineer. The career level for this job opening is Expert and is accepting India based applicants remotely. Read complete job description before applying.
Acquia
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
About the role…
As a Senior Data Engineer with expertise in building Enterprise Data Platforms (EDP), you will play a pivotal role in creating a robust and scalable data infrastructure. You will be responsible for designing, building, and maintaining data pipelines that will power various analytics and AI-driven products across the company.
Job Responsibilities:
- Data Pipeline Design & Development: Lead the design, development, and optimization of data pipelines that enable the collection, transformation, and storage of data from multiple sources like OLTP databases, operational applications (Salesforce, Jira), event collectors, logs, 3rd-party APIs, and file storage (e.g., S3).
- Ingestion & Integration: Build and manage data ingestion processes using tools like Meltano and orchestrate workflows with Dagster, ensuring scalability and real-time data processing.
- Data Storage & Management: Implement data storage solutions leveraging Snowflake for data warehousing and S3 for data lakes, optimizing for cost, performance, and security.
- Data Transformation: Work with the data transformation layer using dbt and ensure seamless integration with orchestration layers to maintain real-time and batch processing efficiency.
- Data Governance: Establish data governance frameworks and data catalogs (e.g., DataHub) to ensure data quality, discoverability, and security across the platform.
Collaboration: Collaborate with stakeholders across the organization, including data scientists, product managers, and business analysts, to translate business needs into technical solutions. - Real-Time Analytics: Implement real-time analytics solutions by building and maintaining a vector database and real-time analytics database to support machine learning models and predictive analytics.
- Semantic Layer & AI Integration: Design and implement a semantic layer (e.g., cube.js) to facilitate user-facing analytics while integrating AI services and vector databases for advanced analytics.
Skills:
- 5+ years of experience in data engineering or data platform development.
- Experience with data engineering tools and frameworks (e.g., Meltano, dbt, Dagster).
- Strong hands-on experience with data lakes (e.g., S3), and data warehouse tool - Snowflake
- Strong hands-on experience with cloud platforms (AWS, GCP, or Azure)
- Proficiency in SQL, Python, and ETL/ELT design patterns.
- Understanding of real-time analytics and vector databases.
- Strong knowledge of data governance practices and cataloging tools like DataHub.
- Experience working with cross-functional teams and translating business needs into technical requirements.
- Proven expertise in building enterprise-level data platforms.
- Experience with real-time data processing and analytics tools.
Preferred Qualifications:
- Excellent communication skills with the ability to explain technical concepts to non-technical stakeholders.
- Familiarity with dashboarding and reporting tools like Domo.
- Experience with AI/ML-driven platforms and integrating AI services for advanced analytics.