Full-Time ETL/ELT Data Integration Engineer
McFadyen Digital is hiring a remote Full-Time ETL/ELT Data Integration Engineer. The career level for this job opening is Experienced and is accepting Kochi, India based applicants remotely. Read complete job description before applying.
McFadyen Digital
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
McFadyen Digital is seeking an experienced ETL Integration Engineer to design, build, and optimize data pipelines for eCommerce marketplace solutions. This role integrates multi-source data (ERP, CRM, PIM, eCommerce) into cloud data warehouses and analytics tools (AWS, Azure, Google Cloud, Snowflake).
Collaborate with architects, engineers, and analysts to ensure data integrity, security, and efficiency in large-scale commerce environments.
Top 5 Responsibilities:
- Data Pipeline Development & Maintenance
- System & API Integration
- Data Synchronization & Orchestration
- Data Consistency & Integrity Assurance
- Performance Monitoring & Optimization
Additional Responsibilities:
- Develop and optimize ETL/ELT workflows.
- Implement real-time and batch processing pipelines.
- Integrate APIs and vendor systems with TSN data services.
- Support ERP (SAP, Oracle) and 3rd-party application data exchange.
- Implement bi-directional data synchronization.
- Develop workflow automation.
- Enforce data validation, error handling, and logging.
- Ensure data governance and quality compliance.
- Monitor pipelines for performance bottlenecks.
- Improve system scalability and efficiency.
- Proficiency in ETL/ELT development, data integration, pipeline monitoring, APIs, databases, and data transformation tools.
- Strong knowledge of ERP systems (SAP, Oracle) and vendor platforms (e.g., AeroXchange).
- Expertise in handling large datasets for integrity and consistency.
- Troubleshoot and optimize integration workflows.
- 6+ years experience with ETL/ELT tools (dbt, Apache Airflow, Informatica, Talend, Fivetran) in a cloud or hybrid environment.
- Expertise in SQL and database technologies (SQL, NoSQL, Graph databases).
- Experience in data modeling, schema design, and query optimization.
- Strong cloud experience with platforms like AWS Redshift, Snowflake, Google BigQuery, and Azure Synapse.
- Understanding of data security measures (RBAC, RLS, CLS, DDM).
- Experience integrating enterprise applications (ERP, CRM, OMS, PIM).
- Knowledge of big data and streaming technologies (Apache Kafka, Spark, Hadoop) is a plus.
- Familiarity with BI and analytics tools (Power BI, Tableau, Looker).
- Understanding of AI/ML-driven data pipelines.
- Experience with data governance, metadata management, and MDM frameworks.
- Excellent problem-solving, communication, and stakeholder management skills.
- Bachelor's or Master's degree in a relevant field.
- Certifications (AWS Certified Data Analytics, Google Professional Data Engineer, Snowflake Data Architect) are a plus.