Full-Time Data and Integration Engineer
Attain is hiring a remote Full-Time Data and Integration Engineer. The career level for this job opening is Entry Level and is accepting Bengaluru, India based applicants remotely. Read complete job description before applying.
Attain
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
Assist in configuring and maintaining integration workflows using middleware tools to support data exchange between Tango's API architecture and third-party systems.
Support ETL processes, including data extraction, transformation, and loading.
Participate in testing and validating integrations, ensuring data quality, asynchronous processing, and polling-based synchronization meet client requirements.
Collaborate on low-touch implementations by leveraging standard API endpoints and flat file transfers, deploying standard integrations, and providing daily operational support.
Provide Level 3 Production Support for integration-related issues, including root-cause analysis and remediation within defined SLAs.
Contribute to documentation updates for integration playbooks, Swagger files, user guides, test procedures, performance specifications, and product manuals.
Assist in estimating Level of Effort (LOE) for custom integrations during SOW/CO development and client engagements; prepare data templates based on Source to Target Mapping (STM) documents.
Perform data transformations, including merging, ordering, aggregation, and resolving data cleansing/quality issues using various data processing tools.
Run lookups queries, update lookups, execute data quality scripts, format and validate data quality reports, and run scripts on data in staging databases while updating data load checklists.
Conduct internal smoke testing of loaded data in the Tango application and prepare/add/remove columns in sample Business Data validation trackers using STM.
Integrate information from multiple data sources, solving common transformation problems.
Engage in agile iterations for refining transformation routines and business rules, prioritizing critical path data elements while understanding business-domain context to clean and transform data for analysis.
Required Skills:
- Bachelor's degree in Computer Science, Information Technology, Data Science, or related field.
- 0-2 years of experience in software development, integrations, or data wrangling.
- 2+ years of SQL proficiency.
- Proficiency in JavaScript.
- 2+ years of experience with Kettle (Pentaho) or similar ETL/data processing tools.
- Hands-on experience in any ETL/Reporting tool.
- Basic knowledge of RESTful APIs and data formats (e.g., JSON, CSV).