Full-Time Data Architect
Derex Technologies Inc is hiring a remote Full-Time Data Architect. The career level for this job opening is Experienced and is accepting Erie, PA based applicants remotely. Read complete job description before applying.
Derex Technologies Inc
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
Role: Data Architect
Location: Remote
Duration: Long term
About the Role: We are seeking an experienced Data Architect to lead and design modern data solutions for a Property & Casualty (P&C) customer undergoing a major data modernization initiative involving Guidewire Claim Data Access (CDA). The ideal candidate will possess strong technical expertise, hands-on experience, and excellent communication skills to successfully deliver enterprise-grade data solutions in Azure/Informatica.
This role requires a proactive problem solver who can troubleshoot and optimize complex data pipelines and workflows for maximum efficiency and reliability.
Key Responsibilities:
- Architect and implement enterprise metadata-driven data pipelines using ETL tools like Azure Data Factory (ADF) and Informatica.
- Design and develop an Operational Data Store (ODS) sourced from Azure Data Lake, ensuring robust, scalable, and high-performing architecture.
- Collaborate with stakeholders to integrate and optimize Guidewire Data (CDA) into the data lake architecture, enabling advanced analytics and reporting.
- Troubleshoot and resolve issues in data pipelines, workflows, and related processes to ensure reliability and data accuracy.
- Continuously monitor and optimize current workflows for performance, scalability, and cost-efficiency, adhering to best practices.
- Develop and maintain custom processes using Python, T-SQL, and Spark, tailored to business requirements.
- Leverage Azure Functions to design serverless compute solutions for event-driven and scheduled data workflows.
- Optimize data workflows and resource usage to ensure cost-efficiency in Azure Cloud environments.
- Provide leadership and guidance for implementing Hadoop-based big data solutions where applicable.
- Develop a comprehensive understanding of P&C domain data, ensuring alignment with business objectives and compliance requirements.
- Communicate technical solutions effectively with cross-functional teams, stakeholders, and non-technical audiences.
Required Qualifications:
- 13+ years of experience in data architecture, data engineering, and/or ETL development roles, with at least 3+ years in a P&C insurance domain.
- Proven experience with Azure Cloud Services, including Azure Data Lake, Azure Data Factory, and SQL Server.
- Leverage Informatica for robust ETL workflows, data integration, and metadata-driven pipeline automation to streamline data processing
- Build end-to-end metadata-driven frameworks and continuously optimize existing workflows for improved performance, scalability, and efficiency.
- Strong knowledge of Guidewire Claim Data Access (CDA) or similar insurance domain data.
- Expertise in troubleshooting and optimizing data pipelines and workflows for enhanced reliability and performance.
- Proficiency in scripting and programming with Python, T-SQL, and Spark for custom data workflows.
- Hands-on expertise in building and managing ODS systems from data lakes.
- Experience with Azure Functions for serverless architecture.
- Familiarity with Hadoop ecosystems (preferred but not mandatory).
- Demonstrated ability to design solutions for Azure Cloud Cost Optimization.
- Excellent communication skills to engage with technical and business stakeholders effectively.
- Experience with metadata management and data cataloging for large-scale data ecosystems.
Preferred Skills:
- Familiarity with Guidewire systems and their integration patterns.
- Experience in implementing Data Governance frameworks.
- Certification in Azure (e.g., Azure Data Engineer Associate or Azure Solutions Architect).
- Experience with other data platforms/tools such as Hadoop, Databricks etc.