Full-Time Senior Splunk / Cribl Engineer - Cybersecurity Engineering
Abbvie is hiring a remote Full-Time Senior Splunk / Cribl Engineer - Cybersecurity Engineering. The career level for this job opening is Experienced and is accepting USA based applicants remotely. Read complete job description before applying.
Abbvie
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
This position can be remote anywhere in the U.S.
AbbVie Information Security is looking for a highly motivated and talented individual to join the Cyber Security Engineering (CSE) Team. The Cyber Security Engineering installs, manages, optimizes, automates the tools in use by the broader Information Security and Risk Management teams.
Data Management services are a foundation for the CSE team's portfolio, inclusive of data transformations and pipelining to downstream systems. The CSE team acts as subject matter experts and assists with training and development of their userbase, expanding beyond the scope of Information Security and Risk Management.
This is an expansion of capabilities within the Cyber Security Engineering Team, focusing on the data pipelines, data models, and adherence to standards across our datasets. Grow with us as a Data Engineer, Cyber Security Engineering (CSE) to raise our tools and skilled analysts up so that they may help our business to continue to have remarkable impacts on people's lives.
This role is responsible for delivering the value of data management toolsets, inclusive of the data pipelines and the SIEM platform. The Data Engineer will assist with data onboarding, normalization/harmonization, pipelining, data modelling, and documentation while striving for automation and quality delivery.
Our team focuses on leveraging CI/CD pipelines for automated builds and deployments across all of our supported toolsets, while implementing a mix of legacy and cloud-native infrastructure and services. The ideal candidate must be comfortable to adapting and learning new terminology, processes, and techniques in use within Information Security teams and be comfortable working in Scrum and Agile/DevOps methodologies.
In this role you will be responsible for:
- Implementation and development of data pipelines that feed the SIEM and other analytics engines using existing toolsets
- Creating structured data sets from unstructured data
- Build data models and enhance standard schemas across different technologies
- Normalize/Harmonize data across various platforms
- Verify data integrity and translations against multiple systems
- Creation and support of analytic toolsets outside the SIEM
- Assist in analysis and defining data requirements and specifications
- Assist in analysis and planning for anticipated changes in data capacity requirements
- Assist in developing and documenting data standards, policies, and procedures
- Perform compilation, cataloging, caching, distribution, and retrieval of data within the SIEM and other platforms
- Analyze data sources to provide actionable recommendations
- Develop standards and automations in metrics aggregation and dissemination
- Manage data lineage across various systems
- Designs enhancements, updates, and programming changes for portions and subsystems of data pipelines, repositories or models for structured/unstructured data.
- Analyzes design and determines coding, programming, and integration activities required based on specific objectives and established project guidelines.
- Executes and writes portions of testing plans, protocols, and documentation for assigned portion of application; identifies and debugs issues with code and suggests changes or improvements.
- Participates as a member of a project team to develop reliable, cost effective, and high-quality solutions for data systems, models, or components.
- Significant Work Activities -Continuous sitting for prolonged periods (more than 2 consecutive hours in an 8 hour day)
Qualifications
Tools and skills you will use in this role:
- Heavy team collaboration skills.
- Splunk
- Crible
- Scrum & Agile development
- Git-Ops
Experiences that make you a strong fit for this role:
Required:
- Bachelor's Degree with 6 years' experience; Master's Degree with 5 years' experience; PhD with 0 years' experience OR relative work experience
- Skills in developing data models, dictionaries, and reports within a SIEM platform
- Experience building and configuring data pipelines and architectures
- Experience with regular expressions and parsing unstructured data
- Deep understanding of data administration and data standardization policies
- Knowledge of database management systems, query languages, table relationships, and views
- Experience in validating data sets and calculations
- Ability to work both independently without direction and within a group for day-to-day activities
- Capable of learning new concepts and processes quickly, and adapting to a constantly changing environment
- Experience with CI/CD Pipelines and Git
- Experience with database & system integration technologies
- Prior experience working with ETL in a SIEM environment (ELK, Splunk, Exabeam, etc)
Beneficial:
- Prior experience working in an Agile team
- Familiarity with cybersecurity, privacy principles, cyber threats, and vulnerabilities
- Prior experience working with ETL in a SIEM environment (ELK, Splunk, Exabeam, etc)
- Demonstrated experience in implementing regular expressions
- Experience working with development tools and scripting languages (Python / PowerShell / Go)
- Experience analyzing and pivoting on large sets of data, with the ability to identify patterns, anomalies, and outliers
- Ability to identify basic common coding flaws
- Demonstrated experience in log analysis and parsing of unstructured data (ETL)
- Amazon Solutions Architect / Azure Data Engineer Associate / Cloud Professional Data Engineer Certification