Full-Time Data Engineer
Boost-IT is hiring a remote Full-Time Data Engineer. The career level for this job opening is Expert and is accepting Portugal based applicants remotely. Read complete job description before applying.
Boost-IT
Job Title
Posted
Career Level
Career Level
Locations Accepted
Share
Job Details
Tasks
As a Data Engineer specializing in Google Cloud Platform (GCP), you will play a key role in designing, building, and maintaining our cloud-based data solutions. You will collaborate closely with our cross-functional teams to develop data pipelines, implement data ingestion processes, and ensure the reliability and scalability of our data infrastructure on GCP. The ideal candidate will have a strong background in data engineering, experience with GCP services, and a passion for leveraging cloud technologies to solve complex data challenges.
Key Responsibilities:
- Design, develop, and maintain data pipelines and ETL processes on Google Cloud Platform (GCP) using services such as Cloud Dataflow, Apache Beam, or Google Cloud Composer.
- Implement scalable and reliable data ingestion mechanisms to collect, process, and store large volumes of structured and unstructured data.
- Optimize data storage and retrieval processes using GCP storage solutions such as BigQuery, Cloud Storage, and Cloud Bigtable.
- Collaborate with data scientists and analysts to support their data needs and ensure data accessibility, accuracy, and integrity.
- Monitor and troubleshoot data pipelines, identifying and resolving performance bottlenecks, data quality issues, and system failures.
- Implement data security and compliance measures to protect sensitive information and ensure regulatory compliance.
- Stay up-to-date with the latest developments in GCP services and cloud computing technologies, evaluating new tools and techniques to improve our data infrastructure.
- Provide technical guidance and mentorship to junior data engineers, fostering a culture of continuous learning and growth.
Requirements
- Bachelor's degree in Computer Science, Engineering, or related field.
- At least 7 years working as a Data Engineer
- Proven experience with a focus on Google Cloud Platform (GCP).
- In-depth knowledge of GCP services and products, including BigQuery, DBT (Data Build Tool), Cloud Storage, Dataflow, Pub/Sub, and Data Studio.
- Proficiency in programming languages such as Python, Java, or Scala for developing data pipelines and ETL processes.
- Strong understanding of data modeling, database design, and SQL query optimization.
- Experience with version control systems such as Git and CI/CD pipelines.
- Excellent problem-solving skills and attention to detail.
- Ability to work effectively in a fast-paced, collaborative environment with cross-functional teams.
- Google Cloud certification (e.g., Professional Data Engineer) is a plus.
Preferred Skills:
- Experience with other cloud platforms such as AWS or Azure.
- Knowledge of containerization technologies such as Docker and Kubernetes.
- Familiarity with streaming data processing frameworks such as Apache Kafka or Apache Flink.
- Understanding of machine learning concepts and frameworks.