Data Engineer (Cyber Security)
Progressive Edge
Engineering & Technology
Job Summary
As a data engineer with experience in AWS and a focus on data lake construction and Elasticsearch integration, you will be at the forefront of our data-driven initiatives. Your responsibilities will encompass working with a range of AWS services, including Lambda, S3, CodePipeline, and CodeCommit, as well as Elasticsearch, Kibana, Logstash, Beats, Elastic Common Schema (ECS), and Elastic Security. Your role will involve deploying, configuring, testing, and troubleshooting AWS services, ensuring efficient data indexing, querying, aggregations, and mappings. You will also serve as the subject matter expert for ELK (Elasticsearch, Logstash, Kibana) implementation across our shared service platform.
- Minimum Qualification: Degree
- Experience Level: Mid level
- Experience Length: 3 years
Job Description/Requirements
Responsibilities:
- Collaborate with cross-functional teams to construct and manage a data lake on AWS for real-time and local market data feeds
- Implement and maintain Elasticsearch, Logstash, Beats, Kibana, Elastic Common Schema (ECS) and Elastic Security components
- Utilise AWS services such as Lambdas, S3, CodePipeline, and CodeCommit to automate data lake processes
- Develop and maintain data ingestion pipelines using Apache Flink and Apache Beam
- Deploy, configure, and optimise AWS services to support data storage, processing, and analysis
- Index data, design complex queries, create aggregations, and manage mappings within Elasticsearch
- Serve as the go-to expert for ELK implementation and configuration
- Collaborate with teams to integrate Elasticsearch with other operational data platforms and tools, including Kafka, SIEM, and more
Requirements:
- Bachelor's degree in Computer Science, Information Technology, or a related field
- Extensive experience as a developer with a strong focus on AWS Elastic services and Python
- In-depth knowledge of Elasticsearch, Logstash, Kibana, Beats, Elastic Common Schema (ECS) and Elastic Security
- Familiarity with AWS services, including Lambdas, S3, CodePipeline, and CodeCommit
- Proven expertise in data pipeline development using Apache Flink and Apache Beam
- Proficiency in Python for scripting and automation
- Background in cyber security or data engineering highly advantageous
- Strong problem-solving and troubleshooting skills
- Excellent communication and collaboration abilities
- Ability to excel in a fast-paced, team-oriented environment
Important Safety Tips
- Do not make any payment without confirming with the Jobberman Customer Support Team.
- If you think this advert is not genuine, please report it via the Report Job link below.