Join our growing Information Protection team as an IT Data Security Engineer. You will drive and support the evolution of our Secure Infrastructure Portfolio, specifically in Data Security services. Your role will involve collaborating with various teams to implement data security solutions, improving telemetry services, and ensuring compliance with data privacy laws. Additionally, you will conduct risk assessments, train stakeholders, and collaborate with IT team members to secure data architectures.
WHAT YOU'LL DO
Welcome to BCG Worldwide IT! We are seeking an IT Data Security Engineer to join our growing Information Protection team. You will be working in a Security Engineering, Architecture and Operations capacity to drive and support the continued evolution of our Secure Infrastructure Portfolio, notably in Data Security services providing security telemetry and observability telemetry capabilities to help detect and prevent threats. You will play a key role in developing and implementing our next generation of detection capabilities.
YOU WILL
- Work collaboratively with application development, data protection, information security, and risk management teams to understand and implement data security and management solutions.
- Continuously improve security & observability telemetry services based on input from a diverse network of internal and external stakeholders, and technology teams as well as the IT industry at large.
- Data Management: Define and manage data models, schemas, metadata, and security rules. Design, create, deploy, and manage databases and data structures on premise and in the cloud to fulfill business requirements.
- Threat Analysis: Identify and mitigate potential security risks in the organization's data architecture.
- Compliance: Ensure compliance with data privacy laws and regulations.
- Risk Management: Conduct risk assessments and take appropriate actions to mitigate the risks associated with data security.
- Training and Development: Train and educate stakeholders about our data.
- Collaboration: Collaborate with other IT team members, stakeholders, and executives to ensure the security of data architectures.
YOU'RE GOOD AT
You have experience in data warehousing, data modelling, and the building of data integration pipelines. You are well versed in data ingestion methods, such as extract and load techniques through scripting and/or tooling, streaming, API consumption and replication. You are good in analyzing performance bottlenecks and providing enhancement recommendations; you have a passion for customer service and a desire to learn and grow as a professional and a technologist.
- Viewed as subject matter expert for stakeholders, possessing in-depth knowledge and specialized technical skillset.
- Able to work independently with minimal supervision.
- Proactively identify and independently solve non-routine problems by applying expertise.
- Perform research of viable technical and/or non-technical solutions
- Develop internal network with senior leaders within the chapter and key stakeholders in the product portfolio.
- Develop strategies for data ingestion into Splunk, Snowflake, or similar products using Python, PowerShell, AWS lambda or similar products and technologies.
- Design and implement data pipelines to feed data models for subsequent consumption.
- Actively monitor and resolve user support issues, working closely with your functional squad and other squads as part of the chapter.
- Develop and maintain architectural standards, best practices, and measure compliance.
Communication, interpersonal and teaming skills
- Works positively and collaboratively with others and within team; builds strong and lasting relationships.
- Working and partnering with remote team members in different time zones
- Adapts style to changing situations and audiences with tact, poise, and patience. Demonstrates persistence to drive change. Contributes to a positive and productive work environment.
- Leverages network effectively across functions, offices, regions and/or externally (existing network within BCG will be helpful)
- Experience on data visualization and / or ability to present data analytics in an insightful manner for the leadership will be an added plus
Work Management, organization, and planning
- Must be able to perform successfully in a fast-paced, intellectually intense, service-oriented environment.
- Familiarity and willingness to work on Agile methodology is a must.
- Strong Organizational skills and process management skills
- Ability to contribute to multiple work streams at once and prioritize efforts accordingly. Demonstrated ability to drive projects to scheduled conclusion.
YOU BRING (EXPERIENCE & QUALIFICATIONS)
- 8+ years’ experience as a Product Owner in a Data Engineering, or Information Security related field.
- Experience working closely with Information Security and Risk Management Stakeholders
- Data Engineering / Management subject matter experience preferred, or related expertise.
- Strong understanding of Data Management or Data Engineering
- Strong grounding in data analysis and related processes
- Experienced in Agile methods, experience in Atlassian stack (i.e., JIRA) or related tools.
- Knowledge of globally distributed environments such as AWS and Azure
- Ability to Develop roadmaps and the underlying strategies for the data centric products and services.
- Management of ongoing feature improvements, backlog grooming, triage and prioritization, and cross-functional coordination to ensure completion within timelines, budget, and scope.
- Data-driven mindset
- Ability to collaborate with stakeholders on requirements and communicate project goals to squad members and dependency-related stakeholders.
- The ability to track progress, assess risks, coordinate delivery, and actively communicate contingency and mitigation plans.
- The awareness of when to involve key leadership team members.
- Experience in data Ingestion, Integration, ETL, or security engineering experience with large scale implementations distributed globally.
- Extensive knowledge of a globally distributed environment across multiple platforms such as AWS, Azure and GCP.
- Experience with standard monitoring frameworks and observability products
- Experience with hybrid environment data sources, data collectors and instrumentation
- Expertise in the use SIEM solutions for basic and advanced detection methods, including cloud-based data sources.
- Experience with security monitoring & observability solutions such as Splunk, Sumo, Datadog, New Relic, AppDynamics
- Experience of working with cloud and data security in a DevSecOps/IRE and agile working environments.
DESIRABLE
- Expertise in at least one scripting language (PowerShell, Python, Bash)
- Experience in container/container orchestration technologies - Docker and Kubernetes
- Experience w/systems configuration orchestration tools - Ansible or Terraform
- Understanding of infrastructure as a code and concepts
- Related security certifications (e.g. CISSP, CCSP, SABSA, ITIL etc.).
- 3+ years familiarity and experience with Linux / Ubuntu / Mac systems
- Experience in creating dashboards, queries, alerts in Splunk, Data Dog, SumoLogic, or similar product.
- Intellectual curiosity and an ability to execute projects.
- Data system performance and process tuning automating manual processes, perfecting data delivery, re-designing infrastructure for greater scalability
- Implementation of predictive analytics and machine learning models (MLlib, scikit-learn, etc.)
- Knowledge of a globally distributed environments across multiple platforms such as AWS, Azure and GCP
- Experience with standard monitoring frameworks and observability products
- Experience with hybrid environment data sources, data collectors and instrumentation
- Experience in the use of SIEM solutions for basic and advanced detection methods, including cloud-based data sources.
- Experience with security monitoring & observability solutions such as Splunk, Sumo, Datadog, New Relic, AppDynamics
- Experience with Open Telemetry Technologies such as: MES, PLC, HMI, Sensors
- Experience working with Big Data streaming technologies such as Spark, Hive, Impala, Druid, or Presto
- A solid foundation in data structures, algorithms, and OO or functional Design with fundamentally strong programming skills ad-hoc to the problem to be solved.