Understanding OT Monitoring and Its Importance in Modern Industries
On the important of OT monitoring in physical industries, key challenges, emerging trends and best practices to ensure robust security, resilience...
Control 8.15 - Logs that record activities, exceptions, faults and other relevant events should be produced, stored, protected and analysed
In cybersecurity, a log acts as a digital diary, meticulously recording steps, interactions and events within information systems and networks. Logs are produced by both software, but we see a growth in “hardware” logs coming from the digitalization of hardware systems and the industry 4.0 trends. Log files capture user actions, system functions, program behaviors and much more, with time stamps and source details.Logs are a primary data source for observability, enabling use cases from cybersecurity monitoring, operational excellence and compliance.
The logging requirement are given by the standard in the Annex A, more precisely by the control 8.15 :
“Logs that record activities, exceptions, faults and other relevant events should be produced, stored, protected and analysed.”
“Logging in information security refers to the process of collecting and storing information about system and network activity, in order to track and analyze the actions of users, systems, and other entities. Logs can include information such as user logins and logouts, file access, network connections, and system events.” ISO 27001
Given the exponential digitization of businesses and the multiplication of systems, it's unrealistic to want to record every event produced on your company's network.
ISO 27001 has identified 10 events as being particularly important to record. These ten events have been selected because of their plausible ability to influence risk adjustments, and their essential contribution to maintaining commendable information security credentials 👇
Logs serve as an indelible chronicle, meticulously documenting each interaction, event, and transaction within your systems. Preserving their integrity is paramount, for the worth of these data hinges on their inviolable nature. Hence, it becomes imperative that no user, regardless of privileged access, possesses the capability to tamper with or manipulate logs. Absolute prevention measures must be in place to ensure that users are unequivocally barred from the following actions:
Safeguarding the sanctity of logs stands as a safeguard against misrepresentation and a guardian of unaltered truth.
To protect all your logs, the ISO 27001 standard suggests 4 methods :
Moreover, logs might be solicited by vendors for the purpose of system troubleshooting within your corporate environment. Once again, the safeguarding of the logs you share with vendors becomes an absolute necessity.
As previously emphasized, logs encapsulate sensitive particulars, encompassing personal information of individuals. Prior to transmitting logs to your vendors, a pivotal step involves the anonymization of this data.
This safeguarding technique is commonly referred to as data masking, a concept encapsulated within Control 8.11 - Data Masking, **this control was introduced following the update of the standard in 2022.**
Here's a comprehensive list of the information that necessitates anonymization :
In 2021, CIS unveiled an updated version of its controls, which now comprise a simplified set of 18 key security practices designed to strengthen an organization's cybersecurity framework. In particular, Control 8, entitled "Audit Log Management", focuses on the complex aspects of log management.
Far from being a generic guideline, Control 8 is meticulously broken down into actionable sub-controls. Each of these develops best practice with operational precision. While ISO 27001 already sets out essential requirements, incorporating the CIS recommendations increases the depth of adherence, giving it universal appeal across diverse industries. By adopting these measures, organizations can not only align with ISO 27001 criteria, but also exceed them, thereby strengthening their security position.
Below is the detailed of the sub-controls described in the revised CIS 8 control, highlighting best practice in audit log management.
As an integral part of your ISO 27001 compliance strategy, meticulous log analysis plays an imperative strategic role, and is based on managing the triptic : people, process, tool.
Fine-grained log analysis requires a coordinated effort involving experts skilled in operations, compliance, IT and security. Their collective vision enables a comprehensive understanding of the real landscape, discerning “last mile” uniqueness from general patterns and potential impacts that might otherwise escape detection.
As part of ISO 27001 compliance, implementing a robust log analysis is a requirement.
The process must take into account the requirements mentioned in the ISO 27001 standard in terms of log analysis, such as the establishment of Control 8.17 - Clock synchronization. This imperative control mandates the harmonization of time and date settings across all systems, thereby ensuring the accuracy of analytical endeavors.
In addition, responsibilities and objectives need to be defined, with the emphasis on compliance, risk management and uptime. A structured timetable, punctuated by management reviews and input from stakeholders, will enable progress to be made. Agility remains essential, as continuous optimisation through data-driven iterations ensures adaptability to changing regulations and emerging risks. This process cultivates a culture of continuous improvement, strengthening operational resilience and cyber security posture.
Harnessing the potential of log analysis requires the use of specific tools, which balance accessibility and security. Secure access and complete dissection of logs are the foundation of informed decision-making. The tools chosen must not only have solid analysis capabilities, but also facilitate transparent collaboration between stakeholders. Enabling operational, IT, security and compliance teams to interact with data promotes responsiveness to emerging threats and compliance adjustments.
To adopt a multi-dimensional approach and improve a company's security and compliance posture, it is essential to combine analysis and log monitoring.
Monitoring logs will enable companies to ensure their confidentiality, integrity and accessibility are aligned with their security policies. We all know the “garbage in, garbage out“ principles and Log Monitoring aim to solve that issue. Log monitoring should be continuously applied and connected to alerting systems if a deviation - both in specific behaviors and trends - are detected. Log Analysis on the other end tackles the insights and controls that can be applied based on logs signals. Ranging from operational excellence, to compliance and cybersecurity monitoring, companies can unlock a large amount of use cases with Log Analysis.
Log monitoring controls example: monitor access to log files, monitor actions on log files, monitor volumes of logs files,…
Log analysis controls example: monitor user behaviors in your systems or apps, monitor networking logs for potential risky connections, monitor authentication configuration to ensure alignement with security policy,…
The emergence of edge systems has reshaped industries, expanding the horizons of computing and data processing. However, these technologies also make log monitoring more complex.
Firstly, networks are increasingly segmented, which poses a monitoring problem. Coordinating data collection, orchestrating aggregation and analyzing information within isolated segments requires a complex network architecture. Among all these peripheral systems, logs are heterogeneous. Logs from different edge systems arrive in different formats, structures and contexts. Unifying and analyzing these logs requires log management solutions capable of deciphering, normalizing and correlating multi-faceted data streams. In addition, entreprise are more and more distributed, increasing the importance of rapid detections, and a local response to alerts and problems. However, uninterrupted monitoring of dispersed devices is technically complex to achieve.
Many entreprise rely on large scale engineering projects to solve ad-hoc issues. Everything starts to look like a centralized log problem when you have a Splunk hammer in your hands.
Following an update of the ISO 27001 standard in 2022, the controls to be implemented for logging have been modified.
In the ISO 27001:2013 version, 3 different controls covered all the topics related to log management :
A.12.4.1 Event Logging
“Event logs should be produced, retained, and regularly reviewed to record user activities, exceptions, defects, and information security events.”
A.12.4.2 Protection of Log Information
“Logging and log information should be secure from intrusion and unauthorized access.”
A.12.4.3 Administrator and Operator Logs
“Administrator and operator logs are produced, regularly reviewed, and protected.”
The modification of the standard in October 2022 has therefore merged all the above controls into a new control covering all logging requirements.
The logging control in ISO 27001:2022 is now control 8.15 called Logging.
With a single command-line, a dev can deploy a robust monitoring solution, eliminating the complexities often associated with implementing data pipeline, data lake and their analytics capacities.
Trout Software platform leverages the power of WebAssembly, a technology renowned for its lightweight footprint and resource efficiency. This architectural choice ensures that the integration of Security Hub merges with a company existing infrastructure (it does not dictate an architecture, but allows to monitor an existing one), accelerating compliance efforts without impeding operational productivity.
A hallmark Trout is its capacity to correlate logs from heterogeneous data sources, all without the need for preliminary parsing. This intelligent capability empowers cybersecurity professionals to glean meaningful insights rapidly, fostering agile decision-making in line with ISO 27001 requirements.
Trout Software Operation Hub allows companies to monitor their logs but also operating systems files. By creating a notebook and monitoring access to log files and actions against these logs, a company can ensure ISO compliance quickly, and continuously.
For simplified log analysis, Trout Software has developed a no-code interface that facilitates the swift execution of intricate analyses by users. All that's required is the connection of their machine/software to the Trout Software platform through pre-built connectors.
This software empowers users to query their various data sources directly, eliminating the need for preliminary ingestion pipelines and conserving valuable resources.
Once the software or machine is successfully linked to the platform, users can initiate their investigation by crafting a playbook.
Here's the process:
Upon gaining an overview of the data, the no-code interface becomes instrumental in performing rapid analyses of log information. This interface encompasses all the functions detailed in the image below.
To add extra context about any environment, users can also drag and drop another datasource, CSV, text, log files to add it to your analysis.
Trout Software's platform facilitates the monitoring of logs through the implementation of automated controls.
The process of automating log controls equips you with the capability to promptly receive alerts in the event of any deviation detected based on predefined rules. This approach enables the proactive enhancement of security measures.
Here's how it works:
The automation of checks is a streamlined procedure encompassing just three steps:
On the important of OT monitoring in physical industries, key challenges, emerging trends and best practices to ensure robust security, resilience...
This guide provides an analysis of the OT security concept, highlighting the unique challenges of protecting industrial systems.
All you need to know about OT Compliance : ISA/IEC 62443, NIST SP 800-82, CIS Controls…
Receive an email when our team releases new content.