What are the ISO 27001 Logging Requirement ?

What are the ISO 27001 Logging Requirement ?

Control 8.15 - Logs that record activities, exceptions, faults and other relevant events should be produced, stored, protected and analysed

What is a log ?

In cybersecurity, a log acts as a digital diary, meticulously recording steps, interactions and events within information systems and networks. Logs are produced by both software, but we see a growth in “hardware” logs coming from the digitalization of hardware systems and the industry 4.0 trends. Log files capture user actions, system functions, program behaviors and much more, with time stamps and source details.Logs are a primary data source for observability, enabling use cases from cybersecurity monitoring, operational excellence and compliance.

What the standard require ?

The logging requirement are given by the standard in the Annex A, more precisely by the control 8.15 :

🎛️ Control ISO 27001:2022 A 8.15 :

“Logs that record activities, exceptions, faults and other relevant events should be produced, stored, protected and analysed.”

🎯 Purpose ISO 27001:2022 A 8.15 :

“Logging in information security refers to the process of collecting and storing information about system and network activity, in order to track and analyze the actions of users, systems, and other entities. Logs can include information such as user logins and logouts, file access, network connections, and system events.” ISO 27001

Which Events Should Be Logged for ISO 27001 ?

Given the exponential digitization of businesses and the multiplication of systems, it's unrealistic to want to record every event produced on your company's network.

ISO 27001 has identified 10 events as being particularly important to record. These ten events have been selected because of their plausible ability to influence risk adjustments, and their essential contribution to maintaining commendable information security credentials 👇

  • System access attempts,
  • Data and/or resource access attempts,
  • System/OS configuration changes,
  • Use of elevated privileges,
  • Use of utility programs or maintenance facilities,
  • File access requests and what occurred,
  • Access control alarms and critical interrupts,
  • Activation and/or deactivation of front end and back end security systems, such as client-side antivirus software or firewall protection systems,
  • Identity administration work (both physical and logical),
  • Certain actions or system/data alterations carried out as part of a session within an application.

How should logs be protected ?

Logs serve as an indelible chronicle, meticulously documenting each interaction, event, and transaction within your systems. Preserving their integrity is paramount, for the worth of these data hinges on their inviolable nature. Hence, it becomes imperative that no user, regardless of privileged access, possesses the capability to tamper with or manipulate logs. Absolute prevention measures must be in place to ensure that users are unequivocally barred from the following actions:

  • Modifying or erasing logs,
  • Altering message types,
  • Failing to generate a log file, or engaging in unwarranted overwrites of log files, which may arise from ongoing challenges related to storage media or network performance.

Safeguarding the sanctity of logs stands as a safeguard against misrepresentation and a guardian of unaltered truth.

To protect all your logs, the ISO 27001 standard suggests 4 methods :

  • Cryptographic hashing.
  • Read-only recording.
  • Use of public transparency files.
  • Append-only recording.

Moreover, logs might be solicited by vendors for the purpose of system troubleshooting within your corporate environment. Once again, the safeguarding of the logs you share with vendors becomes an absolute necessity.

Data Masking

As previously emphasized, logs encapsulate sensitive particulars, encompassing personal information of individuals. Prior to transmitting logs to your vendors, a pivotal step involves the anonymization of this data.

This safeguarding technique is commonly referred to as data masking, a concept encapsulated within Control 8.11 - Data Masking, **this control was introduced following the update of the standard in 2022.**

Here's a comprehensive list of the information that necessitates anonymization :

  • Usernames,
  • Internet Protocol addresses,
  • Hostnames,
  • Organization name.

Deep dive into CIS Controls - Control 8 : Audit Log Management

In 2021, CIS unveiled an updated version of its controls, which now comprise a simplified set of 18 key security practices designed to strengthen an organization's cybersecurity framework. In particular, Control 8, entitled "Audit Log Management", focuses on the complex aspects of log management.

Far from being a generic guideline, Control 8 is meticulously broken down into actionable sub-controls. Each of these develops best practice with operational precision. While ISO 27001 already sets out essential requirements, incorporating the CIS recommendations increases the depth of adherence, giving it universal appeal across diverse industries. By adopting these measures, organizations can not only align with ISO 27001 criteria, but also exceed them, thereby strengthening their security position.

Below is the detailed of the sub-controls described in the revised CIS 8 control, highlighting best practice in audit log management.

CIS Control 8 - Audit Log Management Subcontrols :

Control CIS 8 Audit log management 1.JPG

Control CIS 8 Audit log management 2.JPG

Control CIS 8 Audit log management 3.JPG

How do you analyse your logs ?

As an integral part of your ISO 27001 compliance strategy, meticulous log analysis plays an imperative strategic role, and is based on managing the triptic : people, process, tool.

People :

Fine-grained log analysis requires a coordinated effort involving experts skilled in operations, compliance, IT and security. Their collective vision enables a comprehensive understanding of the real landscape, discerning “last mile” uniqueness from general patterns and potential impacts that might otherwise escape detection.

Process :

As part of ISO 27001 compliance, implementing a robust log analysis is a requirement.

The process must take into account the requirements mentioned in the ISO 27001 standard in terms of log analysis, such as the establishment of Control 8.17 - Clock synchronization. This imperative control mandates the harmonization of time and date settings across all systems, thereby ensuring the accuracy of analytical endeavors.

In addition, responsibilities and objectives need to be defined, with the emphasis on compliance, risk management and uptime. A structured timetable, punctuated by management reviews and input from stakeholders, will enable progress to be made. Agility remains essential, as continuous optimisation through data-driven iterations ensures adaptability to changing regulations and emerging risks. This process cultivates a culture of continuous improvement, strengthening operational resilience and cyber security posture.

Tool :

Harnessing the potential of log analysis requires the use of specific tools, which balance accessibility and security. Secure access and complete dissection of logs are the foundation of informed decision-making. The tools chosen must not only have solid analysis capabilities, but also facilitate transparent collaboration between stakeholders. Enabling operational, IT, security and compliance teams to interact with data promotes responsiveness to emerging threats and compliance adjustments.

Log Analysis should be supported by log Monitoring

To adopt a multi-dimensional approach and improve a company's security and compliance posture, it is essential to combine analysis and log monitoring.

Monitoring logs will enable companies to ensure their confidentiality, integrity and accessibility are aligned with their security policies. We all know the “garbage in, garbage out“ principles and Log Monitoring aim to solve that issue. Log monitoring should be continuously applied and connected to alerting systems if a deviation - both in specific behaviors and trends - are detected. Log Analysis on the other end tackles the insights and controls that can be applied based on logs signals. Ranging from operational excellence, to compliance and cybersecurity monitoring, companies can unlock a large amount of use cases with Log Analysis.

Log monitoring controls example: monitor access to log files, monitor actions on log files, monitor volumes of logs files,…

Log analysis controls example: monitor user behaviors in your systems or apps, monitor networking logs for potential risky connections, monitor authentication configuration to ensure alignement with security policy,…

The difficulty of monitoring Edge Systems

The emergence of edge systems has reshaped industries, expanding the horizons of computing and data processing. However, these technologies also make log monitoring more complex.

Firstly, networks are increasingly segmented, which poses a monitoring problem. Coordinating data collection, orchestrating aggregation and analyzing information within isolated segments requires a complex network architecture. Among all these peripheral systems, logs are heterogeneous. Logs from different edge systems arrive in different formats, structures and contexts. Unifying and analyzing these logs requires log management solutions capable of deciphering, normalizing and correlating multi-faceted data streams. In addition, entreprise are more and more distributed, increasing the importance of rapid detections, and a local response to alerts and problems. However, uninterrupted monitoring of dispersed devices is technically complex to achieve.

Many entreprise rely on large scale engineering projects to solve ad-hoc issues. Everything starts to look like a centralized log problem when you have a Splunk hammer in your hands.

How long should I retain my logs ?

The ISO 27001 log retention period is three years, this must be included in your data retention policy.

Changes between ISO 27001:2013 and ISO 27001:2022

Following an update of the ISO 27001 standard in 2022, the controls to be implemented for logging have been modified.

In the ISO 27001:2013 version, 3 different controls covered all the topics related to log management :

A.12.4.1 Event Logging

“Event logs should be produced, retained, and regularly reviewed to record user activities, exceptions, defects, and information security events.”

A.12.4.2 Protection of Log Information

“Logging and log information should be secure from intrusion and unauthorized access.”

A.12.4.3  Administrator and Operator Logs

“Administrator and operator logs are produced, regularly reviewed, and protected.”

The modification of the standard in October 2022 has therefore merged all the above controls into a new control covering all logging requirements.

The logging control in ISO 27001:2022 is now control 8.15 called Logging.

How Trout Software can help you to meet this requirement 🎣

📈Flexible deployment

With a single command-line, a dev can deploy a robust monitoring solution, eliminating the complexities often associated with implementing data pipeline, data lake and their analytics capacities.

Trout Software platform leverages the power of WebAssembly, a technology renowned for its lightweight footprint and resource efficiency. This architectural choice ensures that the integration of Operation Hub merges with a company existing infrastructure (it does not dictate an architecture, but allows to monitor an existing one), accelerating compliance efforts without impeding operational productivity.

A hallmark feature of Operation Hub is its capacity to correlate logs from heterogeneous data sources, all without the need for preliminary parsing. This intelligent capability empowers cybersecurity professionals to glean meaningful insights rapidly, fostering agile decision-making in line with ISO 27001 requirements.

🛡️ Protect your logs

Trout Software Operation Hub allows companies to monitor their logs but also operating systems files. By creating a notebook and monitoring access to log files and actions against these logs, a company can ensure ISO compliance quickly, and continuously.

🔎 Quickly analyze logs

For simplified log analysis, Trout Software has developed a no-code interface that facilitates the swift execution of intricate analyses by users. All that's required is the connection of their machine/software to the Trout Software platform through pre-built connectors.

This software empowers users to query their various data sources directly, eliminating the need for preliminary ingestion pipelines and conserving valuable resources.

Once the software or machine is successfully linked to the platform, users can initiate their investigation by crafting a playbook.

Here's the process:

  • Open a playbook.
  • Establish connections with the desired data sources.
  • Start the analytical process.

Upon gaining an overview of the data, the no-code interface becomes instrumental in performing rapid analyses of log information. This interface encompasses all the functions detailed in the image below.

perform log analysis (3)

To add extra context about any environment, users can also drag and drop another datasource, CSV, text, log files to add it to your analysis.

👨‍💻 Continuously monitor logs

Trout Software's platform facilitates the monitoring of logs through the implementation of automated controls.

The process of automating log controls equips you with the capability to promptly receive alerts in the event of any deviation detected based on predefined rules. This approach enables the proactive enhancement of security measures.

Here's how it works:

  • After creating the designated playbook, as outlined earlier,
  • Navigate to the "Schedule" segment within the Security Hub.

The automation of checks is a streamlined procedure encompassing just three steps:

  • Selection of the specific playbook established earlier.
  • Configuration of control parameters.
  • Initiation of the automation process by selecting the "Schedule" option.

monitor your logs (4)

Get notified about Trout articles

Receive an email when our team releases new content.