Support

How Quality Data Builds the Foundation for Machine Learning in Cybersecurity

July 14, 2021

Machine learning (ML) is a subset of artificial intelligence that mimics the way humans learn by utilizing data and algorithms. It’s no secret that machine learning is an excellent tool to help assess and prevent cybersecurity threats. With the proper data, it can help companies avoid falling victim to similar attacks by seeking out patterns of the system's behavior. Machine learning also utilizes predictive modeling, which is very effective for detecting attacks while they are happening and can even help companies prevent new attacks.

With all of its functionality, it may seem like machine learning is the key to enhancing your company's cybersecurity. However, there is one critical factor that determines if machine learning will be a useful tool. This critical factor is good quality data because you get out what you put in! When your data quality is bad or incomplete, machine learning cannot provide accurate insights about what is happening in your environment, which will hinder a company’s ability to respond quickly to cyberattacks and/or prevent them from happening.

You can determine the quality of your data set by analyzing it based on these six dimensions of data quality:

Data Quality Dimension
Definition
Importance to Cybersecurity
1. Accuracy

Definition

The data values are correct.

Importance

Accuracy might be the most critical dimension in this table. If the data values you feed your machine learning tool are not correct, none of the results that are produced will be helpful to your company because they are based on incorrect data.
2. Completeness

Definition

The data has all of the expected and required values.

Importance

Machine learning algorithms can help you identify patterns in your company’s data and ensuring that your data is complete will make it easier for these patterns to be identified. Identifying patterns is important in cybersecurity because it can help your company notice cyberattacks earlier.
3. Consistency

Definition

The same data value does not change in a data set.

Importance

Consistency is essential for addressing new risks that might seem similar to events that normally happen within your company. If the same data values change within a data set, it will make it more challenging to identify regular user activity.
4. Timeliness

Definition

The most up-to-date data that is needed is readily available.

Importance

Timeliness is important for detecting network anomalies. If your company does not have access to its most up-to-date network data, it will be more difficult to model normal network behavior and detect when there is an anomaly.
5. Uniqueness

Definition

There is no duplicate data.

Importance

Uniqueness is important for modeling normal behavior and recognizing if there is a cyberthreat within your company. If there is duplicate data, it becomes more difficult to model normal behavior because it will appear that an event is occurring more often than it actually does.
6. Validity

Definition

The data value matches the definition of the real-world thing it is representing.

Importance

Validity is important for using automated incident response. Suppose the data your company is using does not match the definition of the real-world thing it is representing. In that case, the automated cybersecurity responses your company sets up based on that data will not be effective.

Machine learning is already changing the world of cybersecurity and has the potential to enhance it even more. Though it may cost your company more time and money now, obtaining good quality data will give you the necessary foundation to use ML effectively. Only then will your company gain the full range of benefits that machine learning has to offer.

To learn more about how accurate data builds the foundation for machine learning use cases in cybersecurity, including security orchestration and automation (SOAR), join our seminar with CS2AI and Splunk on August 18 at 1 PM EDT.