Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the ajax-load-more-anything domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home/devwp/public_html/p225-newweb/wp-includes/functions.php on line 6114

Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the wordpress-seo domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home/devwp/public_html/p225-newweb/wp-includes/functions.php on line 6114
Log Sensitive Data Scrubbing and Scanning on Datadog - Vsceptre

Log Sensitive Data Scrubbing and Scanning on Datadog

6 Sep 2023

Observability, Security

Log Sensitive Data Scrubbing and Scanning on Datadog

In today’s digital landscape, data security and privacy have become paramount concerns for businesses and individuals alike. With the increasing reliance on cloud-based services and the need to monitor and analyze application logs, it is crucial to ensure that sensitive data remains protected. Datadog offers robust features to help organizations track and analyze their logs effectively. However, it is equally important to understand how to hide sensitive data from logs on Datadog to prevent unauthorized access and potential data breaches. In this blog post, we will explore various techniques and best practices to safeguard sensitive information within your logs while benefiting from the powerful capabilities of Datadog. By implementing these strategies, you can maintain data privacy, comply with regulations, and build customer trust.

Configuring Log Collection to achieve sensitive data scrubbing

As an example, you may hide the IP address from the log entries by adding the following config to the  DataDog agent with log collection enabled. We are using a python app and create a file conf.yaml in the conf.d/python.d/ directory with the following content:

Restart the agent to pick up the new settings, then check the logs collected on DataDog:

You can see that the IP addresses are hidden on the logs collected on Datadog.

Sensitive Data Scanner on Datadog: Protecting Your Logs

Another feature offered by Datadog to help safeguard sensitive data within logs is the Sensitive Data Scanner. This powerful tool allows you to automatically scan your logs for specific patterns or keywords that may indicate the presence of sensitive information such as credit card numbers, social security numbers, or personal identification information.

The Sensitive Data Scanner works by leveraging regular expressions and predefined patterns to identify and flag any potential matches within your log data. By configuring the scanner to search for specific patterns, you can ensure that sensitive data is promptly detected and appropriately handled.

To use a sensitive data scanner

  1. Go to the Sensitive Data Scanner page and click on Add a scanning group

2. Create scanning rule

Scanning rules are used to identify specific sensitive information within the data. In a scanning group, you can either choose from pre-existing scanning rules in Datadog’s Scanning Rule Library or create your own rules using custom regex patterns for scanning.

Scanning rules are used to identify specific sensitive information within the data. In a scanning group, you can either choose from pre-existing scanning rules in Datadog’s Scanning Rule Library or create your own rules using custom regex patterns for scanning.

View out-of-the-box Sensitive Data Scanner Overview Dashboard:

In conclusion, implementing log-sensitive data scrubbing and scanning on Datadog is a crucial step toward ensuring data privacy and security. By defining scanning rules and utilizing custom regex patterns, you can effectively identify and protect sensitive information within your logs. With the flexibility and power of Datadog’s Scanning Rule Library, you have the tools necessary to safeguard your data and maintain compliance with privacy regulations. Start implementing these practices today to enhance your data protection strategy and gain peace of mind.

Further Readings

https://docs.datadoghq.com/agent/logs/advanced_log_collection/?tab=configurationfile#scrub-sensitive-data-from-your-logs

https://www.datadoghq.com/product/sensitive-data-scanner/

Related Articles

Demystifying Log to Trace correlation in DataDog

Demystifying Log to Trace correlation in DataDog

At around end of March, I want to get my hands on the old raspberry pi cluster again as I need a testbed for K8S, ChatOps, CI/CD etc. The DevOps ecosystem in 2023 is more ARM ready compared with 2020 which makes building a usable K8S stack on Pi realistic. I upgraded from a 4 nodes cluster to a 7 Pi4 nodes with POE capabilities, SSD, USB and sitting inside a nice 1U rack. Then spending the next two months’ time on testing various OS. Re-installing the whole stack multiple times and struggling with the home router is fun. At the end the cluster is up with all platform engineering tools deployed.

Monitoring temperature of my DietPi Homelab cluster with Grafana Cloud

Monitoring temperature of my DietPi Homelab cluster with Grafana Cloud

At around end of March, I want to get my hands on the old raspberry pi cluster again as I need a testbed for K8S, ChatOps, CI/CD etc. The DevOps ecosystem in 2023 is more ARM ready compared with 2020 which makes building a usable K8S stack on Pi realistic. I upgraded from a 4 nodes cluster to a 7 Pi4 nodes with POE capabilities, SSD, USB and sitting inside a nice 1U rack. Then spending the next two months’ time on testing various OS. Re-installing the whole stack multiple times and struggling with the home router is fun. At the end the cluster is up with all platform engineering tools deployed.

Setting up the first SLO

Setting up the first SLO

This is the final piece of the 3 part series “The path to your first SLO”.
We have discussed on the basics of what to observe and how to get the relevant metrics in part 1 and part 2 of this series. This time we are going to have a quick look on to setup a simple service availability monitoring SLO with Nobl9 and SolarWinds Pingdom.

This site is registered on wpml.org as a development site.

Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/devwp/public_html/p225-newweb/wp-includes/functions.php on line 5464

Notice: ob_end_flush(): failed to send buffer of zlib output compression (1) in /home/devwp/public_html/p225-newweb/wp-includes/functions.php on line 5464