diff --git a/src/content/docs/logs/ui-data/parsing.mdx b/src/content/docs/logs/ui-data/parsing.mdx
index e49de72c06c..d6879442dcb 100644
--- a/src/content/docs/logs/ui-data/parsing.mdx
+++ b/src/content/docs/logs/ui-data/parsing.mdx
@@ -326,6 +326,16 @@ Note that variable names must be explicitly set and be lowercase like `%{URI:uri
Geographic location from IP addresses. See [Geolocating IP addresses (GeoIP)](#geo) for more information.
+
+
+
+ `key value pairs`
+ |
+
+
+ Key Value Pair . See [Parsing Key Value Pairs](#parsing-key-value-pairs) for more information.
+ |
+
@@ -527,6 +537,131 @@ Note that variable names must be explicitly set and be lowercase like `%{URI:uri
* **region**: Abbreviation of state, province, or territory
* **regionName**: Name of state, province, or territory
+
+
+ The New Relic logs pipeline parses your log messages by default, but sometimes you have log messages that are formatted as key-value pairs. In this situation, you may want to be able to parse them and then be able to filter using the key-value attributes.
+
+ If that is the case, you can use the `key value pairs` [grok type](#grok-syntax), which will parse the key-value pairs captured by the grok pattern. This format relies on 3 main parts: the grok syntax, the prefix you would like to assign to the parsed key-value attributes, and the `key value pairs` [grok type](#grok-syntax). Using the `key value pairs` [grok type](#grok-syntax), you can extract and parse key-value pairs from logs that are not properly formatted; for example, if your logs are prefixed with a date/time string:
+
+ ```json
+ 2015-05-13T23:39:43.945958Z key1=value1,key2=value2,key3=value3
+ ```
+
+ In order to extract and parse the key-value data from this log format, create the following Grok expression::
+
+ ```
+ %{TIMESTAMP_ISO8601:containerTimestamp} %{GREEDYDATA:my_attribute_prefix:keyvalue()}
+ ```
+
+ The resulting log is:
+
+ ```
+ containerTimestamp: "2015-05-13T23:39:43.945958Z"
+ my_attribute_prefix.key1: "value1"
+ my_attribute_prefix.key2: "value2"
+ my_attribute_prefix.key3: "value3"
+ ```
+
+ You can define the custom delimiter and separator also to extract the required key value pairs.
+
+ ```json
+ 2015-05-13T23:39:43.945958Z event:TestRequest request:bar
+ ```
+
+ For example, with the following Grok expression:
+
+ ```
+ %{TIMESTAMP_ISO8601:containerTimestamp} %{GREEDYDATA:my_attribute_prefix:keyvalue({"delimiter": " ", "keyValueSeparator": ":"})}
+ ```
+
+ The resulting log is:
+
+ ```
+ containerTimestamp: "2015-05-13T23:39:43.945958Z"
+ my_attribute_prefix.event: "TestRequest"
+ my_attribute_prefix.request: "bar"
+ ```
+
+ If you want to omit the `my_attribute_prefix` prefix, you can include the `"noPrefix": true` in the configuration.
+
+ ```
+ %{TIMESTAMP_ISO8601:containerTimestamp} %{GREEDYDATA:my_attribute_prefix:keyValue({"noPrefix": true})}
+ ```
+
+ The resulting log is:
+
+ ```
+ containerTimestamp: "2015-05-13T23:39:43.945958Z"
+ event: "TestRequest"
+ request: "bar"
+ ```
+
+
+ If you want to set your custom quote character prefix, you can include the "quoteChar": in the configuration.
+
+ ```json
+ 2015-05-13T23:39:43.945958Z nbn_demo='INFO',message='This message contains information with spaces ,sessionId='abc123'
+ ```
+
+ ```
+ %{TIMESTAMP_ISO8601:containerTimestamp} %{GREEDYDATA:my_attribute_prefix:keyValue({"quoteChar": "'"})}
+ ```
+
+ The resulting log is:
+
+ ```
+ "my_attribute_prefix.message": "'This message contains information with spaces",
+ "my_attribute_prefix.nbn_demo": "INFO",
+ "my_attribute_prefix.sessionId": "abc123"
+ ```
+
+ ### Grok Pattern Parameters
+
+ You can customize the parsing behavior with the following options to suit your log formats:
+
+ * **delimiter**
+ * **Description:** String separating each key-value pair.
+ * **Default Value:** `,` (comma)
+ * **Override:** Set the field `delimiter` to change this behavior.
+
+ * **keyValueSeparator**
+ * **Description:** String used to assign values to keys.
+ * **Default Value:** `=`
+ * **Override:** Set the field `keyValueSeparator` for custom separator usage.
+
+ * **quoteChar**
+ * **Description:** Character used to enclose values with spaces or special characters.
+ * **Default Value:** `"` (double quote)
+ * **Override:** Define a custom character using `quoteChar`.
+
+ * **dropOriginal**
+ * **Description:** Drops the original log message after parsing. Useful for reducing log storage.
+ * **Default Value:** `true`
+ * **Override:** Set `dropOriginal` to `false` to retain the original log message.
+
+ * **noPrefix**
+ * **Description:** When `true`, excludes Grok field name as a prefix in the resulting object.
+ * **Default Value:** `false`
+ * **Override:** Enable by setting `noPrefix` to `true`.
+
+ * **escapeChar**
+ * **Description:** Define a custom escape character to handle special log characters.
+ * **Default Value:** "\" (backslash)
+ * **Override:** Customize with `escapeChar`.
+
+ * **trimValues**
+ * **Description:** Allows trimming of values that contain whitespace.
+ * **Default Value:** `false`
+ * **Override:** Set `trimValues` to `true` to activate trimming.
+
+ * **trimKeys**
+ * **Description:** Allows trimming of keys that contain whitespace.
+ * **Default Value:** `true`
+ * **Override:** Set `trimKeys` to `true` to activate trimming.
+
## Organizing by logtype [#type]
diff --git a/src/content/docs/release-notes/logs-release-notes/logs-25-01-10.mdx b/src/content/docs/release-notes/logs-release-notes/logs-25-01-10.mdx
new file mode 100644
index 00000000000..d488bc5feca
--- /dev/null
+++ b/src/content/docs/release-notes/logs-release-notes/logs-25-01-10.mdx
@@ -0,0 +1,17 @@
+---
+subject: Logs
+releaseDate: '2025-01-10'
+version: '250110'
+---
+
+### New Feature: Key-Value Parsing with Grok
+
+We are excited to introduce key-value parsing with Grok in our latest release. This feature allows you to extract key-value pairs from your logs more efficiently, enhancing your log management capabilities.
+
+### Added
+
+* **Key-Value Parsing with Grok**: You can now use Grok patterns to parse key-value pairs from your logs. This feature simplifies the extraction of structured data from unstructured log messages, making it easier to analyze and visualize your log data.
+
+### Notes
+
+To stay up to date with the most recent fixes and enhancements, subscribe to our [Logs RSS feed](/docs/release-notes/logs-release-notes/).