Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Backport 2.x] Correct handling of null max aggregation values in SearchResponse #1293

Merged
merged 1 commit into from
Sep 3, 2024

Conversation

opensearch-trigger-bot[bot]
Copy link

Backport e47e3c3 from #1292.

)

Previously, when the max aggregation in SearchResponse returned a null value, it was converted to a negative number (-9223372036854775808), leading to erroneous timestamps.

This fix adds a check to ensure that only valid positive aggregation values are considered, preventing null or invalid values from being misinterpreted as negative numbers. The change applies to only timestamps. So a positive number filter is always right.

Testing done:
1. Added unit tests to cover scenarios where max aggregation returns null or negative values, ensuring the method returns Optional.empty() when the value is invalid.
2. Added unit tests to verify correct handling of valid positive max aggregation values.
3. Ran e2e manual testing.

Previously:

```
POST /_plugins/_anomaly_detection/detectors/_validate/model
{
    "name": "test-detector-13",
    "description": "Test detector",
    "time_field": "customer_birth_date",
    "indices": ["opensearch_dashboards_sample_data_ecommerce"],
    "feature_attributes": [
        {
            "feature_name": "cpu",
            "feature_enabled": true,
            "aggregation_query": {
                "total_revenue_usd-field": {
                    "max": {
                        "field": "day_of_week_i"
                    }
                }
            }
        }
    ],
    "detection_interval": {
        "period": {
            "interval":1,
            "unit": "Minutes"
        }
    },
    "window_delay": {
        "period": {
            "interval": 1,
            "unit": "Minutes"
        }
    },
    "category_field": ["currency"]
}
```

returns:

```
nested: OpenSearchParseException[failed to parse date field [-9223372036854775808] with format [strict_date_optional_time||epoch_millis]: [failed to parse date field [-9223372036854775808] with format [strict_date_optional_time||epoch_millis]]]; nested: IllegalArgumentException[failed to parse date field [-9223372036854775808] with format [strict_date_optional_time||epoch_millis]]; nested: DateTimeParseException[Failed to parse with all enclosed parsers]; } opensearch-ccs-node1 | [2024-09-03T15:05:45,776][ERROR][o.o.t.r.h.ModelValidationActionHandler] [2a2cd14da04d] Failed to create search request for last data point opensearch-ccs-node1 | org.opensearch.action.search.SearchPhaseExecutionException: all shards failed opensearch-ccs-node1 | at org.opensearch.action.search.AbstractSearchAsyncAction.onPhaseFailure(AbstractSearchAsyncAction.java:770) [opensearch-3.0.0.jar:3.0.0]
```

Now the call returns:

```
{ "model": { "time_field": { "message": "There isn't enough historical data found with current timefield selected." } } }
```

Signed-off-by: Kaituo Li <[email protected]>
(cherry picked from commit e47e3c3)
Signed-off-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
@amitgalitz amitgalitz merged commit 3702497 into 2.x Sep 3, 2024
22 of 27 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant