Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changes default version to 2.0.0-alpha1 and fixes CVE-2020-36518 #478

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 12 additions & 10 deletions .github/workflows/CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ jobs:
Build-ad:
strategy:
matrix:
java: [11, 14]
java: [11]
fail-fast: false

name: Build and Test Anomaly detection Plugin
Expand All @@ -29,20 +29,22 @@ jobs:

- name: Assemble anomaly-detection
run: |
./gradlew assemble -Dopensearch.version=2.0.0-SNAPSHOT
echo "Creating ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/2.0.0.0-SNAPSHOT ..."
mkdir -p ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/2.0.0.0-SNAPSHOT
echo "Copying ./build/distributions/*.zip to ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/2.0.0.0-SNAPSHOT ..."
plugin_version=`./gradlew properties -q | grep "opensearch_build:" | awk '{print $2}'`
echo plugin_version $plugin_version
./gradlew assemble
echo "Creating ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/$plugin_version ..."
mkdir -p ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/$plugin_version
echo "Copying ./build/distributions/*.zip to ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/$plugin_version ..."
ls ./build/distributions/
cp ./build/distributions/*.zip ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/2.0.0.0-SNAPSHOT
echo "Copied ./build/distributions/*.zip to ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/2.0.0.0-SNAPSHOT ..."
ls ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/2.0.0.0-SNAPSHOT
cp ./build/distributions/*.zip ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/$plugin_version
echo "Copied ./build/distributions/*.zip to ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/$plugin_version ..."
ls ./src/test/resources/org/opensearch/ad/bwc/anomaly-detection/$plugin_version
- name: Build and Run Tests
run: |
./gradlew build -Dopensearch.version=2.0.0-SNAPSHOT
./gradlew build
- name: Publish to Maven Local
run: |
./gradlew publishToMavenLocal -Dopensearch.version=2.0.0-SNAPSHOT
./gradlew publishToMavenLocal
- name: Multi Nodes Integration Testing
run: |
./gradlew integTest -PnumNodes=3
Expand Down
25 changes: 17 additions & 8 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,18 @@ import org.opensearch.gradle.testclusters.StandaloneRestIntegTestTask
buildscript {
ext {
opensearch_group = "org.opensearch"
opensearch_version = System.getProperty("opensearch.version", "2.0.0-SNAPSHOT")
// 1.2.0 -> 1.2.0.0, and 1.2.0-SNAPSHOT -> 1.2.0.0-SNAPSHOT
opensearch_build = opensearch_version.replaceAll(/(\.\d)([^\d]*)$/, '$1.0$2')
isSnapshot = "true" == System.getProperty("build.snapshot", "true")
opensearch_version = System.getProperty("opensearch.version", "2.0.0-alpha1-SNAPSHOT")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Infra team fixed this part, you can refer to this PR https://github.com/opensearch-project/ml-commons/pull/262/files

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed it to be the same as ml-commons

Copy link
Collaborator

@ylwu-amzn ylwu-amzn Mar 30, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, btw, make sure the built AD artifact name is opensearch-anomaly-detection-2.0.0.0-alpha1-SNAPSHOT. I see line 633

version = "${project.version}" - "-SNAPSHOT"

Not sure what project.version is

Copy link
Member Author

@amitgalitz amitgalitz Mar 31, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It actually gives a zip of opensearch-anomaly-detection-2.0.0.0-SNAPSHOT. Same case for ml-commons built zip (just tried on latest code) -> will investigate this, changing line 633 seems to have no effect

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh,I guess infra team's fix break this part for ml-commons, will take a look

Copy link
Collaborator

@ylwu-amzn ylwu-amzn Mar 31, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For ml-commons, this one can generate artifact with alpha1, ./gradlew clean;./gradlew build -Dbuild.version_qualifier=alpha1. But artifact generated by ./gradlew build has no alpha1 in name even we set opensearch_version = System.getProperty("opensearch.version", "2.0.0-alpha1-SNAPSHOT"). @peterzhuamazon is this what we expected?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From @peterzhuamazon , we are going to move to rc1 soon. So just keep this code, we can change to rc1 later.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Infra team fixed this part, you can refer to this PR https://github.com/opensearch-project/ml-commons/pull/262/files

buildVersionQualifier = System.getProperty("build.version_qualifier")
// 2.0.0-alpha1-SNAPSHOT -> 2.0.0.0-alpha1-SNAPSHOT
version_tokens = opensearch_version.tokenize('-')
opensearch_build = version_tokens[0] + '.0'
if (buildVersionQualifier) {
opensearch_build += "-${buildVersionQualifier}"
}
if (isSnapshot) {
opensearch_build += "-SNAPSHOT"
}
common_utils_version = System.getProperty("common_utils.version", opensearch_build)
job_scheduler_version = System.getProperty("job_scheduler.version", opensearch_build)
}
Expand Down Expand Up @@ -106,7 +115,7 @@ configurations.all {
if (it.state != Configuration.State.UNRESOLVED) return
resolutionStrategy {
force "joda-time:joda-time:${versions.joda}"
force "com.fasterxml.jackson.core:jackson-core:2.12.6"
force "com.fasterxml.jackson.core:jackson-core:2.13.2"
force "commons-logging:commons-logging:${versions.commonslogging}"
force "org.apache.httpcomponents:httpcore:${versions.httpcore}"
force "commons-codec:commons-codec:${versions.commonscodec}"
Expand Down Expand Up @@ -315,7 +324,7 @@ String bwcFilePath = "src/test/resources/org/opensearch/ad/bwc/"
testClusters {
"${baseName}$i" {
testDistribution = "ARCHIVE"
versions = ["7.10.2","1.3.0-SNAPSHOT"]
versions = ["7.10.2", "2.0.0-alpha1-SNAPSHOT"]
numberOfNodes = 3
plugin(provider(new Callable<RegularFile>(){
@Override
Expand Down Expand Up @@ -584,9 +593,9 @@ dependencies {

// force Jackson version to avoid version conflict issue
implementation 'software.amazon.randomcutforest:randomcutforest-serialization:2.0.1'
implementation "com.fasterxml.jackson.core:jackson-core:2.12.6"
implementation "com.fasterxml.jackson.core:jackson-databind:2.12.6"
implementation "com.fasterxml.jackson.core:jackson-annotations:2.12.6"
implementation "com.fasterxml.jackson.core:jackson-core:2.13.2"
implementation "com.fasterxml.jackson.core:jackson-databind:2.13.2.2"
implementation "com.fasterxml.jackson.core:jackson-annotations:2.13.2"
implementation files('lib/randomcutforest-parkservices-2.0.1.jar')
implementation files('lib/randomcutforest-core-2.0.1.jar')

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -533,7 +533,7 @@ public void initAnomalyDetectorIndexIfAbsent(ActionListener<CreateIndexResponse>
*/
public void initAnomalyDetectorIndex(ActionListener<CreateIndexResponse> actionListener) throws IOException {
CreateIndexRequest request = new CreateIndexRequest(AnomalyDetector.ANOMALY_DETECTORS_INDEX)
.mapping(AnomalyDetector.TYPE, getAnomalyDetectorMappings(), XContentType.JSON)
.mapping(getAnomalyDetectorMappings(), XContentType.JSON)
.settings(settings);
adminClient.indices().create(request, markMappingUpToDate(ADIndex.CONFIG, actionListener));
}
Expand Down Expand Up @@ -597,7 +597,7 @@ public void initAnomalyResultIndexDirectly(
ActionListener<CreateIndexResponse> actionListener
) throws IOException {
String mapping = getAnomalyResultMappings();
CreateIndexRequest request = new CreateIndexRequest(resultIndex).mapping(CommonName.MAPPING_TYPE, mapping, XContentType.JSON);
CreateIndexRequest request = new CreateIndexRequest(resultIndex).mapping(mapping, XContentType.JSON);
if (alias != null) {
request.alias(new Alias(CommonName.ANOMALY_RESULT_INDEX_ALIAS));
}
Expand All @@ -617,7 +617,7 @@ public void initAnomalyResultIndexDirectly(
public void initAnomalyDetectorJobIndex(ActionListener<CreateIndexResponse> actionListener) {
try {
CreateIndexRequest request = new CreateIndexRequest(AnomalyDetectorJob.ANOMALY_DETECTOR_JOB_INDEX)
.mapping(AnomalyDetector.TYPE, getAnomalyDetectorJobMappings(), XContentType.JSON);
.mapping(getAnomalyDetectorJobMappings(), XContentType.JSON);
request
.settings(
Settings
Expand Down Expand Up @@ -649,7 +649,7 @@ public void initAnomalyDetectorJobIndex(ActionListener<CreateIndexResponse> acti
public void initDetectionStateIndex(ActionListener<CreateIndexResponse> actionListener) {
try {
CreateIndexRequest request = new CreateIndexRequest(CommonName.DETECTION_STATE_INDEX)
.mapping(AnomalyDetector.TYPE, getDetectionStateMappings(), XContentType.JSON)
.mapping(getDetectionStateMappings(), XContentType.JSON)
.settings(settings);
adminClient.indices().create(request, markMappingUpToDate(ADIndex.STATE, actionListener));
} catch (IOException e) {
Expand All @@ -671,8 +671,7 @@ public void initCheckpointIndex(ActionListener<CreateIndexResponse> actionListen
} catch (IOException e) {
throw new EndRunException("", "Cannot find checkpoint mapping file", true);
}
CreateIndexRequest request = new CreateIndexRequest(CommonName.CHECKPOINT_INDEX_NAME)
.mapping(CommonName.MAPPING_TYPE, mapping, XContentType.JSON);
CreateIndexRequest request = new CreateIndexRequest(CommonName.CHECKPOINT_INDEX_NAME).mapping(mapping, XContentType.JSON);
choosePrimaryShards(request);
adminClient.indices().create(request, markMappingUpToDate(ADIndex.CHECKPOINT, actionListener));
}
Expand Down Expand Up @@ -729,7 +728,7 @@ void rolloverAndDeleteHistoryIndex() {
}
CreateIndexRequest createRequest = rollOverRequest.getCreateIndexRequest();

createRequest.index(AD_RESULT_HISTORY_INDEX_PATTERN).mapping(CommonName.MAPPING_TYPE, adResultMapping, XContentType.JSON);
createRequest.index(AD_RESULT_HISTORY_INDEX_PATTERN).mapping(adResultMapping, XContentType.JSON);

choosePrimaryShards(createRequest);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -335,34 +335,31 @@ protected void validateTimeField(boolean indexingDryRun) {
// AbstractAnomalyDetectorActionHandler.validateCategoricalField(String, boolean)
ActionListener<GetFieldMappingsResponse> mappingsListener = ActionListener.wrap(getMappingsResponse -> {
boolean foundField = false;
Map<String, Map<String, Map<String, GetFieldMappingsResponse.FieldMappingMetadata>>> mappingsByIndex = getMappingsResponse
.mappings();

for (Map<String, Map<String, GetFieldMappingsResponse.FieldMappingMetadata>> mappingsByType : mappingsByIndex.values()) {
for (Map<String, GetFieldMappingsResponse.FieldMappingMetadata> mappingsByField : mappingsByType.values()) {
for (Map.Entry<String, GetFieldMappingsResponse.FieldMappingMetadata> field2Metadata : mappingsByField.entrySet()) {

GetFieldMappingsResponse.FieldMappingMetadata fieldMetadata = field2Metadata.getValue();
if (fieldMetadata != null) {
// sourceAsMap returns sth like {host2={type=keyword}} with host2 being a nested field
Map<String, Object> fieldMap = fieldMetadata.sourceAsMap();
if (fieldMap != null) {
for (Object type : fieldMap.values()) {
if (type instanceof Map) {
foundField = true;
Map<String, Object> metadataMap = (Map<String, Object>) type;
String typeName = (String) metadataMap.get(CommonName.TYPE);
if (!typeName.equals(CommonName.DATE_TYPE)) {
listener
.onFailure(
new ADValidationException(
String.format(Locale.ROOT, CommonErrorMessages.INVALID_TIMESTAMP, givenTimeField),
DetectorValidationIssueType.TIMEFIELD_FIELD,
ValidationAspect.DETECTOR
)
);
return;
}
Map<String, Map<String, GetFieldMappingsResponse.FieldMappingMetadata>> mappingsByIndex = getMappingsResponse.mappings();

for (Map<String, GetFieldMappingsResponse.FieldMappingMetadata> mappingsByField : mappingsByIndex.values()) {
for (Map.Entry<String, GetFieldMappingsResponse.FieldMappingMetadata> field2Metadata : mappingsByField.entrySet()) {

GetFieldMappingsResponse.FieldMappingMetadata fieldMetadata = field2Metadata.getValue();
if (fieldMetadata != null) {
// sourceAsMap returns sth like {host2={type=keyword}} with host2 being a nested field
Map<String, Object> fieldMap = fieldMetadata.sourceAsMap();
if (fieldMap != null) {
for (Object type : fieldMap.values()) {
if (type instanceof Map) {
foundField = true;
Map<String, Object> metadataMap = (Map<String, Object>) type;
String typeName = (String) metadataMap.get(CommonName.TYPE);
if (!typeName.equals(CommonName.DATE_TYPE)) {
listener
.onFailure(
new ADValidationException(
String.format(Locale.ROOT, CommonErrorMessages.INVALID_TIMESTAMP, givenTimeField),
DetectorValidationIssueType.TIMEFIELD_FIELD,
ValidationAspect.DETECTOR
)
);
return;
}
}
}
Expand Down Expand Up @@ -608,45 +605,42 @@ protected void validateCategoricalField(String detectorId, boolean indexingDryRu
boolean foundField = false;

// Review why the change from FieldMappingMetadata to GetFieldMappingsResponse.FieldMappingMetadata
Map<String, Map<String, Map<String, GetFieldMappingsResponse.FieldMappingMetadata>>> mappingsByIndex = getMappingsResponse
.mappings();

for (Map<String, Map<String, GetFieldMappingsResponse.FieldMappingMetadata>> mappingsByType : mappingsByIndex.values()) {
for (Map<String, GetFieldMappingsResponse.FieldMappingMetadata> mappingsByField : mappingsByType.values()) {
for (Map.Entry<String, GetFieldMappingsResponse.FieldMappingMetadata> field2Metadata : mappingsByField.entrySet()) {
// example output:
// host_nest.host2=FieldMappingMetadata{fullName='host_nest.host2',
// source=org.opensearch.common.bytes.BytesArray@8fb4de08}

// Review why the change from FieldMappingMetadata to GetFieldMappingsResponse.FieldMappingMetadata

GetFieldMappingsResponse.FieldMappingMetadata fieldMetadata = field2Metadata.getValue();

if (fieldMetadata != null) {
// sourceAsMap returns sth like {host2={type=keyword}} with host2 being a nested field
Map<String, Object> fieldMap = fieldMetadata.sourceAsMap();
if (fieldMap != null) {
for (Object type : fieldMap.values()) {
if (type != null && type instanceof Map) {
foundField = true;
Map<String, Object> metadataMap = (Map<String, Object>) type;
String typeName = (String) metadataMap.get(CommonName.TYPE);
if (!typeName.equals(CommonName.KEYWORD_TYPE) && !typeName.equals(CommonName.IP_TYPE)) {
listener
.onFailure(
new ADValidationException(
CATEGORICAL_FIELD_TYPE_ERR_MSG,
DetectorValidationIssueType.CATEGORY,
ValidationAspect.DETECTOR
)
);
return;
}
Map<String, Map<String, GetFieldMappingsResponse.FieldMappingMetadata>> mappingsByIndex = getMappingsResponse.mappings();

for (Map<String, GetFieldMappingsResponse.FieldMappingMetadata> mappingsByField : mappingsByIndex.values()) {
for (Map.Entry<String, GetFieldMappingsResponse.FieldMappingMetadata> field2Metadata : mappingsByField.entrySet()) {
// example output:
// host_nest.host2=FieldMappingMetadata{fullName='host_nest.host2',
// source=org.opensearch.common.bytes.BytesArray@8fb4de08}

// Review why the change from FieldMappingMetadata to GetFieldMappingsResponse.FieldMappingMetadata

GetFieldMappingsResponse.FieldMappingMetadata fieldMetadata = field2Metadata.getValue();

if (fieldMetadata != null) {
// sourceAsMap returns sth like {host2={type=keyword}} with host2 being a nested field
Map<String, Object> fieldMap = fieldMetadata.sourceAsMap();
if (fieldMap != null) {
for (Object type : fieldMap.values()) {
if (type != null && type instanceof Map) {
foundField = true;
Map<String, Object> metadataMap = (Map<String, Object>) type;
String typeName = (String) metadataMap.get(CommonName.TYPE);
if (!typeName.equals(CommonName.KEYWORD_TYPE) && !typeName.equals(CommonName.IP_TYPE)) {
listener
.onFailure(
new ADValidationException(
CATEGORICAL_FIELD_TYPE_ERR_MSG,
DetectorValidationIssueType.CATEGORY,
ValidationAspect.DETECTOR
)
);
return;
}
}
}

}

}
}
}
Expand Down
Loading