Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Addition of Upsert operation scenario #20

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 38 additions & 0 deletions src/e2e-test/features/salesforcesink/RunTime.feature
Original file line number Diff line number Diff line change
Expand Up @@ -132,3 +132,41 @@ Feature: Salesforce Sink - Run time Scenarios
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred from Bigquery to Salesforce is equal

@SINK-TS-SF-RNTM-04 @BQ_SOURCE_TEST @DELETE_TEST_DATA
Scenario: Verify user should be able to see ingest the records successfully using upsert operation
When Open Datafusion Project to configure pipeline
And Select plugin: "BigQuery" from the plugins list as: "Source"
And Navigate to the properties page of plugin: "BigQuery"
Then Replace input plugin property: "project" with value: "projectId"
Then Enter input plugin property: "datasetProject" with value: "projectId"
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
Then Enter input plugin property: "dataset" with value: "dataset"
Then Enter input plugin property: "table" with value: "bqSourceTable"
Then Validate "BigQuery" plugin properties
And Close the Plugin Properties page
And Select Sink plugin: "Salesforce" from the plugins list
And Connect plugins: "BigQuery" and "Salesforce" to establish connection
And Navigate to the properties page of plugin: "Salesforce"
And fill Authentication properties for Salesforce Admin user
Then Enter input plugin property: "referenceName" with value: "ReferenceName"
And Select radio button plugin property: "operation" with value: "upsert"
Then Enter input plugin property: "externalIdField" with value: "UpsertColumnvalue"
And Enter input plugin property: "sObject" with value: "sobject.account"
And Select dropdown plugin property: "errorHandling" with option value: "Skip on error"
Then Validate "Salesforce" plugin properties
And Close the Plugin Properties page
And Save the pipeline
And Preview and run the pipeline
Then Wait till pipeline preview is in running state
Then Open and capture pipeline preview logs
Then Verify the preview run status of pipeline in the logs is "succeeded"
Then Close the pipeline logs
Then Close the preview
Then Deploy the pipeline
Then Run the Pipeline in Runtime
Then Wait till pipeline is in running state
Then Open and capture logs
Then Verify the pipeline status is "Succeeded"
Then Close the pipeline logs
Then Validate the values of records transferred from Bigquery to Salesforce is equal
4 changes: 1 addition & 3 deletions src/e2e-test/resources/BigQuery/BigQueryCreateTableQuery.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1 @@
create table `DATASET.TABLE_NAME` (Name STRING, Col_Timestamp__c TIMESTAMP, Col_Date__c DATE, Col_Currency__c FLOAT64,
Col_Email__c STRING, Col_Number__c FLOAT64, Col_GeoLocation__Latitude__s FLOAT64,
Col_GeoLocation__Longitude__s FLOAT64, Col__c STRING, Col_Url__c STRING, Col_Time__c TIME, Col_Text__c STRING)
create table DATASET.TABLE_NAME (Id__c FLOAT64, Name STRING, Col_Timestamp__c TIMESTAMP, Col_Date__c DATE, Col_Currency__c FLOAT64, Col_Email__c STRING, Col_Number__c FLOAT64, Col__c STRING, Col_Url__c STRING, Col_Time__c TIME, Col_Text__c STRING)
8 changes: 3 additions & 5 deletions src/e2e-test/resources/BigQuery/BigQueryInsertDataQuery.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
insert into `DATASET.TABLE_NAME` (Name, Col_Timestamp__c, Col_Date__c, Col_Currency__c, Col_Email__c, Col_Number__c,
Col_GeoLocation__Latitude__s, Col_GeoLocation__Longitude__s, Col__c, Col_Url__c, Col_Time__c, Col_Text__c) values
('adam','2019-03-10 04:50:01 UTC','2021-01-28',61.823765812,'[email protected]',898365444,37.794116,-122.3432,
'984746334','abc/123','20:26:34','find');

insert into DATASET.TABLE_NAME (Id__c, Name, Col_Timestamp__c, Col_Date__c, Col_Currency__c, Col_Email__c, Col_Number__c,Col__c, Col_Url__c, Col_Time__c, Col_Text__c) values
(786777,'adam','2019-03-10 04:50:01 UTC','2021-01-28',61.823765812,'[email protected]',-122.3432,
'984746334','abc/123','20:26:34','find');
2 changes: 1 addition & 1 deletion src/e2e-test/resources/pluginParameters.properties
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ invalid.admin.consumer.secret=lmnop891011
#SOQL Query
simple.query=SELECT Id, Name, Phone FROM Account
test.query=SELECT Id,Name,Col_Timestamp__c,Col_Date__c,Col_Currency__c,Col_Email__c,Col_Number__c,\
Col_GeoLocation__Latitude__s,Col_GeoLocation__Longitude__s,Col__c,Col_Url__c,Col_Time__c,Col_Text__c FROM Automation_custom__c
Col__c,Col_Url__c,Col_Time__c,Col_Text__c FROM Automation_custom__c
where.query=SELECT name FROM Opportunity WHERE StageName='Needs Analysis'
groupby.query=SELECT CampaignId, AVG(Amount) FROM Opportunity GROUP BY CampaignId
childtoparent.query=SELECT Id, Name, Account.Name FROM Contact WHERE Account.Industry = 'Chemicals'
Expand Down