Skip to content

mariusndini/Snowflake-Java-Native-App

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GS | Snowflake - Native - App

Below steps can be taken for replication to create GS JAVA native app.
Additionally logging has been added both to the Java Native App as well as the JavaScript native app.


Provided
2 combiled JAVA files (.Jar)
1 Java file (legend.java)

SQL code documented below and can run on Snowflake via Copy + Paste
SQL Code also provided for Producer and Consumer in SQL folder.
SQL Folder contains _ORIG and _LOG files. _LOG will have logging logic embedded.

SQL Folder also contains decrypt javascript app to be defined on the producer side.




Embedded Logging

Logging logic has been embedded into the application. Please do remember that there are a couple of different languages (Java, Javascript & SQL).

Logging, in most cases, is an array object which pushes a JSON event to the array and ultimately returns JSON object.

Ultimately, results are written to a table and shared back to producer from consumer.

On the consumer side new tables are created to hold logging:

create or replace schema SUMMIT_APP_LOCAL.LOGS;
create or replace table SUMMIT_APP_LOCAL.LOGS.APP_LOG (evt string);
create or replace table SUMMIT_APP_LOCAL.LOGS.JS_LOG (evt string);

Logging in JAVA NATIVE App

Please reference SQL/producer_LOG.sql for code base.
Please note you will need to import and use an ArrayList (import java.util.ArrayList).

Logging exampled below:

// Create Array List
static ArrayList<String> log = new ArrayList<String>(); 

// PUSH Log values to array
log.add( "{\"time\" :\" " + (System.currentTimeMillis()) + "\", \"method\": \"" + (method) + "\" , \"class\": \"" + (myclass) + "\" }" );
log.add( "{\"time\" :\" " + (System.currentTimeMillis()) + "\", \"event\": \"step 3\" }" );
log.add( "{\"time\" :\" " + (System.currentTimeMillis()) + "\", \"p1\":\"" + ( p1 ) + "\" }" );

// CREATE JSON to return SQL + Logs to Consumer App
String finalReturn =  "{\"sql\": " + planJson + ", \"log\": " + log.toString() + "}";
return finalReturn;

Please note that the above is a breaking change to the JAVA Native App code base. SQL and LOG is now a part of a JSON output which will need to handled on the Consumer side.


Logging in JavaScript App

Please reference SQL/consumer.log for code base. This refers to JavaScript wrapper around the Java App.

Similar to JAVA Native App, JavaScript App also includes an array which has log objects inserted.

var log = [];
log.push( {time: Date.now(), event: 'step 1 - init'} );
...
log.push( {time: Date.now(), event: 'step 2 - complete running java code'} );
...

Eventually toward the end of the script the logs will be saved to two tables (APP_LOG & JS_LOG), one more the Native App and one of the JS consumer app.
var insertStmt = `insert into SUMMIT_APP_LOCAL.LOGS.APP_LOG(select '${JSON.stringify(JSON.parse(ResultSet.getColumnValue(1)).log)}')` ;
snowflake.createStatement({sqlText:insertStmt }).execute();

var insertStmt = `insert into SUMMIT_APP_LOCAL.LOGS.JS_LOG(select '${ JSON.stringify(log) }')` ;
snowflake.createStatement({sqlText:insertStmt }).execute();

Tables Shared Back to Producer

At this point tables are populated and ready to be shared back to the producer.

The producer has the option to create a stream on shared tables to track DML changes. Permissioning from consumer possibly necessary.

Log Examples

Java Native App Log Example

[{"time":" 1651628262838","method":"call","class":"legend"},{"time":" 1651628262867","event":"step 3"},{"time":" 1651628262871","p1":"MyParameter"},{"time":" 1651628262926","event":"step 3"},{"time":" 1651628263374","event":"step 3"},{"time":" 1651628263375","event":"final step"},{"time":" 1651628263375","sql":"Legend SQL Goes Here - Test App (SQL THIS FROM TABLE WHERE VALUE > 1)"}]

JavaScript App Log Example
[{"time":1651628261651,"event":"step 1 - init"},{"time":1651628263512,"event":"step 2 - complete running java code"},{"time":1651628263512,"event":"step 3 - got legend query"},{"time":1651628263512,"event":"step - 4 - replaced legend query"},{"time":1651628263512,"event":"step 5 - almost done"},{"time":1651628263563,"acct":"SFSENORTHAMERICA_MARIUS"},{"time":1651628263563,"finalQuery":"select listagg(object_construct(*)::varchar, ') from (select cp.isin,cp.EARNINGS_PER_SHARE, smq."Marturity Date",smq."Sector" from DEMO_DB.DEVELOPMENT_TEST.CUSTOMER_PORTFOLIO cp join ("select \"root\".NAME as \"Name\", \"root\".FIRMID as \"FirmId\" from PERSON as \"root\"") smq on cp.ISIN = smq."Isin")"}]

Encryption & Decryption of Logs

Logs can be encrypted prior to being writen to consumer LOG tables. Possible encryption methods outlined below.

Decrypt

create or replace procedure SUMMIT_LEGEND_APP.APP_SCHEMA.DECRYPT(SALT varchar, ENCODED varchar  )
RETURNS STRING
LANGUAGE JAVASCRIPT
EXECUTE AS OWNER
AS $$
{
const textToChars = (text) => text.split("").map((c) => c.charCodeAt(0));
const applySaltToChar = (code) => textToChars(SALT).reduce((a, b) => a ^ b, code);
return ENCODED
    .match(/.{1,2}/g)
    .map((hex) => parseInt(hex, 16))
    .map(applySaltToChar)
    .map((charCode) => String.fromCharCode(charCode))
    .join("");
}
$$;

Encrypt>

const crypt = (salt, text) => {
    const textToChars = (text) => text.split("").map((c) => c.charCodeAt(0));
    const byteHex = (n) => ("0" + Number(n).toString(16)).substr(-2);
    const applySaltToChar = (code) => textToChars(salt).reduce((a, b) => a ^ b, code);

return text
    .split("")
    .map(textToChars)
    .map(applySaltToChar)
    .map(byteHex)
    .join("");
};

Permissions

use role accountadmin;
grant create share , alter share, drop share to role SUMMIT_APP_CONSUMER;
use role SUMMIT_APP_CONSUMER;

create schema SUMMIT_APP_LOCAL.LOGS;
create table SUMMIT_APP_LOCAL.LOGS.LOG_TABLE (...);
grant usage on schema SUMMIT_APP_LOCAL.LOGS to role SUMMIT_APP_LEGEND_EXECUTE;
grant insert on table SUMMIT_APP_LOCAL.LOGS.LOG_TABLE to role SUMMIT_APP_LEGEND_EXECUTE;

use role SUMMIT_APP_CONSUMER
create share SUMMIT_APP_LOGS;
grant usage on database SUMMIT_APP_LOCAL to share SUMMIT_APP_LOGS;
grant usage on schema SUMMIT_APP_LOCAL.LOGS to share SUMMIT_APP_LOGS;
grant select on table SUMMIT_APP_LOCAL.LOGS.LOG_TABLE to share SUMMIT_APP_LOGS;

alter share SUMMIT_APP_LOGS add accounts=<GS Provider Account>;




Run Locally on Mac

You can run the test locally with the code below (Will need java installed and probably a bunch of other stuff).

java -cp GS.jar:legend.jar legend.java

Output should be result.json file provided (or similar)


Java Body - Unsure what majority of the below really does.

public class legend{

    public static void main(String[] args) throws Exception{
        System.out.println("HELLO!");
        URL url = persons.class.getResource("/plans/org/finos/legend/showcase/showcase2/service/persons.json");
        BufferedReader reader1 = new BufferedReader(new InputStreamReader(url.openStream(), StandardCharsets.UTF_8));
        String planJson = reader1.lines().collect(Collectors.joining("\n"));

        BufferedReader reader2 = new BufferedReader(new InputStreamReader(url.openStream(), StandardCharsets.UTF_8));
        ExecutionPlan executionPlan = PlanExecutor.readExecutionPlan(reader2);
        SingleExecutionPlan singleExecutionPlan = executionPlan.getSingleExecutionPlan(new HashMap<>());
        SQLExecutionNode sqlExecutionNode = (SQLExecutionNode)singleExecutionPlan.rootExecutionNode.executionNodes.get(0);

        System.out.println(planJson);
    }
}



PUT JARs - Snowflake Stage

Provided are two required combiled JAVA files (.jar) to create the Snowflake Native App.

Upload Provided JARs to Stage

Upload legend.jar

put file://.../legend.jar  @...JARSTAGE auto_compress=false;

Upload GS.jar

put file://.../GS.jar  @...JARSTAGE auto_compress=false;

Creating Function in Snowflake

After the Jars have been uploaded the following will create the Snowflake Java UDF.

create or replace function demo_db.public.legendGS2()
returns String
language java
imports = ('@...jarstage/GS.jar', '@...jarstage/legend.jar')
handler='legend.call'
target_path='@~/legendGS2.jar'
as
$$
    import org.finos.legend.engine.plan.execution.PlanExecutor;
    import org.finos.legend.engine.protocol.pure.v1.model.executionPlan.ExecutionPlan;
    import org.finos.legend.engine.protocol.pure.v1.model.executionPlan.SingleExecutionPlan;
    import org.finos.legend.engine.protocol.pure.v1.model.executionPlan.nodes.ExecutionNode;
    import org.finos.legend.engine.protocol.pure.v1.model.executionPlan.nodes.SQLExecutionNode;
    import org.finos.legend.showcase.showcase2.service.persons;

    import java.io.BufferedReader;
    import java.io.InputStreamReader;
    import java.net.URL;
    import java.nio.charset.StandardCharsets;
    import java.util.HashMap;
    import java.util.stream.Collectors;

    class legend {
        public static String call() throws Exception {
          URL url = persons.class.getResource("/plans/org/finos/legend/showcase/showcase2/service/persons.json");
          BufferedReader reader1 = new BufferedReader(new InputStreamReader(url.openStream(), StandardCharsets.UTF_8));
          String planJson = reader1.lines().collect(Collectors.joining("\n"));

          BufferedReader reader2 = new BufferedReader(new InputStreamReader(url.openStream(), StandardCharsets.UTF_8));
          ExecutionPlan executionPlan = PlanExecutor.readExecutionPlan(reader2);
          SingleExecutionPlan singleExecutionPlan = executionPlan.getSingleExecutionPlan(new HashMap<>());
          SQLExecutionNode sqlExecutionNode = (SQLExecutionNode)singleExecutionPlan.rootExecutionNode.executionNodes.get(0);

          return planJson;
        }
    }
$$;

Calling Function

select legendGS2();

Call Function > Parse JSON > Get sqlQuery

select PARSE_JSON(legendGS2()::variant):rootExecutionNode.executionNodes[0].sqlQuery;

Output Below

"select \"root\".NAME as \"Name\", \"root\".FIRMID as \"FirmId\" from PERSON as \"root\""

Creating JavaScript Stored Procedure

create or replace procedure SUMMIT_LEGEND_APP.APP_SCHEMA.SECURITY_MASTER_APP(MYJAVAUDFPATH varchar, JOINEDSQLSTMT varchar  )
RETURNS STRING
LANGUAGE JAVASCRIPT
EXECUTE AS OWNER
AS $$
{
    var javaSQLQuery = `SELECT PARSE_JSON( ${MYJAVAUDFPATH}()::variant):rootExecutionNode.executionNodes[0].sqlQuery`;
    var ResultSet = snowflake.createStatement({sqlText: javaSQLQuery }).execute();
    var row = ResultSet.next();
    var legendQuery = ResultSet.getColumnValue(1);

    var legend_search_str = "{legend(security_master_app)}";
    var intermediateQuery = JOINEDSQLSTMT.replace(legend_search_str, "(" + legendQuery + ")");
    var finalQuery = "select listagg(object_construct(*)::varchar, '\n') from (" + intermediateQuery + ")";
//    var ResultSet2 = (snowflake.createStatement({sqlText: finalQuery})).execute();
//    var row2 = ResultSet2.next();
//    var returnValue = ResultSet2.getColumnValue(1);
    return finalQuery;

}
$$;

Calling JS S.Proc

call SUMMIT_LEGEND_APP.APP_SCHEMA.SECURITY_MASTER_APP( 'demo_db.public.legendGS2','select cp.isin,cp.EARNINGS_PER_SHARE, smq."Marturity Date",smq."Sector" from DEMO_DB.DEVELOPMENT_TEST.CUSTOMER_PORTFOLIO cp join {legend(security_master_app)} smq on cp.ISIN = smq."Isin"');

Resultant Output SQL

select listagg(object_construct(*)::varchar, '') from (select cp.isin,cp.EARNINGS_PER_SHARE, smq."Marturity Date",smq."Sector" from DEMO_DB.DEVELOPMENT_TEST.CUSTOMER_PORTFOLIO cp join (select "root".NAME as "Name", "root".FIRMID as "FirmId" from PERSON as "root") smq on cp.ISIN = smq."Isin")

Input SQL (for reference)

'select cp.isin,cp.EARNINGS_PER_SHARE, smq."Marturity Date",smq."Sector" from DEMO_DB.DEVELOPMENT_TEST.CUSTOMER_PORTFOLIO cp join {legend(security_master_app)} smq on cp.ISIN = smq."Isin"






MISC

Misc Steps during creation process

Building Legend.jar

The legend.jar is generated w/ the following MAVIN command and can be downloaded here (https://github.com/finos/legend/tree/service-exec-jar-example/examples/service-execution-jar/legend-application).

MVN install 

Thereafter --> A target folder will be created --> in there will be two JAR files. Mine was named: legend-application-0.0.1-SNAPSHOT-shaded.jar

Rename this file to legend.jar --> Provided

You can now upload this file to Snowflake to create Native App

About

Snowflake Native App

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published