Skip to content

Commit

Permalink
update runScript to work properly and readme
Browse files Browse the repository at this point in the history
  • Loading branch information
melotik committed Jun 27, 2024
1 parent cdbce43 commit a673149
Show file tree
Hide file tree
Showing 4 changed files with 122 additions and 103 deletions.
67 changes: 44 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,23 +1,25 @@
# scroll-adapters

## TVL by User - Adapters

### Onboarding Checklist

Please complete the following:

1. Set up a subquery indexer (e.g. Goldsky Subgraph)
1. Follow the docs here: https://docs.goldsky.com/guides/create-a-no-code-subgraph
1. Set up a subquery indexer (e.g. Goldsky Subgraph)
1. Follow the docs here: <https://docs.goldsky.com/guides/create-a-no-code-subgraph>
2. General Steps
1. create an account at app.goldsky.com
2. deploy a subgraph or migrate an existing subgraph - https://docs.goldsky.com/subgraphs/introduction
3. Use the slugs `scroll-testnet` and `scroll` when deploying the config
2. Prepare Subquery query code according to the Data Requirement section below.
3. Submit your response as a Pull Request to: https://github.com/delta-hq/scroll-adapters
1. With path being `/<your_protocol_handle>`

1. create an account at app.goldsky.com
2. deploy a subgraph or migrate an existing subgraph - <https://docs.goldsky.com/subgraphs/introduction>
3. Use the slugs `scroll-testnet` and `scroll` when deploying the config
2. Prepare Subquery query code according to the Data Requirement section below.
3. Submit your response as a Pull Request to: <https://github.com/delta-hq/scroll-adapters>
1. With path being `/<your_protocol_handle>`

### Code Changes Expected

1. Create a function like below:

```
export const getUserTVLByBlock = async (blocks: BlockData) => {
const { blockNumber, blockTimestamp } = blocks
Expand All @@ -28,14 +30,18 @@ Please complete the following:
};
```

2. Interface for input Block Data is, in below blockTimestamp is in epoch format.
```

```
interface BlockData {
blockNumber: number;
blockTimestamp: number;
}
```
3. Output "csvRow" is a list.

3. Output "csvRow" is a list.

```
const csvRows: OutputDataSchemaRow[] = [];
Expand All @@ -49,16 +55,18 @@ const csvRows: OutputDataSchemaRow[] = [];
usd_price: number; //assign 0 if not available
};
```
4. Make sure you add relevant package.json and tsconfig.json

4. Make sure you add relevant package.json and tsconfig.json

### Data Requirement

Goal: **Hourly snapshot of TVL by User by Asset**

For each protocol, we are looking for the following:
1. Query that fetches all relevant events required to calculate User TVL in the Protocol at hourly level.
2. Code that uses the above query, fetches all the data and converts it to csv file in below given format.
3. Token amount should be raw token amount. Please do not divide by decimals.
For each protocol, we are looking for the following:

1. Query that fetches all relevant events required to calculate User TVL in the Protocol at hourly level.
2. Code that uses the above query, fetches all the data and converts it to csv file in below given format.
3. Token amount should be raw token amount. Please do not divide by decimals.

Teams can refer to the example we have in there to write the code required.

Expand All @@ -74,7 +82,6 @@ Teams can refer to the example we have in there to write the code required.
| token_balance | Balance of token (**If the token was borrowed, this balance should be negative**) |
| usd_price (from oracle) | Price of token (optional) |


Sample output row will look like this:

| blocknumber | timestamp | user_address | token_address | token_balance | token_symbol (optional) | usd_price(optional)|
Expand All @@ -84,11 +91,14 @@ Sample output row will look like this:
Note: **Expect multiple entries per user if the protocols has more than one token asset**

### index.ts

On this scope, the code must read a CSV file with headers named `hourly_blocks.csv` that contains the following columns:

- `number` - block number
- `timestamp` - block timestamp

And output a CSV file named `outputData.csv` with headers with the following columns:

- `block_number` - block number
- `timestamp` - block timestamp
- `user_address` - user address
Expand All @@ -99,26 +109,37 @@ And output a CSV file named `outputData.csv` with headers with the following col
e.g. `adapters/renzo/src/index.ts`

For testing the adapter code for a single hourly block, use the following `hourly_blocks.csv` file:
```

```
number,timestamp
4243360,1714773599
```

### Adapter Example

In this repo, there is an adapter example. This adapter aims to get data positions from the subrgaph and calculate the TVL by users.
The main scripts is generating a output as CSV file.

[Adapter Example](adapters/example/dex/src/index.ts)

## Notes

1. Please make sure to have a "compile" script in package.json file. So, we are able to compile the typescript files into `dist/index.js` file.

## How to execute this project?

```
npm install // install all packages
npm run watch //other terminal tab
npm run start // other terminal tab
**Please** ensure you have `adapters/{PROJECT_FOLDER}/hourly_blocks.csv` comma-delineated.

```bash
cd adapters
npm i # install all packages in main repo

cd {PROJECT_FOLDER}
npm i
npm run compile

cd ..
npm run start # run runScript.js (this runs in the ci/cd)
```

By this, we'll be able to generate the output csv file.
These commands will run your project, and they mimic the commands found in the ci/cd [workflow](.github/workflows/test.yml).
29 changes: 0 additions & 29 deletions adapters/layerbank/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -97,32 +97,3 @@ const readBlocksFromCSV = async (filePath: string): Promise<BlockData[]> => {

return blocks;
};

readBlocksFromCSV("hourly_blocks.csv")
.then(async (blocks: any[]) => {
console.log(blocks);
const allCsvRows: any[] = []; // Array to accumulate CSV rows for all blocks

for (const block of blocks) {
try {
const result = await getUserTVLByBlock(block);
for (let i = 0; i < result.length; i++) {
allCsvRows.push(result[i]);
}
} catch (error) {
console.error(`An error occurred for block ${block}:`, error);
}
}
await new Promise((resolve, reject) => {
const ws = fs.createWriteStream(`outputData.csv`, { flags: "w" });
write(allCsvRows, { headers: true })
.pipe(ws)
.on("finish", () => {
console.log(`CSV file has been written.`);
resolve;
});
});
})
.catch((err) => {
console.error("Error reading CSV file:", err);
});
113 changes: 71 additions & 42 deletions adapters/runScript.js
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
// runScript.js
const fs = require('fs');
const path = require('path');

const csv = require('csv-parser');
const { write } =require('fast-csv');
const { write } = require('fast-csv');

// Get the folder name from command line arguments
const folderName = process.argv[2];
Expand All @@ -21,27 +21,31 @@ if (!fs.existsSync(folderPath)) {
process.exit(1);
}

// Check if the provided folder contains index.ts file
// Check if the provided folder contains dist/index.js file
const indexPath = path.join(folderPath, 'dist/index.js');
if (!fs.existsSync(indexPath)) {
console.error(`Folder '${folderName}' does not contain index.ts file.`);
console.error(`Folder '${folderName}' does not contain dist/index.js file. Please compile index.ts`);
process.exit(1);
}

// Import the funct function from the provided folder
// Import the getUserTVLByBlock function from the provided folder
const { getUserTVLByBlock } = require(indexPath);

const readBlocksFromCSV = async (filePath) => {
const blocks = [];

await new Promise((resolve, reject) => {
fs.createReadStream(filePath)
.pipe(csv({ separator: '\t' })) // Specify the separator as '\t' for TSV files
fs.createReadStream(filePath, { encoding: 'utf8' })
.pipe(csv({
separator: ',', // Specify the separator as '\t'
mapHeaders: ({ header, index }) => header.trim() // Trim headers to remove any leading/trailing spaces
}))
.on('data', (row) => {
console.log(row)
const blockNumber = parseInt(row.number, 10);
const blockTimestamp = parseInt(row.block_timestamp, 10);
const blockTimestamp = parseInt(row.timestamp, 10);
if (!isNaN(blockNumber) && blockTimestamp) {
blocks.push({ blockNumber: blockNumber, blockTimestamp });
blocks.push({ blockNumber, blockTimestamp });
}
})
.on('end', () => {
Expand All @@ -55,40 +59,65 @@ const readBlocksFromCSV = async (filePath) => {
return blocks;
};

readBlocksFromCSV('block_numbers.tsv')
.then(async (blocks) => {
const allCsvRows = []; // Array to accumulate CSV rows for all blocks
const batchSize = 10; // Size of batch to trigger writing to the file
let i = 0;

for (const block of blocks) {
try {
const result = await getUserTVLByBlock(block);

// Accumulate CSV rows for all blocks
allCsvRows.push(...result);

i++;
console.log(`Processed block ${i}`);

// Write to file when batch size is reached or at the end of loop
if (i % batchSize === 0 || i === blocks.length) {
const ws = fs.createWriteStream(`${folderName}/outputData.csv`, { flags: i === batchSize ? 'w' : 'a' });
write(allCsvRows, { headers: i === batchSize ? true : false })
.pipe(ws)
.on("finish", () => {
console.log(`CSV file has been written.`);
});

// Clear the accumulated CSV rows
allCsvRows.length = 0;
// Log the full path to the CSV file
const csvFilePath = path.join(folderPath, 'hourly_blocks.csv');
console.log(`Looking for hourly_blocks.csv at: ${csvFilePath}`);

// Additional check for file existence before proceeding
if (!fs.existsSync(csvFilePath)) {
console.error(`File '${csvFilePath}' does not exist.`);
process.exit(1);
}

// Main function to coordinate the processing
const main = async () => {
try {
const blocks = await readBlocksFromCSV(csvFilePath);

console.log('Blocks read from CSV:', blocks);

const allCsvRows = []; // Array to accumulate CSV rows for all blocks
const batchSize = 10; // Size of batch to trigger writing to the file
let i = 0;

for (const block of blocks) {
try {
const result = await getUserTVLByBlock(block);

console.log(`Result for block ${block.blockNumber}:`, result); // Print the result for verification

// Accumulate CSV rows for all blocks
allCsvRows.push(result);

i++;
console.log(`Processed block ${i}`);

// Write to file when batch size is reached or at the end of loop
if (i % batchSize === 0 || i === blocks.length) {
const ws = fs.createWriteStream(`${folderName}/outputData.csv`, { flags: i === batchSize ? 'w' : 'a' });
write(allCsvRows, { headers: i === batchSize ? true : false })
.pipe(ws)
.on("finish", () => {
console.log(`CSV file has been written.`);
});

// Clear the accumulated CSV rows
allCsvRows.length = 0;
}
} catch (error) {
console.error(`An error occurred for block ${block.blockNumber}:`, error);
}
} catch (error) {
console.error(`An error occurred for block ${block}:`, error);
}

} catch (err) {
console.error('Error reading CSV file:', err);
}
})
.catch((err) => {
console.error('Error reading CSV file:', err);
};

// Run the main function and ensure the process waits for it to complete
main().then(() => {
console.log('Processing complete.');
process.exit(0);
}).catch((err) => {
console.error('An error occurred:', err);
process.exit(1);
});
16 changes: 7 additions & 9 deletions adapters/testScript.js
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// runScript.js
// testScript.js
const fs = require('fs');
const path = require('path');

Expand Down Expand Up @@ -30,13 +30,11 @@ if (!fs.existsSync(indexPath)) {
const { getUserTVLByBlock } = require(indexPath);

// Call the getUserTVLByBlock function with desired arguments
const result = getUserTVLByBlock({
blockTimestamp: 1711023841,
blockNumber: 3041467
getUserTVLByBlock({
blockTimestamp: 1711023841,
blockNumber: 3041467
}).then((result) => {
if(!result.length){
throw new Error("Empty result")
}
if(!result.length){
throw new Error("Empty result")
}
});


0 comments on commit a673149

Please sign in to comment.