Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: update SyncSwap adapter #51 #59

Merged
merged 4 commits into from
Apr 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
91 changes: 41 additions & 50 deletions adapters/syncswap/src/index.ts
Original file line number Diff line number Diff line change
@@ -1,66 +1,57 @@
import { getPositionsForAddressByPoolAtBlock as getSyncSwapPositionsForAddressByPoolAtBlock} from "./sdk/positionSnapshots"

import { promisify } from 'util';
import stream from 'stream';
import csv from 'csv-parser';
import fs from 'fs';
import { write } from 'fast-csv';


interface CSVRow {
block_number: number
timestamp: string
user_address: string
token_address: string
token_symbol: string
token_balance: string
usd_price: string
}


const pipeline = promisify(stream.pipeline);

// Assuming you have the following functions and constants already defined
// getPositionsForAddressByPoolAtBlock, CHAINS, PROTOCOLS, AMM_TYPES, getPositionDetailsFromPosition, getLPValueByUserAndPoolFromPositions, BigNumber
interface BlockData {
blockNumber: number;
blockTimestamp: number;
}

const readBlocksFromCSV = async (filePath: string): Promise<number[]> => {
const blocks: number[] = [];
await pipeline(
fs.createReadStream(filePath),
csv(),
async function* (source) {
for await (const chunk of source) {
// Assuming each row in the CSV has a column 'block' with the block number
if (chunk.block) blocks.push(parseInt(chunk.block, 10));
export const main = async (blocks: BlockData[]) => {
const allCsvRows: any[] = []; // Array to accumulate CSV rows for all blocks
const batchSize = 10; // Size of batch to trigger writing to the file
let i = 0;

for (const { blockNumber, blockTimestamp } of blocks) {
try {
// Retrieve data using block number and timestamp
const csvRows = await getSyncSwapPositionsForAddressByPoolAtBlock(blockNumber)

// Accumulate CSV rows for all blocks
allCsvRows.push(...csvRows);

i++;
console.log(`Processed block ${i}`);

// Write to file when batch size is reached or at the end of loop
if (i % batchSize === 0 || i === blocks.length) {
const ws = fs.createWriteStream(`outputData.csv`, { flags: i === batchSize ? 'w' : 'a' });
write(allCsvRows, { headers: i === batchSize ? true : false })
.pipe(ws)
.on("finish", () => {
console.log(`CSV file has been written.`);
});

// Clear the accumulated CSV rows
allCsvRows.length = 0;
}
} catch (error) {
console.error(`An error occurred for block ${blockNumber}:`, error);
}
);
return blocks;
};


const getData = async () => {
const snapshotBlocks = [
296496,330000
// Add more blocks as needed
]; //await readBlocksFromCSV('src/sdk/mode_chain_daily_blocks.csv');

const csvRows: CSVRow[] = [];

for (let block of snapshotBlocks) {
// SyncSwap Linea position snapshot
const rows = await getSyncSwapPositionsForAddressByPoolAtBlock(block)
rows.forEach((row) => csvRows.push(row as CSVRow))
}

// Write the CSV output to a file
const ws = fs.createWriteStream('outputData.csv');
write(csvRows, { headers: true }).pipe(ws).on('finish', () => {
console.log("CSV file has been written.");
});
};

getData().then(() => {
console.log("Done");
});

export const getUserTVLByBlock = async (blocks: BlockData) => {
const { blockNumber, blockTimestamp } = blocks
return await getSyncSwapPositionsForAddressByPoolAtBlock(blockNumber)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@amedrontadora can you please change the timestamp field to number in the schema of returned object.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, updated

}

// main().then(() => {
// console.log("Done");
// });
2 changes: 1 addition & 1 deletion adapters/syncswap/src/sdk/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,5 @@ export const enum CHAINS{
}

export const SUBGRAPH_URLS = {
[CHAINS.LINEA]: "https://api.studio.thegraph.com/query/62864/syncswap-graph-linea/v1.4.1.4"
[CHAINS.LINEA]: "https://gateway-arbitrum.network.thegraph.com/api/ce0ba3625ebbbd3c4b5a2af394dc8e47/subgraphs/id/3xpZFx5YNWzqemwdtRhyaTXVidKNnjY19XAWoHtvR6Lh"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@amedrontadora is the usage of arbitrum network intentional here. We need the liquidity on Linea. Please confirm if that will be received from this.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the address provided by the graph service provider, which can obtain linea data.

}
4 changes: 2 additions & 2 deletions adapters/syncswap/src/sdk/positionSnapshots.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ interface SubgraphResponse {

interface UserPositionSnapshotsAtBlockData {
block_number: number
timestamp: string
timestamp: number
user_address: string
token_address: string
token_symbol: string
Expand Down Expand Up @@ -102,7 +102,7 @@ export const getPositionsForAddressByPoolAtBlock = async (
}
userPositionSnapshotsAtBlockData.push({
user_address: positionSnapshot.account,
timestamp: new Date(positionSnapshot.timestamp * 1000).toISOString(),
timestamp: Math.floor(positionSnapshot.timestamp),
token_address: positionSnapshot.pair.id,
block_number: snapshotBlockNumber,
token_symbol: `${positionSnapshot.pair.token0.symbol}/${positionSnapshot.pair.token1.symbol} cSLP`,
Expand Down
Loading