Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review #242

Open
wants to merge 300 commits into
base: main
Choose a base branch
from
Open

Review #242

wants to merge 300 commits into from

Conversation

Amuhar
Copy link
Contributor

@Amuhar Amuhar commented Sep 13, 2024

PR for review DSM 1.5 and other changes

Amuhar and others added 30 commits May 3, 2024 18:52
…efactoring

Feat/deposit events verify refactoring
…efactoring

Feat/deposit events verify refactoring
fix: migrate to confluentinc in kafka tests
@Amuhar Amuhar marked this pull request as ready for review September 18, 2024 13:45
@Amuhar Amuhar requested a review from a team as a code owner September 18, 2024 13:45
@Amuhar Amuhar marked this pull request as draft September 18, 2024 13:52
package.json Show resolved Hide resolved
Comment on lines +395 to +396
blockHash: string;
blockNumber: number;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this object is used multiple times, we can create a type for it

return ignoreDeposits;
}

public isNeedToProcessNewState(newMeta: {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldProcessNewState would be a better name

Comment on lines +398 to +412
const lastMeta = this.lastProcessedStateMeta;
if (!lastMeta) return true;
if (lastMeta.blockNumber > newMeta.blockNumber) {
this.logger.error('Keys API returns old state', { newMeta, lastMeta });
return false;
}
const isSameBlock = lastMeta.blockHash === newMeta.blockHash;

if (isSameBlock) {
this.logger.log(`The block has not changed since the last cycle. Exit`, {
newMeta,
});
}

return !isSameBlock;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's kinda hard to reason with this logic because exits are not explicit. We can make it more clearer by rewriting it this way

Suggested change
const lastMeta = this.lastProcessedStateMeta;
if (!lastMeta) return true;
if (lastMeta.blockNumber > newMeta.blockNumber) {
this.logger.error('Keys API returns old state', { newMeta, lastMeta });
return false;
}
const isSameBlock = lastMeta.blockHash === newMeta.blockHash;
if (isSameBlock) {
this.logger.log(`The block has not changed since the last cycle. Exit`, {
newMeta,
});
}
return !isSameBlock;
if (!this.lastProcessedStateMeta) {
return true;
}
if (newMeta.blockNumber < lastMeta.blockNumber) {
this.logger.error('Keys API returns old state', { newMeta, lastMeta });
return false;
}
if (newMeta.blockHash === lastMeta.blockHash) {
this.logger.log('The block has not changed since the last cycle. Exit', { newMeta });
return false;
}
return true;

Comment on lines +83 to +93
const keyMap = keys.reduce((acc, key) => {
const duplicateKeys = acc.get(key.key) || [];
duplicateKeys.push(key);
acc.set(key.key, duplicateKeys);

return acc;
}, new Map<string, RegistryKey[]>());

return Array.from(keyMap.entries()).filter(
([, duplicateKeys]) => duplicateKeys.length > 1,
);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can find a lot of small optimizations all over this project by reducing the number of passes over an array of keys using for-loop instead of functional methods like reduce, filter, etc. I don't know if these optimizations are worth refactoring but are still worth pointing out.

For example, in this particular case we can do this one for-of loop instead doing a full reduce, from and filter on n keys.

e.g.

const keyMap = new Map<string, RegistryKey[]>();
const duplicateGroups: [string, RegistryKey[]][] = [];

for (const key of keys) {
  const existingGroup = keyMap.get(key.key);
  if (existingGroup) {
    existingGroup.push(key);
    if (existingGroup.length === 2) {
      duplicateGroups.push([key.key, existingGroup]);
    }
  } else {
    keyMap.set(key.key, [key]);
  }
}

return duplicateGroups;

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thank you for suggestion, but implicit data changes are not safe, and the small performance gains from this approach don't compensate for the risks. Safety and readability are far more important.

Comment on lines +176 to +179
return operatorKeys.reduce(
(prev, curr) => (prev.index < curr.index ? prev : curr),
operatorKeys[0],
);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A little more readable way

Suggested change
return operatorKeys.reduce(
(prev, curr) => (prev.index < curr.index ? prev : curr),
operatorKeys[0],
);
return operatorKeys.sort((a, b) => a.index - b.index)[0];

return this.handleDepositedKeyDuplicates(suspectedDuplicateKeys);
}

return await this.handleMultiOperatorDuplicates(
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return await this.handleMultiOperatorDuplicates(
return this.handleMultiOperatorDuplicates(

duplicates,
);

// Filter all unresolved keys (keys without a SigningKeyAdded event) for the module,
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How is this possible?

Comment on lines +117 to +123
if (cacheResult === false) {
acc.cachedInvalidKeyList.push(key);
}

if (cacheResult === undefined) {
acc.uncachedDepositKeyList.push(depositKey);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This cache structure can potentially cause bugs because it can store two falsey values that mean completely different things: false means that the deposit data is invalid, but undefined means that it is not cached.

If you are not careful enough, you may accidentally check like so if (cacheResult), and it will lead to unexpected behavior.

I suggest that it's better to use explicit truthy values if the data is cached like, e.g.

const DEPOSIT_VALID = "valid";
const DEPOSIT_INVALID = "invalid";

cache.set(deposit, DEPOSIT_VALID);

Comment on lines +92 to +108
this.securityService
.unvetSigningKeys(
nonce,
blockNumber,
blockHash,
stakingModuleId,
operatorIds,
vettedKeysByOperator,
signature,
)
.catch((error) =>
this.logger.error('Failed to send unvet transaction', {
error,
blockHash,
stakingModuleId,
}),
);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, all guardians are potentially sending the same tx at the same time?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It will be reverted for other councils if one has already been successfully sent

@Amuhar Amuhar marked this pull request as ready for review October 1, 2024 13:10
@Amuhar Amuhar changed the title Prerelease 3.1.0 Review Oct 1, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants