Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure the state attr from molecular graph is consistent with matgl.float_th and include linear layer in TensorNet to match the original implementations #244

Merged
merged 25 commits into from
Mar 29, 2024

Conversation

kenko911
Copy link
Contributor

@kenko911 kenko911 commented Mar 29, 2024

Summary

Assigned the dtype state_attr into matgl.float_th and added a linear layer in TensorNet to match the original implementation.

Checklist

  • [ x] Google format doc strings added. Check with ruff.
  • [ x] Type annotations included. Check with mypy.
  • [ x] Tests added for new features/fixes.
  • [ x] If applicable, new classes/functions/modules have duecredit @due.dcite decorators to reference relevant papers by DOI (example)

Tip: Install pre-commit hooks to auto-check types and linting before every commit:

pip install -U pre-commit
pre-commit install

Summary by CodeRabbit

  • New Features
    • Enhanced data processing by specifying the data type during the conversion to PyTorch tensors.
    • Introduced a new neural network layer to improve model initialization and processing for specific conditions.

kenko911 and others added 25 commits November 18, 2023 02:52
…oat_th and including linear layer in TensorNet to match the original implementations
@kenko911 kenko911 requested a review from shyuep as a code owner March 29, 2024 18:32
Copy link
Contributor

coderabbitai bot commented Mar 29, 2024

Walkthrough

The recent modifications involve enhancing data processing and model architecture within a graph-based module. Specifically, there's an update to how state attributes are converted into tensors, ensuring they're in the correct data type. Additionally, a new neural layer is added to the model, with adjustments made for handling input features under certain conditions, refining both the initialization and the forward pass processes.

Changes

Files Change Summary
.../graph/data.py Updated state_attrs tensor conversion to use matgl.float_th data type.
.../models/_tensornet.py Added a new nn.Linear layer and adjusted input feature handling for specific conditions.

🐰✨
In the realm of code, where tensors flow,
A rabbit hopped, tweaking as it go.
With layers added, and data typed right,
It crafted models, both nimble and bright.
"To improve and enhance," it chirped with glee,
A code rabbit's quest, as endless as the sea.
🌟🚀

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share

Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

CodeRabbit Discord Community

Join our Discord Community to get help, request features, and share feedback.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Review Status

Configuration used: CodeRabbit UI

Commits Files that changed from the base of the PR and between ee9e987 and 065bfc2.
Files selected for processing (2)
  • src/matgl/graph/data.py (1 hunks)
  • src/matgl/models/_tensornet.py (3 hunks)
Additional Context Used
Additional comments not posted (4)
src/matgl/graph/data.py (1)

192-192: The change to explicitly set the data type of state_attrs to matgl.float_th during tensor conversion is correct and aligns with the PR objectives to ensure data type consistency for state attributes. Good job ensuring consistency across the framework.

src/matgl/models/_tensornet.py (3)

183-183: The addition of the nn.Linear layer is correctly implemented and is likely to enhance the model's ability to learn complex patterns from the data. Good job ensuring consistency in data types throughout the model.


185-185: The adjustments to input features based on the is_intensive flag and the readout_type are correctly implemented and provide flexibility in configuring the model's behavior. The thoughtful inclusion of dim_state_feats when include_state is true ensures the model can incorporate state features when available.


207-207: The configuration of the final layer for extensive tasks is correctly implemented, with appropriate handling of the task_type parameter. The explicit check to prevent classification tasks from being marked as extensive is a good practice that ensures correct usage of the model.

Copy link

codecov bot commented Mar 29, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 98.80%. Comparing base (ee9e987) to head (065bfc2).

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #244   +/-   ##
=======================================
  Coverage   98.80%   98.80%           
=======================================
  Files          33       33           
  Lines        2750     2752    +2     
=======================================
+ Hits         2717     2719    +2     
  Misses         33       33           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@kenko911 kenko911 changed the title Ensure the state attr from molecule graph is consistent with matgl.float_th and including linear layer in TensorNet to match the original implementations Ensure the state attr from molecular graph is consistent with matgl.float_th and include linear layer in TensorNet to match the original implementations Mar 29, 2024
@kenko911 kenko911 merged commit cb41e60 into materialsvirtuallab:main Mar 29, 2024
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant