You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Basic Data wallet is an application that allows a user to upload a file in a standard specification (BIDS), verify the file specification metadata, set sharing permissions, and peer-to-peer upload/share. Data can be shared and unshared at any time. Individual DIDs can be added to share data in individual data sharing spaces (like a group text). As a basic first step, sharing is accomplished by revealing the ipfs hash to legal DIDs.
Single-User Actions
web3 wallet login
User prompted with log-in, must log in with metamask. Upon login, user is presented with UI with sidebar showing public, private, and shared data (empty by default). The dataset metadata is the default view. User may browse through version history and network information tabs for each dataset.
upload file
User can drag file into UI to add to their data wallet. A version-controlled dataset is initialized with datalad when file is dropped in app.
file parsing & validation
A validation script runs to read the metadata from the dataset. The content is added + validation info committed into version history with datalad. Permissions may be set. If set to share or public, a remote ipfs/infura/texibleDB node is initialized.
sign dataset
User signs data after finalizing permissions. Metadata, version history, and repository/dataset remote information and metadata are stored on-chain.
deploy to IPFS
Users choose when to publish data after adding to the wallet and verifying the contents. Infura gateway used to publish data securely. Dataset availability is confirmed, on-chain data is queried and updated to reflect dataset availability.
Drag and drop file box
-- check file extension
-- run bids validation on bids json
---> https://github.com/bids-standard/bids-validator
-- create new datalad dataset (local)
-- add file to new datalad dataset (local)
-- commit file status (local; bids-validation output: warnings/info)
Sidebar Dataset Browser each side bar element filters datasets by permissions
-- private
-- public
---> download public data (can choose to pin)
-- space shared with me unique space for each unique group, like group chat
Dataset Metadata Tab each selected dataset has metadata tab (default view)
-- name, size, owner, permissions, schema (bids),
History tab git log --graph --pretty=format:'' --abbrev-commit each selected dataset has history tab
-- datalad version history for each dataset / file
Network Info tab git annex info each selected dataset has network info tab
-- list of datasets/git remotes
-- ipfs CID
-- no. copies
Basic Data Wallet UI for Dataset Management.
The Basic Data wallet is an application that allows a user to upload a file in a standard specification (BIDS), verify the file specification metadata, set sharing permissions, and peer-to-peer upload/share. Data can be shared and unshared at any time. Individual DIDs can be added to share data in individual data sharing spaces (like a group text). As a basic first step, sharing is accomplished by revealing the ipfs hash to legal DIDs.
Single-User Actions
User prompted with log-in, must log in with metamask. Upon login, user is presented with UI with sidebar showing public, private, and shared data (empty by default). The dataset metadata is the default view. User may browse through version history and network information tabs for each dataset.
User can drag file into UI to add to their data wallet. A version-controlled dataset is initialized with datalad when file is dropped in app.
A validation script runs to read the metadata from the dataset. The content is added + validation info committed into version history with datalad. Permissions may be set. If set to share or public, a remote ipfs/infura/texibleDB node is initialized.
User signs data after finalizing permissions. Metadata, version history, and repository/dataset remote information and metadata are stored on-chain.
Users choose when to publish data after adding to the wallet and verifying the contents. Infura gateway used to publish data securely. Dataset availability is confirmed, on-chain data is queried and updated to reflect dataset availability.
Example Front-End
Figma Link Here
Working Elements
-- check file extension
-- run bids validation on bids json
---> https://github.com/bids-standard/bids-validator
-- create new datalad dataset (local)
-- add file to new datalad dataset (local)
-- commit file status (local; bids-validation output: warnings/info)
each side bar element filters datasets by permissions
-- private
-- public
---> download public data (can choose to pin)
-- space shared with me
unique space for each unique group, like group chat
each selected dataset has metadata tab (default view)
-- name, size, owner, permissions, schema (bids),
git log --graph --pretty=format:'' --abbrev-commit
each selected dataset has history tab
-- datalad version history for each dataset / file
git annex info
each selected dataset has network info tab
-- list of datasets/git remotes
-- ipfs CID
-- no. copies
-- Publish
-- Delete
-- Pin (keep copy for seeding)
-- Manage Permissions
-- metamask
-- address resolve to ENS
public/private/shared-list
-- read onchain permissions
-- search names by ENS
-- check existing permissions for modification privelleges
-- publish to ipfs/remote
-- write new onchain permission
-- configure dataset remote
git annex ipfs) https://git-annex.branchable.com/special_remotes/ipfs/
other remotes) https://git-annex.branchable.com/special_remotes/
-- create bucket
-- create folder
-- push dataset to remote
-- check remote and write metadata to chain
Reach Goals / Next Steps:
Implementation: https://mesh.xyz/web3studio/sojourn/#the-vault
Materials & Examples
---> code for demo client) https://github.com/FleekHQ/space-client-workshop
---> accompanying tutorial) https://www.youtube.com/watch?v=f5LRSpGGuQE
---> space.storage client) https://github.com/FleekHQ/space-client
The text was updated successfully, but these errors were encountered: