Skip to content

Latest commit

 

History

History
42 lines (31 loc) · 3.56 KB

README.md

File metadata and controls

42 lines (31 loc) · 3.56 KB

AI Multimodal Public Demos

This repository is part of AIMM project: a joint and collaborative effort between scientists from the following Labs:

The current contributors and authors working in this project are:

  • ANL:
    • Maria Chan, Steve Heald, Nicholas Schwarz, Chengjun Sun, Inhui Hwang, Yiming Chen
  • BNL:
    • Eli Stavitski, Daniel Allan, Stuart Campbell, Deyu Lu, Xiaohui Qu, Shinjae Yoo, Matthew Carbone, Juan Marulanda, Zhu Liang, Fanchen Meng
  • LBL:
    • Dylan McReynolds, Wanli Yang, Joseph Kleinhenz

Our goal is to enable and accelerate scientific discovery by leveraging large, complex multimodal datasets generated across BES synchrotron facilities. We are developing shared, transferrable infrastructure to store, curate, analyze, interpret and disseminate the data.

We integrated with Tiled for these demos. Tiled is a data access service for data science tools and we use it to put together all the different data sets that we receive from beamline scientists and it helps users to avoid worrying about formatting, file structure and parsing.

This repository contains both public and private data sets. Currently, we have enabled the XAS data library of Dr. Matt Newville (source) for public access in our server.

You can run all the data sets by following the intructions below. Keep in mind that you will need authorization to access the private data sets. We will give authorization to a selective group of scientists via ORCID authetification.

  • You can run the examples in your browser on Binder:

    • Data access example: Binder
    • Data access with widgets: Binder
    • Demo for NSLS-II, CFN & LBMS Users' Meeting 2023: Binder
  • Or you can run the locally:

git clone https://github.com/AI-multimodal/public-demos
cd public-demos
pip install -r requirements.txt
pip install jupyterlab
jupyter lab