Skip to content

September 1, 2017, Friday

Liya Wang edited this page Sep 4, 2017 · 6 revisions
  • Create a branch for archiving back to Data Store
    • Mainly for supporting RNAseq and Chipseq workflows
    • Use Agave for archiving (no pipelining)
      • Tacc will resolve the issue for not looking for local copy?
      • Is there a way to tell Agave to not to move the data to TACC?
      • We need to set a cron job if using pipelining't
    • Use Agave for authentication?
      • So user can run with their own user name
      • Limit to public app and users own private app
      • How to authenticate to both Agave and CyVerse?
      • When loading workflow, need to check permission on jobs and results? Need to re-design database for storing workflow? Maybe its a bad idea to store them?
        • Can always give permission to maizecode for all jobs so user can load workflow on SciApps?
    • Remove 'public' and data (top menu) to reduce dependency on local servers
      • For CSHL, could we put data on wildcat, then mount to halcott, brie, brie7?
        • Would data page browsing be slow? or will it break hd5ai?
        • Would computing be slow?
        • This will allow us to process Sorghum data
    • May put it on production if the paper gets out
    • Need to change output url with Agave links (folder needs to be public through Agave)
    • Need to sort out how to power Shinny app and BioDalliance
      • Very hard to do since they are both reading the data on server directly
      • Might be possible to copy the data to server (wget? inside shinny script?)
  • Background
    • Converting to user centric is hard (user submitting job instead of maizecode)
      • Need to figure out how to authenticate with Agave
      • Need to rely on public apps and systems, need to figure out the permission on jobs
      • Easy to run Stampede and access users data
      • Easy to put on Atmosphere/Jetstream
      • Supports user private apps easily
    • Federation version is good for workshops
      • No queue issue
      • Data load very fast
      • Easy share
Clone this wiki locally