Skip to content

ICEES Facade #168

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
stevencox opened this issue Feb 4, 2019 · 3 comments
Closed

ICEES Facade #168

stevencox opened this issue Feb 4, 2019 · 3 comments
Assignees

Comments

@stevencox
Copy link
Contributor

  1. Bring the operation implementation currently in this service into alignment with current thinking on workflow five. (verify use of estimated population density, inequality values, p_value threshold, etc).

  2. Verify all changes to that service continue to work with the workflow here.

  3. Update the workflow name to something more readable.

  4. Create a second endpoint using ed visits as a clustering criteria.

@colinkcurtis
Copy link

colinkcurtis commented Feb 5, 2019

  1. Checked with Kara and Hao.... currently, "urban" is defined as "EstResidentialDensity >= 3". This may change, they are going to confer with Steve Appold on this issue.

  2. The changes did not break the system.

  3. Changed the workflow pair (.cwl and .yml) to 'workflow_5_v_3.cwl' and 'workflow_5_v_3.yml' respectively. Still stands to be improved...

  4. I've created a second endpoint which utilizes 'TotalEDInpatientVisits > 2' as the feature variable. This is done in the 'server.py' file... however, I delineated this into two new files which are now titled 'icees_res_density_server.py' and 'icees_ed_visits_server.py' to represent this bifurcation. These two files must be operated in the same way which server.py has been prior to now.

The changes described in # 4 above, do not produce any change in the workflow output SIZE. In both cases, the output size is:
"size": 883424,

Further, the checksum changes from test to test, even when nothing is changed and the original command is merely re-run. Current best guess is that there is a hard-limit to the size of the output but the gathering of contents is non-deterministic such that once ANY set of data of size = 883424, it is returned.

@stevencox
Copy link
Contributor Author

Thanks -

    1. sounds good.
    1. sounds good. (see 4 for caveat)
    1. sounds good.
    1. Please put both operations in a single server.py. We don't want to have to run separate services for each operation. This will mean each operation will have a different URL. That URL needs to be reflected in the CWL parameters for the workflow. Are we doing that somehow?

@colinkcurtis
Copy link

All issues addressed and the next set of concerns is addressed by issue #169

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants