-
Notifications
You must be signed in to change notification settings - Fork 25
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support large search result sets #124
Comments
Is this still on the radar? |
It's certainly still desirable. No ETA. I'm happy to take a PR for this issue. |
Ah this is a real deal-breaker to an otherwise nice package! Although I am glad the package did show the following warning:
If it is any guidance, a few years ago I made this implementation to deal with the pagination. Anyways, I don't think I'll have the time soon to make a PR with this contribution, but will keep it on my radar. |
how do we shut off the warnings? |
That command suppresses warnings made through the warnings module. The messages that you're seeing are warnings made through the logging module. There really no way to suppress those specifically. If you're running from the command line, the best/easiest workaround is probably to redirect stderr to a separate file (or /dev/null). |
Is this repo still maintained? |
I don't need eutils in my work at the moment, so I'm not adding new features or fixing bugs. But, I will gladly accept PRs if you have something to contribute. |
I tried to add a costume variable "retstart" and "retmax" to create a loop and getting the results by looping through my search pubmed ids. After five hour, still couldn't make it, but I am sure that we can add retstart and retmax as a costume variable. In VS code, you need to ctrl+click on the xx.esearch to see the code behind that which is:
And you can ctrl+click on ESearchResult to see the code behind that which is:
You can see my code trying to set retmax and retstart as a modifiable variable, hoping to download a big chunk of articles looping through pubmed results:
I hope someone with more experience can put 1 hour into this and solve this issue, which will help so many people like me :) Cheers to this future hero :) |
This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 7 days. |
This issue was closed because it has been stalled for 7 days with no activity. |
Just hit this issue myself -- I'm reopening this issue and will get a PR up... sometime. |
Originally reported by Reece Hart (Bitbucket: reece, GitHub: reece) in biocommons/eutils #124
Migrated by bitbucket-issue-migration on 2016-05-25 23:09:02
NCBI's eutiltities interface very nicely supports large search result sets by sending results in chunks. The eutils currently only handles the first chunk.
See http://www.ncbi.nlm.nih.gov/books/NBK25500/#chapter1.Demonstration_Programs
Perl excerpt to generation the continuation URLs:
The purpose of this issue is provide full support for large result sets using webenv histories.
Possible implementation:
This seems like an obvious use of python iterators for results. I'd like to keep the eutils.xmlfacades.esearchresults.ESearchResults as parsing-only. However, the interface methods are appropriate. So, one implementation is to write an upper-level (eutils.esearchresults) that wraps the xmlfacade version, holds a reference to the client, and
provides an iterator over results. This upper-level ESearchResults would be passed back to callers in lieu of the xmlfacade version.
The text was updated successfully, but these errors were encountered: