-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
api for making bulk atributions #3
Comments
Thanks @abubelinha for the idea. What you'd like is a write API OR an interface to upload a crafted csv file structured similarly to the table you included here and/or a textarea box to paste records in a web form. This is interesting. I'd not likely create a write API because this would require sophisticated authentication. Upload a csv or a large textarea are considerably easier. In any case, we'd first want to follow the recommendations for how to construct an entry for In your case, you're likely to have heaps identical entries for Q numbers or ORCID IDs in your two I'll ponder this and try a few things in development to see what might be feasible. At the very least we'd probably want some form of processing queue with realtime feedback on each row somewhat like quickstatements for wikidata if you're familiar with that one. Or, there'd be a report at the completion of an upload/paste with sufficient information to describe what worked and what did not (for whatever reason). |
I wonder if it would be even easier to provide our data as a remote csv file which bionomia can regularly read and process. You could add a new textbox (in bionomia user profile) where users can optionally enter an url for providing such a file (next to that textbox you can provide also a link to an example file, so users can just replicate that structure). Optionally, we could identify attributors (other than the logged bionoia user) by simply providing their ORCID in a right additional column.
That last column* wouldn't be strictly necessary (in my case I will probably be the only attributor of my file records). Of course you could then process this file at any time, checking first for people's IDs found in recordeById and identifiedById columns. |
I have just discovered this repository and I guess this api is for recovering data from bionomia (which is great).
But I wonder if there are chances to use an api for making attributions, instead of having to browse bionomia.net website and manually attribute records.
I am not meaning bulk attributions related to a GBIF dataset curated by me, which I already can bulk-attribute by populating with ORCID & wikidata IDs and then republishing to GBIF.
I mean that I know of many other GBIF records which I do not upload to GBIF, for example:
Is there anyway that authenticated users can somehow update structurated data like this to bionomia.net?
I suppose it wouldn't harm to re-attribute data which have already been attributed by other people.
If that ever happened, you just can keep the original attribution.
(also, if both differ you could take profit of this for detecting possible errors).
Thanks
@abubelinha
The text was updated successfully, but these errors were encountered: