Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

extract.py - download all articles locally by running a script #15

Closed
wants to merge 6 commits into from

Conversation

alikatgh
Copy link

@alikatgh alikatgh commented Jul 1, 2023

Download all articles by running a script included in directory

@facundoolano
Copy link
Owner

Hi Albert, thanks for the PR. I will have to decline it though, since it doesn't match a few goals I have for this repository:

  1. I don't know what the copyright/permissions situation is for most of the papers, I imagine that, at least for some of them, you can't just copy them and serve them into a repository without getting permissions. That's why I went with just storing a link to the paper.
  2. Keeping a local copy adds overhead to the the maintenance of the list, which changes frequently. I prefer to just have a spec in the yaml. I also don't want to maintain the script that downloads the papers (and especially not one that requires human input to work). You'll see that there's a script to check the articles are online, and even in that case I have plans to remove it and use a standard github action instead.

I understand it's inconvenient when links go down. It's usually easy to find a replacement link, what I'm missing is to update that link checking script to run on a schedule (e.g. everyday) so I find out when a link goes down. That should solve most of the problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants