-
Notifications
You must be signed in to change notification settings - Fork 5
Add user stop words Madlib. #45 #48
Conversation
- add set for user stopwords. - remove misleading 'international' call until it's fixed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Diff was too large to provide inline comments.
Looks good. The suggestions I'd make are:
- for the example
STOP_WORDS_USER
variable, let's make it a list so that they can get a feel for providing multiple variables. i.e.STOP_WORDS_USER = {'south','north'}
- make explicit that the
STOP_WORDS_USER
are added to the NLTK stop words. This is clear from the code but the text comments could make it clear. - let's remind the user in the comment or text around the
# Get a list of the top words in the collection (regardless of year).
cell - i.e. that if you want to remove words from it, they can editSTOP_WORDS_USER
. They may not know what stop words are and this would give them the context behind what it means.
Otherwise, works quite well.
I'm fine with removing the internationalization code here – it is straightforward and I'll defer to you and the datathon participants findings – but I think in the future we should try to keep PRs to one thing in scope.
Okay. Will fix. I knew you were going to say that about international(), but I just couldn't bear seeing it just be wrong and out there in the open for such a small change. Also, I knew that #42 was still in the issues, so the tracking is still available. |
Latest commit should resolve all issues stated above. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that's a different issue which might be fixed more generally in another branch, especially if we look at making a module to cover the functions. Essentially we have general madlibs and more specific madlibs and it's not always clear which is which. I don't think that it's that helpful to tell people to scroll up to change something. Let's look at the overall explanations / docs / comments together as a separate issue. (Maybe after receiving feedback from the datathon as well.) |
OK. I still don't think
is clear? Filter what? Filter in? Filter out? Can you try to take a kick at the can to make it a bit more usable? |
No. That's in scope for this PR. Ian's suggestion makes sense. Can you update your PR to include that, and we can continue to iterate from there. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See: #48 (comment)
Now that I've been able to look at this more closely, I see your point. I have some follow-up work this morning, but will get this fixed in the afternoon. |
Remove filter STOP_WORDS comment.
auk-notebook.ipynb
Outdated
@@ -645,7 +645,7 @@ | |||
"source": [ | |||
"## Overall Collection Characteristics\n", | |||
"\n", | |||
"Change the variables in the following cell to manipulate the analysis you'll be running to understand overall collection characteristics. In this case, they mostly affect the visualization that we generate.\n" | |||
"Change the variables in the following cell to manipulate the analysis you'll be running to understand overall collection characteristics. In this case, they mostly affect the visualization that we generate. You may wish to add or remove words to `STOP_WORDS_USER`in the user configuration cell to fine-tune the results.\n" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe fine-tune the results
-> remove words that may be overwhelming your analysis
This PR gives the user the options to use NLTK stopwords and/or add their own. This was a common demand from the datathon.
I also removed the International () function call because it is not doing what I originally thought it was doing.