You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some discussion between matteocontrini and me on the Flarum forum about what would be the best way to send the post of a discussion to the crawler using QAPages.
Possible solutions:
Don't add posts to the QAPages
Limit the posts at the QAPages: max 5 results, as search engines do not preview all them just a few, but, Google for example indexes the content of the posts as well, so that could be usefull for search results
Cache the JSON string of the discussion posts and save it to the cache directory on the server disk and re-cache it once per day per discussion
Hide the json (with the posts) for normal browser requests, only show for crawlers/bots
Discuss
What would be a great solution? Other ideas or tips?
The text was updated successfully, but these errors were encountered:
If I had to make a choice, I would go for the option to show the posts data to crawlers only. I'm thinking about cases where a discussion has a lot of posts, where it would be far from ideal (and useless) to send potentially hundreds of kilobytes of data to every user that reaches that post.
As mentioned "on the other side", there seems to be great libraries that take care of matching the user agent, and the one I linked supports more than one thousand patterns, with an additional generic one that matches words ending with "bot", etc. The only thing I would test is the overhead of matching a string against one thousand of regular expressions...
I'm not a SEO expert, so I'm not really sure how good/bad this solution actually is, but the only downside I can think of is that some very unpopular crawlers out there could not be in that list. But honestly, this feature is primarily thought for Google, so as long as you allow it, it should be ok. The Flarum API is always there if someone wants the full collection of posts for a discussion (I believe Google already has some kind of pattern for parsing Flarum discussion, I've been really impressed by how much it's able to scrape/index with zero optimizations).
EDIT: you have a typo in the title, it should spell "cache"
Some discussion between matteocontrini and me on the Flarum forum about what would be the best way to send the post of a discussion to the crawler using QAPages.
Possible solutions:
Discuss
What would be a great solution? Other ideas or tips?
The text was updated successfully, but these errors were encountered: