-
Notifications
You must be signed in to change notification settings - Fork 474
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
client.get_block broken due to highest accepted "limit" value 100, but we use 100000 #304
Comments
It looks like the notion web client uses |
@phoenixeliot Have you find the way to load/read more than 100 records? |
@miletoda You or I would need to write a patch/PR that implements the above suggestion. I might, but I don't have any use cases for 100+ block pages yet, so it's not super relevant to me personally yet. It should be quite doable, though, if you want to take a crack at it! |
To add more confirmation, this seems to be an issue with edit: s/commit/comment |
This case is pretty relevant to me. I'd like to help, if I can. The problem is, I'm not sure how the pagination works. Is there an example somewhere? I'd also like to help fix monitoring, but that's unrelated to this issue. (Is there an issue open for it?) Thanks! |
Okay, I've fixed block pagination. See PR #345 |
I'm finding that using the client as it is now, with the "limit" set to 10000 when fetching a block, I get an HTTP error "invalid input".
notion-py/notion/store.py
Line 280 in 3533c01
By testing manually, I found that 100 is the highest value this will accept as of right now, and removes the error.
The library should be updated with the new limit here, and we might need new logic to paginate this request when the page has more than 100 blocks in it.
From a cursory glance it looks like there are other places with the number "10000" that may also need to be updated.
The text was updated successfully, but these errors were encountered: