Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Push notifications are not delivered to IOS app #21120

Open
pavloburykh opened this issue Aug 23, 2024 · 13 comments · May be fixed by #21297
Open

Push notifications are not delivered to IOS app #21120

pavloburykh opened this issue Aug 23, 2024 · 13 comments · May be fixed by #21297

Comments

@pavloburykh
Copy link
Contributor

Reproducing on IOS 17.5.1, iPhone 15

Preconditions: User A - message sender (any device), User B - PN receiver, IOS device

Steps:

  1. User B enables PN notifications
  2. User B backgrounds app
  3. User A sends messages to any common chat with User B (1-1, group, community channel)
  4. See if User B received PN about new message

Actual result: PN is not received by User B

Andr_PN_sender.zip
IOS_PN_receiver.zip

PNs_IOS.mp4

Additional Information

  • Status version: release 2.30
  • Operating System: iOS
@pavloburykh
Copy link
Contributor Author

UPDATE: sometimes PNs go through. @mariia-skrypnyk and @churik were able to receive some PNs.

@cammellos
Copy link
Contributor

I have investigated the issue:

What looks like, in at least one instance:

  1. The device was correctly registered with the push notification server
  2. The sender, queried the push notification server for informations, via lightpush. The message made it to the server
  3. The server replied to the message
  4. The reply was never received by the sender, therefore the push notification wasn't sent

Looking at kibana's logs, I can see that no filter node has pushed that message to the right client:

https://kibana.infra.status.im/app/discover#/?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-15h,to:now))&_a=(columns:!(),filters:!(),index:ffcc22b0-0116-11ed-9719-cdd3b483481c,interval:auto,query:(language:kuery,query:'%220x9f5d30ee6c008d365e20014c749bf3194d960467edaa71f89880721abf635180%22%20AND%20kv_raw:filter'),sort:!(!('@timestamp',desc)))

This is the missing message. The peer id of the sender is "16U*hhQfss"

You can see a couple of minutes later, a successful push:

https://kibana.infra.status.im/app/discover#/?_g=(filters:!(),refreshInterval:(pause:!t,value:0),time:(from:now-15h,to:now))&_a=(columns:!(),filters:!(),index:ffcc22b0-0116-11ed-9719-cdd3b483481c,interval:auto,query:(language:kuery,query:'%220x34304e1a4764bb92c5ccfec8e59a56e85b99651fb70d88c956799d2a5103a139%22%20and%20kv_raw:filter'),sort:!(!('@timestamp',desc)))

But now looking at the client log, it looks like they never registered for that particular topic, so i'll start from scratch investigating.

@cammellos
Copy link
Contributor

DEBUG[08-23|10:46:10.968|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:810]                                       checking if querying necessary 
DEBUG[08-23|10:46:10.968|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:815]                                       querying info 
DEBUG[08-23|10:46:10.989|github.com/status-im/status-go/wakuv2/common/filter.go:108]                                                         filters install                          id=3867b9e1d9cd67826de7426adbafac0aa6d9ec80374057c478ee92f35971477f
DEBUG[08-23|10:46:10.989|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:111]         adding filter                            filter-id=3867b9e1d9cd67826de7426adbafac0aa6d9ec80374057c478ee92f35971477f
DEBUG[08-23|10:46:10.989|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:124]         existing pubsubTopic batch               agg-filter-id=dcd3c805-b342-46be-a406-ebc445802ae1 topic=/waku/2/rs/16/32
DEBUG[08-23|10:46:10.989|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:145]         adding to existing pubsubTopic batch     agg-filter-id=dcd3c805-b342-46be-a406-ebc445802ae1 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0x25a24866/rfc26 ] }" batch-size=3
DEBUG[08-23|10:46:10.989|github.com/status-im/status-go/protocol/transport/filters_manager.go:441]                                           registering filter for                   namespace=filtersManager chatID=contact-discovery-950 type=partitioned topic=0x25a24866
DEBUG[08-23|10:46:10.991|github.com/status-im/status-go/protocol/common/message_segmentation.go:47]                                          message segmented                        segments=1
DEBUG[08-23|10:46:11.099|github.com/status-im/status-go/wakuv2/common/filter.go:108]                                                         filters install                          id=8a08fc1993745515933e0ec0044e4ac3f8a08933e0da387c3bf5ae82c7d7ead4
DEBUG[08-23|10:46:11.099|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:111]         adding filter                            filter-id=8a08fc1993745515933e0ec0044e4ac3f8a08933e0da387c3bf5ae82c7d7ead4
DEBUG[08-23|10:46:11.099|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:124]         existing pubsubTopic batch               agg-filter-id=dcd3c805-b342-46be-a406-ebc445802ae1 topic=/waku/2/rs/16/32
DEBUG[08-23|10:46:11.099|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/filter/filter_manager.go:145]         adding to existing pubsubTopic batch     agg-filter-id=dcd3c805-b342-46be-a406-ebc445802ae1 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0x815b41c6/rfc26 ] }" batch-size=4
DEBUG[08-23|10:46:11.099|github.com/status-im/status-go/protocol/transport/filters_manager.go:574]                                           registering filter for                   namespace=filtersManager chatID=95d7f2cdb31b8b1abbdd528aaf97d3a9c7a5285b1a184e6ff5d1a52a7b6e7b9889b38d120795e068ead5f5c27389f9696ec09ce06f1765bafd211e065ca37ce4 type=public ContentTopic=0x815b41c6 PubsubTopic=
DEBUG[08-23|10:46:11.100|github.com/status-im/status-go/protocol/common/message_sender.go:805]                                               sent-message: public message             recipient=[] messageID=0x17b3a4d64ebe9f38f5eb7fe8272feb1018d8c3ba5f63089f88b8974fe02c021b contentType=PUSH_NOTIFICATION_QUERY messageType=public hashes=[0xf7bbddd9f231f2601012b95ccc939b7dae76f5c2a5747da50c51f006588e7ca0]
DEBUG[08-23|10:46:11.100|github.com/status-im/status-go/wakuv2/common/filter.go:249]                                                         processing message: decrypted            envelopeHash=0xf7bbddd9f231f2601012b95ccc939b7dae76f5c2a5747da50c51f006588e7ca0
INFO [08-23|10:46:11.100|github.com/status-im/status-go/vendor/github.com/waku-org/go-waku/waku/v2/api/publish/message_sender.go:115]        publishing message via lightpush         envelopeHash=0xf7bbddd9f231f2601012b95ccc939b7dae76f5c2a5747da50c51f006588e7ca0 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x815b41c6/rfc26 timestamp=1,724,409,971,099,899,579

Looking at this message now:

HASH=0xf7bbddd9f231f2601012b95ccc939b7dae76f5c2a5747da50c51f006588e7ca0
CONTENT_TOPIC=/waku/1/0x815b41c6/rfc26

The message was received by a push notification server:

https://kibana.infra.status.im/app/discover#/context/ffcc22b0-0116-11ed-9719-cdd3b483481c/C9HXfpEBaE8Vq-sg_ypD?_g=(filters:!())&_a=(columns:!(),filters:!(('$state':(store:appState),meta:(alias:!n,disabled:!f,index:ffcc22b0-0116-11ed-9719-cdd3b483481c,key:logsource,negate:!f,params:(query:node-01.do-ams3.notify.prod),type:phrase),query:(match_phrase:(logsource:node-01.do-ams3.notify.prod)))),predecessorCount:10,successorCount:5)

@cammellos
Copy link
Contributor

The response, was sent:

 INFO [08-23|10:46:10.404|github.com/status-im/status-go/wakuv2/waku.go:1016] publishing message via relay envelopeHash=0xbbd9ab3762e6cee821a5dddbf240018e43d96c4ed8786c44b519bcc9ac19b408 pubsubTopic=/waku/2/rs/16/32 contentTopic=/waku/1/0x25a24866/rfc26 timestamp=1,724,409,970,404,010,117 peerCnt=15

Notice the time:

10:46:10.404

And the content topic:

/0x25a24866

If we go back to the client:

10:46:10.989  adding to existing pubsubTopic batch     agg-filter-id=dcd3c805-b342-46be-a406-ebc445802ae1 content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0x25a24866/rfc26 ] }" batch-size=3
10:46:10.989 registering filter for                   namespace=filtersManager chatID=contact-discovery-950 type=partitioned topic=0x25a24866

That's when we register the filter.

10:46:12.569      ticker hit, hence subscribing            agg-filter-id=dcd3c805-b342-46be-a406-ebc445802ae1 batch-size=4 agg-content-filter="{ pubsubTopic: /waku/2/rs/16/32, contentTopics: [ /waku/1/0x7bcb873b/rfc26/waku/1/0xbd4cfa16/rfc26/waku/1/0x25a24866/rfc26/waku/1/0x815b41c6/rfc26 ] }"

But then 2 seconds later it's actually tried to register

10:46:12.889subscription successful                  pubSubTopic=/waku/2/rs/16/32 contentTopics="[/waku/1/0x7bcb873b/rfc26 /waku/1/0xbd4cfa16/rfc26 /waku/1/0x25a24866/rfc26 /waku/1/0x815b41c6/rfc26]" peer=16Uiu2HAmMhsfGvYqkAryRe8WZnKHJUhKRbni7hseGpRy174hRaZv

So the timeline is:

  1. Client schedule registering filter on content topic A
  2. Client pubilshes message
  3. Server receives message and replies on content topic A
  4. Client actually registers the filter on content topic A

Message would be received between 3 & 4, but filter is not pushed as the client hasn't yet register the filter.

You can see similar examples from the logs, where we query store nodes just after scheduling register a filter, but we receive messages too early.

status-im/status-go#5440 introduced the ticker.

Before the issue was probably already present (registering a filter was probably asynchronous, although I think on v1 it was sync), but most likely was never picked up as the window was smaller.

Potential solutions:

  • we mitigate but reducing the timing, but the window would still be present
  • we delay the sending until the filter is registered, that would add a concept of dependency to sending a message (i.e sending message X depends on having registered message Y)
  • we refetch from store node once a new filter is subscribed, for the period between scheduling the filter and once the filter has been actually registered, with some padding of course to accomoddate the timing. (there's still a window where the message would not be on the store node, nor the message has been pushed by the filter though).

Either ways, this looks like a fairly critical message reliability issue, lowering the batch ticker (it's currently I believe 5 seconds, which is quite high) is probably the quickest to lower the possibility of that happening.

cc @churik @fryorcraken @chaitanyaprem @richard-ramos @ilmotta and thank you @pavloburykh for all the logs

@richard-ramos
Copy link
Member

As a way to not block release, I'd say that we can reduce the ticker length to a length which fast enough to make this issue potentially not happen (maybe something like 500ms or less?).

Then we can try work out a proper solution. @chaitanyaprem and I had talked before about this possible window on which messages would not be received due to the ticker and doing the storenode request could be a good solution indeed, taking into account that there might be a delay for the message to arrive to the storenode.

Nwaku storenodes will only accept messages with a timestamp +-20s its current time, so based on this, when we register a filter and the ticker is triggered, we asynchronously wait 20s, and then send a store request. What do you think, @cammellos and @chaitanyaprem?

@churik
Copy link
Member

churik commented Aug 26, 2024

Improved in status-im/status-go@4a18c85 as a temp measure; the issue requires more investigation / actions, so postponed to 2.31

@chaitanyaprem
Copy link

Nwaku storenodes will only accept messages with a timestamp +-20s its current time, so based on this, when we register a filter and the ticker is triggered, we asynchronously wait 20s, and then send a store request. What do you think, @cammellos and @chaitanyaprem?

I am wondering if doing a store query after 20secs would help, because current missing messages periodic store query anyways happens every 20seconds. Which means anyways worst-case these messages would be fetched in next store query isn't it (unless i am missing something)?

@pavloburykh
Copy link
Contributor Author

Issue is still valid for me. @status-im/mobile could you please also re-check if you are receiving push notifications on IOS?

@VolodLytvynenko
Copy link
Contributor

still valid. No pushes on IOS.

Devices:

  • iPhone 11 Pro Max, IOS 17

@mariia-skrypnyk
Copy link

mariia-skrypnyk commented Sep 19, 2024

@pavloburykh checked on 19/09 nightly on iOS - no PN .

iPhone 13, iOS 17

@churik
Copy link
Member

churik commented Sep 19, 2024

I still have notifications working on release 2.30 and in PRs (enabled from profile - and from onboarding)

IPhone 14 Pro, IOS 17.6.1

@chaitanyaprem
Copy link

chaitanyaprem commented Sep 20, 2024

referring issue being tracked for go-waku waku-org/go-waku#1211
we thought after reducing the ticker the issue has not happened.

but looks like the fix has not been propagated correctly it into the develop branch which is why push notifications must be failing. let me create a PR with fix pushed to develop branch as well and we can verify the same.

This is why @churik is not facing any issues as she is testing on 2.30 whereas others might be testing on develop branch.

@chaitanyaprem
Copy link

pushed a fix in #21297

@pavloburykh @churik can you please get this revalidated and confirm if the issue still happens.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: No status
Development

Successfully merging a pull request may close this issue.

7 participants