Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rare and random app crashes on start #16144

Open
jrainville opened this issue Aug 19, 2024 · 18 comments
Open

Rare and random app crashes on start #16144

jrainville opened this issue Aug 19, 2024 · 18 comments

Comments

@jrainville
Copy link
Member

Bug Report

Description

This issue happens very rarely and randomly.
This morning, I opened the app after the weekend, and after login, while it was loading, it crashed.

Then, when I try again, it works.

I wonder if it's because it received a message it doesn't support, or just because it wasn't opened in a few days?

Here are the logs:

geth-crash.log

app_20240819_102513.log

One interesting thing in the logs is that logging out is shown right after communityTags. It would be good to get the logs from other crashes to compare:

DEBUG[08-19|14:25:23.834|rpc/handler.go:307]                                                                  Served wakuext_communityTags             reqid=20 duration="41.795µs"
DEBUG[08-19|14:25:23.835|github.com/status-im/status-go/api/geth_backend.go:2502]                             logging out  

Steps to reproduce

No idea. This time it was after the weekend, but I think I've seen it before with other contexts.
Additional info: I had just upgraded from RC8 to RC9, but there weren't that many changes in between (no migration)

Expected behavior

Never crashes

Actual behavior

Sometimes crashes on login

Additional Information

  • Status desktop version: 2.30 RC9
  • Operating System: Ubuntu 22
@dlipicar
Copy link
Contributor

RC9 just crashed quickly after launching.

nim_status_client-2024-08-19-114419.ips.zip
geth.log.zip
app_20240819_114334.log

@jrainville
Copy link
Member Author

Yours seems like a harder crash. It doesn't have the logging out line in the logs. It just died it seems.

I don't think they are the culprits, but there is a lot of wakuext_getLatestVerificationRequestFrom. It should be fixed as well

@igor-sirotin
Copy link
Contributor

igor-sirotin commented Aug 19, 2024

Same for me. Some stuff from the logs:

app log:

DBG 2024-08-19 14:51:32.011+01:00 [threadpool task thread] initiating task   topics="task-threadpool" tid=31968995 file=threadpool.nim:56 messageType=AsyncMarkAllMessagesReadTaskArg:ObjectType threadid=31968995 task="{\"$type\":\"AsyncMarkAllMessagesReadTaskArg:ObjectType\",\"chatId\":\"0x02b5bdaf5a25fcfe2ee14c501fab1836b8de57f61621080c3d52073d16de0d98d6a20cb2a6-e4b9-47e3-8fc1-cd815976a088\",\"vptr\":105553124926880,\"slot\":\"onMarkAllMessagesRead\"}"
DBG 2024-08-19 14:51:32.011+01:00 NewBE_callPrivateRPC                       topics="rpc" tid=31968995 file=core.nim:27 rpc_method=wakuext_markAllRead
WRN 2024-08-19 14:51:32.011+01:00 qt warning                                 topics="qt" tid=31968955 text="QML Connections: Implicitly defined onFoo properties in Connections are deprecated. Use this syntax instead: function onFoo(<arguments>) { ... }" file=qrc:/imports/shared/views/chat/MessageView.qml:383 category=default
INF 2024-08-19 14:51:32.024+01:00 shutting down...                           topics="status-app" tid=31968955 file=nim_status_client.nim:221
INF 2024-08-19 14:51:32.024+01:00 logging out...                             topics="app-controller" tid=31968955 file=app_controller.nim:300

geth.log:

DEBUG[08-19|13:51:32.024|github.com/status-im/status-go/api/geth_backend.go:2500]                                    logging out                              package=status-go/api.GethStatusBackend
INFO [08-19|13:51:32.024|github.com/status-im/status-go/services/wallet/service.go:271]                              wallet will be stopped 

So it's not a panic, it just logs out. For some reason.

@jrainville
Copy link
Member Author

@igor-sirotin yours looks more similar to mine. What's the line before the logging out for you?

@igor-sirotin
Copy link
Contributor

nothing suspicious 🤔

DEBUG[08-19|13:51:32.010|github.com/status-im/status-go/protocol/messenger_mailserver.go:125]                        mailserver request performed successfully mailserverID=store-01.do-ams3.shards.test
DEBUG[08-19|13:51:32.017|github.com/status-im/status-go/protocol/messenger.go:2063]                                  installations info                       Number of installations=3 Number of enabled installations=2
DEBUG[08-19|13:51:32.017|github.com/status-im/status-go/protocol/common/message_sender.go:164]                       sending a private message                public-key=0x0476397bc3b03711383d714d115b2e3999a3477eb441b295ee02385c7fa35328368eec4120c1d7286b0916d38aef7bc415f800cd26d6df1642592183e87e0ab508 site=SendPrivate
DEBUG[08-19|13:51:32.017|github.com/status-im/status-go/protocol/common/message_sender.go:446]                       sending private message                  recipient=0x0476397bc3b03711383d714d115b2e3999a3477eb441b295ee02385c7fa35328368eec4120c1d7286b0916d38aef7bc415f800cd26d6df1642592183e87e0ab508
DEBUG[08-19|13:51:32.017|github.com/status-im/status-go/protocol/pushnotificationclient/client.go:724]               handling message scheduled 
DEBUG[08-19|13:51:32.023|github.com/status-im/status-go/protocol/communities/community_description_encryption.go:85] failed to decrypt community private data keyIDSeqNo=2fb45f80ff7d248eaa1175c0fb80e72d9d3fe92dcd7c35b650348f9c57c2cb5e53 error="no ratchet key for given keyID"
DEBUG[08-19|13:51:32.024|github.com/status-im/status-go/api/geth_backend.go:2500]                                    logging out                              package=status-go/api.GethStatusBackend

@jrainville
Copy link
Member Author

nothing suspicious 🤔

Indeed. So it's not the same lines as mine. We'll have to dig deeper and/or hope to catch it with a debugger

@igor-sirotin
Copy link
Contributor

igor-sirotin commented Aug 19, 2024

@jrainville just remembered that I actually had an OS popup saying that Status crashed. So probably some goroutine crashed and the rest just somehow smartly logged out 🤔

@jrainville
Copy link
Member Author

New crash I got on 2.30.0
app_20240830_110147.log

geth-crash2.log

@jrainville
Copy link
Member Author

Another crash on 2.30.0, again when first starting the app for the day, this time after the weekend.
app_20240903_092756.log
geth-crash-3.log

@jrainville
Copy link
Member Author

Another one. This time, my app was open since yesterday. I downloaded a new build, so I close the app and opened the new build and it crashed on login as usual.
app_20240905_103209.log
geth-crash4.log

@jrainville
Copy link
Member Author

Some interesting findings while looking at the logs. The last logs before the call to logging out are always the same:

DEBUG[08-19|14:25:23.834|rpc/handler.go:307]                                                                  Served wallet_getEthereumChains          reqid=18 duration="240.561µs"
DEBUG[08-19|14:25:23.834|rpc/handler.go:307]                                                                  Served wakuext_communityTags             reqid=20 duration="41.795µs"
DEBUG[08-19|14:25:23.835|github.com/status-im/status-go/api/geth_backend.go:2502]                             logging out                              package=status-go/api.GethStatusBackend

DEBUG[08-30|15:01:59.879|rpc/handler.go:307]                                                                  Served wallet_getEthereumChains          reqid=18 duration="329.087µs"
DEBUG[08-30|15:01:59.880|rpc/handler.go:307]                                                                  Served wakuext_communityTags             reqid=20 duration="52.392µs"
DEBUG[08-30|15:01:59.881|github.com/status-im/status-go/api/geth_backend.go:2535]                             logging out                              package=status-go/api.GethStatusBackend

DEBUG[09-03|13:28:29.352|rpc/handler.go:307]                                                                  Served wallet_getEthereumChains          reqid=18 duration="290.468µs"
DEBUG[09-03|13:28:29.352|rpc/handler.go:307]                                                                  Served wakuext_communityTags             reqid=20 duration="39.519µs"
DEBUG[09-03|13:28:29.353|github.com/status-im/status-go/api/geth_backend.go:2535]                             logging out                              package=status-go/api.GethStatusBackend

DEBUG[09-05|14:32:22.068|rpc/handler.go:307]                                                                  Served wakuext_communityTags             reqid=20 duration="53.745µs"
DEBUG[09-05|14:32:22.069|rpc/handler.go:307]                                                                  Served wallet_getEthereumChains          reqid=18 duration="351.101µs"
DEBUG[09-05|14:32:22.070|github.com/status-im/status-go/api/geth_backend.go:2535]                             logging out                              package=status-go/api.GethStatusBackend

I'll check if the handling we do on the Nim side might have some error that could cause a crash

@jrainville jrainville self-assigned this Sep 5, 2024
@jrainville jrainville moved this to In Progress in Status Desktop/Mobile Board Sep 5, 2024
@jrainville
Copy link
Member Author

@jrainville
Copy link
Member Author

@jrainville
Copy link
Member Author

I've been trying to reproduce the crash after adding logs to the possible culprits, but somehow I can't reproduce it at all. Maybe it was fixed on master, but I'm thinking maybe I'm just lucky and my current params or network are optimal?

Let me know if you can reproduce it on master and paste logs and anything you can remember of the scenario.

@jrainville jrainville moved this from In Progress to Next in Status Desktop/Mobile Board Oct 4, 2024
@jrainville
Copy link
Member Author

I was able to reproduce on my branch, which is on top of b8c4dfac34d3db3cac84dcec89b3175de209f68f

app_20241009_110939.log
geth-crash-7.log

@jrainville
Copy link
Member Author

This issue keeps evading my logs. It only happens when I have no logs 💀

@jrainville
Copy link
Member Author

I could reproduce in 2.31.1. The logs seem a bit different, so it might be a different issue, or hopefully it's the same and it helps us track down the culprit:

app_20241121_093032.log
geth.log

@jrainville
Copy link
Member Author

I still can't manage to reproduce the issue when I add logs.

@jrainville jrainville modified the milestones: 2.32.0 Beta, 2.33.0 Beta Dec 4, 2024
@jrainville jrainville removed their assignment Jan 27, 2025
@jrainville jrainville modified the milestones: 2.33.0 Beta, 2.34.0 Beta Jan 27, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: Next
Development

No branches or pull requests

3 participants