You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using sonic for a bigger project and it's working great. Except when it crashes in irregular intervals (every 1 - 2 days).
The communication is done via the elixir sonix client and sonic runs via the official docker container (latest version).
config is the default one and the stored data is about 25MB (~24MB ram usage, CPU usage very low)
The log is flooded with these lines:
| 2021-07-02T11:45:39.424+02:00 | thread 'thread 'thread 'thread 'thread 'thread 'thread 'thread 'sonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:sonic-channel-clientsonic-channel-client' panicked at 'closing channelsonic-channel-clientsonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:', ' panicked at 'src/channel/handle.rs188:21
| 2021-07-02T11:45:39.424+02:00 | :188188:21
| 2021-07-02T11:45:39.424+02:00 | :21
| 2021-07-02T11:45:39.424+02:00 | closing channel', src/channel/handle.rs:188:21
| 2021-07-02T11:45:39.424+02:00 | sonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:188:21
| 2021-07-02T11:45:39.424+02:00 | ' panicked at 'sonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:188:21
| 2021-07-02T11:45:39.424+02:00 | sonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:188:21
| 2021-07-02T11:45:39.424+02:00 | closing channel', src/channel/handle.rs:188:21
| 2021-07-02T12:00:11.153+02:00 | (WARN) - took a lot of time: 125ms to process channel message
| 2021-07-02T12:14:19.737+02:00 | thread 'thread 'sonic-channel-clientsonic-channel-client' panicked at '' panicked at 'closing channelclosing channel', ', src/channel/handle.rssrc/channel/handle.rs::188188::2121
| 2021-07-02T12:14:19.737+02:00 | thread 'thread 'sonic-channel-clientsonic-channel-client' panicked at '' panicked at 'closing channelclosing channel', ', src/channel/handle.rssrc/channel/handle.rs::188188::2121
| 2021-07-02T12:14:19.737+02:00 | thread 'sonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:188:21
| 2021-07-02T12:14:19.737+02:00 | thread 'sonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:188:21
| 2021-07-02T12:14:19.737+02:00 | thread 'sonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:188:21
| 2021-07-02T12:14:19.737+02:00 | thread 'sonic-channel-client' panicked at 'closing channel', src/channel/handle.rs:188:21
| 2021-07-02T13:53:21.945+02:00 | (WARN) - took a lot of time: 54ms to process channel message
| 2021-07-02T13:53:21.967+02:00 | (WARN) - took a lot of time: 76ms to process channel message
| 2021-07-02T13:53:21.997+02:00 | (WARN) - took a lot of time: 105ms to process channel message
Do you have any clue what is causing these log messages and the crashes?
After restarting the container everything is working fine again.
Thanks a lot!!
Philipp
The text was updated successfully, but these errors were encountered:
I'm not the developer of the Elexir connector, would you mind recording logs of all commands you're sending via this library (eg. from your client application), and sending there the commands that are sent right before Sonic panics?
I've never experienced this using my JS connectors, and given the thread 'thread 'thread [..] error trace I'm curious of what could be happening!
Hi,
I'm using sonic for a bigger project and it's working great. Except when it crashes in irregular intervals (every 1 - 2 days).
The communication is done via the elixir sonix client and sonic runs via the official docker container (latest version).
config is the default one and the stored data is about 25MB (~24MB ram usage, CPU usage very low)
The log is flooded with these lines:
Do you have any clue what is causing these log messages and the crashes?
After restarting the container everything is working fine again.
Thanks a lot!!
Philipp
The text was updated successfully, but these errors were encountered: