A spotify daemon

Related tags

linux spotify daemon spotifyd
Overview

Spotifyd

All Contributors

Cargo Downloads Dependabot Status Github Actions - CD Github Actions - CI

An open source Spotify client running as a UNIX daemon.

Spotifyd streams music just like the official client, but is more lightweight and supports more platforms. Spotifyd also supports the Spotify Connect protocol, which makes it show up as a device that can be controlled from the official clients.

Note: Spotifyd requires a Spotify Premium account.

To read about how to install and configure Spotifyd, take a look at our wiki!

Common issues

  • Spotifyd will not work without Spotify Premium
  • The device name cannot contain spaces

Contributing

We always appreciate help during the development of spotifyd! If you are new to programming, open source or Rust in general, take a look at issues tagged with good first issue. These normally are easy to resolve and don't take much time to implement.

Credits

This project would not have been possible without the amazing reverse engineering work done in librespot, mostly by plietar.

Issues
  • meta: adding contributors to file

    meta: adding contributors to file

    opened by SirWindfield 193
  • Panic: zero-initalisation of ov_callbacks

    Panic: zero-initalisation of ov_callbacks

    Description

    I had successfully compiled and used Spotifyd for several days prior to this issue. I then decided I wanted to enable additional feature flags, so uninstalled the original instance and built again.

    However, after installing this new instance, I receive the following log messages when trying to play music from a Connect-capable client:

    
    No proxy specified
    
    Using software volume controller.
    
    Connecting to AP "gew1-accesspoint-b-qzv4.ap.spotify.com:443"
    
    Authenticated as "USERNAME" !
    
    Country: "GB"
    
    Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))
    
    Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))
    
    Loading <Broken> with Spotify URI <spotify:track:6qF9QltwbDeujrVrzpCoLj>
    
    Caught panic with message: attempted to zero-initialize type `librespot_tremor::tremor_sys::ov_callbacks`, which is invalid
    
    Caught panic with message: called `Result::unwrap()` on an `Err` value: "SendError(..)"
    
    Player thread panicked!
    

    The application then exits.

    Upon realising this, I attempted to rebuild the source with the original options and reinstall that, however the same error displays as above. I tried a variety of build options and appropriate configuration tweaking to no end, including using the crates.io release to no avail.

    To Reproduce Not something I can describe to reproduce, as it occurred suddenly and is persistent among rebuilds.

    Logs

    Click to show logs Loading config from "/home/pi/.config/spotifyd/spotifyd.conf"

    CliConfig { config_path: None, no_daemon: true, verbose: true, pid: None, shared_config: SharedConfigValues { username: Some("taken out for privacy"), password: Some("taken out for privacy"), password_cmd: None, use_keyring: false, on_song_change_hook: None, cache_path: None, no-audio-cache: false, backend: Some(PulseAudio), volume_controller: None, device: None, control: None, mixer: None, device_name: Some("RaspberryPiTV"), bitrate: Some(Bitrate320), volume_normalisation: false, normalisation_pregain: None, zeroconf_port: None, proxy: None } }

    Found shell "/bin/bash" using SHELL environment variable.

    No proxy specified

    registering with poller

    registering with poller

    registering with poller

    registering with poller

    build; num-workers=4

    registering with poller

    Using software volume controller.

    starting background reactor

    adding I/O source: 0

    registering with poller

    Zeroconf server listening on 0.0.0.0:37183

    adding I/O source: 4194305

    registering with poller

    event Writable Token(4194305)

    adding I/O source: 8388610

    registering with poller

    loop process - 1 events, 0.000s

    event Writable Token(8388610)

    loop process - 1 events, 0.000s

    adding I/O source: 12582915

    registering with poller

    event Writable Token(12582915)

    loop process - 1 events, 0.000s

    park; waiting for idle connection: "http://apresolve.spotify.com"

    Http::connect("http://apresolve.spotify.com/")

    consuming notification queue

    resolving host="apresolve.spotify.com", port=80

    scheduling Read for: 0

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    scheduling Read for: 2

    sending packet to [ff02::fb]:5353

    scheduling Read for: 3

    loop poll - 1.90442ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 505571470 }

    loop process, 41.351µs

    event Readable | Writable Token(8388610)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    received packet from [fe80::6c2e:d9c1:337:5394]:5353

    received packet from [fe80::6c2e:d9c1:337:5394]:5353 with no query

    scheduling Read for: 2

    scheduling Read for: 2

    loop poll - 326.755µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 505963465 }

    loop process, 36.055µs

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 45.666µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 506065445 }

    loop process, 34µs

    loop poll - 128.534738ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 634651312 }

    loop process, 56.574µs

    connecting to 34.98.74.57:80

    adding I/O source: 16777220

    registering with poller

    scheduling Write for: 4

    event Writable Token(16777220)

    loop process - 1 events, 0.000s

    loop poll - 18.134288ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 653032245 }

    loop process, 22.926µs

    read_keep_alive; is_mid_message=false

    scheduling Read for: 4

    should_keep_alive(version=Http11, header=None) = true

    Client::encode has_body=false, method=None

    reclaiming write buf Vec

    flushed 47 bytes

    flushed State { reading: Init, writing: KeepAlive, keep_alive: Busy, error: None }

    wants_read_again? false

    loop poll - 199.664µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 653334649 }

    loop process, 16.074µs

    event Readable | Writable Token(16777220)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 594 bytes

    Response.parse([Header; 100], [u8; 594])

    Response.parse Complete(176)

    maybe_literal not found, copying "Via"

    parsed 5 headers (176 bytes)

    incoming body is content-length (418 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=None) = true

    Conn::read_body

    decode; state=Length(418)

    flushed State { reading: Body(Length(0)), writing: KeepAlive, keep_alive: Busy, error: None }

    wants_read_again? false

    loop poll - 27.867997ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 681227461 }

    loop process, 18.445µs

    Conn::read_body

    decode; state=Length(0)

    incoming body completed

    scheduling Read for: 4

    maybe_notify; read_from_io blocked

    read_keep_alive; is_mid_message=false

    scheduling Read for: 4

    signal: Want

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    poll_want: taker wants!

    pool dropped, dropping pooled ("http://apresolve.spotify.com")

    loop poll - 169.128µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 681475440 }

    loop process, 17.796µs

    Connecting to AP "gew1-accesspoint-b-n2lk.ap.spotify.com:443"

    adding I/O source: 20971525

    registering with poller

    scheduling Write for: 5

    read_keep_alive; is_mid_message=false

    scheduling Read for: 4

    client tx closed

    State::close_read()

    State::close_write()

    flushed State { reading: Closed, writing: Closed, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 4

    signal: Closed

    loop poll - 152.628µs

    loop time - Instant { tv_sec: 5973, tv_nsec: 814064256 }

    loop process, 18.722µs

    event Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 24.842684ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 838934625 }

    loop process, 22.519µs

    scheduling Read for: 5

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 22.965875ms

    loop time - Instant { tv_sec: 5973, tv_nsec: 884802432 }

    loop process, 22.981µs

    flushing framed transport

    writing; remaining=133

    framed transport flushed

    scheduling Read for: 5

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 179.811013ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 87816409 }

    loop process, 22.759µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 619.45243ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 707304894 }

    loop process, 119.943µs

    attempting to decode a frame

    frame decoded from buffer

    Authenticated as "REDACTED" !

    new Session[0]

    new Spirc[0]

    new MercuryManager

    input volume:65535 to mixer: 65535

    attempting to decode a frame

    frame decoded from buffer

    Session[0] strong=4 weak=2

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    Country: "GB"

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    writing; remaining=876

    framed transport flushed

    loop poll - 562.548µs

    loop time - Instant { tv_sec: 5974, tv_nsec: 708687302 }

    loop process, 40.351µs

    new Player[0]

    Using PulseAudio sink

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 17.572444ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 726319912 }

    loop process, 39.629µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 16.785861ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 743167309 }

    loop process, 39.944µs

    subscribed uri=hm://remote/user/REDACTED/ count=0

    loop poll - 8.352µs

    loop time - Instant { tv_sec: 5974, tv_nsec: 743305474 }

    loop process, 38.499µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 2.744613ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 746107697 }

    loop process, 37.463µs

    Modify_watch: Watch { fd: 23, read: true, write: false }, poll_now: false

    adding I/O source: 25165828

    registering with poller

    Dropping AConnection

    scheduling Read for: 4

    scheduling Read for: 4

    D-Bus i/o poll ready: 23 is NotReady

    handle_msgs: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))

    handle_msgs: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))

    loop poll - 216.682µs

    loop time - Instant { tv_sec: 5974, tv_nsec: 751050023 }

    loop process, 45.592µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 49.369448ms

    loop time - Instant { tv_sec: 5974, tv_nsec: 800487322 }

    loop process, 44.703µs

    Polling message stream

    msgstream found Ok(Ready(Some((Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired")))))

    Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))

    Polling message stream

    msgstream found Ok(Ready(Some((Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired")))))

    Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeNotify "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698217680 1605868501989

    loop poll - 12.981µs

    loop time - Instant { tv_sec: 5974, tv_nsec: 800946334 }

    loop process, 48.759µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 295.089105ms

    loop time - Instant { tv_sec: 5975, tv_nsec: 96105753 }

    loop process, 48.814µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 4.213226017s

    loop time - Instant { tv_sec: 5979, tv_nsec: 309408065 }

    loop process, 21.741µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    scheduling Read for: 2

    loop poll - 196.614689ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 506057828 }

    loop process, 22.703µs

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 74.851µs

    loop time - Instant { tv_sec: 5979, tv_nsec: 506169549 }

    loop process, 17.518µs

    event Readable Token(0)

    loop process - 1 events, 0.000s

    scheduling Read for: 0

    loop poll - 66.485193ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 572681649 }

    loop process, 20.259µs

    Conn::read_head

    adding I/O source: 29360134

    registering with poller

    scheduling Read for: 6

    event Writable Token(29360134)

    loop process - 1 events, 0.000s

    flushed State { reading: Init, writing: Init, keep_alive: Busy, error: None }

    wants_read_again? false

    loop poll - 107.906µs

    loop time - Instant { tv_sec: 5979, tv_nsec: 572822221 }

    loop process, 16.407µs

    event Readable | Writable Token(29360134)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 222 bytes

    Request.parse([Header; 100], [u8; 222])

    Request.parse Complete(222)

    maybe_literal not found, copying "Keep-Alive"

    parsed 6 headers (222 bytes)

    incoming body is content-length (0 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true

    read_keep_alive; is_mid_message=true

    should_keep_alive(version=Http11, header=None) = true

    Server::encode has_body=true, method=Some(Get)

    encoding chunked 450B

    flushed 546 bytes

    scheduling Read for: 6

    maybe_notify; read_from_io blocked

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    loop poll - 1.541295ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 574388849 }

    loop process, 18.092µs

    event Readable | Writable Token(29360134)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 0 bytes

    parse eof

    State::close_read()

    read eof

    read_keep_alive; is_mid_message=true

    flushed State { reading: Closed, writing: Init, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 6

    loop poll - 3.530881ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 577948026 }

    loop process, 17.259µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 294.498761ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 872473657 }

    loop process, 47.518µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeHello "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698222811 1605868507120

    scheduling Read for: 5

    flushing framed transport

    writing; remaining=393

    framed transport flushed

    loop poll - 212.275µs

    loop time - Instant { tv_sec: 5979, tv_nsec: 873014465 }

    loop process, 43.981µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 27.351412ms

    loop time - Instant { tv_sec: 5979, tv_nsec: 900432468 }

    loop process, 46.426µs

    Polling message stream

    msgstream found Ok(NotReady)

    loop poll - 12.148µs

    loop time - Instant { tv_sec: 5979, tv_nsec: 900612003 }

    loop process, 49.203µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 407.842414ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 308527657 }

    loop process, 47.777µs

    event Readable Token(0)

    loop process - 1 events, 0.000s

    scheduling Read for: 0

    loop poll - 220.500978ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 529104893 }

    loop process, 47.907µs

    Conn::read_head

    adding I/O source: 33554438

    registering with poller

    scheduling Read for: 6

    event Writable Token(33554438)

    loop process - 1 events, 0.000s

    flushed State { reading: Init, writing: Init, keep_alive: Busy, error: None }

    wants_read_again? false

    loop poll - 268.849µs

    loop time - Instant { tv_sec: 5980, tv_nsec: 529448093 }

    loop process, 58.462µs

    event Readable | Writable Token(33554438)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 222 bytes

    Request.parse([Header; 100], [u8; 222])

    Request.parse Complete(222)

    maybe_literal not found, copying "Keep-Alive"

    parsed 6 headers (222 bytes)

    incoming body is content-length (0 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true

    read_keep_alive; is_mid_message=true

    should_keep_alive(version=Http11, header=None) = true

    Server::encode has_body=true, method=Some(Get)

    encoding chunked 450B

    flushed 546 bytes

    scheduling Read for: 6

    maybe_notify; read_from_io blocked

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    loop poll - 847.934µs

    loop time - Instant { tv_sec: 5980, tv_nsec: 530378970 }

    loop process, 43.407µs

    event Readable | Writable Token(33554438)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 0 bytes

    parse eof

    State::close_read()

    read eof

    read_keep_alive; is_mid_message=true

    flushed State { reading: Closed, writing: Init, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 6

    loop poll - 7.834475ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 538281777 }

    loop process, 47.296µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    scheduling Read for: 2

    loop poll - 77.209149ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 615564184 }

    loop process, 45.148µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 318.811µs

    loop time - Instant { tv_sec: 5980, tv_nsec: 615953865 }

    loop process, 39.129µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 227.069635ms

    loop time - Instant { tv_sec: 5980, tv_nsec: 843083388 }

    loop process, 46.537µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeNotify "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698223783 1605868508092

    loop poll - 10.407µs

    loop time - Instant { tv_sec: 5980, tv_nsec: 843346959 }

    loop process, 44.536µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 478.45311ms

    loop time - Instant { tv_sec: 5981, tv_nsec: 321866568 }

    loop process, 41.704µs

    event Readable Token(0)

    loop process - 1 events, 0.000s

    scheduling Read for: 0

    loop poll - 938.291809ms

    loop time - Instant { tv_sec: 5982, tv_nsec: 260224080 }

    loop process, 48.315µs

    Conn::read_head

    adding I/O source: 37748742

    registering with poller

    event Readable | Writable Token(37748742)

    loop process - 1 events, 0.000s

    read 222 bytes

    Request.parse([Header; 100], [u8; 222])

    Request.parse Complete(222)

    maybe_literal not found, copying "Keep-Alive"

    parsed 6 headers (222 bytes)

    incoming body is content-length (0 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true

    read_keep_alive; is_mid_message=true

    should_keep_alive(version=Http11, header=None) = true

    Server::encode has_body=true, method=Some(Get)

    encoding chunked 450B

    flushed 546 bytes

    scheduling Read for: 6

    maybe_notify; read_from_io blocked

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    loop poll - 843.86µs

    loop time - Instant { tv_sec: 5982, tv_nsec: 261142976 }

    loop process, 42.648µs

    event Readable | Writable Token(37748742)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 0 bytes

    parse eof

    State::close_read()

    read eof

    read_keep_alive; is_mid_message=true

    flushed State { reading: Closed, writing: Init, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 6

    loop poll - 2.008289ms

    loop time - Instant { tv_sec: 5982, tv_nsec: 263217857 }

    loop process, 43.518µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    scheduling Read for: 2

    loop poll - 93.033412ms

    loop time - Instant { tv_sec: 5982, tv_nsec: 356318842 }

    loop process, 49.073µs

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 221.33µs

    loop time - Instant { tv_sec: 5982, tv_nsec: 356617671 }

    loop process, 44.907µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 259.871256ms

    loop time - Instant { tv_sec: 5982, tv_nsec: 616556834 }

    loop process, 49.444µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeNotify "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698225544 1605868509853

    loop poll - 12.573µs

    loop time - Instant { tv_sec: 5982, tv_nsec: 616906329 }

    loop process, 53.907µs

    event Readable Token(0)

    loop process - 1 events, 0.000s

    scheduling Read for: 0

    loop poll - 543.555099ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 160539298 }

    loop process, 43.388µs

    Conn::read_head

    adding I/O source: 41943046

    registering with poller

    event Readable | Writable Token(41943046)

    loop process - 1 events, 0.000s

    read 222 bytes

    Request.parse([Header; 100], [u8; 222])

    Request.parse Complete(222)

    maybe_literal not found, copying "Keep-Alive"

    parsed 6 headers (222 bytes)

    incoming body is content-length (0 bytes)

    expecting_continue(version=Http11, header=None) = false

    should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true

    read_keep_alive; is_mid_message=true

    should_keep_alive(version=Http11, header=None) = true

    Server::encode has_body=true, method=Some(Get)

    encoding chunked 450B

    flushed 546 bytes

    scheduling Read for: 6

    maybe_notify; read_from_io blocked

    flushed State { reading: Init, writing: Init, keep_alive: Idle, error: None }

    wants_read_again? false

    loop poll - 787.379µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 161394954 }

    loop process, 38.092µs

    event Readable | Writable Token(41943046)

    loop process - 1 events, 0.000s

    Conn::read_head

    read 0 bytes

    parse eof

    State::close_read()

    read eof

    read_keep_alive; is_mid_message=true

    flushed State { reading: Closed, writing: Init, keep_alive: Disabled, error: None }

    wants_read_again? false

    shut down IO

    deregistering handle with poller

    dropping I/O source: 6

    loop poll - 4.593349ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 166047450 }

    loop process, 43.944µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 41.998152ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 208114046 }

    loop process, 41.425µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeLoad "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698226110 1605868509853

    State: context_uri: "spotify:album:69fOwmdCZIaWPE4OLLnuQi" index: 2 position_ms: 7039 status: kPlayStatusPause position_measured_at: 1605868510465 context_description: "" shuffle: false repeat: false playing_from_fallback: true row: 0 playing_track_index: 2 track {gid: "\031\327?\022\[email protected]\305\246*\272\372\235\n\334%"} track {gid: "#(\261\262\022\310G\235\267.O\203\333v8\366"} track {gid: "\323E\211\022\224\232NA\257o\303\371X\236-\025"} track {gid: "2|8p\322^Kl\203\321Loz\351\363e"} track {gid: "oUlG\312\005HF\210\266s!\3548\210m"} track {gid: "\350\312\377(\233|E.\252x\302k*\256\326A"} track {gid: "%\002\306[\262\023D\313\253\302V\213-Y\000\265"} track {gid: "_\372\34678\377MR\254\034\321\333\3013\036e"} track {gid: "?\357\312\321\336NK\022\242\343\226c\301~#\203"} track {gid: "\200\022\345\251\2455C[\246\245\271\352\206\031\2504"} track {gid: "\211\320!W/\H\221\203hN\376<6\361\220"} track {gid: "\275\261\002\316\377DI\231\221\252l\032Q\267l\205"} track {gid: "\201\345-\2178BG\213\255\306\231\211\322&"} track {gid: "\344<7!t\036C\005\227:y\242G\037\302\373"} track {gid: "\310\252\t\254\343AM\336\253\230\206JRa\310~"} track {gid: "$\327*\310\237cK\340\256\231\353\215\215\365\200\"} track {gid: "\252\365\n\227r\023O\272\251\023\234\352H\251\335\345"} track {gid: "\306\017\265\0252<EA\270\300\254#\316\306\[email protected]"} track {gid: "@ty\372\201\251G\377\220\213\r\221\302\333\202\331"}

    Frame has 19 tracks

    Loading context: spotify:album:69fOwmdCZIaWPE4OLLnuQi index: [2] of 19

    command=Load(SpotifyId { id: 280828155756069103410723341265459358997, audio_type: Track }, false, 7039)

    scheduling Read for: 5

    flushing framed transport

    writing; remaining=899

    framed transport flushed

    loop poll - 198.812µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 209566175 }

    loop process, 40.203µs

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    received packet from 192.168.1.237:5353

    scheduling Read for: 1

    scheduling Read for: 1

    sending packet to 224.0.0.251:5353

    event Readable | Writable Token(4194305)

    loop process - 1 events, 0.000s

    scheduling Read for: 2

    loop poll - 9.944059ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 219570696 }

    loop process, 44.148µs

    received packet from 192.168.1.182:5353

    received packet from 192.168.1.182:5353 with no query

    scheduling Read for: 1

    scheduling Read for: 1

    scheduling Read for: 2

    loop poll - 167.257µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 219807786 }

    loop process, 40.777µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 13.378034ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 233247819 }

    loop process, 46.814µs

    Polling message stream

    msgstream found Ok(NotReady)

    Loading with Spotify URI spotify:track:6qF9QltwbDeujrVrzpCoLj

    new AudioKeyManager

    loop poll - 12µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 233424817 }

    loop process, 141.387µs

    Downloading file 18e55e1787e41646e9d2725de2f0c9648c046ee0

    requesting chunk 0

    new ChannelManager

    scheduling Read for: 5

    flushing framed transport

    writing; remaining=102

    framed transport flushed

    loop poll - 345.07µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 233940218 }

    loop process, 47.036µs

    consuming notification queue

    loop poll - 105.147µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 234118882 }

    loop process, 52.203µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 24.65365ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 258847753 }

    loop process, 44.685µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 13.338442ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 272257268 }

    loop process, 47.703µs

    consuming notification queue

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 384.828µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 272715392 }

    loop process, 45.499µs

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 115.499µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 272900223 }

    loop process, 41.037µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 5.391635ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 278354765 }

    loop process, 41.777µs

    loop poll - 46.869µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 278465763 }

    loop process, 42.815µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 12.544063ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 291074196 }

    loop process, 49.166µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop poll - 322.348µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 291470821 }

    loop process, 78.906µs

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 1.434075ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 293007468 }

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    loop process, 318.922µs

    attempting to decode a frame

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 921.878µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 294275989 }

    loop process, 43.758µs

    loop poll - 80.61µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 294424172 }

    loop process, 43.222µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 829.805µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 295319124 }

    loop process, 41.61µs

    loop poll - 44.759µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 295427863 }

    loop process, 42.129µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 3.165756ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 298657118 }

    loop process, 41.277µs

    loop poll - 46.221µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 298766783 }

    loop process, 42.204µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 16.197906ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 315028244 }

    loop process, 44.962µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 5.874888ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 321299090 }

    loop process, 48.573µs

    loop poll - 537.234µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 321912767 }

    loop process, 44.129µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 15.654931ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 337634031 }

    loop process, 39.814µs

    chunk 0 / 26 complete

    requesting chunk 1

    loop poll - 146.72µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 337842824 }

    loop process, 39.426µs

    scheduling Read for: 5

    Normalisation Data: NormalisationData { track_gain_db: -6.9300003, track_peak: 0.9973827, album_gain_db: -10.010002, album_peak: 1.0153364 }

    Applied normalisation factor: 0.45029798

    flushing framed transport

    writing; remaining=53

    Caught panic with message: attempted to zero-initialize type librespot_tremor::tremor_sys::ov_callbacks, which is invalid

    framed transport flushed

    loop poll - 289.238µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 338198487 }

    loop process, 50.258µs

    loop poll - 16.778µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 338295245 }

    loop process, 49.036µs

    drop Player[0]

    loop poll - 474.254µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 338848775 }

    loop process, 44.647µs

    Polling message stream

    msgstream found Ok(NotReady)

    loop poll - 14.74µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 339007976 }

    loop process, 38.537µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 33.415502ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 372480922 }

    loop process, 41.962µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 18.377471ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 390924947 }

    loop process, 41.703µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 17.24143ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 408231820 }

    loop process, 39.666µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 57.642046ms

    loop time - Instant { tv_sec: 5983, tv_nsec: 465937365 }

    loop process, 46.647µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypeNotify "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698226403 1605868510712

    loop poll - 12.5µs

    loop time - Instant { tv_sec: 5983, tv_nsec: 466218380 }

    loop process, 45.999µs

    event Readable | Writable Token(20971525)

    loop process - 1 events, 0.000s

    attempting to decode a frame

    frame decoded from buffer

    attempting to decode a frame

    scheduling Read for: 5

    flushing framed transport

    framed transport flushed

    loop poll - 1.208130976s

    loop time - Instant { tv_sec: 5984, tv_nsec: 674418207 }

    loop process, 41.963µs

    Polling message stream

    msgstream found Ok(NotReady)

    kMessageTypePlay "Nokia 5.3" a0f1362bbbc96bff82ba9a7ebcfc56ffcacfdeda 1698227616 1605868510712

    Caught panic with message: called Result::unwrap() on an Err value: "SendError(..)"

    drop Spirc[0]

    Shutting down player thread ...

    Player thread panicked!

    drop Session[0]

    drop AudioKeyManager

    drop ChannelManager

    drop MercuryManager

    Dropping AMessageStream

    AMessageStream telling ADriver to quit

    shutdown; state=pool::State { lifecycle: Running, num_futures: 0 }

    -> transitioned to shutdown

    -> shutting down workers

    dropping I/O source: 3

    dropping I/O source: 4

    deregistering handle with poller

    dropping I/O source: 0

    deregistering handle with poller

    dropping I/O source: 1

    deregistering handle with poller

    dropping I/O source: 2

    drop Dispatch

    deregistering handle with poller

    dropping I/O source: 5

    Compilation flags

    • [x] dbus_mpris
    • [x] dbus_keyring
    • [ ] alsa_backend
    • [ ] portaudio_backend
    • [x] pulseaudio_backend
    • [ ] rodio_backend

    Note that this occurs regardless of features enabled; the working version originally only had alsa_backend enabled, and this no longer works.

    Versions (please complete the following information):

    • OS: Raspian Buster (armhf) on Kernel 5.4.75-v7l+
    • Spotifyd: v0.2.24, both crates.io release and 39106ed8ed270247b1203cc2eed5b05121c90cf7
    • cargo: cargo 1.48.0 (65cbdd2dc 2020-10-14)
    blocked by: librespot bug reproducibility: easy 
    opened by ChildOfDreams 48
  • [ERROR] Caught panic with message: Authentication failed with reason: BadCredentials

    [ERROR] Caught panic with message: Authentication failed with reason: BadCredentials

    Spotifyd worked for 3 days well until one day (2018.04.01) it stopped with error "BadCredentials".

    Credentials were not changed neither in spotifyd or spotify.com.

    I don't know how to look up the version, but my source was Arch linux aur spotifyd-git 0.1.1.2.g7451cd6-1

    log:

    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:523] build; num-workers=8
    00:05:54 [DEBUG] tokio_reactor::background: starting background reactor
    00:05:54 [INFO] Using software volume controller.
    00:05:54 [DEBUG] librespot_connect::discovery: Zeroconf server listening on 0.0.0.0:0
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(2)
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(3)
    00:05:54 [DEBUG] tokio_reactor: loop process - 2 events, 0.000s
    00:05:54 [DEBUG] tokio_core::reactor: added a timeout: 0
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(4)
    00:05:54 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:54 [TRACE] hyper::client::pool: [<unknown>:178] park; waiting for idle connection: "http://apresolve.spotify.com"
    00:05:54 [TRACE] hyper::client::connect: [<unknown>:118] Http::connect("http://apresolve.spotify.com/")
    00:05:54 [DEBUG] hyper::client::dns: resolving host="apresolve.spotify.com", port=80
    00:05:54 [DEBUG] tokio_core::reactor: consuming notification queue
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:275] sending packet to V4(224.0.0.251:5353)
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(2)
    00:05:54 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:275] sending packet to V6([ff02::fb]:5353)
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(2)
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(3)
    00:05:54 [DEBUG] tokio_reactor: loop process - 2 events, 0.000s
    00:05:54 [DEBUG] tokio_core::reactor: updating a timeout: 0
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 298198 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 633948051 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 7023 }
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:76] received packet from V4(192.168.1.100:5353)
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:87] received packet from V4(192.168.1.100:5353) with no query
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:76] received packet from V6([fe80::59cc:f6e9:af23:975e]:5353)
    00:05:54 [TRACE] mdns::fsm: [/home/user/.cargo/git/checkouts/rust-mdns-881ed19b93df0e9d/733b2b6/src/fsm.rs:87] received packet from V6([fe80::59cc:f6e9:af23:975e]:5353) with no query
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 29906 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 633988276 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 5821 }
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 340 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 633997373 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 15680 }
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(3)
    00:05:54 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 87637707 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 721655288 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 42811 }
    00:05:54 [DEBUG] hyper::client::connect: connecting to 104.199.64.136:80
    00:05:54 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:54 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(5)
    00:05:54 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 40125343 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 761954086 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 13345 }
    00:05:54 [TRACE] hyper::proto::h1::dispatch: [<unknown>:274] Dispatcher::poll
    00:05:54 [TRACE] hyper::proto::h1::conn: [<unknown>:284] read_keep_alive; is_mid_message=false
    00:05:54 [TRACE] hyper::proto: [<unknown>:122] should_keep_alive(version=Http11, header=None) = true
    00:05:54 [TRACE] hyper::proto::h1::role: [<unknown>:327] ClientTransaction::encode has_body=false, method=None
    00:05:54 [TRACE] hyper::proto::h1::io: [<unknown>:542] reclaiming write buf Vec
    00:05:54 [DEBUG] hyper::proto::h1::io: flushed 47 bytes
    00:05:54 [TRACE] hyper::proto::h1::conn: [<unknown>:580] flushed State { reading: Init, writing: KeepAlive, keep_alive: Busy, error: None, read_task: None }
    00:05:54 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 73367 }
    00:05:54 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 762090221 }
    00:05:54 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 7153 }
    00:05:55 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(5)
    00:05:55 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:55 [TRACE] hyper::proto::h1::dispatch: [<unknown>:274] Dispatcher::poll
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:172] Conn::read_head
    00:05:55 [DEBUG] hyper::proto::h1::io: read 686 bytes
    00:05:55 [TRACE] hyper::proto::h1::role: [<unknown>:231] Response.parse([Header; 100], [u8; 686])
    00:05:55 [TRACE] hyper::proto::h1::role: [<unknown>:236] Response.parse Complete(268)
    00:05:55 [TRACE] hyper::header: [<unknown>:355] maybe_literal not found, copying "Keep-Alive"
    00:05:55 [TRACE] hyper::header: [<unknown>:355] maybe_literal not found, copying "Vary"
    00:05:55 [DEBUG] hyper::proto::h1::io: parsed 9 headers (268 bytes)
    00:05:55 [DEBUG] hyper::proto::h1::conn: incoming body is content-length (418 bytes)
    00:05:55 [TRACE] hyper::proto: [<unknown>:133] expecting_continue(version=Http11, header=None) = false
    00:05:55 [TRACE] hyper::proto: [<unknown>:122] should_keep_alive(version=Http11, header=Some(Connection([KeepAlive]))) = true
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:246] Conn::read_body
    00:05:55 [TRACE] hyper::proto::h1::decode: [<unknown>:88] decode; state=Length(418)
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:580] flushed State { reading: Body(Length(0)), writing: KeepAlive, keep_alive: Busy, error: None, read_task: None }
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 41689020 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 803790242 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 7605 }
    00:05:55 [TRACE] hyper::proto::h1::dispatch: [<unknown>:274] Dispatcher::poll
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:246] Conn::read_body
    00:05:55 [TRACE] hyper::proto::h1::decode: [<unknown>:88] decode; state=Length(0)
    00:05:55 [DEBUG] hyper::proto::h1::conn: incoming body completed
    00:05:55 [TRACE] hyper::client::pool: [<unknown>:332] pool dropped, dropping pooled ("http://apresolve.spotify.com")
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:829] State::close()
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:426] maybe_notify; no task to notify
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:284] read_keep_alive; is_mid_message=true
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:311] parking current task
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:423] maybe_notify; notifying task
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:580] flushed State { reading: Closed, writing: Closed, keep_alive: Disabled, error: None, read_task: Some(Task) }
    00:05:55 [TRACE] hyper::proto::h1::conn: [<unknown>:588] shut down IO
    00:05:55 [TRACE] hyper::proto::h1::dispatch: [<unknown>:74] Dispatch::poll done
    00:05:55 [TRACE] mio::poll: [<unknown>:905] deregistering handle with poller
    00:05:55 [DEBUG] tokio_reactor: dropping I/O source: 4
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 99727 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 803922570 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 7494 }
    00:05:55 [INFO] Connecting to AP "gew1-accesspoint-b-ksm3.ap.spotify.com:4070"
    00:05:55 [TRACE] mio::poll: [<unknown>:785] registering with poller
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 1442 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 856588695 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 10310 }
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 451 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 856603463 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 6833 }
    00:05:55 [TRACE] tokio_reactor: [<unknown>:330] event Writable Token(5)
    00:05:55 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 40060141 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 896673953 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 11552 }
    00:05:55 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(5)
    00:05:55 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:55 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 38608403 }
    00:05:55 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 274, tv_nsec: 941009139 }
    00:05:55 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 12343 }
    00:05:55 [TRACE] tokio_io::framed_write: [<unknown>:182] flushing framed transport
    00:05:55 [TRACE] tokio_io::framed_write: [<unknown>:185] writing; remaining=128
    00:05:55 [TRACE] tokio_io::framed_write: [<unknown>:202] framed transport flushed
    00:05:56 [TRACE] tokio_reactor: [<unknown>:330] event Readable | Writable Token(5)
    00:05:56 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:56 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 969123604 }
    00:05:56 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 275, tv_nsec: 915854447 }
    00:05:56 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 23234 }
    00:05:56 [TRACE] tokio_io::framed_read: [<unknown>:189] attempting to decode a frame
    00:05:56 [TRACE] tokio_io::framed_read: [<unknown>:192] frame decoded from buffer
    00:05:56 [TRACE] mio::poll: [<unknown>:905] deregistering handle with poller
    00:05:56 [DEBUG] tokio_reactor: dropping I/O source: 4
    00:05:56 [TRACE] tokio_reactor: [<unknown>:330] event Readable Token(0)
    00:05:56 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s
    00:05:56 [DEBUG] tokio_reactor::background: shutting background reactor down NOW
    00:05:56 [DEBUG] tokio_reactor::background: background reactor has shutdown
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:871] shutdown; state=State { lifecycle: 0, num_futures: 0 }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:917]   -> transitioned to shutdown
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:929]   -> shutting down workers
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=7; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=7
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=7
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=6; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=6
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=6
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=5; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=5
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=5
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=4; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=4
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=4
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=3; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=3
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=3
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=2; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=2
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=2
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=1; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=1
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=1
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:933]   -> shutdown worker; idx=0; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:948] signal_stop -- WORKER_SHUTDOWN; idx=0
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:983] worker_terminated; num_workers=0
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:986] notifying shutdown task
    00:05:56 [TRACE] tokio_threadpool: [/home/user/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-threadpool-0.1.0/src/lib.rs:851] Shutdown::poll
    00:05:56 [DEBUG] tokio_core::reactor::timeout_token: cancel timeout 0
    
    bug wontfix 
    opened by Deluxo 28
  • Spotifyd crashes without specifying a volume_controller

    Spotifyd crashes without specifying a volume_controller

    After building on macos 10.4.6. with portaudio as backend, launching spotifyd results next panic:

    thread 'main' panicked at 'called `Option::unwrap()` on a `None` value', src/libcore/option.rs:347:21
    

    Running with RUST_BACKTRACE=1 shows next stack backtrace:

       0: std::panicking::default_hook::{{closure}}
       1: std::panicking::default_hook
       2: std::panicking::rust_panic_with_hook
       3: std::panicking::continue_panic_fmt
       4: rust_begin_unwind
       5: core::panicking::panic_fmt
       6: core::panicking::panic
       7: spotifyd::config::get_internal_config
       8: spotifyd::main
       9: std::rt::lang_start::{{closure}}
      10: std::panicking::try::do_call
      11: __rust_maybe_catch_panic
      12: std::rt::lang_start_internal
      13: main
    

    I was able to build working version from source on August and after that I've only changed my rust installation from Homebrew version to rustup (guess that's not the problem here?). Any suggestions?

    bug reproducibility: easy 
    opened by otahontas 28
  • called `Result::unwrap()` on an `Err` value: WireError(InvalidEnumValue(14))

    called `Result::unwrap()` on an `Err` value: WireError(InvalidEnumValue(14))

    hello

    Spotifyd started crashing. I updated to the latest git commit 2c13ff926931ff9d341b3918d2c244cf4fbf65d5 but it still crashes immediately. I have spotify playing on my tablet when I start Spotifyd. I also have a message on the ap on my tablet saying that my credit card has expired and that I need to update my payment options ASAP.

    May 17 22:28:55 kooka Spotifyd[24049]: event Readable | Writable Token(20971525)
    May 17 22:28:55 kooka Spotifyd[24049]: loop process - 1 events, 0.000s
    May 17 22:28:55 kooka Spotifyd[24049]: new Session[0]
    May 17 22:28:55 kooka Spotifyd[24049]: new Spirc[0]
    May 17 22:28:55 kooka Spotifyd[24049]: new MercuryManager
    May 17 22:28:55 kooka Spotifyd[24049]: new Player[0]
    May 17 22:28:55 kooka Spotifyd[24049]: input volume:65535 to mixer: 65535
    May 17 22:28:55 kooka Spotifyd[24049]: Using alsa sink
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: Session[0] strong=3 weak=2
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: Country: "FR"
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: flushing framed transport
    May 17 22:28:55 kooka Spotifyd[24049]: writing; remaining=368
    May 17 22:28:55 kooka Spotifyd[24049]: framed transport flushed
    May 17 22:28:55 kooka Spotifyd[24049]: loop poll - Duration { secs: 0, nanos: 198506 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop time - Instant { tv_sec: 255865, tv_nsec: 655725030 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop process, Duration { secs: 0, nanos: 10240 }
    May 17 22:28:55 kooka Spotifyd[24049]: event Readable | Writable Token(20971525)
    May 17 22:28:55 kooka Spotifyd[24049]: loop process - 1 events, 0.000s
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: flushing framed transport
    May 17 22:28:55 kooka Spotifyd[24049]: framed transport flushed
    May 17 22:28:55 kooka Spotifyd[24049]: loop poll - Duration { secs: 0, nanos: 5885066 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop time - Instant { tv_sec: 255865, tv_nsec: 661625876 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop process, Duration { secs: 0, nanos: 10093 }
    May 17 22:28:55 kooka Spotifyd[24049]: event Readable | Writable Token(20971525)
    May 17 22:28:55 kooka Spotifyd[24049]: loop process - 1 events, 0.000s
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: flushing framed transport
    May 17 22:28:55 kooka Spotifyd[24049]: event Writable Token(20971525)
    May 17 22:28:55 kooka Spotifyd[24049]: framed transport flushed
    May 17 22:28:55 kooka Spotifyd[24049]: loop process - 1 events, 0.000s
    May 17 22:28:55 kooka Spotifyd[24049]: loop poll - Duration { secs: 0, nanos: 33968326 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop time - Instant { tv_sec: 255865, tv_nsec: 695610976 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop process, Duration { secs: 0, nanos: 11760 }
    May 17 22:28:55 kooka Spotifyd[24049]: subscribed uri=hm://remote/3/user/xxxx/ count=0
    May 17 22:28:55 kooka Spotifyd[24049]: loop poll - Duration { secs: 0, nanos: 2045 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop time - Instant { tv_sec: 255865, tv_nsec: 695655399 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop process, Duration { secs: 0, nanos: 8495 }
    May 17 22:28:55 kooka Spotifyd[24049]: event Readable | Writable Token(20971525)
    May 17 22:28:55 kooka Spotifyd[24049]: loop process - 1 events, 0.000s
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: frame decoded from buffer
    May 17 22:28:55 kooka Spotifyd[24049]: attempting to decode a frame
    May 17 22:28:55 kooka Spotifyd[24049]: flushing framed transport
    May 17 22:28:55 kooka Spotifyd[24049]: framed transport flushed
    May 17 22:28:55 kooka Spotifyd[24049]: loop poll - Duration { secs: 0, nanos: 5974544 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop time - Instant { tv_sec: 255865, tv_nsec: 701642640 }
    May 17 22:28:55 kooka Spotifyd[24049]: loop process, Duration { secs: 0, nanos: 17056 }
    May 17 22:28:55 kooka Spotifyd[24049]: Caught panic with message: called `Result::unwrap()` on an `Err` value: WireError(InvalidEnumValue(14))
    May 17 22:28:55 kooka Spotifyd[24049]: drop Spirc[0]
    May 17 22:28:55 kooka Spotifyd[24049]: Shutting down player thread ...
    May 17 22:28:55 kooka Spotifyd[24049]: drop Player[0]
    May 17 22:28:55 kooka Spotifyd[24049]: drop Session[0]
    May 17 22:28:55 kooka Spotifyd[24049]: drop MercuryManager
    May 17 22:28:55 kooka Spotifyd[24049]: shutdown; state=State { lifecycle: 0, num_futures: 0 }
    May 17 22:28:55 kooka Spotifyd[24049]:   -> transitioned to shutdown
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutting down workers
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutdown worker; idx=7; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    May 17 22:28:55 kooka Spotifyd[24049]: signal_stop -- WORKER_SHUTDOWN; idx=7
    May 17 22:28:55 kooka Spotifyd[24049]: worker_terminated; num_workers=7
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutdown worker; idx=6; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    May 17 22:28:55 kooka systemd[1]: [email protected]: Main process exited, code=exited, status=101/n/a
    May 17 22:28:55 kooka Spotifyd[24049]: signal_stop -- WORKER_SHUTDOWN; idx=6
    May 17 22:28:55 kooka systemd[1]: [email protected]: Failed with result 'exit-code'.
    May 17 22:28:55 kooka Spotifyd[24049]: worker_terminated; num_workers=6
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutdown worker; idx=5; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    May 17 22:28:55 kooka Spotifyd[24049]: signal_stop -- WORKER_SHUTDOWN; idx=5
    May 17 22:28:55 kooka Spotifyd[24049]: worker_terminated; num_workers=5
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutdown worker; idx=4; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    May 17 22:28:55 kooka Spotifyd[24049]: signal_stop -- WORKER_SHUTDOWN; idx=4
    May 17 22:28:55 kooka Spotifyd[24049]: worker_terminated; num_workers=4
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutdown worker; idx=3; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    May 17 22:28:55 kooka Spotifyd[24049]: signal_stop -- WORKER_SHUTDOWN; idx=3
    May 17 22:28:55 kooka Spotifyd[24049]: worker_terminated; num_workers=3
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutdown worker; idx=2; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    May 17 22:28:55 kooka Spotifyd[24049]: signal_stop -- WORKER_SHUTDOWN; idx=2
    May 17 22:28:55 kooka Spotifyd[24049]: worker_terminated; num_workers=2
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutdown worker; idx=1; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    May 17 22:28:55 kooka Spotifyd[24049]: signal_stop -- WORKER_SHUTDOWN; idx=1
    May 17 22:28:55 kooka Spotifyd[24049]: worker_terminated; num_workers=1
    May 17 22:28:55 kooka Spotifyd[24049]:   -> shutdown worker; idx=0; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true }
    May 17 22:28:55 kooka Spotifyd[24049]: signal_stop -- WORKER_SHUTDOWN; idx=0
    May 17 22:28:55 kooka Spotifyd[24049]: worker_terminated; num_workers=0
    May 17 22:28:55 kooka Spotifyd[24049]: notifying shutdown task
    May 17 22:28:55 kooka Spotifyd[24049]: Shutdown::poll
    May 17 22:28:55 kooka Spotifyd[24049]: event Readable Token(4194303)
    May 17 22:28:55 kooka Spotifyd[24049]: loop process - 1 events, 0.000s
    May 17 22:28:55 kooka Spotifyd[24049]: shutting background reactor down NOW
    May 17 22:28:55 kooka Spotifyd[24049]: background reactor has shutdown
    May 17 22:28:55 kooka Spotifyd[24049]: drop Dispatch
    
    opened by stuart12 27
  • Spotifyd.service not launching

    Spotifyd.service not launching

    Hi,

    I´m having troubles to get the spotifyd.service to work, to be able to launch the application on boot.

    systemctl --user start spotifyd.service <- Nothing happens systemctl --user enable spotifyd.service <- same story

    spotifyd.service file copied as instructed here: .config/systemd/user$ ls default.target.wants spotifyd.service

    sudo nano spotifyd.service

    [Unit] Description=A spotify playing daemon Documentation=https://github.com/Spotifyd/spotifyd Wants=sound.target After=sound.target Wants=network-online.target After=network-online.target

    [Service] ExecStart=/usr/bin/spotifyd --no-daemon Restart=always RestartSec=12

    [Install] WantedBy=default.target

    I have MOST LIKELY done something wrong when i built it from source with this guide, i didn´t find a deb package :( https://github.com/Spotifyd/spotifyd/wiki/Installing-on-Ubuntu-%28from-source%29 I need to manually launch it from here ./spotifyd/target/release/spotifyd

    waiting for feedback 
    opened by Ru1ah 26
  • Fetching metadata through mpris sometimes returns an empty array.

    Fetching metadata through mpris sometimes returns an empty array.


    EDIT: If you encounter rate limit errors, please use your own Spotify client ID by setting an environment variable named SPOTIFYD_CLIENT_ID. You can create a client ID here.

    ~ Sven (SirWindfield)


    Hi, first of all thanks for the great work.

    My status bar refreshes every 5 seconds and I'm showing the currently playing music on it. So that means in every 5 seconds I'm fetching metadata data from spotifyd with this command:

    dbus-send --print-reply --type=method_call
    --dest=org.mpris.MediaPlayer2.spotifyd
    /org/mpris/MediaPlayer2
    org.freedesktop.DBus.Properties.GetAll
    string:org.mpris.MediaPlayer2.Player

    But sometimes that returns either an empty array or partial data like just the title or just the artist. Example return value: . . dict entry( string "Metadata" variant array [ ] ) . .

    When I run spotifyd with --no-daemon option, it prints this for every failed fetching attempt:

    Couldn't fetch metadata from spotify: Err(RateLimited(Some(8)))

    And seems like the number inside Some() decreases every second and when it is 0, I can fetch the data. As I'm guessing from its name it is some kind of a limitation. Why is this happening? Is this something that set intentionally? I don't get this kind of error with the official spotify client. If is there a way to set this limitation to 5 seconds so that it can sync to my bar, I would really appreciate it.

    I'm on kernel version 5.5.2 and using the lastest pre-compiled binary.

    advice bug 
    opened by TriaSirax 26
  • Spotifyd prints error, disconnects, but does not exit

    Spotifyd prints error, disconnects, but does not exit

    My spotifyd repeatedly prints an error message after a few minutes: "[ERROR] Os { code: 104, kind: ConnectionReset, message: "Connection reset by peer" }"

    After this message, either (a) spotifyd exists, or (b) playing stops, and the daemon becomes invisible to Spotify apps. Also, spotifyd cannot be killed with Ctrl+C any more. So it does not even react to signals any more. #127 could be related.

    Could it help to just handle this connection reset error more gracefully?

    blocked by: librespot pinned wontfix 
    opened by nalt 24
  • Panic on ChannelError

    Panic on ChannelError

    spotifyd is sporadically crashing during playback.

    I am getting the following errors messages:

    17:40:24 [INFO] Loading track "21st Century Liability" with Spotify URI "spotify:track:5WAsLiCOLiz2iwJLcuJEgb" 17:40:24 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:40:24 [TRACE] tokio_io::framed_write: [:185] writing; remaining=49 17:40:24 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 559164 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 407262389 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 161979 } 17:40:24 [TRACE] tokio_reactor: [:345] event Readable | Writable Token(20971525) 17:40:24 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s 17:40:24 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:24 [TRACE] tokio_io::framed_read: [:192] frame decoded from buffer 17:40:24 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:24 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:40:24 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 42202857 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 449713370 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 163957 } 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 28020 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 450111128 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 163697 } 17:40:24 [TRACE] tokio_reactor: [:345] event Readable | Writable Token(20971525) 17:40:24 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s 17:40:24 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:24 [TRACE] tokio_io::framed_read: [:192] frame decoded from buffer 17:40:24 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:24 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:40:24 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 88427672 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 538781872 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 184010 } 17:40:24 [DEBUG] librespot_audio::fetch: Downloading file 9d855ab455403b8700f3af18299930983f628192 17:40:24 [TRACE] librespot_audio::fetch: [/home/mitch/.cargo/git/checkouts/librespot-06fda9f186b35c32/817dff0/audio/src/fetch.rs:173] requesting chunk 0 17:40:24 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:40:24 [TRACE] tokio_io::framed_write: [:185] writing; remaining=53 17:40:24 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 774736 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 539850252 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 275467 } 17:40:24 [DEBUG] tokio_core::reactor: consuming notification queue 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 141770 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 540443739 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 158801 } 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 14635 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 540696237 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 156979 } 17:40:24 [TRACE] tokio_reactor: [:345] event Readable | Writable Token(20971525) 17:40:24 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s 17:40:24 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:24 [TRACE] tokio_io::framed_read: [:192] frame decoded from buffer 17:40:24 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:24 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:40:24 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:40:24 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 158025181 } [ERROR] channel error: 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 699198551 } 2 1 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 202707 } 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 17760 } 17:40:24 [ERROR] Caught panic with message: called Result::unwrap() on an Err value: ChannelError 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 699512091 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 166666 } 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 16406 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 699804902 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 240520 } 17:40:24 [DEBUG] librespot_playback::player: drop Player[0] 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 7709 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 700236775 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 216040 } 17:40:24 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 6146 } 17:40:24 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93621, tv_nsec: 700572242 } 17:40:24 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 159999 } 17:40:50 [TRACE] tokio_reactor: [:345] event Readable | Writable Token(20971525) 17:40:50 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s 17:40:50 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:50 [TRACE] tokio_io::framed_read: [:192] frame decoded from buffer 17:40:50 [DEBUG] librespot_core::session: Session[0] strong=2 weak=4 17:40:50 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:50 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:40:50 [TRACE] tokio_io::framed_write: [:185] writing; remaining=11 17:40:50 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:40:50 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 25, nanos: 872005877 } 17:40:50 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93647, tv_nsec: 572819055 } 17:40:50 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 170468 } 17:40:50 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:40:50 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:40:50 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 156301 } 17:40:50 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93647, tv_nsec: 573237021 } 17:40:50 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 146562 } 17:40:50 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 14844 } 17:40:50 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93647, tv_nsec: 573476968 } 17:40:50 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 151770 } 17:40:50 [TRACE] tokio_reactor: [:345] event Readable | Writable Token(20971525) 17:40:50 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s 17:40:50 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:50 [TRACE] tokio_io::framed_read: [:192] frame decoded from buffer 17:40:50 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:40:50 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:40:50 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:40:50 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 43941749 } 17:40:50 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93647, tv_nsec: 617648612 } 17:40:50 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 217291 }

    Process aborted by user using CTRL+C at this point

    17:42:37 [TRACE] tokio_reactor: [:345] event Readable | Writable Token(12582915) 17:42:37 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s 17:42:37 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 106, nanos: 512648505 } 17:42:37 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93754, tv_nsec: 130634147 } 17:42:37 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 222446 } 17:42:37 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:42:37 [TRACE] tokio_io::framed_write: [:185] writing; remaining=1442 17:42:37 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:42:37 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 471456 } 17:42:37 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93754, tv_nsec: 131778099 } 17:42:37 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 159790 } 17:42:37 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 15209 } 17:42:37 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93754, tv_nsec: 132061639 } 17:42:37 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 156718 } 17:42:37 [TRACE] tokio_reactor: [:345] event Readable | Writable Token(20971525) 17:42:37 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s 17:42:37 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:42:37 [TRACE] tokio_io::framed_read: [:192] frame decoded from buffer 17:42:37 [TRACE] tokio_io::framed_read: [:189] attempting to decode a frame 17:42:37 [TRACE] tokio_io::framed_write: [:182] flushing framed transport 17:42:37 [TRACE] tokio_io::framed_write: [:202] framed transport flushed 17:42:37 [DEBUG] tokio_core::reactor: loop poll - Duration { secs: 0, nanos: 50111700 } 17:42:37 [DEBUG] tokio_core::reactor: loop time - Instant { tv_sec: 93754, tv_nsec: 182409171 } 17:42:37 [DEBUG] tokio_core::reactor: loop process, Duration { secs: 0, nanos: 162551 } 17:42:37 [DEBUG] librespot_connect::spirc: drop Spirc[0] 17:42:37 [DEBUG] librespot_playback::player: Shutting down player thread ... 17:42:37 [ERROR] Player thread panicked! 17:42:37 [DEBUG] librespot_core::session: drop Session[0] 17:42:37 [DEBUG] librespot::component: drop AudioKeyManager 17:42:37 [DEBUG] librespot::component: drop ChannelManager 17:42:37 [DEBUG] librespot::component: drop MercuryManager 17:42:37 [TRACE] tokio_threadpool::inner: [:64] shutdown; state=State { lifecycle: 0, num_futures: 0 } 17:42:37 [TRACE] tokio_threadpool::inner: [:110] -> transitioned to shutdown 17:42:37 [TRACE] tokio_threadpool::inner: [:122] -> shutting down workers 17:42:37 [TRACE] tokio_threadpool::inner: [:126] -> shutdown worker; idx=3; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true } 17:42:37 [TRACE] tokio_threadpool::inner: [:141] signal_stop -- WORKER_SHUTDOWN; idx=3 17:42:37 [TRACE] tokio_threadpool::inner: [:176] worker_terminated; num_workers=3 17:42:37 [TRACE] tokio_threadpool::inner: [:126] -> shutdown worker; idx=2; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true } 17:42:37 [TRACE] tokio_threadpool::inner: [:141] signal_stop -- WORKER_SHUTDOWN; idx=2 17:42:37 [TRACE] tokio_threadpool::inner: [:176] worker_terminated; num_workers=2 17:42:37 [TRACE] tokio_threadpool::inner: [:126] -> shutdown worker; idx=1; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true } 17:42:37 [TRACE] tokio_threadpool::inner: [:141] signal_stop -- WORKER_SHUTDOWN; idx=1 17:42:37 [TRACE] tokio_threadpool::inner: [:176] worker_terminated; num_workers=1 17:42:37 [TRACE] tokio_threadpool::inner: [:126] -> shutdown worker; idx=0; state=WorkerState { lifecycle: "WORKER_SHUTDOWN", is_pushed: true } 17:42:37 [TRACE] tokio_threadpool::inner: [:141] signal_stop -- WORKER_SHUTDOWN; idx=0 17:42:37 [TRACE] tokio_threadpool::inner: [:176] worker_terminated; num_workers=0 17:42:37 [TRACE] tokio_threadpool::inner: [:179] notifying shutdown task 17:42:37 [TRACE] tokio_threadpool::shutdown: [:38] Shutdown::poll 17:42:37 [TRACE] tokio_reactor: [:345] event Readable Token(4194303) 17:42:37 [DEBUG] tokio_reactor: loop process - 1 events, 0.000s 17:42:37 [DEBUG] tokio_reactor::background: shutting background reactor down NOW 17:42:37 [DEBUG] tokio_reactor::background: background reactor has shutdown 17:42:37 [DEBUG] librespot_core::session: drop Dispatch

    This error is happening after some songs have been played from the playlist. After restarting the daemon and playing the exact same song, the playback is beeing resumed.

    bug wontfix 
    opened by deify 23
  • DBus MPRIS control?

    DBus MPRIS control?

    First of all, I've to say, I've been really liking spotifyd. Super lightweight, works just as promised! Thanks!

    Onto the topic of this issue: I'm wondering if it's possible to implement the MPRIS DBus protocol. It's a standard protocol (at least on modern Unix-like desktops) to control media players.

    Lots of tools use this to control the music players. eg: Gnome and KDE will send a pause call when the media key is pressed on the desktop, and playerctl could be used for similar commands too.

    The end-goal here is to be able to play/pause using media keys.

    This should be really useful. Geez, I wish I'd time to sit down and learn rust, since it looks so well done!

    enhancement 
    opened by WhyNotHugo 22
  • No output when playing a song

    No output when playing a song

    Description

    Playing a song results in no output. pavucontrol indicates that spotifyd is not playing any streams.

    I didn't always get an error, but my last attempt did show one, so including that here.

    To Reproduce

    1. Run spotifyd --no-daemon.
    2. Pick the device as an output from a phone client.
    3. Play a song.

    Expected behavior

    Should play a song.

    Logs

    Jun 14 10:34:55 victory spotifyd[128684]: No config file specified. Running with default values
    Jun 14 10:34:55 victory spotifyd[128684]: No username specified. Checking username_cmd
    Jun 14 10:34:55 victory spotifyd[128684]: No username_cmd specified
    Jun 14 10:34:55 victory spotifyd[128684]: No password specified. Checking password_cmd
    Jun 14 10:34:55 victory spotifyd[128684]: No password_cmd specified
    Jun 14 10:34:55 victory spotifyd[128684]: No proxy specified
    Jun 14 10:34:55 victory spotifyd[128684]: Using software volume controller.
    Jun 14 10:35:03 victory spotifyd[128684]: Connecting to AP "gew1-accesspoint-d-w4k9.ap.spotify.com:443"
    Jun 14 10:35:03 victory spotifyd[128684]: Authenticated as "138rlsckfdsj6jdnpgbegyd70" !
    Jun 14 10:35:03 victory spotifyd[128684]: Using alsa sink
    Jun 14 10:35:03 victory spotifyd[128684]: Country: "AR"
    Jun 14 10:35:03 victory spotifyd[128684]: Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))
    Jun 14 10:35:03 victory spotifyd[128684]: Unhandled DBus message: (Signal, Some("/org/freedesktop/DBus"), Some("org.freedesktop.DBus"), Some("NameAcquired"))
    Jun 14 10:35:04 victory spotifyd[128684]: Loading <Heartbreak (Make Me A Dancer) [feat. Sophie Ellis Bextor]> with Spotify URI <spotify:track:3BZfj4L5f1x9MlVSP4kO7p>
    Jun 14 10:35:04 victory spotifyd[128684]: <Heartbreak (Make Me A Dancer) [feat. Sophie Ellis Bextor]> (209333 ms) loaded
    Jun 14 10:35:04 victory spotifyd[128684]: ALSA lib pcm_dmix.c:1035:(snd_pcm_dmix_open) unable to open slave
    Jun 14 10:35:04 victory spotifyd[128684]: Alsa error PCM open ALSA function 'snd_pcm_open' failed with error 'ENOENT: No such file or directory'
    Jun 14 10:35:04 victory spotifyd[128684]: Could not start audio: Alsa error: PCM open failed
    Jun 14 10:35:04 victory spotifyd[128684]: ALSA lib pcm_dmix.c:1035:(snd_pcm_dmix_open) unable to open slave
    Jun 14 10:35:04 victory spotifyd[128684]: Alsa error PCM open ALSA function 'snd_pcm_open' failed with error 'ENOENT: No such file or directory'
    Jun 14 10:35:04 victory spotifyd[128684]: Could not start audio: Alsa error: PCM open failed
    Jun 14 10:35:04 victory spotifyd[128684]: Caught panic with message: called `Option::unwrap()` on a `None` value
    

    Compilation flags

    Using the archlinux [community] package.

    Versions (please complete the following information):

    • OS: ArchLinux
    • Spotifyd: spotifyd 0.3.2
    • cargo: cargo 1.49.0 (d00d64df9 2020-12-05)

    Additional context

    I'm using pipewire as a drop-in pulseaudio replacement.

    bug 
    opened by WhyNotHugo 0
  • build(deps): bump hex from 0.4.2 to 0.4.3

    build(deps): bump hex from 0.4.2 to 0.4.3

    Bumps hex from 0.4.2 to 0.4.3.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language
    • @dependabot badge me will comment on this PR with code to add a "Dependabot enabled" badge to your readme

    Additionally, you can set the following in your Dependabot dashboard:

    • Update frequency (including time of day and day of week)
    • Pull request limits (per update run and/or open at any time)
    • Out-of-range updates (receive only lockfile updates, if desired)
    • Security updates (receive only security updates, if desired)
    dependencies 
    opened by dependabot-preview[bot] 0
  • build(deps): bump sha-1 from 0.9.1 to 0.9.6

    build(deps): bump sha-1 from 0.9.1 to 0.9.6

    Bumps sha-1 from 0.9.1 to 0.9.6.

    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language
    • @dependabot badge me will comment on this PR with code to add a "Dependabot enabled" badge to your readme

    Additionally, you can set the following in your Dependabot dashboard:

    • Update frequency (including time of day and day of week)
    • Pull request limits (per update run and/or open at any time)
    • Out-of-range updates (receive only lockfile updates, if desired)
    • Security updates (receive only security updates, if desired)
    dependencies 
    opened by dependabot-preview[bot] 0
  • build(deps): bump libc from 0.2.82 to 0.2.97

    build(deps): bump libc from 0.2.82 to 0.2.97

    Bumps libc from 0.2.82 to 0.2.97.

    Release notes

    Sourced from libc's releases.

    0.2.97

    Bump patch version to 0.2.97.

    0.2.96

    Bump patch version to 0.2.96.

    0.2.95

    Bump patch version to 0.2.95.

    0.2.94

    Bump patch version to 0.2.94.

    0.2.93

    Bump patch version to 0.2.93.

    0.2.92

    Bump patch version to 0.2.92.

    0.2.91

    Bump patch version to 0.2.91.

    0.2.90

    Bump patch version to 0.2.90.

    0.2.88

    Bump patch version to 0.2.88.

    0.2.87

    Bump patch version to 0.2.87.

    0.2.86

    Bump patch version to 0.2.86.

    Commits
    • 1c66799 Auto merge of #2230 - jonas-schievink:bump, r=JohnTitor
    • 83842ba bump libc dependency
    • 5f423a1 Auto merge of #2231 - devnexen:darwin_malloc_stats, r=Amanieu
    • 669bbfb apple add few malloc debug features specifics
    • 6743435 Bump version to 0.2.97
    • dac89a3 Auto merge of #2228 - jonas-schievink:mallinfo2, r=JohnTitor
    • 7f6ce32 Try to appease CI
    • fbcf62b Add mallinfo2 support
    • ce5cee1 Auto merge of #2224 - kolapapa:master, r=JohnTitor
    • 3048ef8 Auto merge of #2226 - JohnTitor:semverver, r=JohnTitor
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language
    • @dependabot badge me will comment on this PR with code to add a "Dependabot enabled" badge to your readme

    Additionally, you can set the following in your Dependabot dashboard:

    • Update frequency (including time of day and day of week)
    • Pull request limits (per update run and/or open at any time)
    • Out-of-range updates (receive only lockfile updates, if desired)
    • Security updates (receive only security updates, if desired)
    dependencies 
    opened by dependabot-preview[bot] 0
  • build(deps): bump env_logger from 0.7.1 to 0.8.4

    build(deps): bump env_logger from 0.7.1 to 0.8.4

    Bumps env_logger from 0.7.1 to 0.8.4.

    Release notes

    Sourced from env_logger's releases.

    0.8.4

    Improvements:

    • Allow writing logs to a custom output target (via Target::Pipe)

    Bug fixes:

    • Actually allow overriding filter levels using env_logger::Builders methods, as documented

    0.8.3

    New features:

    • Suffix customization for the default formatter (Builder::format_suffix) #192

    Improvements:

    • Improve documentation about log level names #189

    Bug fixes:

    • Ignore whitespace-only filter specifications #188
    • Remove unneded files from crates.io tarball (including rust-toolchain whose presence caused issues for a few people)

    0.8.2

    Fixed a panic on io errors when writing to stdout / stderr (#184).

    0.8.1

    Update links in the documentation that were pointing to the old repository location.

    0.8.0

    Breaking changes:

    • Update public dependency humantime to 2.0

    Improvements:

    • Update default colors for debug (white => blue) and trace (black => cyan)

    Deprecations:

    • env_logger::from_env has been deprecated in favor of env_logger::Builder::from_env

    This release raises the minimum supported Rust version to 1.41.0.

    Commits
    • 13cafce Bump version to 0.8.4
    • 0900811 Ensure unique directive names when building filters
    • 1a8379a Allow writing logs to a custom output target (Target::Pipe)
    • 2151771 Upgrade to GitHub-native Dependabot
    • 16d982e Fix lints
    • 67adcba Re-add tests to the crates.io tarball
    • c9f033d Release version 0.8.3
    • eed1651 Only include necessary files in crates.io tarballs
    • 664ca1a Don't deny(warnings)
    • 8ef1615 Merge pull request #192 from jthacker/feat/custom-suffix
    • Additional commits viewable in compare view

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language
    • @dependabot badge me will comment on this PR with code to add a "Dependabot enabled" badge to your readme

    Additionally, you can set the following in your Dependabot dashboard:

    • Update frequency (including time of day and day of week)
    • Pull request limits (per update run and/or open at any time)
    • Out-of-range updates (receive only lockfile updates, if desired)
    • Security updates (receive only security updates, if desired)
    dependencies 
    opened by dependabot-preview[bot] 0
  • Update dbus_mpris.rs

    Update dbus_mpris.rs

    add 'VolumeUp' and 'VolumeDown' methods to dbus/MPRIS. Increment = 6%

    opened by NNEU-1 0
  • Set Volume

    Set Volume

    Hi,

    I know that one can get the current Spotify volume by MPRIS or environmental variable with the on_song_change_hook, but is there any way to write the volume to Spotify ?

    MPRIS doesn't work because it is a read-only property.

    This would be massively useful if the volume is changed outside of Spotify; eg. by control knob.

    If there is already a solution, please let me know. Otherwise, consider this as a feature request.

    enhancement 
    opened by NNEU-1 1
  • Doesn't work (thread 'main' panicked)

    Doesn't work (thread 'main' panicked)

    Description Doesn't work, when i launch i get:

    thread 'main' panicked at 'Couldn't initialize logger: Error(Initialization, State { next_error: Some(Error(Io(Os { code: 2, kind: NotFound, message: "No such file or directory" }), State { next_error: None, backtrace: None })), backtrace: None })', src/main.rs:39:51
    note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
    

    To Reproduce

    Compilation flags

    • [ ] dbus_mpris
    • [ ] dbus_keyring
    • [x] alsa_backend
    • [ ] portaudio_backend
    • [ ] pulseaudio_backend
    • [ ] rodio_backend

    Versions (please complete the following information):

    • OS: Void Linux
    • Spotifyd: spotifyd 0.3.2
    • cargo: 1.52.0
    bug 
    opened by hrqmonteiro 0
  • build(deps): bump structopt from 0.3.17 to 0.3.21

    build(deps): bump structopt from 0.3.17 to 0.3.21

    Bumps structopt from 0.3.17 to 0.3.21.

    Changelog

    Sourced from structopt's changelog.

    v0.3.21 (2020-11-30)

    v0.3.20 (2020-10-12)

    • Fixed a breakage when the struct is placed inside a macro_rules! macro.

    v0.3.19 (2020-10-08)

    • Added StructOpt::from_args_safe as a shortcut for StructOpt::from_iter_safe(std::env::args_os()).
    • Some links in documentation have been corrected.

    v0.3.18 (2020-09-23)

    • Unsafe code has been forbidden. This makes cargo geiger list structopt as "safe". Maybe it will help somebody trying to locate a bug in their dependency tree.
    Commits

    Dependabot compatibility score

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


    Dependabot commands and options

    You can trigger Dependabot actions by commenting on this PR:

    • @dependabot rebase will rebase this PR
    • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
    • @dependabot merge will merge this PR after your CI passes on it
    • @dependabot squash and merge will squash and merge this PR after your CI passes on it
    • @dependabot cancel merge will cancel a previously requested merge and block automerging
    • @dependabot reopen will reopen this PR if it is closed
    • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
    • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
    • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
    • @dependabot use these labels will set the current labels as the default for future PRs for this repo and language
    • @dependabot use these reviewers will set the current reviewers as the default for future PRs for this repo and language
    • @dependabot use these assignees will set the current assignees as the default for future PRs for this repo and language
    • @dependabot use this milestone will set the current milestone as the default for future PRs for this repo and language
    • @dependabot badge me will comment on this PR with code to add a "Dependabot enabled" badge to your readme

    Additionally, you can set the following in your Dependabot dashboard:

    • Update frequency (including time of day and day of week)
    • Pull request limits (per update run and/or open at any time)
    • Out-of-range updates (receive only lockfile updates, if desired)
    • Security updates (receive only security updates, if desired)
    dependencies 
    opened by dependabot-preview[bot] 0
  • Track ID not changing with on_song_change_hook

    Track ID not changing with on_song_change_hook

    Description The Track ID doesn't seem to change when running on_song_hook_change

    To Reproduce Simply changing the song and looking at the logs.

    Expected behavior I would except the track ID would match the song that triggered the onevent

    Logs

    May 29 17:12:04 hildegard spotifyd[30784]: Running "/home/<omit>/.config/spotifyd/command -id <omit> -secret <omit>" using "/bin/bash" with environment variables {"DURATION_MS": "352000", "PLAYER_EVENT": "play", "POSITION_MS": "0", "TRACK_ID": "3gOJUhQe0AHfLR9Wby6c4J", "PLAY_REQUEST_ID": "23"}
    May 29 17:12:05 hildegard spotifyd[30784]: Running "/home/<omit>/.config/spotifyd/command -id <omit> -secret <omit>" using "/bin/bash" with environment variables {"TRACK_ID": "7euBZraCkFV5bbct6mn2KP", "PLAYER_EVENT": "change", "OLD_TRACK_ID": "3gOJUhQe0AHfLR9Wby6c4J"}
    May 29 17:12:05 hildegard spotifyd[30784]: Loading <Beyond The Lost Sky> with Spotify URI <spotify:track:4Rvcs1jHrpTzBzpItH3wuA>                
    May 29 17:12:05 hildegard spotifyd[30784]: <Beyond The Lost Sky> (379946 ms) loaded                                                             
    May 29 17:12:44 hildegard spotifyd[30784]: Running "/home/<omit>/.config/spotifyd/command -id <omit> -secret <omit>" using "/bin/bash" with environment variables {"PLAY_REQUEST_ID": "24", "PLAYER_EVENT": "load", "POSITION_MS": "0", "TRACK_ID": "7euBZraCkFV5bbct6mn2KP"}
    May 29 17:12:45 hildegard spotifyd[30784]: Running "/home/<omit>/.config/spotifyd/command -id <omit> -secret <omit>" using "/bin/bash" with environment variables {"PLAYER_EVENT": "play", "PLAY_REQUEST_ID": "24", "TRACK_ID": "7euBZraCkFV5bbct6mn2KP", "POSITION_MS": "0", "DURATION_MS": "319866"}
    May 29 17:12:45 hildegard spotifyd[30784]: Loading <Unfinished Sketch Of An Unforgettable Lady> with Spotify URI <spotify:track:6mWzytEIxRov2cuZCjjRyl>
    May 29 17:12:45 hildegard spotifyd[30784]: <Unfinished Sketch Of An Unforgettable Lady> (193960 ms) loaded     
    

    Versions (please complete the following information):

    • OS: Debian 10
    • Spotifyd: 0.3.2
    • cargo: 1.42.1
    bug 
    opened by chopnico 1
Releases(v0.3.2)
A spotify daemon

Spotifyd An open source Spotify client running as a UNIX daemon. Spotifyd streams music just like the official client, but is more lightweight and sup

null 5.1k Jun 15, 2021