Chained to the Rhythm: Echo Nest Knows What You're Gonna Play Next (Because it Decided for You)
Echo Nest was founded in 2005 by MIT alumni Brian Whitman and Tristan Jehan. It is a music intelligence and data platform that develops algorithms to predict listening patterns and identify listeners based on their patterns.

The engineers at Echo Nest teach computers human concepts like genre and employ collaborative filtering to collate data from all corners of the internet (social media, niche music blogs, music websites, etc.) to keep up to date with the development of new ways of describing and categorising music

In March 2014, Echo Nest was acquired by Spotify for $100,000,000.


According to Spotify, the service contains over 80 million tracks.

How many songs do you think you have listened to? How many do you return to frequently? 

That's not me trying to goad you into listening to all 80 million tracks. But in the balance between enabling listeners to discover new music outside of their comfort zone and keeping the listener satiated by providing them with music that they are likely to enjoy, what financially benefits Spotify the most? 

Are you going to spend money for a subscription to a service where you have to actively go looking for music that you will enjoy? Or are you going to spend money on a subscription to a service that just seems to *get* you? 
It is the accuracy of Spotify's algorithmic delivery that has placed it ahead of its competitors like Tidal and Apple Music. By accuracy I mean its ability to know what kind of music its users want to listen to based on what they are currently listening to.

As the technology advances, so will the accuracy of the algorithms and thus the listener experience of Spotify (if easiness is what you, the listener, are looking for.). Director of Echo Nest Paul Lamere has spoken previously about enhancing their algorithms to recognize the contexts in which someone is listening to music- their mood (happy, sad, hopeful, anxious), their location (in the car, on a beach, in a library) and even their political compass to guess and effectively dictate what they will listen to next.

The (terrifying) success of such accurate delivery can trap listeners in a sort of feedback loop, the potential for their taste or interests to expand becoming less. Of course, when you are a paid subscriber to Spotify you have the option to skip a song you don't like. But this in itself can be seen as a glitch- a moment where the technology did not understand you, a moment of miscommunication between you and the interface. Are you going to pay for a glitch? Can Spotify risk that? 
I think it's important to keep asking ourselves, as our worlds both virtual and real become more shaped or 'streamlined' by algorithms, just who is developing these algorithms and for what means. No one is unbiased, so why would the algorithms they design be? What biases are encoded? What will the algorithm pivot you towards and away from? With the music we listen to and how it becomes increasingly shaped by algorithms, we need to pause and take time to look at who is being silenced or platformed.

In a 2014 blog post, Lamere offered data based on listening habits according to gender. At the time, to use the platform, asserting your gender in the sign-up process was mandatory. In its Privacy Policy, Spotify states that gender is one of the points of data it collects from its users.

Furthermore, Spotify has access to its users IP addresses. From this information it can obtain a profile of the user's nationality, address and by extension maybe even social class.

Are you comfortable with Spotify knowing these things about you? Does this impact your feelings on what it could recommend to you to listen to based on who it thinks you are?