The Echo Nest: Comprehensive Music Data
The Echo Nest powers a lot of music recommendation services across the web. But what we really like is they give app developers direct access to their massive music database.
According to them, The Echo Nest is the result of twelves years of research across MIT, Columbia and Berkeley. Its database grows using algorithms we would usually associate with statistics or business intelligence: data mining, machine learning and a bit of analysis through vocal cues. We see the database extend beyond anything that exists from Amazon.com’s or Apple’s iTunes catalogues, because it contains more than just meta-data. In addition to pulling details from album details, it analyzes tempo and pitch, crawls the web to see how people are describing particular songs or genres, and actually takes into account music trends emerging across social media. Who knew a holistic music database could contain so much detail?
Developers use customary API calls to access the data, and they have a lot of options. Consider the capabilities when you have this data:
Many of The Echo Nest’s commercial clients use the data to power music recommendations, promoting alternatives using any combination of data calls. Where the innovation may lie, however, is within the non-commercial developer community. The Echo Nest’s website houses a showcase of experiments. For example, take Streamograph, which plots an area chart by mashing the hottest artists with their recent news mentions. Or there’s the unofficial Artist Discovery Guide to SXSW, which pipes in data from your Last.fm account and makes recommendations on which SXSW performances best match your music tastes.