There’s no reason I can’t talk about my own projects on this blog, right? Just keep in mind that this story starts in 1999, when the web was mostly just html pages, some Director/Shockwave, a little Flash, and mostly dial-up modems. The social web didn’t yet exist. Napster was being sued by everyone. And the idea of selling music online was considered crazy. But lots of companies saw that something was about to happen and wanted to get involved.
Back then Triplecode was just Pascal Wever and myself, with the occasional freelancers — a tiny interactive design studio. We were approached by a new startup, founded to help people discover new music. The company, MoodLogic (although, as we worked with them, they were also known as Emotioneering and JaBoom), was just a couple founders, a core technology (but no sense of how to best give access to it), and a rented mansion in the hills above San Francisco. It was very dot-com.
The design problem was super-interesting: People’s musical tastes narrow as they age. Whether it’s the music enthusiast, stuck in a particular style of music, or the post-teen who still listens to their high-school/college favorites. This happens, in part, because musical tastes are very peer-driven, and as we get older our social groups move away from using music as a key identifier. To make matters worse, there are no good alternatives for discovering new music. Record stores (Remember them? This was in the days before iTunes and legal digital downloads.) didn’t make it easy for people to discover new music — and their organization of albums by genres and alphabetically definitely didn’t encourage exploration and serendipity.
MoodLogic thought they could offer a solution: apply super-smart statistical analytics and analysis algorithms to a vast musical database. And they built a big one. They categorized song data across 24+ dimensions including perception-based music metadata and information gathered by automated signal processing. By the time the project was launched, tens of thousands of music listeners had contributed more than 300 million discrete opinions, and the database had over 700 million data points collected on 550,000 songs! It probably wouldn’t be a big deal these days, but back in 1999 that was a big data set!
The design challenge was to take advantage of all this data and analytical smarts and create a great user experience. How could we help people discover new music that matched their tastes and mood? It was important that it be intuitive and give results that people could believe. But there also needed to be some transparency to the complex underlying mechanism so users could trust the system. I thought the nicest, most poetic, metaphor MoodLogic used when describing their dream solution was for the system to feel like a “magical compass” that helps you find songs.
The project began with an extensive design research and mutual-education phase. It was a “what if” process examining different user scenarios, delivery technologies, and business strategies. We worked very closely with MoodLogic’s growing team of engineers, analysts, and music taxonomers — with each group feeding ideas into the others work. It was a lot of fun. And we developed a wide range of exploratory designs dealing with possible conceptual structures of their vast music data.
Our sketches — with names like “dancing jiggles,” “6d space plot,” and “skyscrapers from above” — may have seemed crazy, but they helped us all get a grip on the complexity of the task. We wanted to break away from analytics and think more about emotional and intuitive experiences for music discovery. The sketches also helped MoodLogic’s scientists, who were mostly familiar with traditional scientific visualization tools, think in new ways about their data.
Early sketches emphasized user experience and flow over content and organization. And many of them were “sketched in code” — giving an interactive feel for how a user might navigate through this vast database, and how the data might restructure itself as the user moved. As things progressed, some sketches were connected to the database to evaluate their usability with real data.
Over time we settled on an interface concept where users used a combination of “filters” and “magnets” to search and explore songs. Filters were used to limit the songs displayed, magnets helped visually arrange them into meaningful distributions. The magnets also allowed users to directly manipulate the information, providing a fluidity and flexibility that would be impossible in a more traditional page-based presentation.
In the browser’s first full implementation, magnets and filters were interchangeable — so, any search criteria could be used as either a filter or a magnet. For example, users could place magnets for “happy” and “saxophone” and the songs would arrange themselves accordingly – often in unexpected but interesting ways.
While this approach allowed the greatest flexibility in exploring, user testing found it to be a bit complicated. It took too long for people learn how to use the system. While a learning curve might be acceptable in some other context, it was definitely a problem for this web application – for just as users were not assumed to be music experts, so too they should not need to become experts in this system.
As a result, we narrowed the way in which filters and magnets were used. Filters would be used for categorical qualities of the music, and magnets for the more abstract emotional characteristics. (The magnets were arranged as a Pultchik wheel of emotions.) This made it easy for users to quickly start using the browser and enabled them to make more complex search requests as they become more comfortable with the system.
Letting users drag the magnets around – manipulating the data dynamically, and seeing their inter-relationships – created a very interactive experience. And showed very interesting relationships between songs.
Looking at the system now, I’d critique it it by saying that it was pretty quiet — especially considering that it was about music discovery. I’d love for it to be more a sonic experience, and less a visual one. But the technology and legal limitations at the time made streaming music hard.
After the Magnet Browser launched, we adapted it for a wide variety of adaptations, including a touch-screen kiosk. So, what started as design research, evolved into a suite of music tools. Some were launched, while others were used for business development. Later the company launched some playlist generating tools (similar to iTunes Genius) and was eventually bought out. — but it paved the way for the current generation of music discovery software.
Did I mention that the relationship with MoodLogic really was a lot of fun. Obviously it was a great design challenge. But it also helped us grow our business. It allowed us to hire our first full time employee, Lindi Emongou, grow into a new office space, and then hire a couple other great designers. It won us a gold IDSA award in 2001. And it lead to a lot of other exciting work (which, maybe, I’ll save for future posts.)
Sadly, most of this work was in Director/Shockwave (or require connections to the no-longer active MoodLogic server) and doesn’t run an more. If I can find a solution I’ll update this post.)