Adrian Holovaty has two great passions. As a web developer he co-created the popular Python framework Django. As a virtuoso guitarist he channels the music of Django Reinhardt and posts popular YouTube videos. A few years back Adrian mentioned to me that he was working on an application that would combine these interests. This month it arrived. Soundslice is a tool with many uses:
Scoring music using tablature (tab), a notation that records positions on strings instead of notes on a staff.
Synchronizing notation with YouTube performances.
Using the synchronized notation to study and learn.
Annotating videos not only with tab tracks (for 4-, 5-, 6-, or 7-string instruments) but also with free-form tracks that can indicate chords and structure.
Using synchronized free-form annotation to add layers to any YouTube video. For example, the Soundslice tutorial is a screencast divided into chapters marked on a free-form track.
For me this has been an early Christmas gift. I’ve spent a lot of time, in recent years, learning to play guitar arrangements that I really enjoy. Until now that’s meant learning from the page — sometimes using tabs, sometimes conventional notation. I’ve looked to YouTube for inspiration. It’s a great way to sample variations on a theme and get a feel for how things could sound. But I’ve struggled to learn directly from those videos. It’s great ear training if you can do it, but I’m not there yet. I need a tool that helps me analyze, in detail, what’s happening in those performances.
Soundslice is that tool. Consider this tutorial in which Duck Baker demonstrates two improvisations of Make me a pallet on your floor. I really, really want to learn to play the tune like that. And I’d been making progress by replaying the video in small sections. But it was hard to consolidate what I’d learned. And as I recently found out, I wasn’t hearing (or seeing) a lot of the details.
Now consider my (still unfinished) Soundslice annotation of that Duck Baker video. You’ll need to skip to 6:52 to see my annotations because I’m tackling the second of the two variations first. (Adrian, when you get around to it, the ability to append #6:52 to the Soundslice URL would be a nice enhancement!)
If you check out my Soundslice, or any other Soundslice, here are some things to notice:
As the cursor moves along the timeline, current annotations on all tracks light up. So if you’ve annotated four beats as D7 on a chord track, that annotation will light up for the duration of the four beats. And notes within that measure will light up for their own durations.
There are no fixed intervals. It’s up to you to establish the durations of notes on the tab track and of measures on the chord track. That is, admittedly, tedious. But it means that you can sync the notation to the video as precisely as your motivation dictates.
Every annotation, when clicked, selects its duration for looping. You can focus on individual notes, whole measures, larger sections, or any other divisions that you create. And you can loop any selection at full speed or half speed.
All this, mind you, is happening in an HTML5 web application that I can use effectively in Chrome, Firefox, and IE 10. It’s a remarkable demonstration of what’s becoming possible in standards-based browsers.
But since this is a column about the personal cloud, I want to focus instead on how Soundslice anticipates an ecosystem of cooperating personal clouds. In this case let’s consider Duck Baker’s and mine. His performances on YouTube form a part of his cloud. The terms of that arrangement are specific to YouTube but we can imagine other services with different kinds of access control (or not) and monetization (or not).
Meanwhile my annotations of Duck Baker’s video form part of my personal cloud. Again the terms are specific to Soundslice but we can imagine different versions of Soundslice with different flavors of access control (or not) and monetization (or not). And while some versions might work with YouTube, others might work with different public or private video services.
This architecture opens up vast realms of possibility for those who wish to study, learn, teach, and share music, as well as for those who provide services to facilitate these activities. The key enablers are:
Data in disparate personal clouds.
Services that join those clouds.
Here the joining mechanism is time-based synchronization, which of course can be used far more broadly. Mozilla’s Popcorn.js, a general framework for combining video with other web assets, points the way.
Another joining mechanism is name-based synchronization. In the scenario I envisioned in Goodbye Fax, Hello Personal Cloud the joining mechanism would involve insurance claim numbers and health-care provider IDs.
When I wrote about that insurance/health-care scenario several commenters called it “aspirational,” which is true enough. I have similar aspirations for Soundslice. It’ll be a while yet before ecosystems of cooperating personal clouds really get going, but I don’t mind waiting a bit longer. There’s plenty of guitar practicing to do in the meantime.
Make Me a Pallet in the Cloud
This article
Make Me a Pallet in the Cloud
can be opened in url
http://newswantek.blogspot.com/2012/11/make-me-pallet-in-cloud.html
Make Me a Pallet in the Cloud