For Election 2010, we built an interactive mosaic that tries to emphasize the wide diversity of perspectives NYCity News Service reporters captured before and after. Vistors can filter these faces by relevant metadata, and click on a face to access a short video or audio interview with the person.
The entire product was done with HTML/CSS/Javascript (jQuery) in about an afternoon. Each mug wrapper has classes with values for each metadata field. Presentation is driven by adding further classes based on those values. Originally, I was going to pull from an API and refresh the entire view but what I ended up with proved to be a better solution.
In the future, I think it’d be more interesting to filter based on issue instead of demographic. Also, I didn’t think about this at the start, but there is a performance loss on page load when every single piece is a video interview as the browser queues all of the hidden Vimeo players.
Later: Michelle and I took a look at the project this evening and I think it’s the first time I’ve seen it truly from the user’s perspective. Most notably: video embed performance sucks and needs fixing. It would also be useful to have the filter values sorted alphabetically (and numerically). Clicking on a new filter should close the active video player. Clicking on a new mug that’s at the bottom of the page should scroll the user back up to the overlay. Lastly, when activating filters, the informational overlays should only appear on mugs that match the filter.