Greg Linch led a 3 pm session in room 4 on applying GitHub (and Git) to news. Andrew Spittle and I collaborated on live Google Doc notes. Here’s Greg’s previous blog post on the topic. Continue reading
Andrew Spittle and Andrew Nacin led a 1 pm session on lessons to be learned from developing software. Both worked on their college newspapers. Spittle now works for WordPress.com, a service offered by Automattic, and Nacin works on WordPress.org, an open-source software project. Two different types of communities involved: centralized and decentralized. Continue reading
Window seat on Amtrak 651 Keystone Service to Philadelphia for my third BarCamp NewsInnovation Philly tomorrow. Stoked to hang out with Andrew, Andrew, Marc, Sean, Greg, and everyone else. Michelle arrives tomorrow night after her Yelp event. We’re staying at the Penn View Hotel, and then will enjoy a tourist day on Sunday.
Drew Geraets led a session this morning on American Public Media’s Public Insight Network, an initiative and tool to bring their audience deeper into the reporting process. Funded by the Knight Foundation, they’re currently doing a complete rebuild of their CRM for journalism to produce a fully open source project and expand usage beyond the 12 existing media partners.
Specifically, by doing the rebuild, they want to: share more insights, offer better tools for sharing, enable sources to update their profile within the system, offer sources more granular privacy controls, instantly publish insights, create credibility systems for sources, offer a better user experience, and integrate with existing sites.
The prototype dashboard for the reporter-facing Audience Insight Repository is project-based and focused on collaboration.
Journalists can search through a huge database of sources based on demographic metadata.
Once they’ve found a worthwhile lead, the journalist can click through and get contact information, background on the source, and a record of prior interactions.
The project also has plans for a user-facing site tentatively called MyPIN where they’d be able to engage more fully with the news organization’s reporting process or update their profile information.
There’s a certain amount of friction, however, in requiring sources to manually update their profiles every time a bit of their personal data changes. As the system exists now, American Public Media requires readers to submit full contact information every time they fill out a form. If the contact information on the form is different than what is in the database, then that discrepancy is flagged and an analyst has to manually address that conflict. In the future, in addition to enabling users to update their profiles on their own, it might also be worthwhile to explore integrating with LinkedIn, Facebook or Twitter. With LinkedIn or Facebook, the user could update their resume, contact information, etc. and have it automatically pulled into the Public Insight Network. By integrating Twitter, for example, journalists could easily find sources for a given story by having search localized to updates from users within the network.
We also discussed user privacy, which getting correct is of significant interest to American Public Media. More importantly, what control users have over their privacy and how to make policy changes without surprising or alienating them. An idea I suggested is that, rather than presenting just a list of options for the user to choose from, they should instead try a Hunch-style approach. With this, they’d be presented a series of questions detailing scenarios about their data and how it might be used. The decisions the user made responding to each scenario could then guide their privacy options. At some point, American Public Media would like to start sharing source information amongst all of their media partners using the software, but it will be critical for them to execute that move right the first time.
“What is the atomic unit of journalism?” Howard Weaver asks as he discusses the pre-launch philosophies of Peer News. This is an important question because the news startup doesn’t want to brand itself as a news site. Doing so will lock your community into thinking about you with a particular paradigm. Instead, Peer News wants to offer a range of services to its audience.
Started by Pierre Omidyar, this news startup in Honolulu will be an online-only, subscription-only approach to covering political news. Online-only means no legacy costs or issues and, at this time, they have no plans for taking advertising. In addition, one of the insights Pierre made while thinking about this is that local civic government news is an elite niche with a capacity for “hyper-efficiency.”
If you’re starting a subscription model, Howard asks, why is this going to be worth more than a free beer on Facebook?  For this operation, the compelling argument is the opportunity for a close relationship with a skilled journalist in a valuable local niche. By becoming a member of the site, the user has equal opportunity to posting their opinion and joining the discussion.
Peer News be charging $20/month because that’s what the competition charges for home delivery. There are five reporters, two web developers, an assistant editor, and an editor. By having the technology capacity internal, they hope to be able to innovate with how they package and deliver information. For instance, they hope to emulate Google Living Stories to provide contextual, canonical pages for ongoing stories and issues.
One question from the audience: how do you reconcile the fact that, by charging a subscription fee, you’re excluding some percentage of the population from access to information and democratic debate? There was also a spirited debate, albeit lacking much data, about paywalls and whether this would work well at a local level. Howard argues, however, that they’re selling the experience and not the goods.
Two reasons they think this news startup can work. First, the entire operation requires a lot less financial resources than running a print newspaper. Peer News provide high-quality journalism and break even with operating revenues of a couple million dollars a year. Second, they’ve identified a local niche whose information is high value to a certain part of the community.
Steven Johnson, with “The Glass Box and The Commonplace Book” (emphasis mine):
But they have underestimated the textual productivity of organizations that are incentivized to connect, not protect, their words. A single piece of information designed to flow through the entire ecosystem of news will create more value than a piece of information sealed up in a glass box. And ProPublica, of course, is just the tip of the iceberg. There are thousands of organizations – some of the focused on journalism, some of the government-based, some of them new creatures indigenous to the web – that create information that can be freely recombined into private commonplace books or Pulitzer-prize winning investigative journalism. A journalist today can get the idea for an investigation from a document on Wikileaks, get background information from Wikipedia, download government statistics or transcripts from open.gov or the Sunlight Foundation. You cannot measure the health of journalism simply by looking at the number of editors and reporters on the payroll of newspapers. There are undoubtedly going to be fewer of them. The question is whether that loss is going to be offset by the tremendous increase in textual productivity we get from a connected web. Presuming, of course, that we don’t replace that web with glass boxes.
Whoa, wait a second… how do we measure the health of journalism then? If we were to develop this system, would we be able to track information density of text content or derive the quality of the information produced? Could we then mash this against topical and location metadata to see how well particular communities are being served?
On Saturday, I spent the day discussing the evolution of the news at BarCamp NewsInnovation Philly. It was something I had planned on attending for over a month and, as such, I had a pretty good idea on Thursday and Friday of what I wanted to discuss. With the story of swine flu infections breaking all around us, though, I was certain we had something new that we had to talk about: the role of the news organization in an ecosystem with multiplying non-traditional means for information dissemination.
It’s the biggest story of the weekend, no doubt, but there’s a meta-discussion to be had too. I first caught wind of the story late Friday night while waiting for Sean Blanda to pick me up from PHL. Processing through Google Reader, I briefly skimmed Xark!’s “Flu: Don’t panic, but pay attention.” The honest truth, however, is that I didn’t pay attention and it didn’t stick. The next morning as we drove to Temple University for #bcniphilly I was skimming through Google Reader on my iPhone again. This time I came across a post from Vinay Gupta on how you should take action if it becomes a pandemic flu (i.e. what steps you should take to be proactive). His perspective is what perked my interest to learn more.
Tomorrow morning will find me headed to Philadelphia for Saturday’s BarCamp NewsInnovation Philly. Needless to say, I’m super stoked for this opportunity. Not only will I be able to finally meet my boss, my new colleagues, and the rest of the CoPress team I haven’t met, but I’ll get to spend an entire day, and probably much of the weekend, discussing the future of journalism with some of the smartest news folk in the country. If my flight doesn’t get laid over in Atlanta, I’d like to spend my time taking about at least a couple of different things:
Designing a News Startup From Scratch in 60 Minutes
The goal would be to rapidly prototype what a news organization of the future might look like by walking the hypothetical startup from concept to a year after launch and covering things such as: