An Episode About Episodes

My latest commentary for New England Public Radio is about episodes. It starts at about 7:10 below:

The original post is here

Albania Redux?

Balkan Army

Reading in RWW today about the Internet Assault on Traditional TV :

Netflix’s dominance over HBO in particular makes for some pretty symbolic future-of-TV discussion fodder. It is, after all, HBO that refuses to offer its programming as a stand alone subscription service, despite growing demand for such a option. It is precisely its old media business relationships and norms that are holding HBO back from letting non-cable subscribers use its HBO Go app, a fact that seems worth recalling at this particular moment in history. It’s no wonder that the company’s CEO is publicly rethinking that strategy and admitting to reporters that cable-free access to HBO Go may be an inevitability.

I couldn’t agree more, and thinking about “assaults” couldn’t help thinking back a couple of years when Time Warner’s CEO referred to Netflix as “the Albanian Army”.

I loved the Balkan reference at the time, but somehow it seems like we’re moving into a new era of big power.  Which QZ underlines this by pointing out that Netflix:

  1. has more American subscribers than HBO
  2. is the most watched “cable network” in the US
  3. is America’s biggest bandwidth hog, by far
  4. is the S&P’s best performing stock of the year

More than anything else what’s remarkable about Netflix resurgence into a major power is that it has done this via content creation and not via new technology. In a sense it’s not really an Internet “assault” anymore, its more a tipping point of user preferences for how they want to consume video content. With the increasing mass of young viewers moving online, all it needed was the right content to come along. If the new season of Arrested Development is a hit it could very well be to be the final battle of this Balkan era for television.

Hardware’s Next Little Things

http://www.flickr.com/photos/aussiegall/

http://www.flickr.com/photos/aussiegall/

It seems that we’ve finally passed the point of expecting some sort of big breakout hit to come out of SXSWi.  With its size and scope most of the concern of attendees was focused on dealing with the usual long lines for highly featured speakers and panels, snagging the invites for secret parties or waiting on even longer lines for the sponsored ones.  On top of this, its become clear that with no-one wanting to try anything remotely daring outside of SXSW approved events, ((homeless hotspots anyone? (and by the way it looks like they worked)), we’re left with the organizers to try and create the foundation or groundwork from which we might find the next big, or little, thing.

The problem with this of course, is that much of the programming for SXSW was sealed pretty much six months ahead of the festival, which means that the “latest and greatest” breakout hit may already have happened. This pretty much seemed to be the case looking at the lineup of keynote speakers: Bre Pettis from MakerBot, Elon Musk, Tina Rosenberg and  Julia Uhrman of OUYA. What did all of these speakers have in common? At the core of their offering and interest is the strong theme of creating physical products in a digital age. Nowhere among these high profile speakers was a new killer mobile app or a hot new social network. In fact there wasn’t really much “New”. Pettis and Musk did manage to inject some serious new into their presentations. Pettis by announcing MakerBot’s new prototype 3D scanner and Musk by showing off this amazing freshly minted, video of a reusable SpaceX rocket practicing a short take-off and landing. But without the pull of a breakout hit it seemed to me that a theme of physical applications to digital technologies had become at least a major thread this year.  Here’s a few of the strings:

Big Sensor

It started for me in a Friday Healthcare App session with a questioner who asked about how the presenters were planning to take “Big Sensor” into account.  Big Sensor? I’d been hearing plenty about Big Data, but this was the first I heard about defining a more specific subset of it as the massive and rapidly growing amount of sensor data available. In the new world of the quantified self where we, and perhaps our doctors, are all tracking our own information, sensors from fit-bits to blood meters to some scary workplace motion tracking sensors are becoming the physical appendages of data networks. Their growing use is creating a deeper need for developing a more designed approach that can integrate how we use sensor data, how we control it and how we can take advantage of it while retaining privacy and humanity.

Crowdsourced Cars

The day following Elon Musk’s presentation I went to a far more sparsely attended session that took Musk’s approach to physical production and turned it on its head. Local Motors is a company I had heard about before from via Neil Perkin, who has championed its crowdsourced approach to automobile production.  What’s impressive about Local Motors is their ability to leverage a worldwide network of enthusiasts, experts and professionals, connected by software, to design, develop, build and constantly improve a complex physical product i.e. an automobile. While their Rally Fighter is in production and street legal in the US, they are also developing a limited edition pizza delivery vehicle for Domino’s pizza and natural gas powered concept cars for Shell. But the most impressive part of their story was how they worked with DARPA to concept a vehicle for specific requirements in Afghanistan. The result, the XC2V went from concept to delivery in 14 weeks an amazingly short period of time for vehicle, or any sort of, government procurement project.

Listening to the Local Motors story it became clear theirs is a case of hardware learning from software. By using the distributed model of design they are able to use over 35,000 employees, by adapting Agile and Lean approaches of startups to their, related, Toyota Production System they are able to produce limited editions of automobiles, that are limited for the purposes of continuous improvement.  The approach is to build 1,000 vehicles and then pause and optimize instead of the expense and hassle of the traditional mass model. All of this goes to the way that hardware is rapidly becoming more customized and customizable to a defined user experience. We’ve all gotten used to software that can be tweaked and refined to our specific needs, hardware is now rapidly approaching these same capabilities.

Leap Motion

Leap Motion wasn’t new to me, but it rapidly became one of the smaller scale breakouts of the show, even though its product had been announced and on pre-order since at least December.  The biggest reason for this lies in one of the critical differences between physical and digital adoption, hands on experience. Leap had set up a tent for attendees to sample the controller and the lines outside the tent, along with a presentation by its founders, created a strong word of mouth buzz around the product.

What the Leap controller represents is another step in the growing world of gestural interfaces. Kinect got this off and running, but Leap takes it a number of steps forward especially when you consider its price, small form factor and ability to connect with multiple systems. What Leap also brings is a new relationship between physical and digital and the promise to interface with them in the same way. It also begins to ask serious questions about our basic device controllers such as buttons, keyboards and menus, but ultimately it starts blending the gap between physical and digital in ways that I am looking forward to imagining.

There were more examples of the deeper blending of the physical and into the digital landscapes, most notably full scale replica of NASA’s James Webb telescope, but one of my favorites brought it back to how marketing might start to use this combination in a far more interesting way than QR codes. During their presentation called Art Copy & Code, Google demonstrated some interesting and whimsical directions for marketers that start blending digital and physical to create more personal communications experiences. My favorite was this version of an arduino enabled basketball shoe that talked trash to its owner:

A funny, and admittedly very early, attempt at bringing advertisers into the new environment connecting digital and physical. But when you consider how hardware is making so many small, innovative advances on so many fronts its hard to imagine that we won’t wake up soon to a new normal where connected communications is part of the physical world all around us.

My radio commentary on WFCR

If you missed it, (and it was over in the blink of an eye), here’s my first radio commentary for WFCR radio on what a CD player has to do with curation:

 

 

You can read the original post this was taken from over here

House of Data

ViewofHouse

Image: modthesims.info/d/317742

I’ve had a lot of discussions with friends about Netflix recent release of “House of Cards”. There continues to be a lot of discussion well after the release date, and Rick Leibling published a very thorough post in which does an excellent job of summarizing a lot of the commentary while pointing out how Netflix is doubling down on its role as disruptor. What I’d like to consider in a bit more detail are three aspects of House of Cards that, while controversial, have led to its success and likelihood of its being a model for the future streaming entertainment.

1. Netflix is an Archive

Perhaps one of the biggest things that has bothered many commentators, and Netflix users, has been the decision to release the entire show all at once. A lot of this has been ascribed to the fact that Netflix used its Big Data muscle to learn that its audience went in for “binge” viewing of shows. Somehow I can’t imagine that this was some sort of new revelation for Netflix. This type of behavior has been going on for a long time, in fact well before they began streaming programs. In my case I was pretty much inhaling episodes of The Wire as soon as they became available on Netflix DVD. No, I couldn’t get an entire season at once, but if I could have I would, so my use case was to immediately binge on the six episodes available to me at that moment.

This binge use case reflects the fact that most of its users see Netflix as an archive. Until the arrival of original content Netflix pretty much functioned as an storehouse of video content that had been previously created and transmitted by others. Netflix’s role was to source a huge selection of this previously created content and make it available in new and innovative ways. In this role it was always possible for it to, upon acquiring a season of 30 Rock, make it available in stages or episodes. But that would never have worked for a pretty obvious reason: Why would anyone want to have limits on viewing content that had been previously released. The problem that many people have with the House of Cards release is that it is unreleased content and for that reason it has to be shown in the way that we have been used to seeing it, by rationing it and thereby creating a culture of suspense and desire.

Which is not to say that I’m opposed to that culture, but it reminds me of this quote from Paul Adams: “People applied the way they worked with existing media to the new media”.  While “House of Cards” is not “new” media its all you can eat distribution is  “new” distribution to which we are applying old rules. Netflix’s essential nature is basically of an archive, or a library. And like a library it offers pretty much unlimited access to the mix of content that it has available. Netflix has always understood this fundamental use case and so when they released House of Cards it wasn’t that they were rejecting the potential of serialization. Instead, they were making a pretty clear statement to existing users and new subscribers: This is how we distribute content, get used to it.

2. So, What is an episode?

With all you can eat distribution another question that comes up is the nature of episodes. If you release of 13 hours of content in one go, why not release it as a single 13 hour movie? This is actually a far more interesting idea to ponder than the perils of binge viewing. When you consider that there’s no sort of timed release, what reason is there any more to create content in handy one hour doses, and what are the new potentials for creativity and storytelling if writers and directors are freed from the bounds of an hourly format.

The idea of storytelling via episode reached its heyday with serializations in newspapers and journals. Dickens, Zola and many others churned out well known, and not so well known, works in chapters that appeared on a regularly scheduled basis. The use case for this type of serialization was all about the publisher and advertising. By getting readers to return to find out what happened to Oliver Twist publishers could build up and maintain readership that translated to growing ad revenue. This structure was readily adopted by radio and television networks and became the mainstay of the programming schedule that we’re all familiar with. But when you can now release an entire season at once, why shouldn’t you also be liberated from packaging it in an hourly capsule. If the story demands a full telling of in Episode 4, why not go an hour and 15 minutes, 93 minutes or even 45 minutes. When you consider that Mad Men was nearly scuttled by an argument about two minutes of episode time, imagine the creative freedom and potential you could get by allowing creators to define episode length in order to tell a better story.

This is the challenging, and exciting, question and one that will take lots of experimentation to figure out, and Netflix’s Big Data may be very useful here. Anecdotally it seems that the reason I and many of my friends watch streaming series on weekdays is because we don’t want to make the commitment to the full, ponderous and probably lengthy movie.  Whether it’s growing up with network TV or just simply a need to control time and limit deeper thinking from our weekday entertainment, the idea of specific capsules of time still seem to make sense. That being said, in today’s world of time shifting its no longer about what’s on at 9 pm. By managing viewer expectations and demonstrating the real value of giving an episode the time it deserves, storytellers could eventually be able to tell better stories and we will be end up being happier viewers.

3. Data and Creativity

Which brings me to the last point about the limits of what data can or cannot do. Much has been made about the fact that Netflix commissioned “House of Cards” based on seeing a positive data confluence around Kevin Spacey, David Fincher and the original BBC series. This has brought up the specter of bean counting software commissioning scripts based on algorithms that would deliver us a soulless version of “exactly” what we wanted. Sort of like the idea that Facebook will ultimately deliver the right content to us based on our “Likes”. While there’s always the possibility of this I think that the success of “House of Cards” proves that data, if handled correctly, can be a very good producer.

I’m a huge fan of the original BBC version, Kevin Spacey and David Fincher, but very much in that order. So when I heard Spacey utter the memorable, “You might think that…”, line from the original I felt the warm delight of nostalgia and expectation re-experience the old in a new setting. It was when I realized that I would only hear that line twice that I began to pay attention to the show itself and how its writers, directors and actors had been wrenched it out of the past and made it very much their own. The familiar players of the original have been updated and are much more deeply defined. Zoe Barnes is not the naïve, fawning reporter that Mattie Storin was in the original and Francis’ wife, a rarely seen, scary figure in the original, gets a far more complete and nuanced role in this version. Yes, there are a few things I miss from the original, there are a couple of episodes that feel flat, (did they need to fill in a whole hour?), and I have a real problem with parts of the Pete Russo story. But overall by taking complete ownership of the story they were able to retell it anew and in a way that kept me, and many others it seems, riveted. No matter if they watched it in one gulp or in hourly drips.

The role of data, as author and producer, should be like that of any great impresario. Find a great story, find the right people and then get out of the way and deliver it to the audience in the way they want it. This part doesn’t change.

And God created…a Remix

I tried hard not to watch the Superbowl this year. The teams weren’t that interesting and the ads felt even less compelling, especially the uninspiring “previews”. Besides, I knew I’d be able to catch up on them afterwards in the numerous review pages.

But the blackout conspired against me which meant that when the enforced episode of “Downton” ended, there was the fourth quarter waiting for me. And, as it turned out, what was by far the best ad of the entire game, Dodge Ram’s “So God Created A Farmer” ad.

The strength of the ad was in the way it broke through the clutter and sameness of the typical Superbowl ad. No safe frat jokes, schmaltzy humor or CGI overkill. Simply by using  a spoken word soundtrack and striking photography it teased the viewer into following a story and the soft sell, slow reveal of its sponsor.

 

 

By the next day, it was revealed that the ad, that was already generating a great deal of positive buzz, was pretty much lifted from this YouTube video created by Farms.com.

 

Farms.com had given Dodge full approval and support, as you will likely see in their video. But this certainly brought up all sorts of questions about the creativity and originality. For years marketers have begged, borrowed and outright stolen cultural artifacts and pop themes from their creators in the name of creativity and staying ahead of the curve. This case is really not all that different with the exception of the tone and level of production.The monologue in the Farms.com video includes a mild joke about the male farmer enduring “visiting ladies”, (and the audience laughter in the background). The photos are a mixed bag, they include women, but are in and out of focus, include women, but some appear to be from Canada. And many include farm equipment and vehicles, but none made by Dodge or their partners Case Tractors. The Dodge video airbrushes many of these faults by editing out the joke, using striking, high quality photography and subtly inserting their own vehicles in the images.

Brands have been steadily increasing their role in curating and creating social content. What’s interesting about this case is that the flow has gone in a different direction. Instead of brands creating content and allowing it to be socialized, in this case the brand has taken social content and branded it. I think this is an interesting flow and one that we’ll likely see more of considering the critical success of the ad. (Though I really doubt it will sell more trucks).

In this flow from social to branded content what does seem to get lost is the freshness, originality and vibrancy of the original, amateur, content. What Dodge did with their production and especially with their photography was to create a highly iconic and nostalgic view of the farmer. And while this is often what advertising is supposed to do, there is a danger here. Nostalgia reminds us of times that never were and at often can make us comfortable with our prejudices. This was brought out strongly when many pointed out that over 50% of farmers and farm workers in America are Hispanic. It’s clearly not part of the vision of the farmer that Dodge wanted to put up, and an issue that I don’t think comes to mind in the Farms.com video with all of its amateur naivete. But we’re now in a new world of remixing, where anyone can create and define their own iconography and nostalgia. So when the Brave New Foundation, posted their own remix, below, they again reversed the content flow, and helped complete a more accurate picture.

 

Why I could, sort of, Like Graph Search

I’ve become a very reluctant user of Facebook over the past couple of years. I log in once a week at best, ignore the weekly updates and never sign in to anything with Facebook. At this point I’m down to three, pretty lame, use cases for Facebook:

  • Spying on my, adult, children (pathetic)
  • Following political news and posting views to a broader network than on twitter (this kind of ended with the election)
  • Using it as a version of Patch.com to find out what’s going on locally because I don’t follow local friends on twitter, (like when we lost power in the freak Halloween storm over a year ago).

So when Graph Search launched I pretty much tried to ignore the Apple-like, shrouded in secrecy, intro event. But as I began to think and read more about Graph Search I realized that there’s potentially more to like, than there is to dislike.

For starters there’s the fact that Graph Search could be, as Danny Sullivan pointed out, a fundamentally different kind of search. It’s not the Google type search we may have been looking for or maybe expected and that’s kind of exciting. Sullivan calls it “multidimensional search” and John Battelle thinks of it as “Facebook is no longer flat”. The dimensional metaphors make a lot of sense. When you consider the possibilities of Graph Search you can see that it has the potential to add additional, and potentially very interesting, layers to the Facebook experience. And unlike recent Facebook copycat clones, (I’m looking at you Poke), there’s some serious thinking and innovation going on in terms of deployment of natural language search and linking volumes of structured and unstructured data on a massive scale.

But beyond the potential dimensionality of the Facebook experience there’s also the fact that Graph Search feels like a serious attempt to build a serious model for sematic search. We’ve been talking about sematic search for quite a while, and while there have some halting attempts, this feels like the first time someone is really trying to approach this in a committed fashion. So thinking about Graph Search as some sort of awesome Big Data project it actually begins to feel interesting. Perhaps by  drawing connections and inferences from all of these data points we can learn how people connect and maybe make all the Facebook experience a bit more interesting? Maybe Graph Search could be an alternative to what has become quickly a very tiresome stream.

Of course the real question is, could that even happen? As Steve Cheney pointed out: “much of the structured data that makes up Graph Search is…:totally irrelevant and dirty.” With all the years and dollars spent on buying “Likes”, a great deal of the semantic data in the Facebook ecosystem is pretty much polluted. It’s as if Google had launched organic search AFTER having deployed paid search, and then used paid search data as a basis for ranking.

All of this brings up the issue of the use cases for Graph Search. We’ve seen few great examples of “Stupid Graph Search” tricks like: Mothers of Jews who like Bacon on this Tumblr. And We’ll keep seeing tricks like these for a while to come reminding us of the pitfalls of semantic search within the Facebook environment. Between paid Likes and the “innovation” of frictionless sharing there is going to be a need to focus more effectively on privacy and the inadvertent settings that have become as part of the Facebook experience. And this can’t simply be the role of Facebook users, Facebook itself and the Graph Search team may have to play a bigger role in deciding how deep trolls and how relevant it makes the connections. The idea of creating “obscurity” on Facebook, as discussed in this recent article, may also be a role that that Graph Search will need to take on, on behalf of the users. (And maybe the impetus to do that would be to start thinking of them as customers instead of, how I just wrote, users). By deciding how much and what type of data to relate or interweave Facebook itself can help create a meaningful obscurity. This is a very tough problem, but it’s the responsibility Facebook has accepted by creating Graph Search and in a way, it would be pretty exciting to see them solve it.

All of which leads to the question of the use case for Graph Search. When I first heard about Google it was “there’s this search engine that gets it right and does it really, really fast”. Right now I haven’t heard a similar statement or problem/ solution set for Graph Search. Yes, there’s a LinkedIn killer use case and a Yelp killer use case, but I’m not so sure that these will really impact these established products with loyal followers. Instead it’s in the weird connections and attributes brought together by Stupid Tricks that there’s an opportunity to create value.

It may be that Facebook will have to take the lead in surfacing interesting Graph Search data and new use cases in order to gain better adoption. Obviously there’s a great use case for advertisers, but Graph Search it comes with an Achilles heel. Advertisers are already enjoying similar benefits of Graph Search through existing Facebook advertising programs. The problem for advertisers is that unless Facebook users can find their own organic and relevant use cases for Graph Search they will likely opt out of it. And as users opt-out it will set up a feedback loop of diminishing returns for advertisers.

Facebook’s beta approach to Graph Search gives some hope that this might happen. Especially if they can be patient, build up data and let the use cases occur before setting it loose on the advertising world. There’s some other large questions, such as how relevant Graph Search will be within Facebook’s walled garden. But as an experiment that could build our understanding of how people connect, while hopefully fostering “obscurity”, I could learn to like Graph Search.

A CD Solution

Photo of CD's

Image courtesy

We recently decided to let our son take our 1993 Toyota Corolla, (amazing old car with only 95,000 miles), off to college. But before he left he surprised me with a request. He wanted to see what it would take to install a CD Player in the car.  The car still has its original sound system: 4 crappy speakers, AM/FM radio and a non-working cassette player. So my immediate question was “Why a CD player”? Wouldn’t you want something more up to date, like an auxiliary jack so you can plug in iTunes from your phone?. No, he said, that was the problem. He didn’t want to have iTunes available, he expressly wanted to be able to use CD’s. And therein lies a tale of abundance and curation.

Turns out that there’s a problem with his use experience for music players in cars. Whenever he’s been driving with friends, even over the shortest of distances, the first thing that happens is a race for someone to plug their device into the car audio. Not only that, but as soon as that person’s first selection is done, someone else immediately demands to put his or her track on, or grabs his phone to look for their favorite track or asks him plug in their music player so they can play it. What bothers him the most about this is that it quickly degenerates into a completely dysfunctional music experience. (I did point out that there’s also a bit of a safety issue, but I’m just the parent). So the real problem is that while he doesn’t mind listening to multiple music sources at times, in this case he’d really like it if he and his friends could focus  one single coherent stream of music instead of jumping all over the place.

So out of desperation he decided to turn the clock back 30 years to CD’s. Why CD’s? It’s really all about curation and control. The plan is: Buy a few hundred CD’s, rip them with full albums of artists he likes, (and maybe a few mixes), and then have them available as the only music source in the car. The rationale? First, if all the car has is a CD player, the only thing that can be played are CD’s. Second, once a CD is cued up and playing it become very difficult to make instant switches to other artists or songs. Finally, and most importantly, he can now create the kind of user experience he really wants: Listening to a single coherent stream of music without constant interruption and perhaps to impress his friends with some interesting mixes.

I’m regularly fascinated by the way we are constantly cobbling together tools to try and curate the digital abundance surrounding us. What I really love about this scenario is it’s retro aspects. Its all about making an overall user experience better by going back to an older technology and making it more difficult. It feels very much like some of the strategies we’ve all seen, and used, to avoid the multitasking: Software that locks you out of your browser, shutting down your email client or setting alarms to focus your time for an hour as I did when I started writing this blog.

In a way this is all about one of my favorite Clay Shirky’s quotes:  “Its not information overload its filter failure”, but taken deeper and into more specific context. In this case we start with the obvious abundant overload of music, but in this case the technology, specifically music players, cause the filter failure by allowing filters to get mixed and delivering an experience that, for him, is less about choice and more about dissonance. So even though our devices allow us to  access and filter better than we had ever imagined, when we combine our filtering capability in this particular instance we get filter failure. Generally the usual answer for filter failure is to design a better filter, but in this case the solution is an inelegant opposite. By rolling back to an older, more difficult, technology  he ends up with a filter that forces less choice, and perhaps a better experience.

The Trough of WinTel

From Mary Meeker, KCPB Internet Trends @ Stanford – Bases Kick Off 12/3/2012

I always look forward to seeing Mary Meeker’s latest Internet Trends reports. This latest one feels more of an update and much of the core information continues the track of earlier presentations. But for some reason this slide really jumped out at me.

Maybe its just the idea of seeing Commodore, Amiga and TRS-80 in a Mary Meeker deck that gave me pangs of nostalgia. Or maybe its the surprising perspective of seeing how deeply the trough of WinTel pressed down for so long on the digital world. Imagine if the chart values had been flipped, would I have perceived it differently? The rise and fall of WinTel instead of the pressure of a single system’s domination being relieved by the sudden rise of Apple and Android?

What’s interesting to consider is what will come over the next few years on the right hand side of this half-pipe. Will it look as fragmented as the left hand side? In her closing slide Meeker says that:

This cycle of tech disruption is materially faster & broader than prior cycles…

If that’s the case, the troughs will be shallower, but I’m not sure my nostalgia will be deeper.

 

Rumormongering

I helped spread a rumor. One that was proved false, and one that says a lot about the influence of social media on news and journalism.

In my case I was flicking through the #sandy coverage and landed on CNN just as they reported that the New York Stock Exchange floor was under 3 feet of water.  At which point they tried to get a comment from their financial reporter, who was in 3 feet of water in Atlantic City, things got ridiculous and I flicked away, but not until I tweeted out my contribution.

Within a few hours, of course, the story was proved wrong. Turns out that it was part of a series of tweets coming from a twitter troll/user named @comfortablysmug. He was the source of the underwater stock exchange tweet, and an number of other scurrilous rumors, which was retweeted 600 times, eventually made it to the Weather Channel who retweeted it, which is where CNN picked it up and vaulted it into the mainstream.

My tweet was also retweeted a couple of times, which gave me some social satisfaction, but I didn’t find out that it was false until the next day.  So the question came up: Was I wrong? Should I have waited, issued a correction? Walked it back with my followers? In the end I ignored it; at least I had attributed it to CNN, let them take the heat. But it made me think more about how this situation is changing the way we are dealing with the effect social media has on newsgathering and the question of how we deal with rumor and fact.

#Sandy was a watershed for social rumormongering. The combination of a mass event that simultaneously focused millions of eyeballs and fingers and the growing symbiotic relationship between personal media and mass media proved irresistible for anyone wanting to put their own spin on events. Consider what happened on Instagram. #sandy brought on an unprecedented profusion of artful, believable, and frankly ridiculous Photoshop hoaxes. The Atlantic’s Alexis Madrigal did some remarkable live coverage of this here, and later pointed out that we still don’t fully understand what its like to experience a fast moving event through the internet:

“With old media still largely moribund and no impending changes in the information ecosystem at the major social networks, the only current systematic answer is the laissez-faire one: over time, people will learn who to trust and who not to trust based on what they post. The people who “provide value” will win. “

I’m not sure about that last part, at least in the short term. As I was writing this I saw this tweet quoting Kevin Systrom that #sandy was Instagram’s biggest moment.

While Systrom based his assessment on the staggering number #sandy hashtagged images, I wonder if another indication of the scale of this “moment” was the the level of instatrolling of events that, up to now, we’ve been used to seeing on twitter. In a way, you wonder if it’s a sign of a platform’s maturity when hoaxers, in jest or seriousness, know that they have the power to spread their hoaxes.

The larger question is what does this mean for the future of news and information? For me, aside from the issue of attribution, addressed by Maria Popova’s sensible but somewhat clunky Curators Code, one possible solution may have to be a change in our mindset and definition of news as fleeting, changeable and fungible content.  When you have millions of correspondents and publishers viewing the same event and discussing it on multiple platforms the definition of information is bound to change and be affected by how you are curating it. It’s become a sort of Heisenberg’s uncertainty principle approach to consuming information.

It’s also why some new news organizations are beginning to adapt to this streamlike approach to newsgathering and publication. Quartz, a new publication by The Atlantic Magazine, has jumped into the fray by not only creating a look and feel that is appropriate for mobile, but by also reorganizing its editorial structure around rapidly changeable topic areas that they call “obsessions”. You can also see this new approach at the ITV News site developed by madebymany. Using a mobile, stream friendly structure the site highlights the live stream of news stories, while simultaneously allowing readers to drill down as the story changes and evolves. In both of these examples, what is coming into focus is that news content needs to be delivered in a way that accept sa certain transitory nature around journalistic truth. That what you see now may be different in a few hours when additional points of view will be added.

But the best example I have seen of this came not from a journalistic institution, but from one of the new breed of citizen journalists that we are all, in a way, becoming.  Last July as the rumors of the awful movie theater shootings in Aurora began to circulate 18-year-old Morgan Jones curated a live feed of social media posts, police radio announcements and news coverage on Reddit. There was much to admire about Jones’ initiative, gumption and desire to get the story out on social media. But along with the task of simply getting the story out there he also displayed an innate understanding of the shifting shape, arc and ultimate changeability of social news through the simple use of strike-throughs.

“I don’t delete things and replace them with something else,” you told NPR’s “All Things Tech.” “I do a strike-through and put what [latest information] I have below it so it gives people an idea of how it’s changing, so it’s transparent.”

For me strike-throughs have always been about a kind of hipsterish fashion, style irony. In this context they have become a signal for a new way of consuming content, rumors and all.