From Second Screens To Dramatically Delayed Watching, Three Reasons The TV Model Could Fall Off A Cliff

I was lucky enough to spend some time at Swarthmore College last week thanks to a kind invitation from the Swarthmore Feminists to talk about sexuality and television, but the real treat for me was a chance to sit and chat with the students there about what television they watch, and how they watch it. Keeping in mind, of course, that this is hardly a representative sample size, that college students aren’t at the height of their purchasing power, and that it isn’t always easy to set up cable subscriptions on campus, it was still a revealing conversation, and one that did more to convince me than cord-cutting numbers have thus far that television could face a comparatively sudden realignment of its business model as a generation of television watchers comes of age, and turns out not to watch television live, in the timeslot, or through the conventional means of accessing television programming at all. I was struck by three key takeaways in particular:

1. If They Own Television Sets, It’s Mostly For Gaming: I didn’t really develop the habits of a television viewer until I’d graduated from college and had access to cable for the first time, but even then, I owned a television, and my sense was that most rooms had one, if only to watch DVDs. DVD drives didn’t come standard on laptops at the time I purchased my first one, tablets were a theoretical product, and streaming video wasn’t a commercial-ready product that could support long clips and heavy usage. Now, with all of those things standard, televisions are less a primary means of watching video content than large objects that take up space in a single or a common room. If you’re serious about video games, they might be a necessity. But all the Swarthmore students I talked to were building their habits as television viewers through their interactions with Netflix and Hulu, rather than with channel surfing. And that means a very different, and much more highly curated user experience. Television watching is a habit as much as it is an optimal consumer experience, and if it’s not developed early, there’s no reason to believe it will remain primary.

2. They Don’t Feel Any Real Hesitation About Pirating Content On Either Moral Or Quality Grounds: Only one of the students I talked to said she avoided watching content that had been illicitly downloaded on moral grounds, and she admitted that her refusal to watch movies or television that hadn’t been paid for or borrowed in some licit way marked her as something of a square. The consensus seemed to be that television networks and studios don’t really need the students’ money, that someone else is putting up the money to support the continued production of content. I obviously think that’s a fairly shortsighted perspective, but it’s illustrative of how deeply it’s taken hold as a convenient excuse for not purchasing content.

3. They Don’t Care When They See Whole Seasons, Much Less Individual Episodes: This was probably the consensus opinion that hit me hardest: the students were very comfortable with watching not just episodes of television but whole seasons of television long after they’d aired. Neither the prospect of so-called spoilers, nor the desire to engage in a cultural conversation tied temporally to air dates seemed to matter very much. And that seems like it should scare television producers more than anything else. If there’s no such thing as must-see-TV, whether it’s the old expectation that viewers wouldn’t see every episode of their favorite shows, or the modern one where it’s expected that we’ll tune in to every episode of long-arc narratives, then it’s hard to see what the television business model is at all.