Thursday, December 31, 2009

The keys to media aggregation success

Not too long ago, I wrote about a new wave of disintermediation, but I've realized that consumers don't inherently dislike middlemen; in fact, they appreciate them if they aggregate content. For an aggregator to be successful, however, it has to:
  1. Have a comprehensive selection of content
  2. Be easy to use
  3. Its prices (if it sells goods and services) don't have to be the lowest, but they must be reasonable
The first key is to have a comprehensive selection of content. Record companies learned years ago that consumers won't shop in stores where they can only get one or two companies' music. They want a big selection. That was why Apple didn't launch its iTunes Store until it had signed distribution deals with all of the biggest record companies.

The second key is ease of use. There were plenty of online music sites before iTunes, but they were hard to use and imposed draconian DRM schemes on users. The motion picture companies had the same problems with their early attempts at making movies available over the Internet.

The final key is reasonable pricing. Early on, record companies tried to demand more money for digital downloads than they did for CDs, and they tried to force consumers to purchase entire albums rather than single tracks. Apple sold them on the idea of a fixed price per track and discounted prices for entire albums.  Now, Amazon is rewriting the pricing model for eBooks by selling almost all its titles for $9.99 or less.

Apple was the first company to get all three keys right, with the iTunes Store. Tight integration of iPods and iTunes helped the company get the ease-of-use part right. Amazon learned from Apple and implemented a similar model with the Kindle, which also gets all three keys right but is somewhat vulnerable due to the technical and ease-of-use limitations of the Kindle itself.

On the video/movies side, YouTube leads by far in the free content space. Hulu has gotten the ease-of-use key right and has the biggest selection of legitimate content from the US television networks, but many users are frustrated by convoluted policies that make episodes available for only a limited amount of time or restrict the number of episodes available. Netflix and Amazon are both working to make more movies and television shows available for immediate viewing, but they're not there yet, and both their ease-of-use and pricing models are "works in progress".

TiVo and Roku are both positioning themselves as "super-aggregators", in that they already offer access to both Netflix's and Amazon's libraries, plus content from an expanding number of producers and aggregators. TiVo got ease-of-use right a long time ago, but its Achilles' heel is its monthly service charge. Roku's user interface is less mature, but it doesn't charge a monthly fee to use its set-top boxes. However, its weakness is that its set-top boxes are only sold direct, not through outlets such as Best Buy and Wal-Mart. Without high-volume outlets, Roku will always be playing catch-up with its better-distributed competitors.

In music, Apple has locked up a dominant position, and Amazon is well on its way to doing the same thing in eBooks. In video, television and movies, however, the only truly dominant player, YouTube, is free. It's far from certain that YouTube can maintain its dominance once it starts to charge for access to some content, which is widely rumored to occur in 2010. For these media, the field is still wide open.
Reblog this post [with Zemanta]

A Big January Coming from CES, Google and Apple

Image representing Nexus One as depicted in Cr...Image via CrunchBase
January 2010 is going to be a big month for new product announcements. First up is Google, which has scheduled a major announcement for next Tuesday (January 5th) on the Googleplex campus, most likely to showcase the Nexus One mobile phone in partnership with T-Mobile. As I speculated earlier, leaked details indicate that there will be only one rate plan available for the phone if you get it from T-Mobile, but it's not a bad one: Unlimited voice and data for under $80 US a month. The phone will be subsidized if you get it from T-Mobile; the price will be $180. The price of the unlocked phone from Google will be $530; I thought that Google would at least partially subsidize the price of the phone and offset it with advertising revenues, but I was wrong.

Consumer Electronics ShowImage via Wikipedia
The Consumer Electronics Show (CES) opens in Las Vegas next Thursday, January 7th. There's not a lot of pre-show buzz about new products, but here's what I expect to see:
  • Several new eBook readers (both hardware and software). In software, the biggest noise is likely to come from Kurzweil and Baker & Taylor's blio--eBook reader software designed to maintain "page fidelity" rather than make eBooks readable on devices with tiny screens. There will undoubtedly be several hardware eBook reader announcements, including some with two-page displays and full color.
  • More Internet set-top boxes like the Roku. This might be where Comcast first shows the Roku-like set-top box that it's been working on to support its Xfinity service, and Video Business Magazine
    Image representing Roku as depicted in CrunchBaseImage via CrunchBase
    accidentally broke an embargo yesterday on a new set-top box called Popbox. Also expect to see lots more Internet-enabled Blu-Ray players supporting Netflix, Amazon On Demand, YouTube and other services. Internet connectivity will be the thing that drives sales of Blu-Ray players, not Blu-Ray itself.
  • More companies will jump into the "dead-simple" camcorder space pioneered by Cisco's Flip. Samsung recently shipped its first model, and I expect to see Panasonic make an announcement as well. Expect to see more models with image stabilization, more control over image quality and better sound, as well as WiFi and geolocation. The challenge will be to make camcorders that are more sophisticated but still inexpensive, small and simple to use.
  • There will undoubtedly be more waves of HDTVs, including someone pushing the "world's biggest" model. There will be more OLED models on the floor, still at stratospheric prices, but with larger screen sizes that are more practical for everyday use.
  • Expect more add-ons for the iPhone and iPod touch to increase their functionality, and possibly, the first wave of similar add-ons for the Motorola Droid. Hardware add-ons for Android phones will be much more difficult to monetize because there's no standard form-factor or dock interface, but some companies will jump into the market.
Apple's announcement at Yerba Buena Center in San Francisco on January 26th is probably the most-anticipated event of all, because the expectation is that Apple will announce a tablet computer. The rumor mill has been going wild with this one, but I wouldn't be surprised to see Apple announce a dramatically expanded iTunes Store along with the tablet with a big selection of eBooks, magazines and newspapers to go along with the music, video and movies already on sale.

Pricing will be a huge issue. If Apple announces a tablet for around $1,000, as has been rumored, its market is going to be very limited. I think that they have to keep the price for the tablet to $500 or less, and they may have to get there by striking an exclusivity deal with a mobile operator such as Verizon or AT&T to subsidize the cost of the tablet in return for a two-year service contract.

In any event, this is going to be a very busy January.

Reblog this post [with Zemanta]

Wednesday, December 30, 2009

Nothing Lasts Forever: The 12 Most Tarnished Tech Brands

Harry McCracken published his list of the 12 most tarnished tech brands on his Technologizer blog. Some of them are near-forgotten, such as AltaVista, Commodore, CompuServe, Netscape (both of which were driven into the ground by AOL) and Packard Bell. A few are once-well-known brands that are now used by anyone with the money to license them, such as Polaroid and Westinghouse, and one, AT&T, is used mainly for convenience by its acquirer, SBC. One brand that I would have included is Compaq; once one of the most powerful companies in the computer industry, the Compaq trademark is now used to brand HP's cheapest and junkiest PCs.

The list reminds us that even the biggest and most powerful companies can eventually fall apart and become irrelevant. MySpace is heading down this path, and Yahoo! isn't too far behind. Motorola has to prove that the Droid isn't a one-shot wonder like the RAZR was in order to keep off the list. Nortel will certainly be on this list if its trademark continues to be used. Alcatel-Lucent fits the list as well, since it probably won't survive in its current form for much longer.

Who do you think is a candidate for ending up on the scrap heap in 2010?
Reblog this post [with Zemanta]

Saturday, December 26, 2009

In Silicon Valley for the holidays

I've been back in Silicon Valley for the last few days; tomorrow, I head back to Chicago. Not a lot has changed since I moved a year ago. Just about everything is more expensive, from gasoline (30 cents more a gallon) to food, although I got a great deal at a local hotel. It's certainly a lot warmer here, and there's no snow; I'm returning to heavy snow and a high around 25 degrees F in Chicago.

Stanford Hospital has taken over the old Excite@Home building in Redwood City, a huge white elephant that was a symbol of the dot-bomb implosion (Excite@Home went out of business before the company could occupy its new headquarters, and the building sat empty for years.) However, new white elephants have taken its place; there are two empty office towers near Great America in Santa Clara, and other, less visible empty buildings and "see-throughs" scattered throughout the area.

It feels as though the Valley is on hold, not dead, but the pre-recession energy of the place has dissipated. There's a sense that post-dot-com crash and post-recession, the Valley won't have the vibrancy that it once had. That's not to say that there isn't still a lot of activity here--after all, Apple, Facebook, Google, HP, Intel and Nvidia are all headquartered here, as are hundreds of other high-tech and biotech companies. However, I think that we're likely to see Silicon Valley become a satellite location for many companies, rather than headquarters.

It makes sense to have a presence in Silicon Valley when it's appropriate. For example, you may need an engineering team with skills that simply aren't available anywhere else, so it would make sense to set up a development office here. However, just about every other skill set you're likely to need is available in quantity in other parts of the country and world. Companies based outside the Valley should only put operations here that justify the extremely high cost of doing business.

Tuesday, December 22, 2009

Helping startups get started

I moved to the far northwest suburbs of Chicago a year ago from Silicon Valley, but I've stayed interested in and involved with new ventures, having founded or co-founded three companies myself. One of the things that surprised me was how much more difficult it seems to be to start new businesses here than it is in Silicon Valley. There are plenty of universities to provide technology and motivate students, including the University of Chicago, Northwestern, IIT and the University of Illinois. The Chicago area also has Argonne National Labs and Fermilab, two of the top scientific research facilities in the country. 37Signals and Threadless thrive here, and I know of a number of startups that are under the radar. Nevertheless, for those startups that do get traction, there's overwhelming pressure to move, usually to Silicon Valley or New York.

I recently signed up for The Founders' Institute, a program of lectures and team assignments designed to help aspiring founders to gain the skills and make the connections that they need for success. I was accepted but had to decline when I learned that I couldn't participate remotely and would have had to fly back to Silicon Valley for all of the sessions, or attend sessions in one of their other cities, none of which is even remotely convenient for me. There are other groups that do similar things, from Y Combinator (the best-known of the group) on down. The problem is that all of these groups depend on getting members and lecturers together in one place over a period of months. It won't work if there aren't a lot of qualified lecturers and interested participants in a city.

Given the success that for-profit educational instutitions such as University of Phoenix and DeVry University are having with remote learning, I'm convinced that a similar approach will work for training and encouraging new business founders, no matter where they're located. We use the Web for collaboration, messaging, teleconferencing and entertainment all the time--why can't we use it to help people learn how to launch their new businesses, wherever they are?

Let's be realistic--there are millions of jobs in old-line manufacturing industries that are gone in this recession and will never return. We have to encourage new ventures across the U.S and create jobs where the people are. Let's use the tools that have so dramatically lowered the barriers to entry for technology companies to lower the barriers to entry for teaching and encouraging entrepreneurs, across the country and around the world.
Reblog this post [with Zemanta]

Monday, December 21, 2009

It's the predictions time of the year, and some are better than others

The holiday season is wonderful, but it's also the time where bloggers, editors, pundits, psychics and all of us slightly- to completely-uninformed people issue predictions about what is going to happen in 2010. I just read a few of the predictions for 2010 from Danny King of Video Business, and I had a few, uh, disagreements.
TiVo's logo, a smiling television setImage via Wikipedia


Image representing Roku as depicted in CrunchBaseImage via CrunchBase
For example, King thinks that it's a foregone conclusion that Amazon will buy Roku. Not very likely, given that the Roku set-top box was designed by Netflix and was originally supposed to be a Netflix-branded product. I'm sure that Netflix still has first dibs on the product, and perhaps on Roku itself. He also predicts that Tivo will be sold to Best Buy. Huh? Tivo is turning into an audience- and advertising-research company; Tivo's DVR sales and market share continue to decline, and Best Buy buying the company wouldn't change that. I think that both companies will end 2010 as independent entities, but if I had to come up with likely purchasers, I'd vote for Cisco for Roku and Google for Tivo.

SAN RAFAEL, CA - AUGUST 14:  The RedBox logo i...Image by Getty Images via Daylife
He also thinks that Redbox will do a deal with Starbucks, but only if they come up with kiosks that do digital downloads (to thumbdrives or SD cards, I presume.) Starbucks tried it with music and it didn't work, so why would they think that it would work any better with video? My personal opinion is that the digital download kiosk model being pursued by Blockbuster and MOD Systems will be dead on arrival. Why would I drive to Starbucks to load a video onto a thumbdrive when I can download it over the Internet to my PC or stream it to my Netflix- or Amazon-equipped set-top box or Blu-Ray player?

Another of his predictions is that NCR will drop out of the video kiosk market, just after the company acquired DVDPlay; his logic is that NCR is number one in ATMs and doesn't want to be number two in video kiosks. True, they don't want to be number two--that's why they purchased DVDPlay and partnered with Blockbuster. There is certainly room for more than one company in the video kiosk business, and I think that NCR will stay in.

I've learned not to make end-of-the-year predictions, because too much can happen too quickly to anticipate. I will state a hope for 2010, however, and that is that the economy recovers, so that the millions of people without jobs can find work, and the millions who are underemployed can find full-time work and better opportunities. I hope that you have a happy holiday season and a healthy, loving and comfortable 2010 for you and your family.

Reblog this post [with Zemanta]

Saturday, December 19, 2009

If you want a better tool, build it yourself

Screenshot of Orson Welles in The Lady from Sh...Image via Wikipedia
I just finished reading a great post on Gizmodo by Frank Beacham about Orson Welles's last project and his fascination with the then-new Sony Betacam. Welles took one look at the first professional camcorder and intuitively understood what its impact on video and film production would be. Beacham's article brought to mind how many great filmmakers over the years were also technologists: Not only did they use the tools available, but they helped to design them (or actually designed them themselves.) The recent list includes:
  • Francis Ford Coppola, who partnered with Sony for many years and was one of the first to apply video to motion picture production
  • George Lucas, who developed one of the first computer-based non-linear editing systems, owned Pixar (and still owns ILM, Skywalker Sound and Lucas Digital) and pushed the limits of digital production, post-production, computer animation and special effects
  • Garrett Brown, a cinematographer who invented the Steadicam and Skycam and changed the way that both motion pictures and sports television look
  • James Cameron, who first pushed the envelope of computer graphics and special effects, and later, with his partner Vince Pace, created the Fusion 3D system that's helping to make 3D a core production and display technology
There's an old saying: "If you want a better tool, build it yourself." These artists and others like them built their own tools in order to create art that could never have been made before. More importantly, they share their knowledge with others in order to advance the "state of the art." If anyone ever asks the question "Are engineers creative?", the answer is everywhere. Creativity is essential to engineering; sometimes, it results in masterworks in software, consumer electronics or architecture, and sometimes, it results in brilliant motion pictures and videos.

If you want a better tool, build it yourself.

Reblog this post [with Zemanta]

Tuesday, December 15, 2009

Where should you locate your startup?

I monitor the Lean Startup Circle group on Google, and a member asked for some suggestions on how and where to find a contract development team. The discussion quickly turned to relocating to where the team is (the member who asked for advice was in Denver); Silicon Valley came up a few times, and one person even suggested relocating to India. My suggestion was to stay right where he was, find a qualified developer locally to run development (I found some good resources in the Boulder area, and other members in the area offered their help), and go from there.

So, where should you locate your startup? (I'm assuming that your business will be technology-based.) If your customer base is concentrated in one geographic location, the answer is simple--go where your customers are. However, if your customers are spread out all over the place, should you stay put or move? It depends on who you (and your partners) are and what your expertise is. If you have the experience to develop at least a portion of the product or service yourself, and you're comfortable managing a development team, you can locate wherever you're comfortable and where you can find the other business and technical resources you'll need.


If you're not a developer or engineer, you need to have at least one person on your team who can run development. That person should be a full member of the team, not a contractor or consultant. No matter how good or committed a contractor is, they're always thinking about the next client and the next project. The work is always done better when the person who does it has skin in the game. It may not be as hard to find that person as you think. If you live in or near a major city, there are always developers that might be interested, or who might know someone qualified who would be interested. Search on Google with your city's name and terms like "startup" and "venture" to find local groups and events where like-minded people congregate, or use a service such as Meetup.com.

That's fine, you say, but why not move to Silicon Valley? I spent more than 25 years living and working there; I consider it my home. You'll find experts in just about every skill set you can imagine. I love the weather; not too hot, not too cold, and you're no more than a few hours from the beach or the mountains. Now for the downside: Silicon Valley is an incredibly expensive place to live, work and run a business. I would easily have to pay 50% more than I pay now for a condo comparable to the one I rent in a suburb of Chicago. If I wanted to buy a home, I'd pay at least three times as much in Silicon Valley for a home with comparable square footage and yard space. Taxes are very high, yet the quality of schools is poor, and parents pay big premiums to live in cities that have good schools, such as Los Gatos and Palo Alto.

Just about everything else is more expensive as well: Food, gasoline, utilities and so on. Office space is much more expensive. People have to earn more money in order to have a decent quality of life, so salaries are much higher. It all adds up to a much higher burn rate than in other, less expensive places to live.

I've never understood why venture capitalists push their investments to move from lower-cost areas such as Texas and Chicago to Silicon Valley. Yes, investors can keep closer tabs on their investments if they can drive over to them, but airfare is truly not that expensive, and teleconferencing is effectively free. If I could run my business successfully at a 30% to 50% lower burn rate simply by staying right where I am, why would I move?


Reblog this post [with Zemanta]

Monday, December 14, 2009

The "Google Phone" (HTC Nexus One) Begins to Make Sense

A short time ago, Engadget posted part of the FCC certification for the HTC Nexus One, the phone that Google mass-distributed to its employees last weekend. There's been a lot of speculation that the GSM-compatible would be sold unlocked by Google (meaning that in the U.S., it would work with AT&T and T-Mobile.) Frankly, a lot of the story didn't make sense--why would Google start competing with its biggest distributors just as Android started getting market traction?

The FCC certification shows that the Nexus One will work on a variety of international GSM networks, but it will only work in the U.S. in G3 on T-Mobile--AT&T customers can use it as a phone, but data speeds will be limited to EDGE. And, now the story begins to make sense. T-Mobile has been Google's primary partner in the U.S. since the launch of the first Android phone, the G1.

So, here's my speculation: Google is going to sell the phone, and technically, it will work on either T-Mobile or AT&T, but there will be a special T-Mobile account just for the Google Phone. It will be based on T-Mobile's Pay-as-you-go pricing models, and it can be considerably less expensive than T-Mobile's prepaid plans because T-Mobile isn't subsidizing the price of the phone.

Google will, in my opinion, subsidize the price of the phone, because the user will be locked into a suite of advertising-supported Google functions that work anywhere, even on WiFi, and even if the Nexus One doesn't have any GSM SIM card at all. (Yes. that means that Google Phone users will be able to take advantage of Google Voice wherever there's an open WiFi hotspot.)

T-Mobile won't be threatened by the Google Phone, because they'll be the preferred broadband voice and data service. Verizon won't be threatened, because the T-Mobile 3G network is even less well built out than AT&T's. Sprint has a foot in just about every camp, and they're becoming less of a market factor every day. AT&T is hostile to Android, so there's no reason for Google to play nice with them. Perhaps most importantly, Google has a chance to dramatically increase market penetration of Android phones and the appeal of the Android platform to developers, and they'll move a lot more mobile advertising inventory.

Wednesday, December 09, 2009

The inside scoop on buying market research reports

Whether you're a marketer, product manager or investor, you can't help but see articles about newly-released market research reports in the trade press, on websites and in blogs. These reports could be a gold mine of information about your customers, vendors, competition or the market in general. However, there are lots of reports and publishers to choose from. Should you consider them at all, and if you do, how do you choose?

I earned my living for a few years writing these market research reports (sometimes called "syndicated" reports, because they're not sponsored by or paid for by a single company.) In that time, I learned a lot about how the business works, what to look for and what to avoid.

The first rule is that syndicated research will never replace talking to your current and potential customers directly. You can often learn more from your customers, for less money, by talking to them yourself. However, if you need research that would be too expensive or too impractical to gather yourself, or if you're starting a new business and you don't have a customer base to talk to, syndicated research may make sense.

You may be able to find the information you need for free. If you're looking for demographic and economic data and forecasts, national and state governments publish a wealth of information that's almost always available for free, online or at a local library. For U.S. statistics, USA.gov is a great place to start.

Just about every research company sends out press releases with valuable statistics and findings when they release a new report. A table of contents is also usually available from the company's website, which will give you a good idea of what the report actually covers.

If the report is newsworthy enough that the researcher who wrote it is interviewed or quoted in an article, they usually reveal additional facts not included in the press release. By starting with the press release (which you can get from the research company's website, PR Newswire, Business Wire, Google, Bing and many other sources,) and then searching for other mentions of the researcher/analyst or research company, you can start fitting together a picture of the report like pieces of a puzzle. Even if you can't get all the answers you need this way, this technique is a great way to narrow down the list of reports that might give you what you're looking for.

Let's assume that you've now got a list of two or three potential reports that might serve your needs. How do you choose? Here are a few guidelines:
  • It's common to look at the size and price of the report, and go for the "best value": A 300-page report priced at $1,500 costs $5 per page, while a 60-page report priced at $600 costs $10 per page. What matters is the quality of the information and how applicable it is to your needs, not the page count or the cost per page. Many reports are padded out with duplicate information, boilerplate tables and data that you could find by yourself with a few hours of work.

  • Most research publishers list biographies of their analysts and managers on their website. Some research firms hire freelancers to write reports, and their information may not be as readily available. Try to find out who actually wrote the report you're interested in. Search for them on Google, Bing, LinkedIn and other sites, and see how much experience they actually have in the area they're researching. It's not uncommon for reports to be written by researchers fresh out of college, or who came from a completely different market area. That doesn't necessarily disqualify either the researcher or the report, but it should definitely be a question mark.

  • The age of a report is very important. Research reports generally have short shelf lives, especially in industries and markets that are changing rapidly. Even in markets where change historically has happened much more slowly, it only takes one major disruption to invalidate a shelf of reports. A  report written in 2007 forecasting the U.S.automobile market for the next five years would look like a description of an alternate universe today. I'd be skeptical of the accuracy and relevance of any market research report that's more than two years old.

  • If the report that you're interested in is part of a series, see if the publisher will send you a back copy for evaluation before you buy. A report that's a few years old shouldn't be used for planning, but it will tell you a lot about the kinds of information that you can expect from the current edition. Ask the publisher if the researcher/analyst, methodology or organization of the current report has changed since the old version was released. High turnover of researchers and analysts can be a red flag.

  • Ask the research company for a current list of clients. Most companies pad their lists with clients they had a few years ago, but have since left. Compare the current client list with the list on their website. If there are a lot of names that are on the website list but not on the current client list, that's another question mark--why did those clients stop buying reports or services?

  • Are the research company's clients mainly product/service vendors or end users? Research publishers target their reports to their primary customer base. End users generally demand greater accuracy and impartiality, while vendors are interested in making their products look better, and both their sales and the overall market opportunity look larger, than they may actually be.

  • If the report you're interested in includes forecasts, it's important to get previous years' forecasts from the publisher and compare them with what actually happened. If a research company primarily sells to vendors, their forecasts are likely to be higher than what actually happened. Optimistic research reports sell; pessimistic ones (generally) don't.

  • The market research company should be willing to share its methodology with you. How many end users or vendors did they interview, how did they conduct the interviews and gather the data, and how did they compile the data? They may not be willing to share everything with you for proprietary reasons, but they should be able to answer the questions above.

  • The size of the research firm is not a good indicator of the quality of the report. Bigger firms don't necessarily do better work. One major firm recently released a study that purported to tell end users which "white box" video content management services vendor would be right for their needs, but they cherry-picked six vendors for comparison, then were forced to issue clarifications and explanations as to why they limited their report to those six vendors. A few weeks later, the vendor who was picked as number one in the report did a webinar with the research company to sell its services. Whether or not there was a business relationship between the winning vendor and the research company before the report was released is immaterial; by partnering up to promote the findings of a clearly flawed study, both the research company and the vendor were tainted. (By the way, the report is still on sale for $1,750. Caveat emptor.)
In an economy where every dollar counts, don't waste your money on outdated or inaccurate syndicated research. Do your homework, and you might find what you need for free. When you need to purchase a report, spend your money wisely.
Reblog this post [with Zemanta]

Tuesday, December 08, 2009

What does "Pivoting" really mean?

Lean Startup and Customer Development are techniques/processes being used by a lot of startups, especially software and services companies. A term commonly used in both techniques is "pivoting". It means that the company changed direction--it was developing a floor wax, which no one wanted, so it "pivots" to develop a dessert topping. Gratuitous SNL reference aside, what usually happens is that the company developed a product with one feature set, then learned that what customers really wanted was a different feature set. In forums on the Web, I often read about companies that pivoted, sometimes three or four times.

What pivoting really means is "We got it wrong." Don't take this the wrong way--I've gotten it wrong many times in my career--but often, a company could have avoided pivoting if it had done more homework upfront. So, how do things go wrong?
  • The team understands technology but not the market: They spot what looks like an opportunity, but they don't really understand the domain all that well, so they define a product or service that looks good to them but not to their target customers.
  • They talk to customers but don't listen: Even when a startup sets out to talk to customers, they dismiss negative feedback--perhaps their product is "too advanced" for the customers they're talking to, or the problems that customers are expressing aren't really problems, or frankly, their customers are stupid.
  • They don't ask the right questions, or they don't ask them in the right way: Large and small companies alike distribute huge, complex, poorly organized surveys that scare off respondents, get low response rates and are too small a sample to be representative of their target customer base. Or, they use focus groups, which have their own set of risks and are often used the wrong way (the company wants to get projectable results when they actually get impressions from a small subset of customers).
  • They copy what competitors are doing: The team starts with an existing product or service and then does a variation--cheaper, faster, or more features. That assumes, however, that the competitive benchmark is actually successful or has features worth replicating. It may turn out that the competitive benchmark isn't successful, and the company ends up replicating a failure.
The solution is to understand more, earlier in the process. Whether that means talking to more customers, asking the right questions in the right way, doing more secondary research or bringing a domain specialist into the team, knowing more upfront is likely to result in a better product/market fit, fewer pivots and faster revenue growth and profitability. None of this invalidates creating Minimum Viable Products or an iterative product and customer development process. It just means spending more time to insure that you're pursuing a real opportunity and not a mirage.
Reblog this post [with Zemanta]

Friday, December 04, 2009

Comcast/NBC Universal: It's Not AOL Time Warner

Comcast's acquisition of 51% of NBC Universal from GE has been derided by some observers as the second coming of the AOL-Time Warner deal--two big media companies merging with few real synergies. On the contrary, I think that it's a very good deal for both companies--but it's not without risks.

AOL was "circling the drain" before the merger with Time Warner--subscriptions rates were flattening out, churn was increasing, as were subscriber acquisition costs. The company was hard-pressed to find growth, so it instead engineered one of the dumbest mergers in U.S. history, getting one of the biggest media companies in the world to essentially give itself to AOL. (Let's be clear...the merger was dumb for Time Warner but brilliant for AOL.)

By contrast, NBC Universal is in far better shape than AOL was. NBC's broadcast network is a mess, and the Universal movie studio is questionable (as it's been ever since MCA was acquired by Panasonic years ago), but its cable networks are generally strong, well-run and profitable. It's the cable networks that formed the primary reason for Comcast's interest.

The FCC is almost certainly going to require Comcast to either divest NBC's owned-and-operated television stations in markets where Comcast has cable systems (in Chicago, Philadelphia and Washington, D.C., among other cities) or its cable systems in those same markets. I suspect that it's the television stations rather than the cable systems that will be sold off.

Antitrust arguments against the merger are going to be a lot harder to make; for years, Time Warner owned Time Warner Cable (the second-largest cable operator), a movie studio and a collection of cable networks at least as powerful as those of the Comcast/NBC Universal combination without running afoul of antitrust regulators. Comcast has already pledged to make NBC Universal's cable networks available to competitors. The deal is likely to get done without major concessions beyond those required by the FCC.

The NBC television network can be fixed; it fell from first to fourth place in little more than a year, and one or two years of strong program development could turn things around. (To do so, however, Comcast will have to get Jeff Zucker and his cronies away from the network and install a new programming team.) Universal is a bigger problem, in that Comcast will be its sixth owner in less than 20 years, and no one in that time has figured out how to return the studio to success. The solution may be to sell off Universal in parts, keeping its library and selling off the ongoing studio operations.

NBC Universal's digital assets have been called a key reason for the deal, but I think that they're clearly the tail in this deal, not the dog. The most important digital asset is Hulu, but NBC Universal is a minority owner. Comcast will get a seat at the table, and Hulu will get to play in the TV Everywhere initiative, but it's not going to negate News Corporation's and Disney's interests.

I've learned from my own sources is that Comcast is working on its own low-cost, Roku-style set-top box to make its Xfinity service available on television sets without having to replace millions of existing set-top boxes. This could become the "official" mechanism through which Hulu will get to television sets.

In short, this deal makes sense for both Comcast and GE: Comcast gets control of a treasure trove of content, decreases its costs for distributing some of the most popular cable channels (they become internal transfer costs instead of outright expenses) and gets partial ownership of the Internet video distributor that poses the biggest risk to cable operators. GE gets out of the entertainment business without taking a financial bath, and can focus on industrial, medical and financial areas. The merger will almost certainly go through.

Reblog this post [with Zemanta]

Sunday, November 22, 2009

Cable Networks 2.0 (or 3.0)

The cable network model that we all know, which was based on the broadcast television model that we all know, is this: A centralized organization acquires programming, schedules and distributes it to affiliates (broadcasters) or cable operators. The network produces some of its own programming (or most of it, if it's a news or sports channel), but it acts primarily as an aggregator, scheduler and dsitributor.

It was a wonderful model for 1925, or 1949, but it's completely obsolete today. It was based on the technical limitations of the dawn of the radio and television eras, limitations that no longer exist. It was effectively impossible to have a two-way conversation between media creators and consumers prior to the Internet and broadband speeds. Now, we've got the means for that two- (or N-) way dialog. The cost of production and distribution is a tiny fraction of what it was even thirty years ago, which was in turn far less expensive than what was being done in the 1960s. YouTube...well, you know all about YouTube, and Vimeo, and Dailymotion, and, and...

My point is that the one-way, centralized network model is obsolete. I don't believe that a new, one-way network will be successful. Future cable networks will have to bake an open, two-way model into their architecture from the very beginning. What does that mean?

It means that the network becomes more of a curator than an all-powerful programmer. It selects and makes available content from external producers, internal teams and viewer/producers (since viewers can now easily be their own producers). It also enables viewers to curate their own programming and make their own selections.

The production process will become far more distributed. Viewers with a few thousand dollars and a high degree of patience can create content that looks as good as anything seen on broadcast television or cable. Field production is simple; it's done thousands of times a day. Studios can be built and sent anywhere. A shipping container can be turned into a perfectly functional television studio. Put it on a fast-and-dirty foundation and you've got a permanent studio. If you want room for an audience, there are a lot of older movie theaters out there being underutilized or gathering dust. Extend the stage, put in LED and fluorescent lights to keep the heating load down, and voila, instant studio!

You may argue that this model has already been tried, at Current, and it hasn't worked very well: Current TV just laid off 80 staffers, shut down production on some shows, and is consolidating two Los Angeles facilities into one. However, the problem wasn't with the production model, it was with trying to fit that model into a conventional cable/broadcast channel. The most-watched shows on Current have been InfoMania and SuperNews: Fairly conventional (from a structural point of view) 30-minute productions that viewers can find easily and that are repeated many times during the week. The bulk of Current's airday has been taken up with brief, four-to-eight minute videos, many of which are submitted by viewers. The problem is that it's been impossible to know exactly what's going to be on when. If you happen to tune in when they're showing a video that's engaging, you're likely to stick around for a while, but if you don't like what you see when you first tune in, you're unlikely to wait around for something better.

The problem with Current TV is that it's programmed from the top down, just like any other cable network, even though viewers contribute a lot of content. Current also has a web presence that allows a more egalitarian approach to programming (in other words, watch what you want, when you want), but with serious limitations: Cable operators prohibit Current.com from running its on-air feed live, or from making programs available prior to their airdates.

That brings me to my last point: The cable network of the future will reside primarily on the Internet, not on cable. So long as the cable operators can dictate terms of when and where programming can be shown, no cable network can become a truly two-way operation. That's why Current is struggling, and why Hulu is only a shadow of what it could be.

In the future, the cable network will be equivalent to the "curated feed", but the open ecosystem will reside on the Internet.

Reblog this post [with Zemanta]

OWN: DOA?

Last week, Oprah Winfrey bid a tearful farewell to her show, telling viewers that she will end her run on syndicated television in 2011. She said that the reason she's leaving is that 25 years are enough, but plenty of industry scuttlebutt contradicts her. Discovery Networks has been working with her on a new cable network, the Oprah Winfrey Network (OWN), which will replace Discovery Health in 2011. The launch of OWN has already been delayed twice, and there's been a revolving door in the management suite, with most executives only lasting a few months.

The oft-repeated rumor is that David Zaslav, President & CEO of Discovery, read her the riot act: Bring her show to OWN and help to get the operation under control, or lose the network. The first shoe dropped a couple of weeks ago, when she sent some of her top managers to take over key positions in OWN, and the second shoe dropped last Friday, with her announcement that she'll be leaving her syndicated show. Zaslav's ultimatum may not have been THE reason why she's leaving her syndicated show, but it's a pretty good reason nonetheless.

The question is, what is she going to? Women's networks on Cable have had an upward struggle: Oxygen, in which Winfrey was a partner, launched with high audience expectations that were never met, and ended up being sold to NBC Universal. Will OWN fare any better? Unless it pursues a radically different model than that of today's cable networks, it won't. That's the topic of my next entry.
Reblog this post [with Zemanta]

Saturday, November 21, 2009

One Reason Why the Movie Business Is In Bad Shape

Earlier this week, Peter Sciretta of Slashfilm and Jason Kottke of kottke.org reported that only two of the top 30 films of the decade were based on original material: Disney/Pixar's "Finding Nemo" and DreamWorks Animation's "Kung Fu Panda". In fact, only nine of the top 50 films were based on original material, and five of them were from Pixar! Everything else was based on an existing motion picture, novel, comic book, or in the case of the "Pirates of the Caribbean" series, a theme park ride!

The movie business has been on a tear, with ever-increasing budgets fueled by DVD revenues. To mitigate the risk of failure for those stratospherically-budgeted movies, studios produce films based on known properties. Original works get crowded out or pushed to the studios' "independent" arms, which operate on relatively small budgets and get very little promotional support unless a movie has Academy Award potential.

The result is like never eating a meal that you haven't already eaten. Yet Pixar, which has only created one sequel in the history of the studio, has the best track record of any studio in terms of average revenue per movie. Is the lesson here to create fewer but better, more original, movies? Something to think about.

Reblog this post [with Zemanta]

Tuesday, November 17, 2009

Disintermediation, again

Kim Masters was on NPR's "Morning Edition" this morning talking about the concerns that motion picture exhibitors (theater owners) have about the movie studios' plans to change their "release windows". Release windows are the order in which movies are released to different channels, and how long each channel has exclusivity. The issue is a seemingly innocuous request made by the studios to the FCC for "selectable output control" on set-top boxes, Blu-Ray players and other devices. Selectable output control would allow the studios to control whether, when and how much a movie or other video program could be played on a compatible device.
Consumer groups and consumer electronics vendors oppose selectable output control because the studios could use it to prevent their content from being recorded on DVRs and other devices. Now, the National Association of Theater Owners (NATO) has filed opposition to the studios' request because they fear that the studios will use selectable output control to make movies available in the home at the same time that they're in theaters.

I don't support selectable output control because it takes away consumer choice and negates thirty years of progress in consumer electronics since the Betamax decision, but the theater owners' opposition to the studios is more an effort to hold back the ocean than a friendly, consumer-oriented action. Extremely few movies make money in theaters today; theatrical distribution is most valuable for promoting films for future sale as DVDs and Blu-Ray discs. As the sales of DVDs erode and Blu-Ray fails to pick up the slack, the studios are forced to look at online digital distribution as a viable alternative. However, for digital distribution to generate the kind of revenue that physical media does, the movies have to be available much sooner, and that means cutting into the theaters' release window.

Whether or not the studios get selectable output control, theaters' release windows are going to erode; it's just a matter of when and how. When theater patrons are forced to go through metal detectors and hand over their cellphones before going in to watch a movie (as shown on a recent edition of CBS's "60 Minutes") in order to prevent piracy, theaters are not long for this world.

The studios are looking for any and every way to increase revenues, including cutting out the middleman, even if the middlemen are movie theaters. The same thing is happening with broadcast television. Comcast is close to buying 51% of NBC Universal from General Electric, which will give it control of the company. NBC has done such a superb job of running its broadcast network into the ground that it may become the first broadcast network to become a cable network. CBS and ABC are beginning to demand a cut of the retransmission payments that cable operators have to pay broadcast stations for the right to transmit their programming. Those retransmission payments have become the only thing keeping some stations on the air in this recession. Add to that the practice of some networks demanding "reverse compensation" from stations: Instead of paying stations to carry the networks' programming, the networks demand that the stations pay for the right to carry the programming.

Network television affiliates are an endangered species, because the networks can make more money, at lower cost, by dealing directly with the cable and satellite operators. For Comcast, the deal becomes almost a no-brainer, since it would control both the network and the cable systems. It will have to sell off the NBC owned-and-operated stations in cities where it has cable systems; the next step would be to take NBC to cable in city after city as network affiliate agreements expire.

Fifteen years ago, we were talking about disintermediation in retail and wholesale distribution brought about by the Internet; now we're talking about it again, this time in media. The future of theatrical motion picture exhibition and free broadcast television hang in the balance.

Update, December 5, 2009: According to the December 2nd edition of the Chicago Sun-Times, a patron at the Muvico Theater in Rosemont, IL was arrested and spent two nights in jail for videotaping four minutes of "Twilight: New Moon." She claims that she was actually videotaping her sister's birthday party at the theater, and the video on her camera (a still camera that records video segments) supports her contention. Nevertheless, the theater's managers pressed charges against her under a little-used law designed to punish film bootlegging. She faces up to three years in prison. I've lost whatever sympathy I still had for theater operators after this mind-boggling incident.


Reblog this post [with Zemanta]

Sunday, November 08, 2009

Get Your Products Out

The new generation of high-quality, video-enabled DSLRs is thoroughly changing both the high-end still camera and camcorder markets. It's now inconceivable for a manufacturer to release a $1,000+ DSLR without some sort of HD video capability. Even Sony, the lone holdout, is rumored to be biting the bullet on November 18 with a HD-capable DSLR that will compete with the Panasonic GH1. (Technically, the GH1 isn't a DSLR, it's a Micro Four Thirds camera without a viewfinder, but it does everything that a DSLR does.)

The ironic thing is that RED, the company that made cinematic video production much more affordable with the RED One, first identified the need for print photojournalists to be able to shoot competent video without having to carry two cameras. It rechanneled its development effort for the Scarlet, which was originally supposed to be an inexpensive, handheld 2K camcorder, into a video-capable DSLR. That was almost two years ago, and not only has RED not yet shipped the Scarlet, it hasn't even provided a definitive list of features or release date.

When the Scarlet comes to market, it will have to compete with a variety of products from virtually every major DSLR manufacturer, at price points starting around $1,000 to over $5,000, with a huge range of capabilities. While Nikon, Canon, Panasonic, etc. weren't educated about the market opportunity solely by RED, it didn't do RED any good to tell its competitors so early about what it was doing. In my opinion, the early announcement was sheer hubris: "We beat you with the RED One, and we'll beat you again with the Scarlet."

Once you make a product announcement, you have to get the product to market quickly. You cannot assume that your competitors are too slow or too dimwitted to respond. The first time around, competitors took RED for granted because they were a new company, run by someone from outside the broadcast electronics business. Lots of companies like that had announced products, perhaps even shipped a few, and then sank beneath the waves. But RED was for real, and its competitors learned to pay attention.

If you're a new entrant into a market, you usually get one free pass where your competitors underestimate or dismiss you. Once you become successful, you're on their radar, and the requirement to get to market quickly becomes paramount.
Reblog this post [with Zemanta]

Wednesday, November 04, 2009

People are Buying Blu-Ray Players, But Not for Blu-Ray

Regular readers of this blog (okay, I admit that there are no regular readers of this blog) know that I've been skeptical about Blu-Ray. It took far too long to resolve the Blu-Ray/HD DVD battle, and then to actually get Blu-Ray players to market at a reasonable price. However, I'm willing to admit that I was wrong. People are willing to buy Blu-Ray players...just not to play Blu-Ray discs.

The Blu-Con conference was held this week in Beverly Hills, and if there was one overriding theme, it was that the motion pictures studios are really, really desperate. DVD sales were down more than 13% in the third quarter, and the studios depend on the profits from DVDs to underwrite the cost of producing blockbusters. As DVD sales drop, film financing gets riskier. We may be heading back into an era when a single bomb can sink a studio, as the movie "Cleopatra" almost did to 20th Century Fox in the 1960s.

As DVD sales are dropping, Blu-Ray sales are increasing, but at nowhere near the rate needed to compensate for DVD's decline. However, sales of Blu-Ray players are growing proprotionally much faster than sales of Blu-Ray movies. Why? The biggest reason is that the prices of the least expensive Blu-Ray players are now overlapping the high end of DVD player prices--around $99. At that price, why not buy a Blu-Ray player, which can also play DVDs?

Another key reason, and the biggest motivator for sales of Blu-Ray players in the $200 range, is Internet connectivity. The studios thought that the Internet connections on Blu-Ray players would be used for games, chatrooms and other content connected with Blu-Ray movies, but that's not been the case. The biggest use for the Internet connections is to play online movies from Netflix, Amazon.com, CinemaNow and Vudu, and Internet videos from sites like YouTube. The Blu-Ray player manufacturers are in a race to add more and more online services, and retailers are racing to drop prices in time for the Christmas season.

So, are Blu-Ray players going to have a big Christmas? Yes, but I suspect that the movie studios won't be so lucky. The very Blu-Ray players on which they've been pinning their salvation have turned into Trojan Horses, bringing streaming movies right along with them. It's ironic that the success of Blu-Ray players is now no longer in serious doubt, but the success of Blu-Ray as a medium for distribution movies is still questionable.
Reblog this post [with Zemanta]

Friday, October 30, 2009

Put a Roku in your Cable Set-Top Box

For years, I've doubted the ability of third-party set-top boxes from companies like Apple and Roku to make much market impact. Consumers generally detest adding more boxes and more wiring to their televisions. That's why home theater-in-a-box systems have been so successful, and a big reason why TiVo, which still has the best PVR, has struggled to build a viable business selling hardware. Consumers know that they have to have a set-top box from the cable or satellite company, and they accept two other boxes: DVD players (slowly morphing into Blu-Ray players), and game consoles.

In turn, both Blu-Ray players and game consoles are morphing into Internet digital video players. Netflix's streaming movie service is integrated into many Blu-Ray players, and with Microsoft's XBOX360. Earlier this week, Netflix made official support for Sony's Playstation 3, and support for Nintendo's Wii is right around the corner. However, none of these devices have anywhere near the household penetration of the ubiquitous cable or satellite set-top box.

Cable and satellite set-top boxes have always been closed, monolithic devices--they act as the gateway to the service provider's content, and nothing gets on them or through them without the service provider getting a cut of the action. Even with initiatives like Tru2Way, there's been little progress on opening up service provider STBs. Perhaps now is the time for them to do so.

Network-enabled Blu-Ray Players and game consoles represent the first viable competitors to the service providers' programming hegemony in the living room. The cable and satellite operators can rail against the competition, try to keep their content suppliers from working with them, and try to limit the value of competitor's services with artificial release windows for movies, none of which are likely to work in the long run. Or, they could add network-enabled features to their own set-top boxes and make the competition irrelevant.

Consider a cable or satellite set-top box that allows subscribers to access the same content as Roku. That means Netflix, Amazon, Major League Baseball, and in the near future, YouTube, Hulu, Revision3, Mediafly and a host of other services. You may be thinking, "Netflix? Amazon? Are you out of your mind? Don't the cable and satellite operators have their own Video-on-Demand services that they've spent millions of dollars to build?" Yes they do. But, those VOD systems have limited capacity and are extremely expensive to expand.

According to Comcast, in the first quarter of 2009, over half of its new VOD movies were available the same day as the DVDs. To limit the impact of $1-a-night services like Redbox, the movie studios are pushing to require Netflix and the kiosk operators to get their titles a month or so after the DVDs are shipped to retailers. The way things are going, if you want to see the movie as soon as it's out on DVD, you can buy the DVD or Blu-Ray, or watch it on cable or satellite VOD and pay a premium. If you're willing to wait a month, you can watch it on Netflix.

But, under the model I'm proposing, even if it's from Netflix, you'll still watch it on your cable or satellite set-top box. The service provider will charge Netflix a small fee for access to your set-top box--perhaps pennies per title viewed or a dollar a month. The service provider will get to brand and sell advertising on the interactive program guide and menus that subscribers access in order to find titles. A similar approach would work for other content providers: The cable or satellite operator gets the right to surround the content with advertising, and possibly to even insert advertising directly into the content.

Is the service provider cannibalizing itself? Yes, but it's capturing a chunk of the revenue that it's now losing, and will lose in even greater amounts in the future, from over-the-top (OTT) services that completely bypass the cable or satellite operator's set-top box.

The OTT services also offer leverage that the cable and satellite operators can apply to providers of conventional cable networks. For example, many service providers have long wanted to move ESPN to a premium sports tier, but ESPN demands a fee for every subscriber, even if they have no interest in sports and never watch any ESPN channels. OTT services like Major League Baseball can be integrated into the service providers' offerings to make a premium sports tier more popular and provide negotiating leverage to move ESPN's services into the same tier.

It's time for the service providers to stop trying to prevent the growth of over-the-top services like Netflix, and to start working with them.
Reblog this post [with Zemanta]

Sunday, October 25, 2009

Can TV News Do Without Anchors?

Variety reports that local television stations across the U.S. are laying off high-paid anchors, cutting newsroom staffs, consolidating news operations from previously competing stations, replacing salaried reporters with stringers, and even making anchors operate their own teleprompters. Lost in all this is a way to potentially retain qualified reporters and editors while cutting costs--getting rid of anchors altogether.

The role of the news anchor is an anachronism that dates back to the earliest days of television, when live feeds from reporters in the field were impossible and most news stories were either read from wire service copy or films that the anchor narrated. Over time, anchors became the "brands" of local stations and television networks alike. The most popular anchors turned in the best ratings, and at the national level, a change in the anchor chair (Dan Rather replacing Walter Cronkite, Brian Williams replacing Tom Brokaw) was the subject of endless analysis.

Today, however, the news anchor is truly an anachronism that stations should consider doing away with. Stories can easily be introduced and narrated by the reporters themselves, whether in the newsroom or in the field. Wire service stories and stories without video can be reported from a desk in the newsroom. More and more stations bring reporters onto the news set to introduce their own stories and answer questions posed by the anchor, so why not "hand the ball" to the reporters and let them do the whole job? Couldn't they ask each other questions?

In my opinion, anchors are becoming less and less important for drawing an audience as the anchor position itself increasingly becomes a revolving door. As the Variety story points out, the show that precedes the late local news (the "lead-in") is perhaps the single most important determinant as to which news program a viewer will watch. NBC's local stations have lost a scary percentage of their late news audiences due to the weakness of The Jay Leno Show and NBC's overall prime time schedule--for example, KNBC, the Los Angeles NBC station, has lost 25% of its late night news audience.

If stations are replacing high-paid anchors with younger, less-experienced substitutes in order to save money, why not experiment with getting rid of anchors altogether? Use the savings to retain more of the experienced beat reporters and editors who really form the backbone of a successful news organization.
Reblog this post [with Zemanta]

Saturday, October 24, 2009

The Atomization of Media

Over the last fifteen years, I've worked on Internet software, streaming media, home video, telecommunications, and most recently, eBooks. Through that experience, it's become clear that the very nature of what constitutes media is changing. Singles have replaced albums as the primary way for people to purchase music. The six-minute short video, whether on YouTube or Hulu, is increasingly replacing the 30- or 60-minute television show (and those shows are increasingly looking like a collection of short videos). Newspapers are being replaced by their web equivalents, and by news aggregators like Google and Yahoo, enabling readers to go right to the topics and stories that they're interested in. 1Cast, a video aggregator that just launched, is doing exactly the same thing for television news.

I call this division of what used to be monolithic media "packages" into smaller, individually-searchable and selectable chunks, atomization. All media are subject to atomization in one form or another. eBooks, my current area of focus, are especially vulnerable...but I think that it's a good thing. The truth is that there are a lot of different book industries, segmented by categories, subjects and target age groups. Reference books and textbooks are particularly ripe for atomization, as are computer and business books, and other types of instructional works. These books are rarely read front-to-back; readers "dive in" at different points to get specific pieces of information. Users of these types of books rely on their indices and tables of contents in order to find what they're looking for. These readers would love to have a robust search engine on top of a collection of books, in order to find the information they need quickly. Some services are providing just such a search engine; after all, that's the idea behind Google Books.

Publishers, on the other hand, sell books, not topics or paragraphs. They're resistant to the idea of atomization--after all, how do you price a topic? Their contracts with writers and third-party content suppliers (image libraries, illustrators, etc.) are written on the basis of revenues from book sales, not sales of chunks of books. Nevertheless, this is the direction that publishing is going in. Books will be "exploded" into bits and pieces, aggregated with other titles, augmented with videos, audio and animation, stored in databases and indexed by search engines. The concept of individual books may eventually go away, to be replaced with databases from publishers focused on a single subject area or category, or from aggregators that combine books from multiple publishers into a single database.

There are, of course, some categories that probably won't be atomized in this way. Fiction and narrative non-fiction are intended to be read front to back, beginning to end. Some publishers and distributors are experimenting with selling these titles on a serialized, per-chapter basis, much like the novel serializations in newspapers of the 1800s, but that's probably as far as atomization can go with these kinds of works.

The point is that we need to stop looking at media forms as monolithic and start asking two questions:  "How can we break this into small, usable chunks?" and "How can we best monetize those chunks?".
Reblog this post [with Zemanta]