Friday, December 30, 2011

Curation: Publishers' most important role

Book publishers perform many functions (some better than others, but that's a topic for another post). Some people believe that the most important thing that publishers do is edit manuscripts--both giving direction to the author and copyediting once the manuscript is complete. Others focus on sales and distribution--getting bookstores to carry their titles, and making co-op payments to bookstores in order to get display space at the front of their stores, along with better facings on the shelves. However, my opinion is that the single most important thing that publishers do is curation--selection of which titles to publish.

Yes, publishers often select and underwrite titles for a variety of reasons that have little to do with quality. They buy up the rights to titles (that are usually ghostwritten by professional writers) from celebrities and jump into hot markets with "copycat" titles, such as the endless stream of vampire-related books that followed the success of the "Twilight" series. However, they also impose basic quality standards on their writers, and they (usually) have standards about what they will and will not publish. To a knowledgeable consumer, seeing the Random House or Farrar, Straus and Giroux name on the spine (to take two examples) says that they're likely to get a well-written, well-edited book that's not going to be a waste of their time or money.

In the new era of self-published books and eBooks, most of the functions of publishers can either be farmed out or are irrelevant. Editing, copyediting and cover/book design can be contracted out or done by an experienced writer. eBook conversion can also be done by the writer or by a contractor. Printing can be done by any of a variety of companies. Distribution can be done by the author for eBooks; distributing print titles is more difficult, but can still be done through companies such as Ingram's Lightning Source, which deals with most of the world's major booksellers.

However, the one thing that neither a self-publisher nor companies that assist self-publishers does is curation. The writer of a book is the last person who can make an objective judgement about its quality--for better or worse, most authors are either far too hard on themselves or are deeply emotionally invested in their work. Self-publishing services companies are concerned with generating as much revenue as possible from self-publishers. That means not turning away any manuscript, no matter how poorly written, so long as the author can pay for their services. About the only thing that will keep a title out of Amazon's and Barnes & Noble's self-published eBook collections is if it's proved to be largely or wholly plagiarized, and even that doesn't happen very often.

Amazon and Barnes & Noble (and other booksellers) claim that their customer reviews provide a curation service for customers, but the reviews can be gamed:

  • Authors can encourage their friends and acquaintances to post positive reviews, or they can pay people to do so.
  • Consumers sometimes give extremely low ratings to books because they believe that they're priced too high (often, the consumers giving the ratings have neither purchased nor read the books).
There are also book curation websites, but none of them are widely popular, and they can be gamed the same way as the eBook retailers' sites. That leaves the tasks of curation and quality control to the publishers. To the extent that publishers abandon those roles, or de-emphasize them in favor of chasing celebrity and copycat titles, they'll give away their biggest advantage over self-publishers.

Enhanced by Zemanta

Wednesday, December 28, 2011

DRM: The product that (almost) nobody wants

A few years ago, I was an industry analyst covering the IPTV (Internet Protocol Television) industry--the video delivery technology used by Verizon (FiOS) and AT&T (U-Verse) in the U.S., and many other companies worldwide. One of the hardware segments of IPTV that I tracked was Digital Rights Management (DRM). When I came on-board, the retiring analyst whom I replaced warned me that the DRM vendors would probably cause me ten times as much grief as those in any other segment. He was right.

DRM is an unusual business: The companies that demand that DRM be used aren't the ones that pay for it. You can't distribute television shows or movies from any of the major television networks or studios unless you have an acceptable DRM system in place. The same is true if you want to distribute eBooks from most of the major publishers (O'Reilly is the biggest fact, O'Reilly demands that its eBooks be distributed without DRM.)

The movie studios, television networks and publishers often specify which DRM systems are acceptable, but they don't pay for them. That cost is borne by cable and IPTV operators, over-the-top video distributors (such as Netflix and Amazon) and eBook distributors. For their part, cable and IPTV operators have their own conditional access systems, and a nearly foolproof way of keeping unauthorized users from getting their content--in the worst case, they can send out a truck and disconnect the pirates from their network. However, that's not good enough for the movie studios and television networks, who want to make sure that their content can not only not be viewed by the wrong people, but that it also can't be copied.

Over-the-top video and eBook distributors are less concerned about piracy than they are about making their services extremely easy to use, in order to stimulate sales. They already require usernames and passwords in order to download content, which helps to insure that only those customers who are authorized to access their content can get it. They want DRM, but they don't want it to make their services hard for average consumers to use. The more hoops that consumers have to jump through in order to purchase, download and use content, the less likely it is that consumers will continue purchasing from those vendors.

Apple and Amazon developed their own DRM systems, which were designed to protect content while making access as easy as possible for consumers. Most other companies don't have the ability to develop their own DRM systems, and that's where third-party vendors come in. Content distributors want the cheapest DRM systems they can get that are acceptable to their content suppliers, because DRM adds no value for the consumer (it actually subtracts value), and it adds cost for distributors while offering little or no value. The only parties that it serves are the content providers, who don't pay for the DRM systems, implement them or deal with customer complaints.

This has created a field of third-party DRM vendors who are fairly paranoid. DRM vendors regularly compete on price, but some companies have chosen other approaches. Widevine, which was acquired in 2010 by Google, had several patents on its DRM technology and would threaten (and sometimes file) patent infringement lawsuits against competitors who were undercutting it on price. Widevine used the same tactics against market research and industry analyst companies that didn't report on the company the way that it wanted, or that put its competitors in a positive light. In the case of the company I worked for, Widevine demanded that we lower the installation counts that we had compiled for some of its competitors. When we refused to do so, it threatened to file suit against us. We easily could have prevailed in any litigation (simply going public with their threat would have been sufficient to destroy their credibility), but the owner of my company caved in and removed Widevine's name from our report, replacing it with "Anonymous". Shortly after, Widevine signed a consulting contract with us, hoping to have more influence over our reporting. When a subsequent report had installation counts for competitors that Widevine disagreed with, they again threatened to file suit, and my company's owner again caved into their demands. I demanded that the company take my name off the report and resigned shortly after, because I didn't want my reputation to be sullied. 

Another company, NDS (owned by News Corporation) refused to give us any numbers for its installed base, but after each report we issued, they would complain loudly that our numbers were inaccurate. When we said that we would be glad to adjust the numbers if they gave us installed base numbers that we could confirm, they said that they were under no obligation to give us any information. Given that they were unwilling to provide any evidence to support their complaints, we stuck with our numbers.

In short, DRM is a product that (almost) nobody wants, where the companies that want it don't pay for it, and most of the companies that are forced to pay for it don't really want it. That would be enough to make just about anyone a little paranoid.
Enhanced by Zemanta

Sunday, December 25, 2011

My year-end waste of time: Predictions for 2012

I've decided to participate in one of the most potentially embarrassing annual blogging rituals: Predictions for the coming year. So, for what it's worth, here are my predictions for 2012, in no particular order:

eBooks and Publishing

  • Both the European Commission's Directorate for Competition Law and the U.S. Justice Department will file suit against Apple and five of the "Big 6" trade publishers (Lagadere's Hachette publishing group, News Corporation's Harper Collins, Holtzbrinck's Macmillan, Pearson's Penguin Group and CBS' Simon & Schuster) for eBook price-fixing under the agency pricing model. Bertelsmann's Random House most likely won't be charged, because it joined in agency pricing long after the other five publishers. All the companies charged will strongly deny any conspiracy to fix prices, but they'll all eventually agree to a consent decree (and the European equivalent) before the cases go to court. The settlement will require Apple and the publishers to make cash payments for consumer damages, and the agency model will be discarded. eBook distribution will go back to the wholesale model.
  • There's also a possibility that the U.S. government and European Union will use the antitrust litigation as a lever to force the Big 6 to make their eBooks available to libraries on commercially reasonable terms. Currently, only Harper Collins and Penguin make their titles available for library lending, and both companies impose significant restrictions.
  • eBook sales in early 2012 will follow the same pattern as the last few years--there will be a huge burst of sales in January and February as millions of consumers who received eReaders and tablets as holiday gifts stock up on titles. However, the year-to-year growth rate in eBook sales will drop, due both to the increased share of eBooks as a percentage of all book sales and higher prices from the Big 6 publishers.
  • Even though the growth of eBook sales will slow, print sales will continue to decline. Independent booksellers in the U.S. won't pick up the slack from the closure of Borders, nor will they make big strides in increasing their overall share of U.S. book sales.
  • The Big 6 publishers' pricing policies will continue to encourage sales growth for smaller publishers and self-publishing authors, as consumers experiment with less-expensive titles and find that many of them are just as good as titles from the top publishers.
  • While the number of titles from medium, small and self-publishers continues to grow, the Big 6 will continue to cut back on the number of titles that they release, focusing even more on pre-sold authors and titles, series and backlist titles that are reissued with a variety of value-adds.
  • The "eSingle revolution" (short eBooks, no more than 50,000 words and typically 30,000 words or less) will grow, with more conventional book publishers offering titles. In addition, more media companies from other fields (magazines, broadcasting, cable and the web) will enter the eBook market with eSingles, either by themselves or in partnership with established book publishers.
  • $99 will become the top-end price for dedicated eReaders sold in the U.S.; someone (probably Amazon) will go to $49-$59 for an entry-level model. The ad-supported/no-ads issue will become moot, as consumers show that they're perfectly happy with a cheaper, ad-supported eReader.
  • The tablet market in 2012 will look very much the same as the market at the end of 2011: Apple will continue to dominate the high end of the market, with two lines of tablets: A new "iPad 3" (although I'm not sure that'll be its name) at the current iPad 2 prices, and the existing iPad 2, possibly with fewer storage and broadband options, at $100 or so below its current prices (for example, $399 for a 16GB model). At the low-end, a variety of tablets will compete in the $149 to $249 range, led (at least for the first few months) by Amazon. I wouldn't at all be surprised to see Barnes & Noble drop prices of both the Nook Color and Tablet by $50, to $149 and $199 respectively.
Cameras & Camcorders
  • We're almost certain to see new cinema camera models from Canon in 2012. The prototype cinema camera based on the EOS body will be launched, as well as at least one new model in the C3XX range, with improved electronics including auto-focus, auto-aperture and auto white balance and 10-bit log output. The new EOS model could be announced as early as NAB in April, and the new C3XX model is likely to be shown at IBC in September.
  • Panasonic's AG-AF100/101 is getting a little "long in the tooth", so I expect a refresh of the model in time for NAB in April. I also expect the GH3 to be announced in the first half of the year.
  • Given all of Sony's 2011 EVIL, DSLR and camcorder announcements, I don't expect any big announcements from Sony in 2012.
  • AVCHD 2.0 (also called AVC Progressive) will become ubiquitous on all new cameras and camcorders supporting AVCHD.
Motion Pictures
  • We'll see major consolidation at the U.S. movie studios, like what we've already seen at Paramount, with even deeper cuts. Studios will become even more conservative about which titles they greenlight for production, continuing to focus on remakes, series and pre-sold titles (very much like the big publishers). This risk minimization strategy will lead to even more boxoffice and home video revenue declines.
  • Online movie rental services such as Netflix and Amazon will continue to increase their share of home video revenues, but what could have been a huge win for Netflix will be a much more competitive market, due to Netflix's self-inflicted wounds from 2011.
  • Studios will rethink the value of 3D given audiences' rejection of the format, and will put more effort into using 3D well on a smaller number of "event" titles. That means that 2D-to-3D conversion, which has never worked well, will go away. Studios will have to come to grips with the fact that 3D, like Blu-Ray before it, will not be their financial savior. Even well-done 3D won't save movies that audiences don't want to see.
  • UltraViolet, the "online digital locker" system supported by most of the major studios, will fail to get significant market share, although the studios won't give up on it in 2012. Consumers will find it too hard to use, not worth the effort and not a compelling reason to go back to buying DVDs and Blu-Ray discs.
  • With an handful of exceptions, independent films will reach audiences through VOD and online streaming services, not through theatrical exhibition or sales of physical media.
Enhanced by Zemanta

Friday, December 09, 2011

Rifles vs. shotguns: The GoPro advantage

The rule over the years for camera and camcorder manufacturers has been to make a model for every need and every price point. Canon, Nikon, Sony and Panasonic sell everything from inexpensive point & shoots to DSLRs. All but Nikon do the same with camcorders--prices run from around $100 for YouTube-focused models to upwards of $100,000 for digital cinema cameras.

The "model for every purpose and every pocket" approach means that, as a manufacturer, you won't miss a sale because you don't have a model that a customer can afford or can use, but it has some significant downsides. One is that it's expensive to develop new camera designs, both in terms of money and time. Canon's new C300 digital cinema camera took two years to develop, and that was considered a "fast track" project that required adapting the electronics from an older camcorder design in order to meet its deadline. In addition, as development budgets get strained, it's necessary to "milk" designs by releasing cameras that are minor variations on each other. Not to pick on Canon again, but the T2i, 60D and T3i DSLRs are very similar to each other, with minor differences in areas such as LCD mountings and video settings.

There's another side-effect of having so many models--features are deliberately left out of some lower-priced models in order to avoid cannibalizing sales of more-expensive ones. Sony is famous for this; for example, a big reason that the FS100 only has a HDMI output instead of HD-SDI is to avoid cannibalizing sales of the F3 camcorder. There's no technical reason why the FS100 can't have HD-SDI--the less-expensive Panasonic AF-100 has it, and it was introduced a year before the FS100.

Some companies practice another approach--build a limited number of models (or even a single model) of camera or camcorder, with a very specific target market or application. That brings us to GoPro, a camcorder company based in Half Moon Bay, California. GoPro only sells two models: The HD Hero and the new HD Hero2. Physically, the two cameras are almost identical to each other, but the Hero2 has improved electronics and optics. There's about $60-$70 difference between the two models, and none of them sell for more than $300. According to company founder Nick Woodman, GoPro initially built ruggedized cameras for use by surfers and skiers, but they were designed to be used by two people--one to surf or ski, and the other to shoot the action. Woodman's revelation, and the core principle behind everything that GoPro sells, is that athletes want to take video or still pictures of themselves in the act, or from their point of view. That meant that GoPro's cameras needed to not only be ruggedized--they had to be tiny, operate automatically, and be mountable just about anywhere.

GoPro sells a suite of mounting kits that allow its cameras to be mounted anywhere from the exterior of a race car to a surfboard. The company has a library of incredible footage shot underwater, on skydivers, mountain bikes, snow skis, skateboards, even as the payload for a weather balloon at the edge of space. It also has accessories to make the cameras easier to aim, extend their battery lives, transmit their video via Wi-Fi and gang two cameras together for 3D video. Yet all of it is based on the same camera design, for the same fundamental application.

I was amazed by how crowded the GoPro booth was at the NAB conference last April. This is a under-$300 camera, yet broadcast professionals were packed into the booth. GoPro's cameras are used for shooting the contestants' points of view on reality game shows, for recording experiments on Discovery's "Mythbusters", and for use almost anywhere danger is involved. Two thoughts went through my mind:

  • Someone is going to buy Woodman Labs, the parent of GoPro, and
  • Surely one of the big Japanese camera or camcorder makers will jump into the market.
I certainly hope that Woodman Labs isn't sold--the scariest example of what could happen is what happened when Cisco acquired Flip Digital. Before the acquisition, Flip was the leader in the market for inexpensive, simple-to-use camcorders. Earlier this year, due both to competition from smartphones and mismanagement, Cisco shut down Flip completely. Whenever a big company buys a small, focused company, it's usually the small company that suffers. As for the second possibility, a Japanese competitor could try to copy GoPro's ideas, but they'll stumble on their need to be all things to all people. To build a viable competitor, you need to understand GoPro's markets and applications as well as GoPro does, and that's hard when you're also trying to build cameras for every possible market and application.

Had GoPro tried to enter the general-purpose camera or camcorder markets, it would have had its head handed to it. Instead, it dominates the point-of-view market, which it can effectively defend. There's a lesson there, not just for other small companies but for the big camera makers as well. It may be time to focus on a few markets instead of trying to compete in all of them.

Enhanced by Zemanta

Wednesday, December 07, 2011

Blackmagic Design acquires Teranex, slashes prices

Earlier today, TV Technology reported that Blackmagic Design has acquired Teranex, a digital image processing company, from Jupiter Systems for an unreported price. Teranex has had some excellent technology for years, especially for standards conversion, video denoising and upscaling/downscaling, but it's never been part of a company that was focused on broadcast technology. Teranex started in 1998 as a spin-off of Lockheed Martin, which invested more than $100 million in real-time video processing. Lockheed Martin, of course, was primarily focused on defense-related business, not broadcasting. In 2004, Teranex was acquired by Silicon Optix, which focused primarily on semiconductors and consumer-grade video scalers. Silicon Optix sold Teranex and most of its other products to Integrated Device Technology in October 2008, and IDT sold Teranex to Jupiter Systems, a video wall manufacturer, in June 2009. And now, 2 1/2 years later, Jupiter Systems has sold it to Blackmagic Design.

When a company has been bought and sold as many times as Teranex, it's very difficult to retain employees or to focus on long-term product plans. As a result, it's hard to know exactly what Blackmagic Design is getting. Teranex has some very interesting 3D software that enables two of its video processors to convert 2D to 3D and output 3D in a variety of formats. Combined with Blackmagic Design's ATEM production switchers, the Teranex products give the company much more extensive real-time image processing capabilities. However, many of Teranex's hardware designs are several years old, and could probably benefit from Blackmagic's abilities to redesign the products using current LSIs for lower cost and higher performance.

Update, December 14, 2011: StudioDaily reports that Blackmagic Design has slashed the price of Teranex's top-of-the-line VC100 universal frame synchronizer and format converter from $90,000 to $19,995. In addition, Blackmagic added additional features including dual-channel 3D support, so that it no longer requires two converters to handle 3D. Existing owners of VC100s can get the new features with a $3,000 upgrade. Even without redesigned hardware, Blackmagic has managed to reduce the price by almost 80%,

In short, the acquisition is certainly a good move for Teranex, which is finally partnered with a parent company that knows what to do with its technology. Depending on how much it cost Blackmagic Design and how old Teranex's technology is, the acquisition might or might not be such a great idea for it. We'll know more at NAB 2012, when we see the first displays of Teranex products in the Blackmagic Design booth.
Enhanced by Zemanta

Tuesday, November 22, 2011

PressBooks eBook publishing service opens its doors to the public

PressBooks, a Montreal-based startup, took its eponymous online eBook production service public today. There are lots of services and software for creating eBooks, but PressBooks has some interesting features for self-publishing authors (and even for established publishers--more on that in a minute.) PressBooks is an online service built on top of WordPress. If you're familiar with the WordPress dashboard, you can jump into PressBooks right away. The service creates fully-formatted EPUBs and PDF files, as well as HTML online eBooks and XML documents. It's primarily designed for text-intensive trade-style eBooks; if you're planning to create picture books or heavily-formatted multi-column text titles, there are better tools than PressBooks. However, you can import book covers and images into a built-in media library.

PressBooks uses a simple WYSIWYG editor for creating and editing text; text can also be edited offline and then uploaded. A small variety of templates are available for automatically formatting eBooks; PressBooks is working on more designs. Multiple authors and users can be defined, and the site can be public (anyone can read the eBook) or private (only specified users can access the site). Users can enter metadata in the Book Information section, including title and subtitle, descriptions, names of editors and translators, print and eBook ISBNs, and prices.

PressBooks has already been used for creating two commercially-published eBooks: "Book: A Futurist's Manifesto", which was edited and published by the PressBooks team for O'Reilly Media while the software was under development, and "Nine Things Successful People Do Differently" by Heidi Grant Halvorson, published by the Harvard Business Review Press.

At the present time, PressBooks is a free service, but that's likely to change once it exits beta. Here's a presentation that demonstrates most of the service's features:
Enhanced by Zemanta

Monday, November 21, 2011

A few quick presenting tips

I spent a couple of hours at an event last night where the main presenter spent about 40 minutes talking about himself and 15 minutes about the topic we were there to hear about. The information that he presented in those 15 minutes could have been reduced down to a two-paragraph blog post.

Most people who are asked to make presentations don't have a clue what they're doing. I used to believe that it was rude for audience members to check their email or browse the web during a presentation, but after sitting through years of crappy presentations, I now think that it's rude for poor, unprepared presenters to waste their audiences' time. So, in the spirit of making everyone's life easier, here are a few presenting tips from someone who's made every mistake in the book:
  • If you only had five minutes to speak, what are the points you'd want to make? Those points should become the core of your presentation.
  • It's better to focus on a few key points than it is to try to pack everything but the kitchen sink into the presentation.
  • Use PowerPoint sparingly, and put one item on each slide. Keep them simple.
  • If your presentation isn't well-organized, your audience won't understand it.
  • Keep the presentation focused on the audience, not on you.
  • Unless you're specifically doing a sales presentation, use your presentation to inform.
  • Rehearse. The first time you give the presentation should never be in front of the intended audience.
  • Arrive at the venue early enough to check out everything--make sure your laptop works with the venue's projector, your microphone works and, if you need it, you've got a network connection. Assume that nothing will work, and prepare backups.

Enhanced by Zemanta

Saturday, November 19, 2011

The publisher bypass operation

I just read an article in the latest issue of Wired about the new breed of subscription music services--companies like Spotify, MOG and The problem with these services (and for that matter, conventional purchase services like iTunes) is that artists get a very small share of the revenue. Most artists are now getting the majority of their income from live performances, not music sales. Where concerts once served to promote album sales, now digital music promotes live performances.

Online video distribution has had a similar impact on movies, television and original video. Most of the revenues from movies and television shows sold or rented by Netflix, iTunes, Amazon, etc., goes to the studios and distributors, not to the original producers. Original video produced for YouTube and other services is incredibly hard to monetize; only a few series, like "The Guild" and "Easy to Assemble", have sponsorship or distribution deals that directly compensate the producers. Most original video has to make do with a trickle of advertising revenue, modest sales from iTunes, or nothing at all.

That brings us to books, where the situation for independent authors is very different. Amazon will pay as much as 70% of the revenue from sales of eBooks to self-publishing authors, and other resellers will typically pay 35%. Compare that with the typical 10% to 12% royalty on wholesale price paid by publishers, and self-publishing starts to look very appealing. Yes, the self-publisher has to pay upfront for editing and design, but many publishers recoup those costs before they start paying royalties. In addition, unless you're a top author, publishers will do little or nothing to promote your title, so you'll have to hire a publicist or do the work yourself.

It's true that print still represents the majority of book sales, but the market is quickly shifting to eBooks. Some of the most popular titles are already selling almost as many copies of eBooks as print, and heavy book readers are adopting eBooks faster than any other group. The majority of book sales are likely to come from eBooks by the middle of this decade.

So, where does that leave publishers? Penguin, for one, is getting into the self-publishing business through its Book Country online service. In addition to charging upfront fees for formatting and designing eBooks, Book Country demands a hefty fee for distributing self-published eBooks to online bookstores--which self-publishers can do themselves. Other publishers are experimenting with "augmented" eBooks--containing audio, video and animations--which they believe are beyond the ability of self-publishers to create. There are two problems with that approach:
  1. Companies such as Vook are launching eBook creation tools that will allow self-publishers to make augmented eBooks, and
  2. Sales figures to date suggest that there's not a big market for augmented eBooks. For example, Vook's original strategy was to publish augmented eBooks itself, but the company couldn't sell enough to sustain its business, so it's now focusing on licensing its platform to others.
To be sure, publishers still provide valuable services, especially for top-tier authors--but book publishing is the first industry where creators (writers) can compete effectively with distributors (publishers). In the near future, the big publishers will likely find themselves focusing on two categories:
  1. New releases from "A-list" authors that can command high prices and sell tens of thousands of copies in print, and
  2. Milking their existing backlist for eBook reissues, bundles, and other ways of delivering "old wine in new bottles".
Some mid-tier authors may find a home with smaller specialty publishers, but almost everyone below the "A-list" will have to self-publish. We're likely to see some self-publishing authors join together in "United Artists"-like organizations to create "quasi-publishers" that perform some of the functions of existing publishers, such as design, publicity and promotion. The participating authors could serve as editors for each other.

By mid-decade, we're going to have far fewer and smaller "old-style" publishers. On the other hand, we'll have far more self-publishers and quasi-publishers that are performing most of the tasks previously done by publishers themselves. The industry power will reside with resellers such as Amazon and Barnes & Noble in the U.S., and their equivalents in other countries around the world.

Enhanced by Zemanta

Thursday, November 10, 2011

Be careful when you partner with startups

There are so many startups launching new products and services every day that it's incredibly tempting to work with them. In many cases, they offer their services for free or at a very low cost in order to get customer feedback and build a user base. In the past, the big problem with startups has been the risk of their going out of business, but that was usually visible long before the companies actually failed--for example, they couldn't find financing, their reference accounts were weak, or they were asking for investments at the same time they were trying to make the sale. Today, however, there's a new trend that magnifies the risk that successful startups will go under, and if you're not careful, they could take your business with them.

In the last two days, "talent acquisitions" have resulted in the closure of two well-regarded web startups. Talent acquisitions happen when startups are acquired, not for their products but for their people. The acquirers tend to be industry giants, such as Google, Facebook, Microsoft or Apple. On Tuesday, Facebook acquired the talent running Strobe, a startup that had built a cross-platform app development system based on HTML5. Facebook didn't acquire the Strobe platform itself, and Strobe (or what was left of it) said that its service would remain available indefinitely in beta. However, with no one working on it, bugs aren't going to be fixed and new features won't be added. In other words, that parrot is definitely dead. There's still a possibility that the Strobe team may sell the software to someone else, but it's very unlikely given that the entire development team is gone. If you were developing your apps using Strobe, you're now faced with finding another development platform, and likely, rewriting your apps to work with that platform.

Today, Google acquired Apture, which offered a plug-in for browsers that enabled pop-up searches for almost every word on webpages, and a JavaScript add-on that allowed multimedia content from many sites, including Wikipedia, Google and YouTube, to be integrated into pop-up windows on blogs. Apture had customers using its service including The Economist, the Financial Times, Reuters, Scientific American and Scribd. Unlike the Facebook-Strobe deal, Google acquired all of Apture, but Google has decided to discontinue the Apture services within the next month, according to TechCrunch. The Apture development team with join the Chrome browser project.

The lesson is that you can no longer use the funding or success of a startup as an indicator that the startup will remain in business. A "successful" startup can be acquired and its services can be shut down or left in limbo by the acquirer. So, what can you do to protect yourself?
  • Be very cautious about building your product or service on top of an API offered by a startup. If anything happens to the startup or API, you may have to go into crisis mode to replace it.
  • Make sure that you have a way to export any data that you don't already have copies of, and keep a local backup.
  • Closely review the terms of service for any startup that you work with. If necessary, you should propose a revision or addendum that gives your company non-exclusive rights to continue using the service or software if the startup, or its acquirer or investors, decide to discontinue it. That may require you to host the service yourself or take possession of the software and source code from an escrow account.
  • If the startup is providing hardware that's essential for your business, you should buy sufficient additional units and/or replacement parts to meet your needs long enough to transition to other hardware.
I'm not suggesting that you shouldn't do business with startups, but you should exercise caution. A good "Plan B" is an essential insurance policy.

Enhanced by Zemanta

Saturday, November 05, 2011

"My name is Bond...Teradek Bond."

Video "uplink in a backpack" systems, pioneered by LiveU, have become very popular for use at major-market television stations for live remotes. These systems use multiple 3G/4G broadband wireless connections, as well as Wi-Fi, to send HD-quality broadcast video live from the field for streaming to the Internet, or for live broadcast. LiveU typically rents its systems for $2,500/month or leases them on an annual basis for $1,500/month; comparable systems from TVU and Streambox sell for $25,000 to $40,000 (U.S.).

Teradek, whose Cube was the first device that made live broadcast-quality Wi-Fi streaming from camcorders feasible and inexpensive, has launched a new device called Bond that shrinks the "uplink in a backpack" down to a size that fits on top of a camcorder, and a price that almost any producer can afford. The Bond is designed to be connected to a Cube, and accepts up to five 3G or 4G USB cellular modems. The Cube provides the HD/SD-SDI or HDMI video input for the Bond; some models also provide Wi-Fi output. At the station or streaming end, Sputnik, a Linux-based application reconstructs the bonded video into a single MPEG-TS stream that can be processed with most H.264 decoders.

That's interesting, but not revolutionary: The LiveU, TVU and Streambox systems do essentially the same thing. What makes Teradek's system revolutionary is the price: The Bond's list price is $2,490 (U.S.). A Cube 250 with a HDMI interface and USB output (needed for the Bond) lists for $1,590. Sputnik is free. If you want to use an end-to-end Teradek solution, a Cube 400 decoder outputs to a HDMI interface as well as wired Ethernet, for $1,190. That's a complete, broadcast-quality broadband ENG uplink/downlnk system for $5,270. Depending on whether you rent monthly or annually, that's about two or four month's rental of a LiveU system, and about 20% of the purchase price of a TVU system. You're going to see a lot more live webcasts and broadcasts, thanks to Teradek and its Bond.

Enhanced by Zemanta

More thoughts on the new Canon and Red cameras

Now that the dust has cleared a bit from Thursday's announcements by Canon and Red, I've had some time to consider the new cameras, and the compromises that both companies made when developing them.

First, the Canon C300. Physically, it's a gorgeous camera--it reminds me of classic 16mm film camera designs. What I'm less impressed with are some of the compromises in the Canon design. For example, the C300 has automatic nothing--no autofocus, auto-aperture, or white balance. The lack of automatic controls may be good for teaching cinematography, just as when learning how to drive, it's better to start by learning how to use a manual transmission. However, in the real world, cinematographers often use autofocus, especially for documentaries and sports. And why no auto white balance, when just about every other digital cinema camera has it? Another big "miss" is the lack of a dual-link HD-SDI output and no 4:4:4 mode. Yet another questionable decision is support for 60 fps only at 720p resolution, together with a sensor that, while it's technically 4K, only outputs 1920 x 1080. The result is a weird mix: Many features of the C300 are oriented toward movie use, while the output of the camera screams "broadcast" and the C300 is missing some key broadcast features.

The reason why the C300 turned out this way can be found in an interview that Larry Thorpe, Canon U.S.A.'s Senior Director, Professional Engineering and Solutions, Imaging Technologies and Communications Group, gave to Digital Photography Review. According to the interview, the C300 was a "fast-track" project inside Canon--two years from inception to product release. In order to meet the tight schedule, Canon couldn't develop new electronics for the C300, so it adapted the processor and electronics from the XF 305 camcorder for the C300. The XF 305 is a very nice camcorder, but it's not a digital cinema camera, and it's most certainly not a $20,000 camera. Most of the missing features in the C300 are due to using the XF 305's electronics.

From what Thorpe said in the interview, it's clear that the C300 is a placeholder for a broader, more functional line of digital cinema cameras coming from Canon. A few years from now, we'll probably look back at the C300 and wonder why anyone bought it, given how powerful the Canon models will be by then. For now, however, the C300 is a strange bundle--a great design, undermined by inadequate electronics.

The new Red Scarlet-X is different--it's plenty powerful enough, but it's not a Scarlet. It uses the same imager as the EPIC, except it uses it at a lower resolution and can thus utilize imagers that failed quality testing for the EPIC. It uses all the same accessories and software as the EPIC. It's the same form-factor as the EPIC. As Philip Bloom points out, it's not the Scarlet that Red's been talking about for three years, the one that was supposed to be light and cheap but still high-resolution. Perhaps the Scarlet-X should have been called the EPIC Light, and Red should have scrapped the Scarlet name. It's a niggling point--Red is going to sell lots of Scarlet-Xs--but the Scarlet-X doesn't fit into the niche that Red created for the Scarlet. Instead, it's an entry-level EPIC--perfect as a B camera to the EPIC, but not competitive in any way with the Panasonic AF101 or the Sony FS100. Speaking of which, the Canon and Red announcements have made the FS100 and AF101 look even more appealing.
Enhanced by Zemanta

Thursday, November 03, 2011

Canon, Red and Sony: It's War, I tell you! War!

First, Sony released the F3, a Super 35mm digital cinema camera for about $17,000 U.S. list price--$14,000 street price. The F3 has a 2K sensor, uses Sony's 35Mbps XDCAM EX codec, and comes with a dual-link HD-SDI interface. For an additional $4,000, it can be upgraded to 4:4:4 output and 3G-SDI.

Next, in a theater on the Paramount Pictures lot this afternoon, Canon announced its new EOS C300 digital cinema camera, based on the EOS DSLR platform, but with a new form factor and sensor. The sensor has 4K resolution, but it uses all the pixels in the sensor instead of line-skipping, and outputs a native 1920 x 1080 image. It uses Canon's 50Mbps XF codec at 4:2:2 and delivers 12 stops of dynamic range. Unlike the F3, it doesn't have options for dual-link HD-SDI, 3G-SDI or 4:4:4, but it does have Canon's Log format built-in. Two lens mounts are available: Canon's EF and Arri's industry-standard PL mount.

Unlike the Sony and most other cinema cameras and camcorders, the C300 doesn't have any automatic settings at all: No autofocus, automatic aperture, or automatic white balance. Everything is manual. That works for digital cinematography, but it's useless for "run & gun" situations, such as sports and documentaries, where the shot changes faster than most cinematographers can keep up.

The Canon, like the F3, comes with most of the essential accessories bundled, including the viewfinder, XLR audio interface, side grip, top handle, battery and charger. The list price of the C300 will be $20,000 (U.S.), and will ship in January 2012.

No sooner would the ink about the C300 have dried on the page if we were still printing ink on pages, than Jim Jannard of Red was standing in front of another group in another theater in Los Angeles, introducing the Scarlet. Yep, THAT Scarlet, the one that's been announced more times than Harold Camping has predicted the Rapture. However, it's not really THAT Scarlet, the model that was supposed to cost $3,000 with a 3K 2/3" sensor and a fixed lens. The Scarlet-X that Jannard introduced has the same Mysterium-X imager as the Red EPIC, uses all the same accessories at the EPIC, and can be purchased with either a EF or PL mount.

The Scarlet-X's sensor has 5K resolution for still images, 4K at 1-25 fps, 2K at 60 fps, and 1K at 120 fps. The sensor's dynamic range is 13.5 stops, and up to 18 stops with HDRx enabled. It records REDCODE RAW at 440Mbps--almost nine times more data per second than Canon's C300. The basic Scarlet-X sells for $9,750, including the imager, an EF mount, Brain (central processor) and side mount for a Solid State Drive. Add $1,500 for a Titanium PL mount; a full configuration with viewfinder and HD-SDI output is $14,000. The Scarlet-X with the PL mount will start shipping this month, and with the EF mount will begin shipping on December 1st. Red estimates that it will take until February to fill all the existing back orders.

The Canon C300 has been seeded to a handful of cinematographers; it's not clear if anyone outside Red has used the Scarlet-X. In any case, reviews of both cameras should start showing up in a few weeks. There are now three digital cinema cameras in the $14,000 to $20,000 range, all of which can do things that required cameras of two or three times their price a year ago.
Enhanced by Zemanta

Sunday, October 30, 2011

The self-publishing conundrum

In 2002, when I was running a home video distributor, the movie "My Big Fat Greek Wedding" was a huge success. The movie cost $5 million to produce and was distributed by IFC Films, an independent distributor owned by the AMC cable channel. It grossed almost $370 million (U.S. dollars) worldwide. That film, along with "The Blair Witch Project" in 1999 and "The Passion of the Christ" in 2004, led movie producers worldwide to conclude that inexpensive, independently-distributed films could make a lot of money. The problem was that all three films were outliers--totally unrepresentative of the fate of the vast majority of inexpensive, independently-produced files, most of which never even make it into movie theaters.

A lot of money was lost by independent producers and distributors in that period, as they tried to duplicate the success of those three films. The producers of "The Blair Witch Project" even made a sequel with a much bigger budget, but it grossed only a tiny fraction of what the original film took in. A second sequel was planned but never made.

History is repeating itself in the world of self-publishing. Amanda Hocking and John Locke are both said to have sold more than a million copies of their self-published works. As a result of their success, many other authors have decided to go the self-publishing route. Unfortunately, there aren't a lot of self-publishing success stories to point to beyond Hocking and Locke. There are some self-publishers who make a nice living from a series of titles, or who have one particular title that does well, but the vast majority of self-published titles sell few, or even no, copies.

Self publishers start with several strikes against them:
  • They have to bear the expense of hiring an editor and designer or do the tasks themselves and risk releasing a book with typos, an amateurish cover and poor layout.
  • They also have to pay for a book publicist or do the work themselves.
  • Self-publishers also have to get (and pay for) ISBN numbers in order to sell their books through most retailers.
  • If they're selling print books as well as eBooks, self-publishers have very limited ways of selling to retail bookstores, and they don't have salespeople who are regularly calling on bookstore buyers.
Self-publishers can almost always make much more money "off the top" for each copy of their books sold than if they work with a publisher, but they have a lot of costs to bear that they have to pay upfront. On the other hand, self-published authors and authors working with a publisher are fairly equal when it comes to promotion. Most publishers offer little or no promotional support for new authors beyond including them in their catalogs and listing them in Bowker's databases. If you want a book tour or want to get press interviews, you're either going to have to hire a publicist or do the work yourself.

I'm a fan of self-publishing, and I don't want to discourage you from considering it. However, keep in mind that you'll have to bear a lot of costs and do a lot of work that a publisher would do for you (albeit at a high cost, especially if your book is very successful). One final point: Both Amanda Hocking and John Locke are now working with publishers: Hocking with St. Martin's Press, and Locke with Simon & Schuster.
Enhanced by Zemanta

Friday, October 28, 2011

YouTube's scattershot channel strategy

Earlier today, the New York Times reported that YouTube will launch more than 100 channels of third-party video programming. YouTube is said to be paying the producers of the new channels, which will begin launching in November and continue throughout 2012, as much as $100 million to create original programming. The producers who have signed on with YouTube range from well-known media brands such as The Wall Street Journal, The Onion, Lionsgate, Reuters, Rodale Press and the WWE, to companies that were entirely unknown until today. Celebrities such as Ashton Kutcher and Deepak Chopra are also involved.

YouTube's plan, which is intended to eventually produce 25 hours of original content each day, is to build the new channels into places where viewers will return day after day, and advertisers will be willing to pay substantially higher rates than they pay for user-generated content. It's a good idea, but YouTube is taking a very scattershot approach to implementing it.

In the past, I've written about YouTube's plans to attract more and better programming. Last December, YouTube gave $1,000 credits to 500 of its YouTube Partners. At the time, I wrote that there's not much useful that a video producer can buy for $1,000 that would make a significant improvement in their productions. YouTube would have been much better off giving $10,000 credits to 50 well-targeted producers.

I feel much the same way about YouTube's new plan. The channels selected are all over the board in terms of content, and are likely to be equally all over the board in terms of quality. Instead of starting with 100 channels, YouTube should have started with 20 or 25, and worked carefully with the producers to insure that the quality of the channels would be high. Then, they could roll out a second wave of channels, perhaps six months down the road. By greenlighting 100 channels at the outset and rolling them out rapidly, YouTube has almost guaranteed that it will end up with a confusing mishmash of shows.

YouTube's intention is to build up a big library of compelling original programming quickly, but they're just as likely to create an assortment of channels carrying junk that would have never been produced had YouTube not committed to pay for it.
Enhanced by Zemanta

Sunday, October 16, 2011

The eSingle revolution

No, I'm not talking about online dating services or individual music tracks. eSingles are short (usually 30,000 words or less) eBooks, most often non-fiction. Amazon calls them Kindle Singles (hence the eSingles name); Apple calls them Quick Reads. There have been printed short stories and novellas almost since the birth of Gutenberg's printing press, but short non-fiction books haven't fared as well, especially in retail bookstores. There's an implicit connection in consumers' minds between the length of a book and its value: A thin book is simply worth less money than a thick one.

The manufacturing costs of books--printing and binding--are affected much less by the number of pages in a book with a given page size than they are by how many copies are printed and bound at a time. A 300-page book costs more than a 100-page book, but nowhere near three times as much. If consumers are predisposed to pay more for a bigger book, that makes the bigger book more profitable than a smaller one. The result is that publishers set word- and page-count targets for their authors that are intended to create bigger, higher-priced and more profitable books. In turn, the books are often padded out with unnecessary information--case histories, sidebars, interviews, etc., that may be interesting but that don't really contribute to the core value of the book. In other words, filler is added to increase the page count.

eBooks have changed how consumers perceive books, in that they no longer judge books by their cover (or, in this case, their size), and instead pay more attention to other factors: The author, subject and reviews. This has enabled eSingles to become viable. Seth Godin's The Domino Project, for example, publishes short non-fiction eBooks that are either primarily motivational or that focus on a single topic. Italy's 40Kbooks is bringing the same approach to fiction. Magazines and newspapers are publishing eSingles that either feature compilations of previously-published articles or new works; The U.K.'s The Guardian newspaper, for example, recently published an eSingle explaining how it broke the News International phone hacking story.

eSingles are generally less expensive than full-length eBooks--for example, they're typically priced between $1.99 and $7.99 at Amazon. Their short length means that they can be written, edited, designed and released much more quickly than full-length titles. That means that they can be more current than full-length titles from the "Big 6" trade publishers, which often take a year or more to get from submission of the manuscript into the hands of consumers as a printed book. Also, both writers and publishers can spread their risks across far more titles; a writer could potentially write and publish as many as four eSingles a year, vs. one full-length title.

Another factor in favor of eSingles, especially for self-publishing authors, is the aggressive royalty that they can get. Amazon, for example, will pay as much as a 70% royalty on each eBook sold, depending on the price and whether or not it gets exclusivity. Given that the Big 6 typically pay only 12% to 15% of the wholesale price to authors, self-publishing authors can make significantly more per copy, even at a much lower sales price.

Here's an example:

$19.99 eBook sold on agency terms (30% to reseller) = $13.99 wholesale price
15% author's royalty (before exclusions and deductions) = $2.10 per copy

$6.99 eSingle sold exclusively through Amazon with 70% royalty = $4.89 per copy

In this example, the author/self-publisher will have to pay all the costs of copyediting and design, but will still end up making substantially more money. That's why I believe that authors will gravitate to eSingles, non-conventional book publishers will embrace them, and "old line" publishers will be forced to respond.

Enhanced by Zemanta

Tuesday, October 11, 2011

Here's why the movie industry needs new revenues

AllThingsD reports that Rich Greenfield of BTIG has quantified the straits that the movie industry has found itself in. For more than a decade, DVD sales made up an ever-increasing majority of the industry's revenues. 2008's Great Recession flipped the DVD market from growth into decline, and the increasing popularity of Netflix and Redbox moved the home video market, which had transitioned from rental to purchase, back toward majority rental.

BTIG's numbers show how far the pendulum has swung. In the first half of 2010, U.S. sales of DVDs were just over $4 billion. Including Blu-Ray and electronic media, total home video sales were $4.998 billion. In the first half of this year, U.S. DVD sales were just over $3 billion--down almost 24% in one year. Blu-Ray sales, which were once seen as the great hope of the movie industry, were $810 million, up from $773 million a year earlier. The total for all physical media was less than the total for DVD alone last year. Electronic media sales increased year-over-year, but only from $260 million in 2010 to $270 million in 2011. Rental and Video-on-Demand revenues, on the other hand, increased from $3.782 billion in 2010 to $4.195 billion in 2011. The total rental market is now bigger than home video sales, and most of the rental revenues go to companies like Redbox and Netflix, not the movie studios.

This is why the movie studios are desperately trying every tactic they can think of to increase revenues, from $60 Video-on-Demand movies to UltraViolet digital copies of movies for online streaming. It's why Sony no longer wants to pay for 3D glasses, and why both movie studios and theaters are pushing 3D movies so hard. DVD sales were the lifeblood of the industry, financing ever more expensive movies and bigger promotional campaigns. With DVD revenues shrinking, studios are having to make difficult decisions, such as Paramount's recent decisions to consolidate its home video division with two other groups and to close its New York distribution office. At some point, studios are going to have to cut back movie budgets and possibly even cut the number of films they release each year.

The DVD "cash cow" is running out of milk, and there's nothing new on the horizon to replace it.
Enhanced by Zemanta

Intel surrenders in the set-top box war

NewTeeVee reports that Intel has shut down its its Digital Home Group, which was working on chips including Atom-based CE media processors for Internet-based set-top boxes, including Google TV and the Boxee Box. The Digital Home Group team has been reassigned to tablet development. Some work will continue on devices for cable and IPTV set-top boxes, such as the processor that Pace is using in the Xfinity set-top box that it developed for Comcast.

Intel was a partner in Google TV's development, and Intel's involvement was partially responsible for the failure of the first-generation product. The reason was that only Intel processors could be used in Google TV devices, and the cost of Intel's Atom processor made the price of products such as the Logitech Revue uncompetitive. The Revue was launched at $299, then dropped to $250 and now sells for $99. The price of Intel's processors wasn't the only problem, of course; poorly-designed software, inscrutable remote controls and a lack of support from content providers didn't help.

Now, processors based on ARM look like they have the upper hand in future Internet set-top box development. ARM-based processors are less expensive than Intel's Atom, and potential set-top box makers such as Samsung already manufacture them. Apple's A-series of processors is based on ARM. Now, it appears that both Google TV and Boxee will go with ARM for their next-generation products.

Enhanced by Zemanta

Thursday, October 06, 2011

A $60 Video-on-Demand Movie? It's Comcastic!

Fierce Cable reports that Comcast, and its Universal Studios subsidiary, will test releasing a movie to Video-on-Demand (VOD) just three weeks after it opens in theaters. The movie is "Tower Heist", starring Ben Stiller and Eddie Murphy, which opens November 4th, and Comcast will run the test in Atlanta and Portland, OR. And the price? $59.95, for which you get to watch the movie once.

(Update--October 13, 2011: Home Media Magazine reports that Universal and Comcast have cancelled their plans to release "Tower Heist" on VOD after two theater chains, Cinemark and National Amusements, said that they wouldn't show the movie if the companies went through with their plans.)

DirecTV already has a program in place with multiple movie studios to show VOD movies 60 days after they open in theaters for $29.95. Comcast's argument is that large families can save money by watching the movie at home instead of buying movie tickets, food and drink. My counterargument is that a family that's looking to save money will wait a few more weeks and get the movie from Redbox for $1, and a family that has to see the movie as soon as it comes out vs. paying $60 to see it at home will go to the theater.

The movie studios are getting more and more desperate to replace the income they're losing from the decline in DVD revenues, and Comcast is trying to fight off Netflix and Amazon by offering movies and television shows sooner than the over-the-top video providers can. However, the value proposition for a $60 VOD movie that's already been in theaters for three weeks is extremely hard to make.
Enhanced by Zemanta

Wednesday, October 05, 2011

Is Apple moving to an incremental product release strategy?

Yesterday's announcement of the iPhone 4S disappointed a lot of people, not just because of Tim Cook's low-key presentation. Many observers were expecting an all-new iPhone 5 with a larger display and a case design similar to the iPad 2. Instead, they got an iPhone 4 with a faster processor and better camera. As a practical matter, the functional difference between what was generally expected in the "iPhone 5" and what Apple actually delivered in the iPhone 4S isn't great. Apple is also going to sell a ton of iPhone 4S phones, regardless of what the pundits think.

The question is whether the iPhone 4S is an interim product designed to buy time in order to get a true 4G LTE model ready, or whether it represents an overall slowdown in product revisions by Apple. On the desktop, the iMac design hasn't changed much in years; the big changes have been internal, with minor cosmetic external revisions. The Mac Pro's external design has barely changed since it was introduced in 2006. There's an excellent chance that the next iPad will combine the iPad 2's physical design with a faster processor, higher-resolution display, and perhaps, a better camera.

Does it mean that Apple will slow down its product development cycles? Will Apple start releasing entirely new iPhones and iPads every 24 to 36 months? Longer product lives would certainly give Apple an opportunity to amortize tooling costs over far more units. It would also give Apple more time to develop completely new designs. After all, Apple will still be selling the iPhone 3GS when it ships the 4S, more than two years after it was first introduced. The 3GS is still a viable entry-level smartphone.

So, don't be surprised if the next iPad looks very much like the iPad 2, and if future iPad and iPhone exterior designs are changed less often. Apple may see no reason to issue an entirely new model every year, until competitors catch up.

Enhanced by Zemanta

Wednesday, September 28, 2011

Amazon and Apple's dramatically different strategies

Today's announcement of three new black & white Kindle models, plus the Kindle Fire tablet, have shaken up both Amazon's competitors and a number of industry observers. The Kindle 4 models (the Touch, Touch 3G and just plain Kindle) each set new low price points for eReaders with their functionality. The Kindle, at $79, will be a stocking-stuffer for the coming Holiday season. The Kindle Touch Wi-Fi, at $99, provides comparable functionality to Barnes & Noble's Nook for $40 less, and the Kindle Touch 3G adds always-on wireless connectivity. for $149. (All these prices are for Kindles with Special Offers, or in other words, advertising. Add $30 to the Kindle and $40 to the Kindle Touch models if you don't want ads.)

The Kindle Fire tablet, at $199, is cheaper than any brand name Android tablet except Lenovo's forthcoming A1, which comes with front and back cameras and GPS, all of which Amazon leaves off. However, Amazon has built its own user interface on top of Android that's designed to make the Kindle Fire much easier to use than Google's standard Android. Amazon has also built its own browser, called Silk, that uses Amazon's cloud services to pre-render web content for better performance.

Everything about the Kindle Fire demonstrates how radically Amazon's and Apple's strategies differ. Apple uses software to sell hardware; the iTunes Store and App Store are there to increase demand for Apple's hardware. Apple makes money from software and content, but it's a drop in the bucket compared to its hardware revenues. Amazon, on the other hand, uses hardware in order to sell its goods and services. It's likely that the new Kindles, including the Kindle Fire, are all being sold at close to break-even or possibly even at a small loss, in order to generate more sales of other goods and services. Amazon sees the Kindle Fire as potentially being as stimulative for music, movie and television show sales, as well as Amazon Prime subscriptions, as the black & white Kindles have been for eBook sales.

There's not a lot of daylight between Amazon and Apple for competitors to exploit. Amazon is setting price expectations at the low end and is trying to dominate the content market; Apple dominates the developer community, and the iPad will continue to have a much bigger selection of apps than any of its competitors. Can competitors sell more expensive tablets than Amazon with a mediocre selection of content, or less expensive tablets than Apple with a mediocre selection of apps? I doubt it.

This would have been a great opportunity for a radically different tablet, like Microsoft's Courier, that could compete in a completely different market segment, but Microsoft killed the Courier, and there's nothing on the horizon from first-tier competitors that can forge its own path. In hindsight, HP was probably right to kill the TouchPad, and I wouldn't be at all surprised to see other companies rethink their entire approach to the tablet market. Cloning Apple or Amazon won't work; competitors need radically different products.
Enhanced by Zemanta

Tuesday, September 27, 2011

Who buys Hulu? Most likely, no one

Silicon Alley Insider is reporting that, now that Hulu's auction is completed, the company's owners have some hard decisions to make. Comcast, News Corporation, Disney and Providence Equity Partners were looking for well north of $2 billion for Hulu, but they didn't get it. More accurately, they got it, but not in the way they wanted.

(Update, October 13, 2011: AllThingsD has reported that Hulu's owners have called off the company's sale and will continue to manage it themselves.)

The top bidder for Hulu was Dish Network, which bid around $1.9 billion dollars, more than either Yahoo or Amazon. Google apparently offered far more--around $4 billion--but the company wanted guaranteed access to Hulu's owners' content for much longer than the two to three years that had been offered. Without that kind of concession, the Hulu deal is really a two to three-year non-exclusive license to its content, not an "acquisition" in any real sense.

That's why Dish, Yahoo and Amazon weren't willing to spend even $2 billion for the company. Hulu's partners could more than double Dish's bid overnight by accepting Google's terms, but I don't think they will. They believe that their content, and the investment they've made in the Hulu platform, is worth more than $1.9 billion, and they're not willing to extend longer terms, given the rate of change in the online content market. Therefore, it's most likely that they'll cancel the auction and keep Hulu themselves.
Enhanced by Zemanta