As a venture investor, one of the really difficult questions to get my hands around is this real question of the market and when it will develop. You can read lots of market reports from Gartner, IDC, or Forrester that predict the elusive $[x] billion market for technology 4-6 years in the future. Nobody believes these research reports, for the most part, but every private company that I see quotes them extensively and puts them in their slide decks. If you really think about it, though, the question of “when” a market really emerges and is addressable has very real consequences for anyone involved in cutting edge technology.
Why “When” Matters
There have been so many articles written about the mistakes that entrepreneurs often make while presenting to venture capitalists. I am going to take a different path as so many have written volumes about how entrepreneurs get hung up on valuation, don’t pay enough attention to terms, choose the wrong syndicates, and underestimate the financing requirements for their new ventures.
I think that a much more pernicious (and in my view underreported) problem is the question of when a given market will develop. As I mentioned in the preface, there are lots of examples of analyst research reports quoting multi-billion dollar markets in 3-5 years. I’d argue that being wrong by 1-2 years on when a market will actually develop has some really damaging effects for private companies, some of which I will list below:
Two Years of Extra Financing – One of the most important consequences of having a market develop more slowly than predicted is that two years of revenue-free operations have very profound impacts on a company’s capital requirements. That’s two more years of venture money that you will have to spend to keep your company going. To make matters worse, if you have built your sales and marketing staffs around the expectation of an impending cloudburst of sales, the CEO will either have to make some painful, demoralizing job cuts or suffer through with a bloated burn while waiting for the market to mature. For venture companies, this is death. You never want to be in a position where you have to take on large amounts of capital without positive business fundamentals underneath you.
Two Years of Cost Reduction – A subtle consequence of having a market develop two years later than anticipated is that in those two years the market can change drastically. In the early days of building a company with a breakthrough product, a company might have to handle all elements of the product. This is especially true in the world of communications and any kind of system design effort. What happens over time, however, is that as a market begins to look viable, companies will emerge to offload some of that R&D. As such, new competitors will be able to enter the market without having to do an end-to-end development effort.
Two Years Educating your Competition – Because so many start-ups and analysts want to be the first to break the story on a market, startups and their associated people spend so much time educating the market and the public on the need for the product and the market requirements for success. What this means, in practical terms, is that new entrants to the market can come enter with a firm idea of product requirements while a startup has done all of the hardwork to surface those needs.
Appearance of Being Stale – Related to the issue of burning large amounts of capital, when a startup that missed the mark on timing goes back to the venture community to raise additional capital, the company might bear the VC equivalent of the scarlet letter – the moniker of being “too early” to a critical market and being a dead deal. As a venture investor, you always have the option of investing in a number of companies in a given space and sometimes that which is new appears more attractive than that which is old.
So, can we identify any companies that might fit under this umbrella definition? I will submit a few cases that are debatable:
Moxi (now part of Digeo) – Moxi wanted to do it all for the consumer with a strong interest in networked home entertainment. They had to do a lot of work on their own and it looks like they might have bitten off more than they could chew. Nonetheless, the vision of what Moxi was trying to build is clearly the direction in which home entertainment is going – federated data sharing among devise with great UI and wireless communications.
Zambeel/BlueArc – I think that both of these companies have great technology and are addressing real needs for high-end NAS users. However, I am not sure that there are enough high-end NAS consumers to support start-ups in this space. In 3-5 years we will probably see end customers demanding high-throughput NAS megaboxes.
Graviton (in the interest in full disclosure, this is an In-Q-Tel portfolio company) – When Graviton first addressed the sensor market, they took a soup-to-nuts approach to wireless sensor networking. With all of the interest in plant automation, physical security, and environmental monitoring, a company that wins in this space will no longer have to bear the R&D expense of developing all of the required components on their end.
Almost any ASP founded between 1997-2000 – All but the most customized, sensitive applications will eventually be delivered by an ASP model once the links are infrastructure are sufficiently reliable. Just look at UpShot and Salesforce.com – these are real businesses generating real revenue. Unfortunately, however, most of the companies in this space overestimated the willingness of the F500 to outsource their apps to startups and overestimated the rate at which the infrastructure would firm up.
Are there any “hot” investment areas that could be a bit too early? Here are some guesses:
Wireless LAN – I am one of the biggest supporters of WLAN technology. In my view, it is the most painfully obvious technologies to come along since, well, the Internet itself. Now that the consumer Internet is about a decade old, we know that it takes a really long time for even obvious technology to get adopted. I am concerned that we are overestimating the rate at which this technology will be adopted.
Utility Computing – This theme has been in the works for years. Now that some of the rudiments are in place (storage virtualization, resource virtualization, and blade computing), there is renewed excitement about this space. I am concerned that we are really in the top of the first inning here — we now have enough functional components for end customers to really begin experimenting with these concepts, most of which have pretty fundamental management/implementation paradigm shifts.