Some of mankind's greatest accomplishments have been all about eliminating complexity. Take water. For thousands of years, humans have been perfecting ways to let people easily grab a drink, regardless of how many miles of aqueducts, how many reservoirs, or how much water treatment are needed to make it possible. And then there's electricity. Inventing the light bulb was nice. But the truly miraculous creation is the gigantic infrastructure of power plants and grids that allows consumers to merely flip a switch to turn on that light bulb.
But life isn't so easy for people who run computer networks. Chief technology officers aren't removed from the computing process. They still have to worry about the day-to-day intricacies of using technology. It's a thankless job burdened with considerable expenses. Companies around the world will spend $95 billion in 2004 just to maintain their server computers -- 80 percent more than they'll spend to actually buy servers over the entire year.
Truth is, for all the high-tech world's advances, the simple task of assigning servers from one job to another takes a lot of grunt work. Imagine this scenario: An online promotion takes off at a retailing Web site, and its tech manager wants to assign more computing power to respond to a traffic spike of shoppers rushing to the site. "Today, it's about as easy as moving a family into a new house," says Jay Kidd, chief technology officer of Brocade Communications Systems. "It would probably take more than a day. And meanwhile, your customers are waiting."
'Something that works'
Computer execs know that's a problem. Even during the dot-com bubble, visionaries like Sun Microsystems CEO Scott McNealy and Netscape Communications co-founder Marc Andreessen -- now chairman of a small software company called Opsware -- were talking about the need to make computer networks more like utility networks such as the power grid.
While the industry has started to make progress in spots, it hasn't gone very far toward the ultimate goal of making computing equipment as easy to use as a light switch -- what has come to be known as "utility computing." "The [tech industry] needs to wise up to what's necessary to make utility computing a reality," says Andy Green, CEO of BT Global Services, a division of British Telecom. "What the customer needs is something that works."
Strange that this is taking so long in an industry so famous for its innovation. When the electricity industry was being built in the late 1800s, it took about two decades for centralized utilities to come into vogue. And by the 1920s, electricity was ubiquitous in the U.S. In comparison, the modern computer industry looks downright stunted. If you consider that it got its real start with the invention of the integrated-circuit board in 1958, it has taken nearly 40 years to reach a point where many people are even talking about computer utilities.
The slow progress has plenty of reasons. To start, no single company does everything for tech buyers. Computer makers have created servers and management software designed to let them run more easily. The same goes for makers of data-storage gear, makers of software, and even the makers of computer networking equipment. "There's a holy war going on," says Jason Donahue, chief executive of software maker Meiosys, one of the many small startups trying to solve the utility computing problem. "All of these groups are developing [utility computing] technology, but there's nothing unifying it."
Certainly, making computer networks run like a utility is a daunting task. In order for utility computing to work, servers, storage gear, networking equipment, and the software that manages it all have to work like four legs of a stool. If one leg isn't cooperating with the others, the whole thing doesn't work as well as it should. Analysts say a transition to true utility computing, where tech managers or even consumers can assign the right amount of computing power to a particular task whenever they want, could take another decade or more.
Here's another scenario: A company that makes water boilers needs to find the best price on copper piping to fulfill a big order, and it needs to get the job done fast. The company starts by reassigning servers to run e-commerce software that checks into availability and price with various suppliers. That's just the start.
'Running out of time'
Those newly assigned servers have to tap into the data banks where pricing information is stored. This requires a fast connection to the computer networks of suppliers so queries don't end up in a slow-moving queue. "This all has to happen quickly, because if you can't get the parts from one supplier, you need to go somewhere else," says Clay Ryder, executive vice-president with Sageza Group, a market research company in Union City, Calif. "If you only have three days of inventory left and it takes a day to find out whose got more parts, you're running out of time."
The good news is that the work necessary for true utility computing is already being done. Computer makers like IBM (IBM ), Hewlett-Packard (HPQ ), and Sun have unveiled "virtualization" schemes that let companies manage hundreds or thousands of computers like one giant machine. Using software to manage all that gear, they can quickly shift work from machine to machine. That allows some of their corporate customers to get more out of the equipment they own. In some cases, server utilization has jumped from a crummy 15% to 60% or more.
Progress is also being made in the storage leg of the utility stool. Ten years ago, almost all data were stored inside of servers, making info largely unavailable to other machines. Since then, most companies have created storage networks that separate the data from the servers. In addition to making the data available to more than one server, it also lets companies reduce the amount of unused space in storage drives.
More recently, companies such as EMC and Veritas Software have been working on technology and processes to help companies lower their storage costs by finding the most cost-effective way to stash info. While an e-mail may be kept on a $1 million piece of storage equipment when it's first created, it will eventually be archived on cheap tape drives. EMC and Veritas are developing ways to make sure that data is stored in the right spot -- cheap tape or pricey disk drives.
More innovation is on the way. Network Appliance is looking at ways to develop a more flexible "storage grid," that would create a powerful directory to give every piece of data an ID tag of sorts. That way, it could be immediately spotted by any server in the corporate network, without human intervention. "In a true utility computing model, you're switching computers in and out all the time," says NetApp Chief Executive Daniel J. Warmenhoven. "So you want every data object to be accessible by every [server] -- just like you want your browser to be able to get to everything on the Web."
The same is true in networking. One problem for companies is that on the public Internet every bit of data has the same priority, a pirated song being downloaded by a teen or a $1 million trade being executed by a major brokerage. Now, a number of industry groups are trying to set standards to add "quality-of-service" capabilities to the Net. That way, companies that wanted to pay slightly more money can get guaranteed delivery speeds.
"Companies ought to be able to call up the network resources they need, when they need it," says Christine Heckart, vice-president for marketing at Juniper Networks, which has created a 26-company group, including IBM and Oracle to look at the problem. "Networking suppliers can't do this alone. We need the participation of the computing and applications folks."
In some cases, big tech companies are looking to compete as well as collaborate. Sun acquired a small company called Nauticus earlier this year to help it install basic networking capabilities right in its servers. IBM and Cisco Systems are also expected to blur the lines between their traditional silos in the tech industry as they race to solve the utility question.
Dozens of startups are looking for entirely new ways to solve the problem. Netezza, in Framingham, Mass., makes a innovative kind of server that companies can use to do sophisticated data analysis, such as a phone company figuring out the profitability of a new promotion.
Tech for the 'next era'
Preconfigured with hundreds of disk drives and sophisticated database software, anyone on the company network can tap the Netezza server to get instant answers to tough questions without slowing down the company's day-to-day business. Rather than focus on just lowering storage costs, this approach lets companies get answers they couldn't afford to even ask in the past, says Netezza CEO Jit Saxena, who says the company has sold 35 of the $1.5 million machines.
On the other hand, Azul Systems, a startup in Mountain View, Calif., has developed a new server computer that's designed to provide a massive amount of processing power that other servers can tap into to handle day-to-day computing jobs. "There's a lot of good work being done to solve these problems with existing technologies. That's a good thing, and it's necessary," says Shahin Khan, a former Sun marketing executive who just joined Azul. "But wouldn't it be nice if we could go build new technology that's designed for this next era?"
Whether it's tech industry's giants or a new class of startups that push computing into the easy-to-use world of utilities, customers will benefit the most. In this special report, BusinessWeek Online explores various facets of this trend, from wireless technology to big bets being made by Sun to the impact on small businesses. It will also examine how well browser pioneer Andreessen has done in his efforts to address the problem, and it follows the venture-capital money to a new generation of tech startups.