According to Maritz, the company's internal cloud operating system, vSphere, is already providing companies with more flexibility in internal systems, but the next step is to link that with external cloud services so that companies can buy more cloud time on a flexible basis and lower their operating costs.
At the same time, vSphere will be getting new capacity planning, storage configuration, operational expense planning and data recovery modules to make linking up easier. Once in place, IT managers will be able to build 'virtual datacentres' around specific processes and departments by renting capacity from cloud vendors.
However, work is needed to be done to make sure that applications can run seamlessly in such circumstances. As such the company is offering a vCloud API to the Distributed Management Task Force for consideration and use by developers.
"Otherwise, you'll have the ultimate 'California hotel', where you can check applications in but not be able to get them out," Maritz said.
However, this does have a downside, in that the clouds will only work with VMware-compatible systems and leaves the company open to charges that it is trying to lock in customers rather than offering a true open system.
Maritz said that the vCloud API was an effort to counter that and would provide a more open development platform.
Geoengineering on the sea floor near glaciers would form a new ice shelf to prevent melting
Alterations in capillary blood flow can be caused by body position change
Curiosity rover is in 'normal mode' but not transmitting scientific data back to base
NatWest outage comes a day after Barclays' IT systems shut out customers and staff