Monday, January 4, 2010

FTC to Investigate Cloud Computing

The investigation should raise some concerns with the enterprise community. Such an investigation could cover aspects of Internet communications that have been in use for years.

How would the FTC distinguish between the rights of the consumer and businesses that also use cloud computing services? What regulations would drift into the enterprise sector?

Any service provider could be viewed as part of the investigation under such a broad umbrella. The obvious parties would include Google, Amazon, Microsoft, Rackspace and the other large cloud computing services.

SaaS is a form of cloud computing. That could mean a company like NetSuite, Zoho or Salesforce.com would have a stake in the outcome of such an investigation.

According toThe Hill, the investigation surfaced in a filing with the Federal Communications Commission (FCC).

In the filing, The FTC recognizes the cost savings of cloud computing but has concerns about information being stored remotely:

"However, the storage of data on remote computers may also raise privacy and security concerns for consumers," wrote David Vladeck, who helms the FTC's Consumer Protection Bureau.

This statement is puzzling. People have been storing their data remotely since the early 1990s on services that predate the social networks.

The intent of the inquiry is to protect consumers privacy. But the repercussions of such a broad investigation will also have reverberations throughout the enterprise community if the inquiry is not narrowed.

According to The Hill, the FTC is holding a roundtable Jan. 28 to focus on privacy protections. It will include specific discussions about cloud computing, identity management, mobile computing and social networking.

Posted via web from chetty's posterous

Turn Your iPhone into a TV Remote

Posted via web from chetty's posterous

Monday, November 2, 2009

Inside one of the world's largest data centers

CHICAGO--On the outside, Microsoft's massive new data center resembles the other buildings in the industrial area.

Even the inside of the building doesn't look like that much. The ground floor looks like a large indoor parking lot filled with a few parked trailers.

It's what's inside those trailers, though, that is the key to Microsoft's cloud-computing efforts. Each of the shipping containers in the Chicago data center houses anywhere from 1,800 to 2,500 servers, each of which can be serving up e-mail, managing instant messages, or running applications for Microsoft's soon-to-be-launched cloud-based operating system--Windows Azure.

Upstairs, Microsoft has four traditional raised floor server rooms, each roughly 12,000 square feet and consuming, on average, 3 megawatts of power. It's all part of a data center that will eventually occupy 700,000 square feet, making it one of the world's largest.

"I think, I'm not 100 percent sure, but I think this could be the largest data center in the world," said Arne Josefsberg, general manager of infrastructure services for Microsoft's data center operations.

Even with only half the site ready for computers, the center has 30 megawatts of capacity--many times that found in a typical facility.

On a hot day, Microsoft would rely on 7.5 miles worth of chilled water piping to keep things cool, but general manager Kevin Timmons smiled as he walked in for the facility's grand opening in late September. It was around 55 degrees outside.

"When I stepped out, I said 'what good data center weather'," he said. "I knew the chillers were off."

Although Microsoft is open about many of the details of its data centers, there are others it likes to keep quiet, including the site's exact location, the names of its employees, and even which brand of servers fill its racks and containers.

The software maker also won't say exactly which services are running in each facility, but the many Bing posters inside the upstairs server rooms in Chicago offer a pretty good indication of what is going on there.

Microsoft originally intended to open the Chicago facility last year, but the company has slowed its data center pace some amid the weaker economy and an array of cutbacks companywide. Instead, the facility had its grand opening in late September.

Of Sidekick--and Azure
Within a month, though, Microsoft's data centers were attracting attention for a wholly different reason. A massive server failure at an older facility--one that Microsoft acquired as part of its Danger acquisition--left thousands of T-Mobile Sidekick owners without access to their data as part of an outage that is now stretching into its second month.

Although Sidekick uses an entirely different architecture, the failure represented a tangible example of the biggest fear of cloud computing--that one will wake up one day to find their data gone.

Microsoft is quick to highlight the differences between the Sidekick setup and what Microsoft is building in Chicago and elsewhere. "We write multiple replicas of user data to multiple devices so that the data is available in a situation where a single or multiple physical nodes may fail," Windows Azure general manager Doug Hauger said in a statement after the Sidekick failure.

As for Azure, Microsoft is expected to talk about its commercial launch at this month's Professional Developers Conference in Los Angeles, including offering more details on how the system will provide its redundancy. Microsoft has already announced some new Azure details, noting last week that it will begin charging for Azure as of February 1.

Microsoft is still trying to figure out just how much capacity at Chicago and elsewhere it needs to assign for Azure.

"Azure is incredibly hard to forecast," said Josefsberg. "We're probably erring toward having a little more capacity than we need in the short term."

What is clear is that, over time, Microsoft will need even more capacity. That's what has Josefsberg returning to a custom "heat map" that figures out the best place to build data centers based on factors including cheapness, greenness, and availability of power, political climate, weather, networking capacity, and other factors. Choosing the right spot is critical, Microsoft executives say, noting that 70 percent of a data center's economics are determined before a company ever breaks ground.

Josefsberg said he already has the next spot picked out.

"We know exactly where it is going to be but I can't tell you right now," he said.

But Microsoft has indicated how the next generation of data center will improve upon the Chicago design.

Moving to containers allows Microsoft to bring in computing capacity as needed, but still requires the company to build the physical building, power and cooling systems well ahead of time. The company's next generation of data center will allow those things to be built in a modular fashion as well.

"The beauty of that is two-fold," Josefsberg said. "We commit less capital upfront and we can then accommodate the latest in technology along the way."

Posted via web from chetty's posterous

Friday, September 18, 2009

Enterprise Content Management in the Cloud

Provides Developer Kit Including Alfresco on Amazon EC2-Ready Application Stacks

London, UK – September 17, 2009 – Alfresco Software Inc., the leader in open source enterprise content management (ECM), today launched its Cloud Content Application Developer Program. Alfresco will provide an open source Amazon EC2-ready stack and developer kit for customers and partners to develop, deploy and monetize cloud service architecture (CSA) content applications on the EC2 platform.

Tweet This: The Alfresco Cloud Content Application stack is available for Amazon EC2 at: http://bit.ly/46EhtI

Managing content effectively at a low cost while adhering to regulatory controls has become an indispensible part of running an efficient business. According to the study “Above the Clouds: a Berkley View of Cloud Computing,” the cost is one-fifth to one-seventh of that offered to a medium sized data center. Alfresco is designed to take full advantage of a cloud service architecture and deliver cost-effective high availability and scalability. Alfresco is portable across internal and external clouds through the use of the Content Management Interoperability Services (CMIS) specification. Those organizations wishing to use secure, but shared data models also have the option of deploying multi-tenant solutions, offering compliant access between multiple legal entities.

“Content growth requires a sophisticated ECM solution that can scale users and content volumes simply and at low cost without massive up-front capital expenditure. Alfresco today has customers in the cloud with millions of users, terabytes of data and hundreds of millions of documents,” commented John Powell, CEO, Alfresco Software. “Legacy applications may run in the cloud, but modern content service approaches consuming services resident in the same cloud are required to inherit the full benefits of a cloud service architecture. Only then can enterprises and governments achieve the cost efficiencies of on-demand scalability, fault tolerance and cloud-wide network security for documents, records and collaboration.”

The Alfresco Cloud Developer Program offers partners “early adopter” advantages to deliver cloud-ready content applications for collaboration, document and records management. Alfresco will also offer a subscription for those requiring expert Enterprise 24/7 support.

For further information regarding Alfresco's Cloud Content Application Developer Program, visit http://wiki.alfresco.com/EC2 or view this free webinar for more ideas: http://www.alfresco.com/about/events/2009/06/content-as-a-service/.

 

Posted via web from chetty's posterous

Tuesday, September 15, 2009

Obama Administration unveils cloud computing initiative

The administration's cloud computing initiative is getting started immediately, at least in small measure, on the brand-new Apps.gov Web site.

(Credit: Apps.gov)

MOUNTAIN VIEW, Calif.--The Obama administration on Tuesday announced a far-reaching and long-term cloud computing policy intended to cut costs on infrastructure and reduce the environmental impact of government computing systems.

Speaking at NASA's Ames Research Center here, federal CIO Vivek Kundra unveiled the administration's first formal efforts to roll out a broad system designed to leverage existing infrastructure and in the process, slash federal spending on information technology, especially expensive data centers.

According to Kundra, the federal government today has an IT budget of $76 billion, of which more than $19 billion is spent on infrastructure alone. And within that system, he said, the government "has been building data center after data center," resulting in an environment in which the Department of Homeland Security alone, for example, has 23 data centers.

Obama administration CIO Vivek Kundra on Tuesday unveiled the government's new cloud computing initiative.

(Credit: Daniel Terdiman/CNET)

All told, this has resulted in a doubling of federal energy consumption from 2000 to 2006. "We cannot continue on this trajectory," Kundra said.

That's why the administration is now committed to a policy of reducing infrastructure spending and instead, relying on existing systems, at least as much as is possible, given security considerations, Kundra said.

As an example of what's possible with cloud computing, Kundra pointed to a revamping of the General Services Administration's USA.gov site. Using a traditional approach to add scalability and flexibility, he said, it would have taken six months and cost the government $2.5 million a year. But by turning to a cloud computing approach, the upgrade took just a day and cost only $800,000 a year.

But while some of the benefits of the administration's cloud computing initiative are on display today--mainly at the brand-new Apps.gov Web site--Kundra's presentation was short on specifics and vague about how long it may take the government to transition fully to its new paradigm.

Indeed, Kundra hinted that it could take as much as a decade to complete the cloud computing "journey."

Three parts to initiative

While repeatedly referencing the realities that many government efforts must make allowances in their IT needs for security, Kundra argued strongly that in many other cases, there is little reason that federal agencies cannot turn to online resources for quick, easy and cheap provisioning of applications.

As a result, the first major element of the initiative is the brand-new Apps.gov site, a clearinghouse for business, social media and productivity applications, as well as cloud IT services. To be sure, the site isn't fully functional yet, and in fact, a brief survey of it resulted in a series of error messages. But it's evident that the administration hopes that for many agencies, the site will eventually be a one-stop shop for the kinds of services that to date have required extensive IT spending, and Kundra said he believes that some at the Department of Energy had already been using the site for some of its needs.

Kundra said that the second element of the effort will be budgeting. For fiscal year 2010, the administration will be pushing cloud computing pilot projects, reflecting the effort's priority and hopes that many lightweight workflows can be moved into the cloud, and for fiscal 2011, it will be issuing guidance to agencies throughout government.

Finally, the initiative will include policy planning and architecture that will be made up of centralized certifications, target architecture and security, privacy and procurement concerns. Kundra said that every effort will be made to ensure that data is protected and secure, and that whatever changes are made are "pragmatic and responsible."

Clearly, though, the administration has seen benefits in the way private industry uses cloud computing, and intends to mirror those benefits. Ultimately, he added, the idea is to make it simple for agencies to procure the applications they need. "Why should the government pay for and build infrastructure that may be available for free," Kundra said.

One inspiration, he explained, is advances the government has already seen in the streamlining of student aid application forms. The so-called FAFSA (Free application for federal student aid) form is "more complicated" than the federal 1040 tax form, Kundra said. But in a joint effort between the IRS and the Department of Education, it has become possible with one click of a mouse button for IRS data to populate the FAFSA form, Kundra said, eliminating more than 70 questions and 20 screens.

That, then, should be the kind of thing that the government seeks to do across the board, ultimately delivering large savings to taxpayers and significantly reducing the environmental impact of government IT systems.

Posted via web from chetty's posterous