The powerful, refrigerator-sized computers that were once a hallmark of corporate and research data centers have steadily lost ground to off-the-shelf servers over the years.
Mind you, mainframes still alive and kicking, but they’ve been relegated to workloads that require secure and dedicated data processing. Think moneyed financial firms, energy producers and other organizations and government agencies where secrecy and security are top priorities — even if it comes at the cost of some flexibility.
“The end-user interfaces are clunky and somewhat inflexible, but the need remains for extremely reliable, secure transaction oriented business applications,” writes Cureton.
In retrospect, it’s kind of hard to imagine NASA without its mainframes. In popular culture, visuals of chunky, blinking caverns of computer equipment — complete with reel-to-reel tape — are as synonymous as the agency’s scientists and space-faring exploits.
Exploring and pioneering clouds
Today, the reality is different. In recent years, the federal government as a whole is looking to efficient, cost-cutting ways of getting the computing power it needs. The solution, in large part, is cloud computing.
NASA’s own Nebula cloud computing platform is an effort to give engineers and researchers a bigger pool of computing power and grant its partners and the public access to its data sets. But the space agency is more than an adopter, it’s also a pioneer.
NASA and Rackspace are behind the open source cloud platform called OpenStack. Of late, the technology has attracted an impressive number of supporters including mega-carrier AT&T. And startups like Cloudscaling and Piston Cloud Computing — the latter of which was co-founded by NASA Nebula Chief Technical Architect Joshua McKenty — are bringing OpenStack-based private clouds to enterprises.
So, as NASA bids farewell to the mainframe, it’s doing more than its part to advance the cloud. Fitting, isn’t it?
Image credit: NASA