Two weeks ago, I was at a large pharmaceutical client talking to a senior IT executive when the word "cloud" was mentioned in passing. He chuckled and responded that this was simply the nom du jour for something that has been in use for a number of years now. For example...
Client / Server. When Microsoft DNA became popular with redundant web, application and database servers this was, in essence, a cloud albeit one that was limited in its ability to scale since you couldn't rapidly add new machines to the mix as demand required it. (And DNA wasn't the first time this setup was used either - Microsoft simply made it sound fashionable.)
Application Service Provider (ASP). This was, in reality, a variant of Client / Server because essentially it was the exact same architecture run instead on another company's infrastructure. From a conceptual perspective, however, this was very similar to cloud computing as it's defined today: your application is deployed elsewhere allowing you to avoid having to invest in the infrastructure required to run it internally.
Service Oriented Architecture (SOA). "Do you want to be able to run your application where changes in the location of various subsystems won't affect its ability to execute? Then SOA is for you!" Of course, this wasn't the primary advantage of SOA but it was certainly mentioned as one of the primary advantages. This is similar in concept to a private cloud, in my opinion.
"But The Cloud is 'on demand computing'!" you exclaim. Do you really think that an application's architecture is going to change just because you run it internally, at an ASP, or on Amazon's EC2? Of course not. This is strictly a question of where the infrastructure resides and who is responsible for maintaining it.
Yet it's funny that, in spite of the fact that these architectural designs have been in play for a few decades now, the press would have you believe that "the cloud" is worthy of a Nobel Prize or something equivalent. When you read articles like this recent one on CIO.com where people like RedMonk's analyst Stephen O'Grady makes a statement like, "We are founded upon the idea that developers are the single most important constituency in technology," it sounds an awful lot like someone is trying to coerce the rest of the world into giving developers the respect that is probably due them (but never happens). Do I smell an attempt at World Domination by the geeks in the world?
Regardless of what the cloud really means, it is imperative that this one fact is never overlooked: the business has needs that need to be met. And while I love the concept ("something borrowed, something blue") and think that there are some very exciting cloud management solutions out there (AppLogic's 3Tera, for example) that probably wouldn't exist without first having the excitement around the concept of "the cloud," if I ever forget that "it's all about the business (duh!)" then I've lost all relevancy in the world of IT.
After all, it's the business that pays my paycheck and not the developers no matter how RedMonk or any analyst firm would want you to believe.
Client / Server. When Microsoft DNA became popular with redundant web, application and database servers this was, in essence, a cloud albeit one that was limited in its ability to scale since you couldn't rapidly add new machines to the mix as demand required it. (And DNA wasn't the first time this setup was used either - Microsoft simply made it sound fashionable.)
Application Service Provider (ASP). This was, in reality, a variant of Client / Server because essentially it was the exact same architecture run instead on another company's infrastructure. From a conceptual perspective, however, this was very similar to cloud computing as it's defined today: your application is deployed elsewhere allowing you to avoid having to invest in the infrastructure required to run it internally.
Service Oriented Architecture (SOA). "Do you want to be able to run your application where changes in the location of various subsystems won't affect its ability to execute? Then SOA is for you!" Of course, this wasn't the primary advantage of SOA but it was certainly mentioned as one of the primary advantages. This is similar in concept to a private cloud, in my opinion.
"But The Cloud is 'on demand computing'!" you exclaim. Do you really think that an application's architecture is going to change just because you run it internally, at an ASP, or on Amazon's EC2? Of course not. This is strictly a question of where the infrastructure resides and who is responsible for maintaining it.
Yet it's funny that, in spite of the fact that these architectural designs have been in play for a few decades now, the press would have you believe that "the cloud" is worthy of a Nobel Prize or something equivalent. When you read articles like this recent one on CIO.com where people like RedMonk's analyst Stephen O'Grady makes a statement like, "We are founded upon the idea that developers are the single most important constituency in technology," it sounds an awful lot like someone is trying to coerce the rest of the world into giving developers the respect that is probably due them (but never happens). Do I smell an attempt at World Domination by the geeks in the world?
Regardless of what the cloud really means, it is imperative that this one fact is never overlooked: the business has needs that need to be met. And while I love the concept ("something borrowed, something blue") and think that there are some very exciting cloud management solutions out there (AppLogic's 3Tera, for example) that probably wouldn't exist without first having the excitement around the concept of "the cloud," if I ever forget that "it's all about the business (duh!)" then I've lost all relevancy in the world of IT.
After all, it's the business that pays my paycheck and not the developers no matter how RedMonk or any analyst firm would want you to believe.