Monday, November 1, 2010

A thought on anti-virus as anti-bodies

Sooner or later the OS and the virtualization layer will become one and the same. They already perform the same function (brokering limited physical resources amongst multiple applications) and while virtualization is a huge step, it just adds overhead to the equation.

In the future, I think each app will have its own little blended OS/virtualization wrapper and it will be able to move around a cloud environment, using what it needs, but not dependent on any single physical piece of hardware. That presents a problem though of securing it against some of the bad stuff that is sure to affect these systems.

This came up in a conversation with a friend tonight who posed the question of applications that today, while they are not required, sit in a shared OS environment with the application...applications like anti-virus software. It's a good question and if the applications are going to be more self-contained, then I don't think it can be answered from a traditional application programming perspective. I think we have to look at the cloud as more akin to a living biological organism rather than a static collection of manufactured processes and compute systems.

And if that is the direction that the cloud takes, one where the location of an application is much more dynamic than in even a traditional virtual infrastructure, then we need a better way to provide these protective functions as part of the "organism". We need an immune system for the cloud.

The first line of defense is the "skin" encompassing your standard perimeter security items such as filters and firewalls. Other layers of defense would be needed if that first layer is breached, apps that act like white blood cells or anti-bodies. Let them flow through the cloud in search of the virus or malware or whatever bad thing is there and then they can go to work cleaning it up. Of course, we'll still have to inoculate and create new vaccines and we'll need the ability to introduce "cures" for the new "bugs" that show up, against which there is not an existing defense, just like with our own bodies.

I don't know that this "organism" model is where we will end up, but something like it should be our end goal. If not, then we'll ultimately end up confined by a great monolithic structure instead of an organic type of thing that can adapt, self-monitor, and heal itself, or that, at the first sign of new symptoms, can be quickly and effectively treated and innoculated against future outbreaks.

Just my two cents and it is still a little rough, but I think the premise is sound. Feel free to comment.

Tuesday, August 24, 2010

Onramp to the cloud?

As I posted back in March, we "have all of the pieces to build out the Cloud 1.0. There are still a lot of questions around access, security, and offerings" that are need to be answered before we can get the cloud to the next level. With most people predicting a hybrid model based on a private-public partnership between companies and their providers, the vendors that can secure and maximize the efficiency of that link the soonest have a tremendous opportunity to capture a niche market that could generate a significant amount of demand, if the predictions hold true.

Afore Solutions may be the first one able to fill that gap. Having witnessed their Advanced Service Edge (ASE) product in action at last year's VMWorld, I was impressed with their ability to extend layer 2 services across extremely long distances for an extremely reasonable price. Since then they have taken that functionality, improved and bolstered it significantly, and are rolling out a virtual, software-only version of it at this year's VMWorld. Designated "CloudLink", this new product provides security, SLA management, and performance optimization for private or public clouds.

This may be a niche area but it is a critical one. Keep an eye on these guys.

Monday, August 16, 2010

A Singularity (of Sorts)

While it may not be as dramatic as the end of the human era when machines become smarter than people and "wake up", a la Cyberdyne Systems, we have reached a minor milestone in personal technology. For several years, I have been saying that our phones would get bigger and our computers would get smaller (no, I don't think I'm the only one that has made that obvious prediction) and now we have the Dell Streak. The iPad came close but it doesn't have any native telephony capabilities (Thank goodness this is not a Windows Mobile device. Dell chose Android and while it will be interesting to see how Oracle's suit against Google for allegedly violating Java copyrights will turn out, I think that is a great long-term strategy).

As Mr. Mossberg points out, the Streak is a "tweener" device, either a tiny tablet or a jumbo smart-phone. What the Streak gives up to the iPad is screen-size, which begs the question, "How big is too big?" We love our big-screen TVs and our 20"+ computer monitors, but we don't want to carry them around. And, aside from the initial cool-factor of being seen at the coffee shop with an iPad in your lap and a steaming latte on the table next to you, how long will this foray into the tablet-world last? The reality is, people will want bigger screens but they won't want to carry them around.

The next move will be toward personal heads-up devices (HUDs) and while I don't think this model solves the problem, there are some interesting things happening with much smaller form factors. Of course, that is also the next step in the devolution of social interaction. It's bad enough that we see people talking and don't realize they have a small blue-tooth earpiece in. Imagine having someone looking at you with their eyes darting all around, a visual shakedown, only to realize later that they were reading the news on their contact lens-based HUD.

I expect Oakley to develop a very stylish HUD that will also incorporate wireless connectivity and that will be the one people scoop up (after all, the problem with contact lens HUDs is getting the cable from the lens to the wireless radio!).

Don't get me wrong, I love technology and all of the things it has and will continue to provide. The problem is with how we use it. At that point, we will have moved even closer to the William Gibson view of the cyberpunk future, one in which we are able to "jack in" to a virtual reality that is more appealing than our physical one. No, our greatest threat is not that the machines will become aware and take over the world; our greatest threat is that we will become so "hooked" on them, that there will be no separation or identification of us without them. That is the singularity that should concern us.

Friday, April 2, 2010

Cloud Computing, the Tablet, and the Developing World

I love Nicholas Negroponte's One Laptop Per Child organization. As the name states, their mission is:

"To create educational opportunities for the world's poorest children by providing each child with a rugged, low-cost, low-power, connected laptop with content and software designed for collaborative, joyful, self-empowered learning."

One big challenge he is running into is the cost and he sees the tablet as the way to get to his goal of a $100 "laptop". He thinks all of the necessary functionality can fit into the tablet and that it is the ideal platform because it has "no moving parts, not even a hinge." It's perfect.

Except that today's tablet is finitely usable. Like any self-contained computing platform, it is only as good as the technology that is in it and its useful life is limited. What needs to happen is to purpose-build a tablet to tap into the cloud. Think how this could impact, not only Negroponte's children, but numerous others, especially women, in developing nations.

One of the things that I find fascinating (mainly due to my self-absorbed Western perspective) is the rise of the microfinance industry. Here we have people that are starting businesses on as little as $100-200! Compare that to the latest VC-financed enterprise here in the States. Now don’t get me wrong; relative to their annual incomes, that can be a steep price, but entrepreneurs exist everywhere, and few have ever had access to capital to launch. But the microfinance industry (made up of secular and faith-based participants) has opened up tremendous possibilities in those areas where it has been able to gain traction.

When we start to couple cloud-enabled tablets with microfinance opportunities and expand both we give those folks access to information and the means to begin to move out of the abject poverty in which so many of them live. Will we ever totally eradicate poverty? Not in this world. There are too many other factors at work. And neither the cloud or the tablet or even the applications are where they need to be yet. But they will be. And as information becomes more ubiquitous and people have access to means that allow them to act on it, other freedoms and opportunities can follow.

Sure, it's an idealistic view of things, but that doesn't mean we shouldn't give it a try someday.

Tuesday, March 16, 2010

The Cloud: A Simple Definition

How do I personally define the cloud? At the risk of over-simplifying: Online banking. Think about it: I ultimately run my business based on information. Information about my products, my customers, my employees, my finances, etc, and I need that information to be presented to me in such a way that it enables me to make decisions and to take (or not take) action. Online banking works just like that. It presents information about my account balances, gains/losses, etc. and based on that information I decide if I need to save more, if I can afford to spend any, if I need to move some around, etc.

Ultimately I only care that I can get to the data by calling my banker, or preferably, via a computer, netbook, terminal, or smartphone with a browser. I don't care where the data sits as long as I can get to it and it is secure. I don't care about the application that serves it up to me and I certainly don't care about the operating system or hardware on which it runs. I'd bet that most CEO/CFO types would tell you that this is exactly how they feel: Give them access to the data they need, wherever they are, whenever they need it, and make sure it is secure at rest and in transit.

Cloud services drivers?

From an executive perspective, I think we have been waiting for "the cloud" for some time now. Ever since the advent of the PC and it's insertion into the corporate environment, we have been trying to figure out how to manage users and data.

In the mid-90s, I was involved in rolling out IBM java-based thin-clients as terminal replacements hanging off of their mid-range platforms. This was a small step in the right direction (IP-enabled, Ethernet-attached, GUI-based) but still just a terminal with no real ability to match the performance characteristics or application availability of the PC, so it died on the vine.

Around the same time, we saw a hint of things to come with the Application Service Provider. Sure, people were tired of paying over and over again (capex) for updated versions of software that gave them no real advantage. Most people just want to create simple documents. Did each new version of Word make us that much more creative and powerful? The problem at that time was a lack of affordable bandwidth to deliver those apps as a service.

Then we saw the telecom bubble expand and miles and miles of fiber were laid to tap into the greatness that the Internet was quickly becoming. Then that bubble burst, 9/11 happened, and things shrank back from the bold new horizons that we looked to at the time. Simultaneously throughout the late '90s and into the early 2000s, we were hearing that we should focus on our core business and that IT was "context" and just a burden and while we understood where that was coming from, there was really no alternative.

And, finally, during that time, the last piece of the puzzle was slowly emerging: Virtualization technology.

So now, here we are. We have the mindset, the bandwidth, and the virtualization technology that allows us to decouple our data and applications from hardware which removes geographical limitations. We have all of the pieces to build out the Cloud 1.0. There are still a lot of questions around access, security, and offerings, but we are at a point where widget manufacturers can start to focus more on manufacturing widgets and begin to outsource those (IT) things that have ceased to provide a competitive advantage but that are still a necessity to function.

I think Nicholas Carr in his book, The Big Switch, does a great job of explaining why most IT functions will move to a utility type of model. He likens it to electricity. In the early days of electric power, companies had to build their own power generation capabilities on premise. Then they had to staff those systems to ensure that they were operational. One day, someone came along and offered them an outlet into which they could plug in their machinery and only pay for what they used, as they used it. Now the textile mill could get out of the electricity business and leave the generation of that power to someone that could do it on a much greater scale and provide it at a lower price than they could generate it for themselves. He sees the core functions of IT (computing and storage) as commoditized utilities that can easily be "generated" and delivered to any company that needs them and ultimately for a lower cost.

If companies can accomplish their business objectives more cost-effectively by pushing a bulk of their IT operations out to a cloud of some sort, then I am betting they will. After all, as Nicholas Carr says: [In] the end, the savings offered by utilities become too compelling to resist, even for the largest enterprises.

Monday, March 15, 2010

VDI and Storage (briefly)

The storage subsystem may be the most important piece of the entire virtual desktop infrastructure. It is responsible for most of the performance of the virtual desktop environment and if it can’t keep up, the entire environment will be impacted. We learned this the hard way a few months back.

In this environment, the I/O per second (IOPS) capacity is critical. Mechanical drives (your standard spinning disk) just can’t keep up with the demands a high-density VDI puts on them. Products you should consider are built around solid-state memory and provide thousands of IOPS per GB vs. just a couple hundred per GB with mechanical drives. Yes they are more expensive per GB but they are a fraction of the cost when you break it down on a $/IOPS basis.

VDI and the trouble with anti-virus

I don't know if this is news to most folks but hopefully it will provide some understanding of the problem of running traditional anti-virus software in a virtual desktop environment. This comes from experience...

Anti-virus is a definite concern in Virtual Desktop Infrastructure (VDI) deployment and no one that I have found out here in the blogosphere or on the vendor sites recommends it (except for the AV vendors). The problem is the way AV functions in the virtual environment. The people that don’t believe it causes hang-ups don’t seem to understand what is going on in this environment.

In this environment, the desktop is, for all intensive purposes, generated from a single, pre-built image each time someone boots up. The problem with desktop AV is that whenever it comes online, it goes out to see if it is up to date. When a virtual desktop spawns from the image that includes AV, it is always whatever version you built into the image (let’s call it v3.0). So each time one or one-hundred virtual desktops are spawned, that v3.0 AV tries to “phone home” to see if it is up-to-date. Of course, it is not up-to-date because you built your image five months ago, so then it (and any other image that just spawned) starts pulling down all of the updates to get it up to the current rev of v3.8. By the way, in our non-persistent environment, as soon as I log that desktop off it is "destroyed" and all of those updates disappear so that when I bring up my next instance, I go right back to v3.0.

Now in a physical environment automatic updates are no big deal other than the fact that they may impact the Internet connection when they all try to suck those updates through the straw. In a virtual environment, all of those images reside on the storage system and so every one of those disks is spinning trying to make these virtual desktops appear to operate like a physical desktop and that’s when you get into trouble.

To date and from what I have seen, there is no AV solution for the virtual desktop that is analogous to what we have used in the physical desktop world. That is one reason why people deploy virtual desktops in a non-persistent manner: If I get a virus, no big deal. I’m going to destroy that entire desktop image and everything with it when I log out of it. What if it infects the files? There are AV solutions for the storage that can go and scrub everything on the storage system where we keep the user files and everything else.

So, all of that being said, the AV folks are furiously working on a solution that works here. Just keep this important element in mind as you move down the VDI path.

Into the fray

Well, on my final day of employment at one place and in honor of new (ad)ventures, here is the introductory entry into my new blog site. Since anyone can show up in the blogosphere with their own creation why wouldn't I want to do the same. I hope there will be valuable introspection in the things that I publish around cloud computing from the perspective of the utility model.

I will post non-technical, managerial/executive-level perspectives of things going on in the world of cloud/utility computing and various posts based on my experiences as a senior manager of professional services engineers involved in developing best practices/methodologies and delivery of those technologies, having been involved in "troublesome" projects, and in my new ventures.

For more info on me, feel free to check out my LinkedIn profile.