Zum Hauptinhalt wechseln

 Subscribe

Barton George joined Dell in 2009 as the company’s cloud computing evangelist. He acts as Dell’s ambassador to the cloud computing community and works with analysts and the press. He is responsible for messaging as well as blogging and tweeting on cloud topics. Prior to joining Dell, Barton spent 13 years at Sun Microsystems in a variety of roles that ranged from manufacturing to product and corporate marketing. He spent his last three years with Sun as an open source evangelist, blogger, and driver of Sun’s GNU/Linux strategy and relationships.

In this interview, we discuss:

  • Just do it – While some people are hung up arguing about what the cloud is, others are just using it to get stuff done
  • Evolving to the cloud – Most organizations don’t have the luxury to start from scratch
  • Cloud security – People were opposed to entering their credit card in the early days of the internet, now it’s common. Cloud security perceptions will follow a similar trajectory
  • Cost isn’t king – For many organizations, is the “try something quickly and fail fast” agility that’s drawing people to the cloud, not just cost savings
  • The datacenter ecosystem – The benefits of looking at the datacenter as a holistic system and not individual pieces

Robert Duffner: Could you take a minute to introduce yourself and tell us a little bit about your experience with cloud computing?

Barton George: I joined Dell a little over a year ago as cloud evangelist, and I work with the press, analysts, and customers talking about what Dell is doing in the cloud. I also act as an ambassador for Dell to the cloud computing community. So I go out to different events, and I do a lot of blogging and tweeting.

I got involved with the cloud when I was at a small company right before Dell called Lombardi, which has since been purchased by IBM. Lombardi was a business process management company that had a cloud-based software service called Blueprint.

Before that, I was with Sun for 13 years, doing a whole range of things from operations management to hardware and software product management. Eventually, I became Sun’s open source evangelist and Linux strategist.

Robert: You once observed that if you asked 10 people to define cloud, you’d get 15 answers. [laughs] How would you define it?

Barton: We talk about it as a style of computing where dynamically scalable and often virtualized resources are provided as a service. To simplify that even further, we talk about it as IT provided as a service. We define it that broadly to avoid long-winded discussions akin to how many angels can dance on the head of a pin. [laughs]

You can really spend an unlimited amount of time arguing over what the true definition of cloud is, what the actual characteristics are, and the difference between a private and a public cloud. I think you do need a certain amount of language agreement so that you can move forward, but at a certain point there are diminishing returns. You need to just move forward and start working on it, and worry less about how you’re defining it.

Robert: There are a lot of granular definitions you can put into it, but I think you’re right. And that’s how we look at it here at Microsoft, as well. It’s fundamentally about delivering IT as a service. You predict that traditional, dedicated physical servers and virtual servers will give way to private clouds. What’s led you to that opinion?

Barton: I’d say that there’s going to be a transition, but I wouldn’t say that those old models are going to go away. We actually talk about a portfolio of compute models that will exist side by side. So you’ll have traditional compute, you’ll have virtualized compute, you’ll have private cloud, and you’ll have public cloud.

What’s going to shift over time is the distribution between those four big buckets. Right now, for most large enterprises, there is a more or less equal distribution between traditional and virtualized compute models. There really isn’t much private cloud right now, and there’s a little bit of flirting with the public cloud. The public cloud stuff comes in the form of two main buckets: sanctioned and unsanctioned.

“Sanctioned” includes things like Salesforce, payroll, HR, and those types of applications. The “unsanctioned” bucket consists of people in the business units who have decided to go around their IT departments to get things done faster or with less red tape.

Looking ahead, you’re going to have some traditional usage models for quite a while, because some of that stuff is cemented to the floor, and it just doesn’t make sense to try and rewrite it or adapt it for virtualized servers or the cloud.

But what you’re going to see is that a lot of these virtualized offerings are going to be evolved into the private cloud. Starting with a virtualized base, people are going to layer on capabilities such as dynamic resource allocation, metering, monitoring, and billing.

And slowly but surely, you’ll see that there’s an evolution from virtualization to private cloud. And it’s less important to make sure you can tick off all the boxes to satisfy some definition of the private cloud than it is to make continual progress at each step along the way, in terms of greater efficiency, agility, and responsiveness to the business.

In three to five years, the majority of folks will be in the private cloud space, with still pretty healthy amounts in the public and virtualized spaces, as well.

Robert: As you know, Dell’s Data Center Solutions Group provides hardware to cloud providers like Microsoft and helps organizations building their own private clouds. How do you see organizations deciding today between using an existing cloud or building their own?

Barton: Once again, there is a portfolio approach, rather than an either-or proposition. One consideration is the size of the organization. For example, it’s not unusual for a startup to use entirely cloud-based services. More generally, decisions about what parts a business keeps inside are often driven by keeping sensitive data and functionality that is core to the business in the private cloud. Public cloud is more often used for things that are more public facing and less core to the business.

We believe that the IT department needs to remake itself into a service provider. And as a service provider, they’re going to be looking at this portfolio of options, and they’re going to be doing “make or buy” decisions. In some cases, the decision will be to make it internally, say, in the case of private cloud. Other times, it will be a buy decision, which will imply outsourcing it to the public cloud.

The other thing I’d say is that we believe there are two approaches to getting to the cloud: one is evolutionary and the other one is revolutionary. The evolutionary model is what I was just talking about, where you’ve made a big investment in infrastructure and enterprise apps, so it makes sense to evolve toward the private cloud.

There are also going to be people who have opportunities to start from ground zero. They are more likely to take a revolutionary approach, since they’re not burdened with legacy infrastructure or software architecture. Microsoft Azure is a good example. We consider you guys a revolutionary customer, because you’re starting from the ground up. You’re building applications that are designed for the cloud, designed to scale right from the very beginning.

Some organizations will primarily follow one model, and some will follow the other. I would say that right now, 95% of large enterprises are taking the evolutionary approach, and only 5% are taking a revolutionary approach.

People like Microsoft Azure and Facebook that are focused on large scale-out solutions with a revolutionary approach are in a small minority. Over time, though, we’re going to see more and more of the revolutionary approach, as older infrastructure is retired.

Robert: Let me switch gears here a little bit. You guys just announced the acquisition of Boomi. Is there anything you can share about that?

Barton: I don’t know any more than what I’ve read in the press, although I do know that the Boomi acquisition is targeted to small and medium-sized businesses. We target that other 95% on the evolutionary side with what we call Virtual Integrated System. That’s the idea of starting with the already existing virtualized infrastructure and building capabilities on top of it.

Robert: The White House recently rolled out Cloud Security Guidelines. At Microsoft and Dell, we’ve certainly spent a lot of time dealing with technology barriers. How much of the resistance has to do with regulation, policy, and just plain fear? And how much do things like cloud security guidelines and accreditation do to alleviate these types of concerns?

Barton: To address those issues, I think you have to look at specific customer segments. For example, HIPAA regulations preclude the use of public cloud in the medical field. Government also has certain rules and regulations that won’t let them use public clouds for certain things. But as they put security guidelines in place, that’s going to, hopefully, make it possible for the government to expand its use of public cloud.

I know that Homeland Security uses the public cloud for their public-facing things, although obviously, a lot of the top secret stuff that they’re doing is not shared out on the public cloud. If you compare cloud computing to a baseball game, I think we’re maybe in the bottom of the second inning. There’s still quite a bit that’s going to happen.

One of the key areas where we will make a lot of progress on in the next several years is with security, and I think people are going to start feeling more and more comfortable.

I liken it to when the Internet first entered broad use, and people said, “I would never put my credit card out on the Internet. Anyone could take it and start charging up a big bill.” Now, the majority of us don’t think twice about buying something off of the web with our credit cards, and I think we’re going to see analogous change in the use of the cloud.

Robert: Regardless of whether you have a public or private cloud, what are your thoughts on infrastructure as a service and platform as a service? What do you see as key scenarios for each of those kinds of clouds?

Barton: I think infrastructure as a service is a great way to get power, particularly for certain things that you don’t need all the time. For example, I was meeting with a customer just the other day. They have a web site that lets you upload a picture of your room and try all kinds of paint colors on it. The site renders it all for you.

They just need capacity for a short period of time, so it’s a good example of something that’s well suited to the public cloud. They use those resources briefly and then release them, so it makes excellent sense for them.

There’s also a game company we’ve heard about that does initial testing of some of their games on Amazon. They don’t know if it’s going to be hit or not, but rather than using their own resources, they can test on the public cloud, and if it seems to take off, they then can pull it back in and do it on their own.

I think the same thing happens with platform as a service. Whether you have the platform internal or external, it allows developers to get access to resources and develop quickly. It allows them to use resources and then release them when they’re not needed, and only pay for what they use.

Robert: In an article titled, “Cloud Computing: the Way Forward for Business?,” Gartner was quoted as predicting that cloud computing will become mainstream in two to five years, due mainly to cost pressures. When organizations look pass the cost, though, what are some of the opportunities you think cloud providers should really be focusing on?

Barton: I think it’s more about agility than cost, and that ability to succeed or fail quickly. To go back to that example of the game company, it gives them an inexpensive testing environment they can get up and going easily. They can test it without having to set up something in their own environment that might take a lot more time. A lot of the opportunity is about agility when companies develop and launch new business services.

The amount of time that it takes to provision an app going forward should, hopefully, decrease with the cloud, providing faster time to revenue and the ability to experiment with less of a downside.

Robert: Gartner also recently said that many companies are confused about the benefits, pitfalls, and demands of cloud computing. What are some of the biggest misconceptions that you still run into?

Barton: Gartner themselves put cloud at the very top of the hype cycle for emerging technologies last year, and then six weeks later, they turned around and named it the number one technology for 2010. There are a lot of misconceptions because people have seen the buzz and want to sprinkle the cloud pixie dust on what they offer.

This is true both for vendors, who want to rename things as cloud, and for internal IT, who when asked about cloud by their CIO, they say, “Oh, yes. We’ve been doing that for years.”

I do think people should be wary of security, and there are examples where regulations will prohibit you from using the cloud. At the same time, you also have to look at how secure your existing environment is. You may not be starting from a perfectly secure environment, and the cloud may be more secure than what you have in your own environment.

Robert: Those are the prepared questions I have. Is there anything interesting that you’d like to add?

Barton: Cloud computing is a very exciting place to be right now, whether you’re a customer, an IT organization, or a vendor. As I mentioned before, we are in the very days of this technology, and we’re going to see a lot happening going forward.

In much the same way that we really focused on distinctions between Internet, intranet, and extranet in the early days of those technologies, there is perhaps an artificial level o
distinction between virtualization, private cloud, and public cloud. As we move forward, these differences are going to melt away, to a large extent.

That doesn’t mean that we’re not going to still have private cloud or public cloud, but we will think of them as less distinct from one another. It’s similar to the way that today, we keep certain things inside our firewalls on the Internet, but we don’t make a huge deal of it or regard those resources inside or outside as being all that distinct from each other.

I think that in general, as the principles of cloud grab hold, the whole concept of cloud computing as a separate and distinct entity is going to go away, and it will just become computing as we know it.

I see cloud computing as giving IT a shot in the arm and allowing it to increase in a stair-step fashion, driving what IT’s always been trying to drive, which is greater responsiveness to the business while at the same time driving greater efficiencies.

Robert: One big trend that we believe is going to fuel the advance of cloud computing is the innovation happening at the data center level. It’s one thing to go and either build a cloud operating system or try to deploy one internally, but it’s another thing to really take advantage of all the innovations that come with being able to manage the hardware, network connections, load balancers, and all the components that make up a data center. Can you comment a little bit about how you see Dell playing into this new future?

Barton: That’s really an area where we excel, and that’s actually why our Data Center Solutions Group was formed. We started four or five years ago when we noticed that some of our customers, rather than buying our PowerEdge servers, were all of a sudden looking at these second-tier, specialized players like Verari or Rackable. Those providers had popped up and identified the needs of these new hyperscale providers that were really taking the whole idea of scale-out and putting it on steroids.

Dell had focused on scale starting back in 2004, but this was at a whole other level, and it required us to rethink the way we approach the problem. We took a step back and realized that if we want to compete in this space of revolutionary cloud building, we needed to take a custom approach.

That’s where we started working with people like Microsoft Azure, Facebook, and others, sitting down with customers and focusing on the applications they are trying to run and the problems they are trying to solve, rather than starting with talking about what box they need to buy. And then we work together with that customer to design a system.

We learned early on that customers saw the system as distinct from the data center environment. Their orientation was to say, “Don’t worry about the data center environment. That’s where we have our expertise. You just deliver great systems and the two will work together.” But what we found is if you really want to gain maximum efficiencies, you need to look at the whole data center as one giant ecosystem.

For example, with one customer, we have decided to remove all the fans from the systems and the rack itself and put gigantic fans in the data center, so that the data center becomes the computer in and of itself. We have made some great strides thinking of it in that kind of a holistic way. Innovation when developing data centers is very crucial to the overall excellence in this area.

We’ve been working with key partners to deliver this modular data center idea to a greater number of people, so this revolutionary view of the data center can take shape more quickly. And then, because they’re modular, like giant Lego blocks, you can expand these sites quickly. But once again, the whole thing has to be looked at as an ecosystem.

Robert: Thanks a lot for your time. This has been a great conversation.

Barton: Thank you.

Tweet

  • Explore

     

    Let us know what you think of Azure and what you would like to see in the future.

     

    Provide feedback

  • Build your cloud computing and Azure skills with free courses by Microsoft Learn.

     

    Explore Azure learning


Join the conversation