• <1 minute

Thought Leaders in the Cloud: Talking with Chris C. Kemp, CTO of NASA

Chris C. Kemp is CTO of NASA, a position he has held since March of 2010. Before that appointment, he was NASA's CIO and was in charge of the Ames Research Center, the Security Operations Center, and…

Chris C. Kemp is CTO of NASA, a position he has held since March of 2010. Before that appointment, he was NASA’s CIO and was in charge of the Ames Research Center, the Security Operations Center, and various other facilities. Chris started with NASA as director of strategic business development; in that capacity, he forged a partnership with Google Earth regarding access to moon and Mars data as well as real-time tracking data for the space shuttle, space station, etc.

Prior to joining NASA, Chris helped create a number of Internet businesses, including Netran (formed while still in college), Classmates.com, and Escapia. He was chief architect for Classmates.com from 2000 to 2006. In his spare time, Chris enjoys skiing, snowboarding, hiking, and playing the violin.

In this interview, we cover:

  • Mandate to consider cloud computing for every major government IT project
  • “Be a Martian” – NASA using Windows Azure to let students classify craters
  • Opportunity for government to use more open source rather than write as much in-house
  • Security concerns around government data
  • Inter-country collaboration on cloud computing initiatives
  • Gartner blog post titled, “Do Government Clouds Make Any Sense?”
  • Collaboration with Microsoft Research on public/private cloud hybrid
  • NASA is leading the charge towards cloud computing and modern IT for government


Robert Duffner: To start, would you introduce yourself and your role at NASA?

Chris C. Kemp: Sure. I am Chief Technology Officer for IT at NASA. I’m responsible for the enterprise architecture division, our IT lab where we prototype and pilot new and emerging technology relating to IT, and our open government initiative.

Robert: “Information Week” editor John Foley recently stated that 58% of government agencies are in the process of assessing cloud computing initiatives. How accurate do you think that figure is?

Chris: I would think that it should be higher, given the language around our 2011 budget requests, where we have to actually look at cloud computing in every one of our major IT projects and justify any cases where we’re not using cloud.

It would surprise me if that message only got through to a subset of federal agencies. I respect John, but it seems to me that there are probably a lot of people working on their stuff down in the trenches just to be consistent with the OMB mandate, if for no other reason.

Robert: In a few video interviews, you’ve mentioned some of the work NASA has been doing with the Windows Azure team here at Microsoft. Can you tell me a bit more about NASA’s efforts with JPL and with the Mars and lunar reconnaissance orbiters?

Chris: I think the kickoff project was led out of JPL. We were building a platform for public outreach relating to some of the Mars data. It was a great collaboration between JPL and Microsoft, where we created a game where the Mars data could be used by a student to identify and classify craters on the Mars surface.

It was a really fun interface, and the project was done without any IT infrastructure at NASA by leveraging the Azure platform. We’re looking forward to expanding beyond just public outreach projects as we start to put our cloud strategy into effect.

Robert: Not long ago, I saw that you tweeted a response to Lew Moorman, when Lew stated that he wished all tax dollars went to NASA and that you said you wished more tax payers felt that way. While most federal agencies would say that, how do you feel that NASA’s cloud computing initiatives make better use of the budget you have?

Chris: I think the spirit of my response was that we were able to look to a community of open source developers to help write code that solves some of NASA’s challenges. Lew’s response was, “I wish all my taxpayer dollars went to agencies that were leveraging open source projects and were leveraging technologies that weren’t all invented here, as they say, and built from scratch by the government.”

I think that, to the extent that the government can leverage open source, and also cloud, we spend less time and attention building and operating infrastructure, and more time and attention solving some of the business challenges and mission challenges we have in IT.

Robert: Red Hat’s chief technology strategist for the public sector group recently stated that treating government organizations like a data vending machine is problematic. Consumers of data could demand a higher degree of reliability in government data at this point.

Because of some recent complaints around data.gov, he went onto say that if every public sector manager ends up on the front page of the “Washington Post” or gets dressed down onstage at Gov 2.0, nothing will change.

What do you think of those comments, and should public sector cloud deployments and access to the data these deployments provide be held to a higher standard?

Chris: I think that you need to start with the data, and you need to adjust the controls you put around security based on the sensitivity of the data. I think the greatest exposure exists where we have personally identifiable information or information that can compromise the security of people’s personal information on federal systems.

These are the areas where we need to demand a higher standard of security from cloud environments. We need to think about encrypting the data that is in the cloud.

A lot of federal agencies work with large sets of data that is not particularly sensitive, and in fact, it’s being shared with the public. And their agencies’ missions are focused on serving the public. I think there are plenty of opportunities, as the security conversations continue and as all these cloud platforms work with OMB and GSA to get through the CNA process on the federal ramp, there are plenty of opportunities for us to leverage the cloud in these areas where the data is less sensitive.

The successful cloud projects that we are seeing here at NASA and across the government are those projects that appropriately use the cloud based on mature security technologies and processes.

Robert: The Brookings Institution put out a report back in July, 2010 that highlights a few problems of the Data Cloud Computing Initiative in the public sector. One of the concerns is the lack of standards for cloud computing initiatives that cross international boundaries in areas such as privacy, data retention, and the security process.

Given the work that you’ve done to date with the National Institute of Informatics in Japan around interoperability between the National Institute of Informatics (NII) Cloud and Nebula, what are your thoughts on inter-country collaboration on cloud computing initiatives?

Chris: There are a lot of conversations about data locality issues when it comes to data that is being used, in particular, by federal governments across national boundaries. I think that’s probably what the Brookings Institute question focused on. It’s really a question that’s come up in some conversations around use cases at NIST as we’re working on federal cloud standards.

Governments are different than commercial companies, in the sense that there are certainly classes of data that must be kept within national boundaries.

I think the current CIs we see within commercial providers do not provide granularity to specify, “I want to keep three copies of this. I want to distribute it, but I only want to distribute it within France.” Or I only want to distribute it within France or within the United States.

I think these are emerging areas in the standards discussions, but I think where we need to start is with use cases. We need to map out why these requirements are there and how these requirements map back to the laws within various countries.

As we start to understand the scenarios in which we’re using the cloud in these cases, we can start to have a conversation around standards. I think the cloud is emerging so quickly that standards avocation might be a very long conversation. But where we’d like to see progress in this area is in the implementation of prototypes and pilots.

That’s what’s great about the Nebula test bed environment. We can implement some of these things as prototypes, test out the use cases that we’re seeing, and then if that proves itself out, we’ll bring that into the standards conversation, and it’ll become more and more baked into the other cloud projects and services that are out there.

Robert: Gartner recently put up a blog post titled, “Do Government Clouds Make Any Sense?” The thrust of the article was that public sector cloud efforts that lead to the ancillary benefits of data center consolidation and the evolution of shared services rarely turn into true community clouds as they’re defined by NIST, that are shared by multiple organizations. What are your thoughts on that, Chris?

Chris: I tend to agree. If you look at the attributes that NIST has attributed to true cloud environments, whether you’re talking about software platforms or infrastructure services, very few of these “cloud initiatives” set up by either federal agencies or commercial companies really pass all those tests.

As an example, you can just virtualize a lot of infrastructure in your data center and provide shared services, because without allowing metering to take place. The question arises as to whether an individual employee has the flexibility to turn all of those services off and not pay for them.

Elasticity is also a consideration: can we grow the amount of infrastructure by a thousandfold instantly without having to go through some of sort of internal process that consumes a lot
f resources?

Passing these attributes that NIST has described as being “cloud attributes” along to the end user is the key test that you have to pass in order to have a true cloud environment within a federal agency or within the federal government.

With the Nebula Project, we’re trying to really break through to the definition of a pure cloud environment as Gartner, Forrester, NIST, and others have designed it. But it’s tough. Its’ not an easy bar. It’s not just about the technology, but it’s about the business processes within the agency or within the organization adapting to translate all those characteristics through to the end users.

Robert: A recent study by Symantec highlighted Public Cloud blog and showed that public sector CIOs continue to emphasize private cloud computing over public cloud models.

How simple do you think it would be for a shift to begin a move toward public and hybrid models, and then how do you feel the Open Nebula project is contributing to that effort?

Chris: A lot of forces are driving CIOs to think about private cloud environments, and to think of the public cloud environments as a second phase. One of the factors that will drive a lot of these CIOs to start looking at the public cloud is their internal customers. Just as we’re seeing a consumerization of IT, we’re seeing the employees bring in portable consumer electronic devices and their own phones.

We’re going to see a lot of the employees begin using cloud infrastructure to solve problems in the context of their work. We’re going to see not the consumerization of the enterprise, but the cloudification of the enterprise.

If you think that it’s a challenge to deal with the consumer electronics proliferating in the enterprise, think about infrastructure proliferating in your enterprise. It’s a real challenge.

I think that, as more and more small groups within organizations start to use software, platform, and infrastructure services, CIOs are going to start taking clouds very seriously, and we’re going to see more emphasis on interoperability and portability between private and public clouds.

Robert: Previously, you gave an excellent example of how NASA is actually using a combination of both private and public clouds in a particular application. Can you talk a little bit about that for our Azure community?

Chris: Sure. A great example of where we think this makes sense concerns a project that we are working on right now in collaboration with Microsoft Research, where a single data set is used very differently by different kinds of end users.

If you have a half-billion files, and some of those files might never get looked at but others might be looked at millions of times, you need to use your infrastructure wisely. Over the past few years, content distribution networks solve this problem by taking those really popular files and spreading them across the Internet.

That is not really possible in a private cloud environment, because you don’t have the presence that particular service replicates. It is a service, however, that is provided by Azure.

So what we are looking at doing is providing the full resolution surface of Mars by World Wide Telescope, which is right now about a half-billion files. There are certain areas on Mars where NASA has created tours. And there are areas where we’ve had the Mars rovers take panoramic images and do a lot of work where there is a lot of interesting content. These are typically the very popular areas.

We’re working on putting all of the images that are extremely popular in Azure and delivering those from the Microsoft infrastructure. We’ll put the many tens of terabytes of other data that’s not quite as popular back in the NASA private cloud.

It’s really the composition of all of this that comes together to provide a great end user experience in World Wide Telescope. As you use World Wide Telescope, it’s the Azure infrastructure that will be serving up all the really popular content.

Even for areas on Mars where there’s not a whole lot going on, NASA has an obligation under its charter to make all that data publicly accessible. We keep that back in cold storage, in our private cloud in Nebula. The composition of services makes those really high performance but also really affordable to operate.

Robert: The Sunlight Foundation recently commented at the Government 2.0 conference that NASA is one of the organizations leading the charge into greater transparency for federal government agencies. Do you feel that projects such as Nebula and Open Nebula help?

Chris: Yeah. When NASA wrote its open-government plan, we realized that open government also needs a platform. A lot of open government has to do not just with putting spreadsheets out on the web, but in allowing the public to really visualize what we’re doing internally.

A lot of the information deep inside our data centers has typically not been very public. So when we put together our open government plan, one of the foundational elements of this was cloud computing and what we were doing with the Nebula project. .

I do think that there’s a great opportunity to take this open government directive because it’s sharing things that are public anyway and using cloud computing platforms to comply with that directive.

Because right now, there shouldn’t be any security concerns around data that you’re putting out there on the web for everybody to consume because you’re being transparent, you’re doing open government stuff. So I think there is a great area of linkage between those two initiatives going on right now with the federal government.

Robert: You were recently selected as a finalist by Nextgov for driving advancement within government. What are your thoughts on that selection? What do you think led to it?

Chris: I know they were looking for people who were trying to be disruptive. I read a description of the selection criteria, and it made me happy because that’s what I’ve been trying to do here. I don’t know much about the aware selection criteria. It’s always an honor when people recognize what we’re doing out here. I just think I’m a finalist, I don’t think I won the award. I didn’t walk out with a trophy. I clearly wasn’t quite disruptive enough last year.

[laughter]

Robert: That’s great. Well, Chris, I see that we’ve about run out of time. Thanks for talking today.

Chris: Thank you.