• Print

GIS, Landsat, and Public Data

Google has announced new integration between two of their most popular products, Google Earth and Google Book Search.

My Radar colleague Brady Forrest has eloquently discussed these features.

This is wonderful, and a step toward truly integrative functionality among diverse sets of content. But as we witness this, it is worth considering what helped get us here, particularly in regard to this spectacular use of earth imaging, and what our choices are where we’re headed. That consideration takes us all the way back to middle of the Cold War, when the U.S. first spearheaded the development of public-access satellite imaging.

Landsat is a long-standing series of U.S. satellites that provide imaging data about current land usage. As early as Landsat 1′s launch in July 1972, it became apparent that such data was extremely valuable to a wide variety of users, and all Landsat data was subsequently made available to the general public. As Wikipedia’s entry notes, “The images, archived in the United States and at Landsat receiving stations around the world, are a unique resource for global change research and applications in agriculture, cartography, geology, forestry, regional planning, surveillance, education and national security.”

NASA has historically been responsible for Landsat satellite engineering, operations, and data capture architecture; Interior has distributed Landsat through the U.S. Geological Survey (USGS), with program management of Landsat transitioning to USGS and NOAA. In 1984, Landsat operations were partially privatized, as NASA’s Jet Propulsion Laboratory website reports:

[T]he US government turned over the tasks of satellite operation and data distribution to a commercial entity, EOSAT (Earth Observation Satellite Company). This for-profit company, a joint venture between Hughes and RCA, operates the Landsat 4 and 5 spacecraft and sells the satellite’s data. EOSAT is also responsible for the development of Landsat 6 and 7.

This NASA site is obviously a bit long in the tooth, as EOSAT was acquired by a company called Space Imaging, which was in turn acquired by GeoEye, whose headquarters is located in Dulles, VA. GeoEye runs several commercial imaging satellites; it sells time on those satellites to investors around the world. GeoEye’s biggest customer is the U.S. National Geospatial-Intelligence Agency (NGA), with whom they have a contract worth approximately $500 million for the development of a next-generation high-resolution remote sensing satellite. NGA is in the (U.S. government’s) business of providing geospatial intelligence (GEOINT) in support of national security.

GeoEye’s corporate website discloses:

Under the NGA Clearview program, NGA has committed almost $50 million to buy imagery and products from GeoEye during this calendar year.

The company has long-term contracts in place with Microsoft and Yahoo! as a supplier of commercial satellite imagery for mapping services. Imagery for search engines is primarily drawn from the GeoEye archive of imagery which consists of some 278 million square kilometers.

GeoEye and another satellite imaging company, DigitalGlobe, which is based in Longmont, Colorado, routinely bid competitively for the design and operation of government and commercial remote sensing projects. DigitalGlobe provides imagery to many companies and governments, just like GeoEye. Including, formerly, a little company called Keyhole.

DigitalGlobe and Google have been working together since 2002 (then Keyhole Corporation) to bring more imagery information to the everyday user. This unique relationship continues to grow with Google finding more and more ways to add context to user searches and DigitalGlobe providing the global imagery coverage that guarantees a context that is as consistent and broad as possible.

Today, DigitalGlobe is the premier provider of imagery in Google’s database – providing a majority of the high-resolution imagery served from its database. DigitalGlobe imagery can be seen in many Google domains today such as Google Maps, Google Local and the world’s most popular imagery viewing tool, Google Earth.

GeoEye’s and Digital Globe’s arrangements with search firms have exploded the use of commercial geoimaging data; at the same time, GeoEye has been partially responsible for Landsat image processing. Although we are accustomed to thinking of Google Earth as freely available, it is actually a commercial product resold by Google, with a base layer available for free. We should not, and cannot, assume that the functionality of Google Earth will be freely available forever. Public access is, historically, what Landsat was gifted with assuring.

Last week, Aviation Week – certainly one of the top trade journals in the world, and lots of fun for geeks to read – had a brief story noting that the current administration was thinking about giving “the Interior Department responsibility for coordinating and planning the future of Landsat-type imagery of the Earth’s surface.” The White House Science Adviser, John Marburger, said, “The importance of this imagery to the nation requires a more sustainable effort to ensure that land imaging data are available far into the future.” [Quotes below appear as in article].

Because of its history with Landsat and expertise in remote sensing and land science, “the U.S. Department of the Interior is judged to be the most appropriate federal department to become the lead agency for establishing the National Land Imaging Program (NLIP),” says the report by the Future of Land Imaging Interagency Working Group (FLIIWG).

The FLIIWG recommends that the U.S. “commit” to continuing the collection of moderate-resolution land imagery and maintain a “core operational capability” to gather such data with U.S.-owned satellites. “Moderate-resolution” data is defined as 5-120 meters per pixel.

Such data have been gathered by the Landsat series of spacecraft, dating back to the launch of Landsat 1 in 1972. But the currently operating Landsat 5 and 7 spacecraft are delivering degraded data due to onboard failures, and neither is expected to last beyond 2010 due to fuel limitations. NASA is planning to launch the Landsat Data Continuity Mission (LDCM) follow-on spacecraft no earlier than 2011, creating an anticipated gap in the record.

Landsat 5, with a March 1984 launch date, is over 20 years old. Landsat 6 failed at launch. The European equivalent of LandSat (Sentinel-2) [requirements: pdf] will probably be up and running by the time LDCM is mission-ready. Defining the future of the U.S.’ Landsat – a satellite system providing publicly available data – is at hand.

Critically undetermined is what architecture might follow the LDCM. Given the revolutions in optics, sensing, and data compression, it would seem intuitively possible to develop a lightweight, build-it-cheap, build-it-quick solution that could provide near real-time access to data, readily available for integration with commercial GIS mapping applications such as Google Earth, via the definition and publication of open standards and specifications. It is also possible to design narrowly focused hardware that meets highly specified criteria, coupled with secured or restricted data distribution architectures. In other words, like many obscure things, next-generation Landsat matters a lot.

Why Landsat – and space imaging generally – matters can be sensed through the wide variety of new initiatives, many quite popular, which have been made possible by leading GIS applications such as Google Earth. ComputerWorld ran a story emanating from coverage of GeoWeb 2007 in Vancouver, BC. At GeoWeb, the arguments over the extent to which GIS data should be democratized ran fast and furious, archetypically engaging the experts-know-best against the mass-participation-benefits-everyone crowds. Google, understandably, is strongly in favor of popularized access.

According to Michael Jones, Google Earth’s chief technologist, by giving everyone access to GIS tools, you’ll end up with “a big number of users converging on a truth.” Locals, he insists, are closer to most GIS data than experts and have a vested interest in its accuracy.

As in most of these debates, there is probably enough heterogeneity in application expectations to suggest that mass participation combined with protected expert data silos or layers might be the optimal accommodation. An AutoDesk executive gave one example justifying expertly-formed geospatial layers.

Geoff Zeiss, director of technology at Autodesk Inc. in San Rafael, Calif., argues that North America is experiencing an infrastructure crisis, much of which could be solved if up-to-date GIS data were available to the right people at the right time. He says the situation is particularly acute for utility companies. “They have pathological problems,” Zeiss says, in that they are unable to get information about the condition of utility infrastructure from workers in the field back to the central data stores. They’ve been able to skate by, relying on a workforce that collectively carries the knowledge base in their heads, he says. But more than half of those workers are over 45 and heading toward retirement. What then? Zeiss argues that utilities need to give field workers tools to input GIS-related data into centralized systems before that information is lost.

Independently, Jeff Jarvis of BuzzMachine came to the more populist derivation of this application after the steam pipe explosion in New York City, suggesting that civic data input be combined with journalists’ investigation to enrich the available data on public infrastructure.

Put up a Google map (with Platial on top) and town and neighborhood wikis and ask them to pinpoint every failure of infrastructure — or feared failure — they see: streets that flood every time it rains, bridges that look just too damned rusty, potholes, pipes that burst, streets that don’t get plowed, streetlights that don’t work, signs that are missing. . . . Ask them for dates and other specifics and for pictures and video. Urge them to blog their stories of frustration and bureaucracy.

Common sense suggests that the public empowerment is facilitated through access to both expert and civic layers in a GIS-based infrastructure representation. And certainly, from the business perspective of DigitalGlobe and GeoEye, the more data enrichment, the merrier.

Beyond the specifics of any such instance, let us step back and note the most critical aspect of this conversation: our ability to have these debates is possible in large part because of a tradition of imaging from Landsat and related instruments, and the commitment to keep high quality imaging data flowing freely.

Let us return to the earlier question: what will the data and sensing architectures be for our next generation public imaging solution? For what purposes and agencies will it be optimized and for what applications will it be enhanced? Is it conceivable that a bundle of open standards might be established for integrating imaging services?

The planning is happening right now, with an interesting (and logical) set of actors. Aviation Week:

The [National Land Imaging Program] NLIP would plan for the future of Landsat imagery beyond LDCM, promote the “widest beneficial use” of the imagery for civil purposes and ensure its availability to public and private users throughout the U.S. The program would convene a Federal Land Imaging Council that would include NASA, the National Geospatial-Intelligence Agency (NGA), and the departments of Defense, Commerce, Agriculture, Homeland Security and State [Department].

That’s a very diverse set of agencies, and one can easily observe discrepancies in viewpoints substantially larger than those between Interior and NASA. It is perhaps not coincidental that Aviation Week names NASA and NGA as the invited agencies most deeply committed to Landsat futures. I’m sure they will be having some very interesting conversations.

At a time when we are increasingly reliant on imaging for public consumption, commercial use, and vital national security, and while we are having serious debates about how to open up GIS access to the broader community, the Landsat program’s future is coming up for review.

Something that should be of interest to all of us.

tags: , ,
  • http://monkeytalk.wordpress.com Geowanker

    First off, very well written article Peter! I see the majority of next generation applications using real time data (specifically imagery)for a variety of tasks like search and rescue, modeling, real time analysis (similar to http://www.astrovision.com/ourbus.html). Although LandSat is an invaluable resource, most of mapping application out there uses Hi-resoultion imagery derived from other satellite systems ikonos etc. So, I’d definitely see more applications using on-demand hi-res imagery as opposed to temporal data currently available to public via Google Earth. But on the same token I highly doubt that the agencies/companies will come to a common agreement around data sharing, because the legal system is too complex and current social conditions do not permit it due to security reasons. But Google has been instrumental in breaking the barriers many times, take street view for example.

    There’s so much data being created, but nobody really shares it. Take for example, Google generated quite a bit of GIS data for India, they have not shared it yet! It’d be intersting to see what happens in the near future..

  • http://www.blinkgeo.com Andres


    Peter Brantley has a solid post on O’Reilly Radar


    Great post, Peter. Gives some additional context to the ‘democratization of GIS data’ discussion. I invite you to join us at BlinkGeo and contribute to the ‘community’ that is forming there.


  • http://www.dog-obedience.net Ajeet Khurana

    With the coming onslaught of widely available AND usable public data, matters might indeed come to a point that Jeff Jarvis is advocating in the post you linked to. Jeff says, “Put up a Google map (with Platial on top) … and neighborhood wikis and … mobilize your public”

    No wonder governments and regulators are running helter-skelter in search for protective legislation.

  • http://www.isde5.org Brian Hamlin

    I am sure no one quite knows where this jump in publicly accessible, fundamentally enabling tech might go.. Wearing the hat of someone concerned with the biosphere and its millenniums old natural ecosystems, I can imagine a few directions.. Great Article.