Personal tools


Surveying User Satisfaction: The NASA DAAC Experience

Alex de SherbininA Blog post by Alex de Sherbinin (WDS Scientific Committee member)

It is often said—disparagingly—that America’s culture is a consumer culture. Although it may be true that America’s consumerism is problematic, not least for the planet, the flip side is how consumer culture drives a service mentality in businesses and government. The old adage that “the customer is king” does motivate US government agencies and government-supported centers, including NASA’s Distributed Active Archive Centers (DAACs), to innovate and improve services in response to user feedback and evolving user needs.

Since 2004, NASA’s Earth Science Data and Information System (ESDIS) Project [WDS Network Member] has commissioned the CFI Group to conduct an annual customer satisfaction survey of users of Earth Observing System Data and Information System (EOSDIS) data and services available through the twelve DAACs. The American Customer Satisfaction Index (ACSI) is a uniform, cross-industry measure of satisfaction with goods and services available to US consumers, including both the private and public sectors. The ACSI represents an important source of information on user satisfaction and needs that feeds into DAAC operations and evolution. This may hold some lessons for WDS data services more broadly as they seek feedback from their users, and endeavor to expand their user bases and justify funding support.

The ACSI survey invitation is sent to anyone who has registered to download data from the NASA DAACs. In the past registration was ad hoc, and each DAAC had its own system. In early 2015, ESDIS began implementing a uniform user registration system called EarthData Login that requires that users establish a free account before they can access datasets. Accounts are associated with a given DAAC, but they allow access to data across all the DAACs. All those who register are sent invitations to fill out the ACSI survey. Response rates vary from a few percent among most DAACs, to as high as 38% for the Land Processes DAAC [WDS Regular Member] (which also has the highest number of respondents at just over 2,000).

In 2015, the overall EOSDIS ACSI was 77 out of 100, which is better then the overall government and National ACSI scores for 2015 (64 and 74, respectively), but lower than the National Weather Service (80). This score is based on users’ overall assessment of satisfaction with each data center based on expectations and comparison with an “ideal” data center. The ACSI model provided by the CFI Group also assesses specific “drivers” of user satisfaction—customer support, product search, product selection and order, product documentation, product quality, and data delivery—and their relative importance to the overall ACSI score. This allows the DAACs to identify areas where improvement is needed and should have the most impact on overall satisfaction.

The ACSI enables the EOSDIS to assess changes from year to year. For example, from 2014 to 2015 customer support went from 89 to 86, with drops in professionalism, technical knowledge, helpfulness in correcting a problem, and timeliness of response (all statistically significant). Many changes likely reflect the fact that the pool of survey respondents changes over time, as do their expectations, rather than actual drops in service provision. But for individual DAACs, declining scores in certain areas, in combination with free-text responses to open-ended questions, can help to flag issues that are in need of attention.

For example, the ACSI scores and free-text responses to open-ended questions helped our DAAC—the Socioeconomic Data and Applications Center (SEDAC) [WDS Regular Member]—in undertaking a major website overhaul in 2011. From a disparate set of pages with different designs, we created a coherent site with consistent navigation. The resulting site was evaluated very favorably by Blink UX, a user experience evaluation firm that reviewed all of the DAAC websites. Deficiencies in data documentation for selected datasets have also been pointed out by survey respondents, and we are now reviewing our guidelines for documentation to ensure that all datasets meet a minimum standard. Some users indicated difficulty in finding the latest dataset releases, so we are developing an email alert system for new data releases.

At the Alaska Satellite Facility (ASF) DAAC [WDS Regular Member], the ACSI results have been very helpful in getting a sense of how people are using ASF DAAC data and services. The free-text responses to questions regarding new data, services, search capabilities, and data formats are particularly informative. For example, one user suggested that it would be useful to have quick access to Synthetic Aperture Radar data for specific regions in the world for disaster response. A data feed was developed after the recent Nepal earthquake that notified users of any new Sentinel-1A data received at ASF DAAC for that specific area. This data feed quickly provided additional data for disaster responders and researchers studying this event. Data feeds are now available for several seismically active areas of the world that have been designated by the scientific community (i.e., Supersites).

Overall, the strong EOSDIS ACSI scores have been important in objectively demonstrating and documenting the continuing value of EOSDIS and the individual DAACs to the broad user community. The annual score is reported as one of NASA’s annual performance metrics, supporting NASA’s goal to provide results-driven management focused on optimizing value to the American public.

Although surveys can be costly, and the response rates low, WDS Members would do well to consider periodic surveys of users. We find that highly motivated users do respond and provide really useful suggestions, especially if they find that their responses actually lead to tangible changes in their user experience. While annual surveys may be more than is needed, surveys every 2–3 years could provide your data service with valuable feedback on its content and services. And of course, none of this should supplant other mechanisms for gathering user feedback, such as help desk software (e.g., UserVoice used by SEDAC or Kayako used by NASA’s EarthData), email, and telephone helplines. Through these multiple mechanisms, our user communities can help drive significant improvements in the services offered by WDS Members and the successful use of our valuable data by growing numbers of users.