The digital transformation in publishing is bringing forth more than new reading platforms, gadgets and distribution options — it also brings a wealth of data publishers have never before had access to, data that can be applied to new marketing and production strategies, and used to help create more efficient business models.
As data becomes more and more central to publishing ecosystems, traditional methods of metric collection and analysis are proving insufficient. This need for new measurement techniques has given rise to a new metrics approach called “alternative metrics.” I reached out to Todd Carpenter, Executive Director of NISO, to find out what’s behind the changing data needs and more about how altmetric applications can benefit publishers. Carpenter will explore this topic further at TOC Frankfurt on October 9, 2012. Our interview follows.
What are alternative metrics?
Todd Carpenter: Alternative metrics — referred to as “altmetrics” — are a suite of assessment criteria and measures that are being developed, particularly in the scientific and academic communities, to assess the importance of a particular work of scholarly output in a new way.
Traditional metrics have been downloads, citations, or sales — generally based on publication-level data. For example, the Thomson Reuters Impact Factor, one of the most widely used metrics in scholarly publishing, measures quality at the journal level by measuring the number of citations to it in other journal articles. As academic publishing has expanded and diversified, these traditional metrics have been increasingly criticized for issues such as their granularity (i.e., measuring at the publication level, not the item level), or their bias toward citation, which is a common practice among researchers but doesn’t reflect more applied, practical, or public use.
The scope of measures that could be considered altmetrics is actually quite broad, ranging from analysis of usage data to social media references; Google Page Rank; deep statistical data analysis techniques, such as betweenness centrality; and other relatedness statistical measures. Also considered for inclusion in alternative metrics are measures of non-traditional types of content production, such as the release of scientific data sets, blog posting, or social media activity — none of which are addressed in traditional metrics.
What role are alternative metrics playing in scholarly/academic publishing?
Todd Carpenter: There is a lot of discussion today in the scholarly community about alternative metrics. This movement is driven in large part by two factors: the movement toward greater qualitative analysis of researcher output for advancement at institutions of higher learning and the desire by funding organizations to be more quantitative in their assessment of the return on their investments in funding science. Funders are interested in seeing the direct impact of their investment of resources in some way that goes beyond publication of project results in a prestigious journal. Publication in top-tier journals may not be the best indicator of impact, especially in niche fields. Alternative metrics also provides an opportunity to be more nuanced in assessing the impact of a researcher’s output.
Adoption of alternative metrics has been modest to this point. In large part, this is due to the lack of standards for what altmetrics mean or how they are calculated. This fact was driven home in a recent Nature editorial entitled Count on me, which outlined the need for agreement on better metrics for assessing scholarly impact and consensus about their derivation.
How about in commercial publishing — what kinds of applications do you envision?
Todd Carpenter: It is important to recognize that altmetrics are not of use only in the STEM publishing world. One need only recall that the original Google search algorithm was derived from a study of citation linking on the web that Brin and Page were undertaking. Some examples of how these metrics could be applied in the trade publishing arena can be seen on Amazon.com, where statistical analysis of purchase data appear as links for “People who bought this also bought …” This is an application of a type of altmetric.
In fact, some commercial and trade publishers are already applying non-traditional data to their businesses in innovative ways. Last summer, Dominique Raccah, CEO of Sourcebooks, presented a very interesting session at the IDPF Digital Book 2012 meeting prior to BookExpo discussing how Sourcebooks uses data that includes large-scale sales data, user studies, web traffic and social analytics to measure and optimize their business. You can listen to a recording of her presentation on the IDPF website.
What role do you see data playing in the future of publishing?
Todd Carpenter: It has become somewhat gospel, and perhaps a bit of hype, lately that organizations should all be making data-driven decisions. Data analysis certainly has its application in our environment, but like all things, it can be taken to extremes. Just as scholars don’t want to be assessed on the basis of some single all-encompassing metric, nor should publishers assess quality simply based on data alone. That said, there are likely applications of metrics, particularly altmetrics that could be more widely adopted in publishing.
Publishers who actively gather and analyze data could find potential new authors, or even new fields, that are growing at the intersections between communities of interest. Frequently, the most innovative ideas (and correspondingly, books) are derived from people who are operating at the intersections of domains — say, for example, the use of computers and biology, which led to bioinformatics and genomics. The combination of zombies and literature or history has led to a variety of very popular bestsellers, a genre that few would have predicted to mesh.
Another application is in the use of informing marketing efforts. Acting on rapidly available data analysis provides publishers an opportunity to leverage market opportunities by tweaking their messages to appropriate groups or breaking into nearby market segments. Decisions on everything from cover design to marketing text can be tested using data analysis.
Organizations will likely find that incorporating more and higher-level data analysis to their businesses will push them forward to being more productive and efficient, if done appropriately.
This interview was lightly edited and condensed.
Be sure to join us at TOC Frankfurt on October 9, 2012. Save 20% on registration with the code TOCPartner20TSpeaker.