New Scopus Article Metrics: A better way to benchmark articles | Elsevier Scopus Blog

Source: New Scopus Article Metrics: A better way to benchmark articles | Elsevier Scopus Blog

Altmetrics were never meant as a replacement to established metrics, rather as a way to augment mainstream bibliometrics. With the release of the new Article Metrics module on Scopus, we hope to bring altmetrics one step closer to dropping the “alt.”

The Scopus Article Metrics module was designed in accordance with 12 principles for the responsible use of metrics1 and includes new metrics based on four alternative metrics categories2 endorsed by the Snowball Metrics project:

  • Scholarly Activity — Downloads and posts in common research tools such as Mendeley and CiteULike
  • Social Activity — Mentions characterized by rapid, brief engagement on platforms used by the public, such as Twitter, Facebook and Google+
  • Scholarly Commentary — Reviews, articles and blogs by experts and scholars, such as F1000 Prime, research blogs and Wikipedia
  • Mass Media — Coverage of research output in the mass media (e.g., coverage in top tier media media)

“We believe no single metric tells the whole story — you need a basket of metrics to make informed decisions,” says Michael Habib, Scopus Senior Product Manager, Elsevier. “By combining citation and alternative metrics, this new Article Metrics module will provide a comprehensive view of both the impact of and community engagement with an article.”

In addition to applying the same rigor to altmetrics that we apply to citation metrics, the new module was designed (and user-tested) to aid Scopus users, primarily scholarly researchers, in two primary tasks:

  1. Determining which articles to read
  2. Gaining deep insight into how an article (possibly one’s own?) compares with similar articles

Using the Article Metrics Module

On the Scopus document details (article) page, a sidebar highlights the minimal number of meaningful metrics a researcher needs to evaluate both citation impact and levels of community engagement. This can help a researcher to determine how others have received the article and, along with reading the abstract, to decide whether to read the full article.
The module displays the following (available for each article):

  • Citation count and percentile benchmark
  • Field-Weighted Citation Impact (FWCI)
  • Mendeley readership count and benchmark
  • Count of 1 type of scholarly commentary (e.g., blog posts, Wikipedia)
  • Count and benchmark of 1 type of social activity (e.g., Twitter, Facebook)
  • Total count of additional metrics and link to see breakdown by source

In addition to displaying these metrics, Scopus is introducing new percentile benchmarks3 to show how article citations or activity compare with the averages for similar articles, taking into account:

  • Date of publication
  • Document type
  • Disciplines associated with its source

From the sidebar, clicking <View all metrics> opens the full Article Metrics module, providing an overview of all available metrics and the underlying content for further analysis and understanding.


1Outlined in the “Response to HEFCE’s call for evidence: independent review of the role of metrics in research assessment.”

2CiteULike, Scholarly Commentary, Social Activity, and Mass Media data provided by altmetric.com.

3A minimum set of 2,500 similar articles is required to calculate a benchmark. Citation benchmarks and scholarly commentary benchmarks use an 18-month window, and social activity benchmarks are calculated with a two-month window.

Release Date: July 29 2015

Source: New Scopus Article Metrics: A better way to benchmark articles | Elsevier Scopus Blog

Summary + Slides for 2013 SSP Panel – Measure for Measure: The role of metrics in assessing research performance

Photo from @Editage

At the 2013 SSP (Society of Scholarly Publishers) Annual conference I was on a panel about research assessment and metrics. As part of my presentation I shared some findings from a large survey conducted by Elsevier’s Research and Academic Relations group (methodology in the slides).

One finding the Twitter back-channel picked up on was the surprising statistic that, from this random sample, only 1% of academics were familiar with altmetrics. I followed this up with a more optimistic statistic  showing that both researchers under 35 and also researchers from developing nations were more likely to view different types of potential altmetrics as useful. For this section of the talk, my primary point was that we need to focus on raising awareness among this demographic if altmetrics are to gain legitimacy in the researcher community.

Also discussed are DORA, Journal Metrics (SNIP, SJR), Snowball Metrics, and more. I summarized the primary take away points as follows:

  1. Choose methods + metrics appropriate to level and impact type being assessed (DORA)
  2. Don’t confuse level with type (alms ≠ altmetrics) – Tip: Embed free Scopus Cited-By counts at article-level
  3. Awareness of metrics correlates to acceptance, raising awareness matters
  4. APAC + younger researchers open to new metrics
  5. Don’t use just one metric, promote a variety of metrics – Tip: Embed free SNIP/SJR on journal pages
  6. Choose transparent and standard methods + metrics – Tip: Learn best practices from Snowball Metrics free ebook

Weekly Twitter Activity 2013-06-06

Weekly Twitter Activity (2013-05-31 – 2013-06-06)

Weekly Twitter Activity 2013-05-09

Weekly Twitter Activity (2013-05-03 – 2013-05-09)