Abstract
This paper investigates the data accumulation velocity of 12 Altmetric.com data sources. DOI created date recorded by Crossref and altmetric event posted date tracked by Altmetric.com are combined to reflect the altmetric data accumulation patterns over time and to compare the data accumulation velocity of various data sources through three proposed indicators, including Velocity Index, altmetric half-life, and altmetric time delay. Results show that altmetric data sources exhibit different data accumulation velocity. Some altmetric data sources have data accumulated very fast within the first few days after publication, such as Reddit, Twitter, News, Facebook, Google+, and Blogs. On the opposite spectrum, research outputs are at relatively slow pace in accruing data on some data sources, like Policy documents, Peer review, Q&A, Wikipedia, Video, and F1000Prime. Most altmetric data sources’ velocity degree also changes by document types, subject fields, and research topics. The type Review is slower in receiving altmetric mentions than Article, while Editorial Material and Letter are typically faster. In general, most altmetric data sources show higher velocity values in the fields of Physical Sciences and Engineering and Life and Earth Sciences. Within each field, there also exist some research topics that attract social attention faster than others.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
“Speed” has been highlighted as one of the most important characteristics of altmetrics (Wouters and Costas 2012; Bornmann 2014). Compared to citations, which has been often criticized for its time delay in providing reliable measurement for research impact (Wang 2013), speed in the context of altmetrics is related to the idea that the impact of a given scientific output can be measured and analyzed much earlier (Priem et al. 2010; Mohammadi and Thelwall 2014). Publication delays are considered to substantially slow down the formal communication and dissemination of scientific knowledge (Amat 2008; Björk and Solomon 2013). In contrast, scholarly interactions on social media platforms are likely to happen within a very short time-frame. For instance, Twitter mentions of scientific documents may occur immediately within hours or even minutes after they were available online (Shuai et al. 2012; Haustein et al. 2015a).
However, because of the strong heterogeneity of altmetrics (Haustein 2016), which incorporate a wide range of metrics based on different types of data sources, it is difficult to establish a clear-cut and unified conceptual framework for the temporal analysis of all altmetrics. Each altmetric indicator, typically with unique functions and aimed at different audiences, may tell different stories about the reception of publications, and show distinguishing patterns in varying contexts. Lin and Fenner (2013) concluded that altmetrics are very likely representing very different things. From this point of view, we argue that the interpretation of the characteristic properties of different altmetrics should be made for each metric separately, including among these properties also their “speed”.
Accumulation patterns and immediacy measurement of citations and usage metrics
In contrast to altmetric data, the accumulation patterns of citations have already been widely discussed in previous studies from several perspectives, such as their “obsolescence” (Line 1993), “ageing” (Aversa 1985; Glänzel and Schoepflin 1995), “durability” (Costas et al. 2010), or “delayed recognition” (Garfield 1980; Min et al. 2016). Citation histories, which relate to the analysis of the distribution of citations over time, were mainly studied from the synchronous or diachronous perspectives (Stinson and Lancaster 1987). The former considers the distribution of the publication years of cited references, while the latter focuses on the distribution of received citations over time (Colavizza and Franceschet 2016; Sun et al. 2016), which are also referred to as “retrospective citations” and “prospective citations”, respectively (Glänzel 2004). These two approaches have been applied to studying the accumulation patterns of usage metric data as well. With the development of digital publishing, usage metrics have been proposed and adopted by publishers during the last decades to supplement citations in reflecting how frequently scientific outputs are used and measuring their early impact to some extent (Schloegl and Gorraiz 2011). From the synchronous perspective, Kurtz et al. (2005) concluded that most studies of obsolescence found that the use of literature declines exponentially with age. The diachronous accumulation patterns of usage metrics, like views, downloads, reads, etc., were investigated and often compared with citations. On the basis of page views data of Nature publications, Wang et al. (2014) explored the dynamic usage history over time and found that papers are used most frequently within a short period after publication, finding that in median it only takes 7 days for papers to reach half of their total page views. Schlögl et al. (2014) reported that citations take several years until they reach their peak, however most downloads of papers are quickly accrued in the same publication year. In a similar fashion, Moed (2005) already found that citations and downloads show different patterns of obsolescence, and about 40% of downloads accumulated within the first 6 months after publication. More recently Wang et al. (2016a) using the article-level “usage counts” provided by Web of Science to investigate the usage patterns of indexed papers, identified that newly published papers accumulated more Web of Science usage counts than older papers.
As to the measurement of the “speed” of citations and usage metrics, several indicators have been created and applied in practice. For example, based on the time elapsed between the publication date and the date of the first citation of a paper, Schubert and Glänzel (1986) developed the indicator mean response time (MRT) in order to measure the citation speed of journals, understood as the properly formed average number of years between the publication of articles in a journal and the time of their first citation. In order to measure how quickly articles in a journal are cited, the Journal Citation Reports (JCR) calculates the indicator named Immediacy Index for each journal in each year. This indicator is defined as the average number of times an article is cited in the same year it is published.Footnote 1 Besides, at the journal level, Cited Half-Life and Citing Half-Life are also calculated by JCR to measure how fast journals are accumulating half of their citations and how far back that citing relationship extends.Footnote 2 Analogous to the citation-based Immediacy Index and half-life, the “usage immediacy index” and “usage half-life” (Rowlands and Nicholas 2007), “download immediacy index” (Wan et al. 2010) were proposed to describe the life cycle of usage metrics. By analyzing usage data in the field of oncology collected from Science Direct, Schloegl and Gorraiz (2010) calculated the mean usage half-life and found that it is much shorter than the average cited half-life, observing also different obsolescence patterns between downloads and citations.
Accumulation patterns and immediacy measurement of altmetric data
Since the emergence of altmetrics, most related studies have focused on the coverage of publications across altmetric sources and their correlation with citation counts (Thelwall et al. 2013; Haustein et al. 2014; Costas et al. 2015a). Less attention was paid to the study of the accumulation velocity of altmetric data over time. Only a few altmetric data sources were investigated from the perspective of their immediacy. Maflahi and Thelwall (2018) conducted a longitudinal weekly study of the Mendeley readers of articles in six library and information science journals and found that they start to accrue early from when articles are first available online and continue to steadily build over time, being this the case even for journals with large publication delays. Thelwall (2017) also found that articles attracted between 0.1 and 0.8 Mendeley readers on average in the month they first appeared in Scopus, with some variability across subject fields. The results based on PeerJ social referrals data of Wang et al. (2016b) suggested that the number of “visits” to papers from social media (Twitter and Facebook) accumulates very quickly after publication. By comparing the temporal patterns of Twitter mentions and downloads of ar** altmetric events separate seems to be an important recommendation, this given not only their fundamental differences (Haustein et al. 2016; Wouters et al. 2019) but also their time accumulation patterns as demonstrated in this study. Moreover, the pace and tempo of different altmetrics cannot be seen as equivalent and, similar to what happens with citations, these time differences need to be taken into account when considering different time windows in altmetric research.
Variations across document types
Zahedi et al. (2014) concluded that the coverage of several altmetric data sources varies across document types and subject fields. In this study, it is shown that the same type of variations apply also to the data accumulation velocity of different altmetric data sources. Thus, in terms of document types, Reviews (this document type mainly focuses on retrospectively reviewing existing findings) are overall the slowest in accumulating altmetric events. A possible reason for this slowest reception lies in the less innovative nature of Reviews. In other words, Review papers are less prone to provide new research discoveries and more to condense the state-of-the-art in a subject field or research topic, therefore lacking the novelty component of other document types. For example, the research topics presented in Editorial Materials and Letters may be more likely to evoke social buzz immediately, since they cover more novel topics, debates, scientific news, etc., without using a too complicated and technical language (Haustein et al. 2015). The thematic property of these two document types might facilitate the users’ attention received more immediately, particularly on Peer review platforms, a type of altmetric data source which is mainly used by researchers, who are faster to take notice of controversial topics emerging in the scientific community. This finding is quite similar with the ageing patterns of citations to different document types: Editorial Materials and Letters were found more likely to be the “early rise-rapid decline” papers with most citations accumulated in a relatively short time period, while Review was observed to be the delayed document type with a slower growth (Costas et al. 2010; Wang 2013).
Variations across scientific fields and topics
In terms of scientific fields, research outputs from the fields of Physical Sciences and Engineering and Life and Earth Sciences are more attractive to social media audiences shortly after publication, accruing altmetric events faster compared to other fields. Research outputs from the fields of both Social Sciences and Humanities and Mathematics and Computer Science are relatively slower to be disseminated on altmetric data sources, although publications in these two fields hold different altmetric data coverage, with the former much higher than the latter (Costas et al. 2015a). Such field-related data accumulation dynamics was also observed in the context of citations, for instance, citation ageing in the social sciences and mathematics journals is similarly slower than in the medical and chemistry journals (Glänzel and Schoepflin 1995), the physical, chemical, and earth sciences, fields in which the research fronts are fast-moving, have more papers showing rapidly declining citation pattern (Aksnes 2003). From the perspective of first-citation speed, papers in the field of physics are faster in receiving the first citation, followed by biological, biomedical, and chemical research, while mathematics papers show lower first-citation speed (Abramo et al. 2011). Even though the overall accumulation patterns between citation data and most altmetric data are obviously different, they share very similar tempos across scientific fields.
Furthermore, the variations do not only exist at the main subject field level, but also the research topic level. Within each subject field, different research topics also show various velocity patterns in receiving altmetric attention, both on fast sources or slow sources. This signifies the thematic dependency of users in following up-to-date research outputs around some topics, just like some certain research topics drive more social attention over others (Robinson-Garcia et al. 2019). Thus, further research should focus on identifying the main distinctive patterns of publications and research topics to determine their faster/slower reception across altmetric sources, and how different observation time windows, and the selection of different data sources, may affect real-time assessment in altmetric practice.
Limitations
The main limitation of this study lies in the precision of Crossref’s DOI created date as the proxy of actual publication date of research outputs. There might still be a small distance between the date on which a DOI was created and the research output was actually made publicly available, which could result in some inaccuracies in our results. Besides, as we mentioned in the data part, DOI created dates might be updated due to the change of DOI status, thereby causing the unreliable time intervals. One of the effects of these inaccuracies is that some publications may have altmetric event posted date even earlier than DOI created dates. Therefore, publications with such unexpected time intervals have been excluded from this study to lower the negative influence made by questionable DOI created dates. Future research should focus on refining accurate methods of identifying the effective publication date of research outputs. As shown in this study, they have important repercussion to determine accurate time windows for altmetric research.
Conclusions
Several conclusions can be derived from this study. First, we conclude that not all altmetrics are fast and that they do not accumulate at the same speed, existing a fundamental differentiation between fast sources (e.g. Reddit, Twitter, News, Facebook, Google+, and Blogs) and slow sources (e.g. Policy documents, Q&A, Peer review, Wikipedia, Video, and F1000Prime). Another important conclusion of this study is that the accumulation velocity of different kinds of altmetric data varies across document types, subject fields, and research topics. The velocity of most altmetric data of Review papers is lower than that of Articles, while Editorial Material and Letter are generally the fastest document types in terms of altmetric reception. From the perspective of scientific fields, the velocity ranking of different data sources changes across subject fields, and most altmetric data sources show higher velocity values in the fields of Physical Sciences and Engineering and Life and Earth Sciences, and lower in Social Sciences and Humanities and Mathematics and Computer Science. Finally, with regards to individual research topics, substantial differences in the velocity of reception of altmetric events across topics have been identified, even among topics within the same broader field. Such topical difference in velocity suggests that it is worth studying the underlying reasons (e.g. hotness, controversies, scientific debates, media coverage, etc.) of why some topics within the same research area do receive social (media) attention much faster than others.
Notes
See more information about Immediacy Index at: https://clarivate.com/webofsciencegroup/blog/know-your-metrics-immediacy-index/.
See more information about Cited and Citing Half-Lives at: https://clarivate.com/webofsciencegroup/blog/a-closer-look-at-cited-and-citing-half-lives/.
This is the date on which a given altmetric event (e.g. a tweet, a News mention, a Blog citation, etc.) was posted online or published (for policy documents).
Extracted from personal communication with Euan Adie from Altmetric.com.
See more information about CWTS classification system at: https://www.leidenranking.com/information/fields.
References
Abramo, G., Cicero, T., & D’Angelo, C. A. (2011). Assessing the varying level of impact measurement accuracy as a function of the citation window length. Journal of Informetrics,5(4), 659–667.
Aksnes, D. W. (2003). Characteristics of highly cited papers. Research Evaluation,12(3), 159–170.
Alperin, J. P. (2015). Geographic variation in social media metrics: An analysis of Latin American journal articles. Aslib Journal of Information Management,67(3), 289–304.
Amat, C. (2008). Editorial and publication delay of papers submitted to 14 selected food research journals. Influence of online posting. Scientometrics,74(3), 379–389.
Aversa, E. (1985). Citation patterns of highly cited papers and their relationship to literature aging: A study of the working literature. Scientometrics,7(3–6), 383–389.
Björk, B. C., & Solomon, D. (2013). The publishing delay in scholarly peer-reviewed journals. Journal of Informetrics,7(4), 914–923.
Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics,8(4), 895–903.
Colavizza, G., & Franceschet, M. (2016). Clustering citation histories in the physical review. Journal of Informetrics,10(4), 1037–1051.
Costas, R., van Leeuwen, T. N., & van Raan, A. F. (2010). Is scientific literature subject to a ‘Sell-By-Date’? A general methodology to analyze the ‘durability’ of scientific documents. Journal of the American Society for Information Science and Technology,61(2), 329–339.
Costas, R., Zahedi, Z., & Wouters, P. (2015a). Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. Journal of the Association for Information Science and Technology,66(10), 2003–2019.
Costas, R., Zahedi, Z., & Wouters, P. (2015b). The thematic orientation of publications mentioned on social media. Aslib Journal of Information Management,67(3), 260–288.
Darling, E. S., Shiffman, D., Côté, I. M., & Drew, J. A. (2013). The role of Twitter in the life cycle of a scientific publication. PeerJ PrePrints,1, e16v1.
Didegah, F., & Thelwall, M. (2018). Co-saved, co-tweeted, and co-cited networks. Journal of the Association for Information Science and Technology,69(8), 959–973.
Fang, Z., & Costas, R. (2018). Studying the posts accumulation patterns of Altmetric.com data sources. In The 2018 altmetrics workshop (Altmetrics18), London, UK. Retrieved from http://altmetrics.org/wp-content/uploads/2018/04/altmetrics18_paper_5_Fang.pdf.
Garfield, E. (1980). Premature discovery or delayed recognition-Why. Current Contents,21, 5–10.
Glänzel, W. (2004). Towards a model for diachronous and synchronous citation analyses. Scientometrics,60(3), 511–522.
Glänzel, W., & Schoepflin, U. (1995). A bibliometric study on ageing and reception processes of scientific literature. Journal of Information Science,21(1), 37–53.
Haustein, S. (2016). Grand challenges in altmetrics: Heterogeneity, data quality and dependencies. Scientometrics,108(1), 413–423.
Haustein, S. (2019). Scholarly Twitter metrics. In W. Glänzel, H. F. Moed, U. Schmoch, & M. Thelwall (Eds.), Springer handbook of science and technology indicators (pp. 729–760). Heidelberg: Springer. Retrieved from http://arxiv.org/abs/1806.02201.
Haustein, S., Bowman, T. D., & Costas, R. (2015a). When is an article actually published? An analysis of online availability, publication, and indexation dates. In Proceedings of the 15th international conference on scientometrics and informetrics (ISSI), (pp. 1170–1179), Istanbul, Turkey. Retrieved from https://arxiv.org/abs/1505.00796.
Haustein, S., Bowman, T. D., & Costas, R. (2016). Interpreting “altmetrics”: viewing acts on social media through the lens of citation and social theories. In C. R. Sugimoto (Ed.), Theories of informetrics and scholarly communication: A Festschrift in honor of Blaise Cronin (pp. 372–405). Berlin: De Gruyter Mouton. Retrieved from https://arxiv.org/abs/1502.05701.
Haustein, S., Costas, R., & Larivière, V. (2015b). Characterizing social media metrics of scholarly papers: The effect of document properties and collaboration patterns. PLoS ONE,10(3), e0120495.
Haustein, S., Peters, I., Bar-Ilan, J., Priem, J., Shema, H., & Terliesner, J. (2014). Coverage and adoption of altmetrics sources in the bibliometric community. Scientometrics,101(2), 1145–1163.
Kurtz, M. J., Eichhorn, G., Accomazzi, A., Grant, C., Demleitner, M., Murray, S. S., et al. (2005). The bibliometric properties of article readership information. Journal of the American Society for Information Science and Technology,56(2), 111–128.
Lin, J., & Fenner, M. (2013). Altmetrics in evolution: Defining and redefining the ontology of article-level metrics. Information Standards Quarterly,25(2), 20–26.
Line, M. B. (1993). Changes in the use of literature with time—Obsolescence revisited. Library Trends,41(4), 665–683.
Maflahi, N., & Thelwall, M. (2018). How quickly do publications get read? The evolution of Mendeley reader counts for new articles. Journal of the Association for Information Science and Technology,69(1), 158–167.
Min, C., Sun, J., Pei, L., & Ding, Y. (2016). Measuring delayed recognition for papers: Uneven weighted summation and total citations. Journal of Informetrics,10(4), 1153–1165.
Moed, H. F. (2005). Statistical relationships between downloads and citations at the level of individual documents within a single journal. Journal of the American Society for Information Science and Technology,56(10), 1088–1097.
Mohammadi, E., & Thelwall, M. (2014). Mendeley readership altmetrics for the social sciences and humanities: Research evaluation and knowledge flows. Journal of the Association for Information Science and Technology,65(8), 1627–1638.
Ortega, J. L. (2018). The life cycle of altmetric impact: A longitudinal study of six metrics from PlumX. Journal of Informetrics,12(3), 579–589.
Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: A manifesto. Retrieved from http://altmetrics.org/manifesto/. Accessed 26 Nov 2019.
Robinson-Garcia, N., Arroyo-Machado, W., & Torres-Salinas, D. (2019). Map** social media attention in microbiology: Identifying main topics and actors. FEMS Microbiology Letters,366(7), fnz075.
Rowlands, I., & Nicholas, D. (2007). The missing link: Journal usage metrics. Aslib Proceedings,59(3), 222–228.
Schloegl, C., & Gorraiz, J. (2010). Comparison of citation and usage indicators: The case of oncology journals. Scientometrics,82(3), 567–580.
Schloegl, C., & Gorraiz, J. (2011). Global usage versus global citation metrics: The case of pharmacology journals. Journal of the American Society for Information Science and Technology,62(1), 161–170.
Schlögl, C., Gorraiz, J., Gumpenberger, C., Jack, K., & Kraker, P. (2014). Comparison of downloads, citations and readership data for two information systems journals. Scientometrics,101(2), 1113–1128.
Schubert, A., & Glänzel, W. (1986). Mean response time—A new indicator of journal citation speed with application to physics journals. Czechoslovak Journal of Physics B,36(1), 121–125.
Shuai, X., Pepe, A., & Bollen, J. (2012). How the scientific community reacts to newly submitted preprints: Article downloads, twitter mentions, and citations. PLoS ONE,7(11), e47523.
Stinson, E. R., & Lancaster, F. W. (1987). Synchronous versus diachronous methods in the measurement of obsolescence by citation studies. Journal of Information Science,13(2), 65–74.
Sun, J., Min, C., & Li, J. (2016). A vector for measuring obsolescence of scientific articles. Scientometrics,107(2), 745–757.
Thelwall, M. (2017). Are Mendeley reader counts high enough for research evaluations when articles are published? Aslib Journal of Information Management,69(2), 174–183.
Thelwall, M., Haustein, S., Larivière, V., & Sugimoto, C. R. (2013). Do altmetrics work? Twitter and ten other social web services. PLoS ONE,8(5), e64841.
Waltman, L., & Van Eck, N. J. (2012). A new methodology for constructing a publication-level classification system of science. Journal of the American Society for Information Science and Technology,63(12), 2378–2392.
Wan, J. K., Hua, P. H., Rousseau, R., & Sun, X. K. (2010). The journal download immediacy index (DII): Experiences using a Chinese full-text database. Scientometrics,82(3), 555–566.
Wang, J. (2013). Citation time window choice for research impact evaluation. Scientometrics,94(3), 851–872.
Wang, X., Fang, Z., & Guo, X. (2016a). Tracking the digital footprints to scholarly articles from social media. Scientometrics,109(2), 1365–1376.
Wang, X., Fang, Z., & Sun, X. (2016b). Usage patterns of scholarly articles on Web of Science: A study on Web of Science usage count. Scientometrics,109(2), 917–926.
Wang, X., Mao, W., Xu, S., & Zhang, C. (2014). Usage history of scientific literature: Nature metrics and metrics of Nature publications. Scientometrics,98(3), 1923–1933.
Wouters, P., & Costas, R. (2012). Users, narcissism and control-tracking the impact of scholarly publications in the 21st century. Utrecht: SURFfoundation. Retrieved from http://research-acumen.eu/wp-content/uploads/Users-narcissism-and-control.pdf.
Wouters, P., Zahedi, Z., & Costas, R. (2019). Social media metrics for new research evaluation. In W. Glänzel, H. F. Moed, U. Schmoch, & M. Thelwall (Eds.), Springer Handbook of science and technology indicators (pp. 687–713). Heidelberg: Springer. Retrieved from http://arxiv.org/abs/1806.10541.
Yu, H., Xu, S., **ao, T., Hemminger, B. M., & Yang, S. (2017). Global science discussed in local altmetrics: Weibo and its comparison with Twitter. Journal of Informetrics,11(2), 466–482.
Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics,101(2), 1491–1513.
Acknowledgements
Zhichao Fang is financially supported by the China Scholarship Council (Grant No. 201706060201). Rodrigo Costas is partially funded by the South African DST-NRF Centre of Excellence in Scientometrics and Science, Technology and Innovation Policy (SciSTIP). The authors thank Prof. Paul Wouters (Leiden University) for valuable suggestions, thank the anonymous reviewer for helpful comments, and thank Altmetric.com for providing the altmetric data of scientific publications.
Author information
Authors and Affiliations
Corresponding author
Appendix
Appendix
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Fang, Z., Costas, R. Studying the accumulation velocity of altmetric data tracked by Altmetric.com. Scientometrics 123, 1077–1101 (2020). https://doi.org/10.1007/s11192-020-03405-9
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11192-020-03405-9