There are a number of ways of measuring and describing the performance of individual research outputs. These can be broadly categorised as citation metrics, altmetrics and indicators of esteem. Determining where to focus your attention will depend on your discipline, the type of research output and an understanding of the likely performance strengths of an individual output.
Find suggested performance measures for a :
The number of citations received by an article is an indicator of the level of engagement it has achieved. To find who is citing a journal article, search citation databases such as Web of Science, Scopus, Dimensions, or Google Scholar.
This is how the citation counts will appear in each database:
Citation databases allow you to set an alert so you can be notified as soon as a new citation is added to the database. Alerts enable researchers to track where, by whom, and how often an article has been cited. Create an author alert in citation databases such as Scopus or Web of Science.
The FWCI (Field-Weighted Citation Impact) and CNCI (Category Normalised Citation Impact) both calculate the ratio of citations received relative to the expected world average for the subject category, publication type and publication year. These metrics are very useful in applications as they put citation counts in context of the average citations for similar papers, more objectively showing that a paper is highly cited.
In both cases, an FWCI or CNCI of 1 indicates that the publication has been cited at the world average for similar publications:
Many researchers find that the FWCI and CNCI metrics for the same article are different – sometimes significantly so. This is due to differences in database coverage between Web of Science and Scopus, affecting the number of citations that are indexed in each system.
Field Weighted Citation Index (FWCI)
The FCWI is can be found in SciVal, where it can be seen for each publication and as an average for all of a researcher's publications. FWCI can be added to most SciVal reports. In Explore, after setting up your researcher profile, to locate a list of FWCIs for each paper, select the Summary page, click on View List of All Publications, then select FWCI as the metric.
The FWCI can also be found in Scopus. where it can only be seen on individual publication records.
UWA authors can also view FWCI for their individual publications, when signed in to the UWA Research Repository:
Category Normalised Citation Impact (CNCI)
The CNCI is calculated on Web of Science data, and is accessible using InCites (benchmarking & analytics). InCites can be used to find the CNCI for each individual publication, or an average for all publications authored by an individual researcher indexed in Web of Science.
In InCites, select to Analyze Researchers. Use the Filters to locate the individual researcher by ORCID, Researcher ID, or Name. If CNCI is not already visible, click Add Indicator to select it. The average CNCI across all papers will be displayed.
View and download a list of all the selected Researcher's papers, including the CNCI for each paper, by clicking the number under the 'Web of Science Documents' column.
Journal Normalized Citation Impact (JNCI)
The JNCI is similar to the CNCI but normalizes the citation rate for the journal in which the document is published. The JNCI of a single publication is the ratio of the actual number of citing items to the average citation rate of publications in the same journal, in the same year and with the same document type. The JNCI for a set of publications is the average JNCI for each publication.
Percentile (Web of Science/ Incites (benchmarking and Analytics))
A percentile indicates how a paper has performed relative to other Web of Science-indexed publications in its field, year, and document type, and is therefore a normalized indicator. The percentile for a publication is determined by creating a citation frequency distribution for all the publications in the same year, subject category and of the same document type. The percentile is the percent of items cited less often than the item of interest, and therefore a higher percentile indicates better relative performance.
Average percentile
Average (mean) of the percentiles for all of your Web of Science-indexed publications.
Find your article percentile
You can find the percentile for each of your Web of Science - indexed publications in Web of Science.
To find highly cited articles on a particular topic, run a keyword search in citation databases such as Scopus or Web of Science. Sort on the search result by "Cited by (highest)" citations (Scopus) or "Sort by: Times cited: Highest to lowest" (Web of Science), so the most highly cited work is at the top of your search results.
This Scopus tutorial demonstrates the use of Scopus article metrics.
Altmetrics measure how many times a research output has been shared, mentioned or downloaded from online sources such as social media sites, blogs, mainstream media and reference managers. Find out more about Altmetrics.
The Altmetrics tools covered in this guide collect social and traditional media mentions, but there are some other sources you might like to check as well.
Policy citations are the number of times your research outputs have been mentioned in policy documents from government bodies, professional organizations or bodies, or non-government organizations (NGO). They are a good way of demonstrating how your research has influenced policy or practice in a particular field and support the development of your impact narrative beyond citations.
There are a few tools you can use to track policy citations, all of which use data from Overton, the world’s largest searchable index of policy documents, guidelines, think tank publications and working papers. Overton collects data globally from over 180 countries and in many different languages.
Patents often cite research papers. If your research has been cited in a patent, this shows the connection between your work and industry or commercial activity.
Several tools collect Patent citations and information:
There is no one measure that captures the impact of books/book chapters across the board, and which measures are effective may depend on the type, discipline, content, format, publisher etc. You may find you need to use a combination of traditional metrics, altmetrics and other measures of esteem to fully reflect the value of your book in research, teaching and other professional activities. Various recommended strategies are detailed in the subsequent tabs.
Books and book chapters are increasingly indexed in databases, so it is worth checking to see if you can find citations metrics. Resources to check:
See the Clarivate Master Book List for publishers that are indexed.
If your book/chapter only has a few citations, consider looking more closely at who has cited your work. Are they a high profile researcher in your area? Have they reflected favourably on your findings?
Book reviews can be a good source of examples of impact in a field, especially if they’re from notable people in the field. To find more formal, ‘reputable’ reviews (meaning published reviews, rather than those written by members of the public), search in the following locations:
1. OneSearch indexes a vast array of content, including reviews. Search by book title and use the ‘Review’ filter to limit your results:
2. Established cultural magazines and publications are also good sources of non-scholarly but still reputable book reviews:
3. Quantity can also be a useful measure (i.e. my book received over 200 reviews on Amazon). Check book review sites like Goodreads, Amazon and Google Books
There are many other ways to measure the quality and impact of your books and book chapters.
Libraries holding your book or the book you contributed to in their collections is another sign of impact. Useful metrics may be:
For holdings in international libraries, search WorldCat. Look out for different editions, which may have separate records!
For holdings in Australian Libraries, search the National Library tool, Trove, using their Books and Libraries filter. Look out for different editions, which may have separate records!
Here are other ways books can be received and used that can indicate impact and quality:
Reports are rarely indexed in the large citation databases, Scopus and Web of Science. The UWA Profiles and Research Repository is indexed by Google Scholar which not only allows UWA authors to track citations within Google Scholar but increases the discoverability of UWA publications and the opportunity for citation. We recommend that UWA authors:
Add their reports to the UWA Repository, including a PDF copy if their publisher agreement allows, or the link to its online location.
Google Scholar will only index publication records which include an abstract, and may preferentially index papers with full text available (Google Scholar Help, 2022).
If the full text of your report can be made available through the UWA Repository, the number of times the report has been downloaded will be displayed in the Repository which can be a useful measure of engagement.
Request a Digital Object Identifier (DOI) from the Library as soon as possible so that "altmetrics" such as mentions of the report on social media can be tracked accurately using the DOI. These will appear next to your report in your Repository profile as colourful visualisations from PlumX and Altmetric.com. Request a DOI through Service Desk.
Search for your report in Capital Monitor to locate mentions in government documents and proceedings, for example references to your report in the Hansard. These would be excellent evidence of the impact of your work.
Non-traditional research outputs (NTROs) are scholarly works or creative endeavours that go beyond traditional academic publications like journal articles and books. NTROs are typically produced by researchers in disciplines where non-textual forms of expression and dissemination are valued, such as the arts, design, music, film, and digital media.
Measuring NTROs can be challenging due to their diverse nature and formats and that they are typically not indexed by academic databases and therefore do not receive citation counts. Altmetrics tools have made tracking easier for some NTRO outputs, but rely on the work having a persistent identifier (PID) which is used in all communcations around that output. Read more about PIDs for publications and how to apply for them in our How to Publish and Disseminate Research guide.
Other possible measures of quality and esteem that can help when building a narrative around engagement and impact are listed below by output type.
Measures of quality, esteem and engagement include:
Measures of quality, esteem and engagement include:
Measures of esteem, quality and engagement include:
Measures of esteem, quality and engagement include:
Measures of quality, esteem and engagement include:
Measures of quality, esteem and engagement include:
Measures of quality, esteem and engagement include:
Just as for other research outputs, use of your data by other researchers can be described when discussing your research performance.
Data citation is an emerging practice. "Data citation refers to the practice of providing a reference to data in the same way as researchers routinely provide a bibliographic reference to outputs such as journal articles, reports and conference papers. Citing data is increasingly being recognised as one of the key practices leading to recognition of data as a primary research output."
Australian National Data Service (ANDS) data citation support material
Halevi, G., Nicolas, B. & Bar-Ilan, J. (2016). The Complexity of Measuring the Impact of Books. Publishing Research Quarterly 32, 187–200. https://doi.org/10.1007/s12109-016-9464-5
Mills, K., & Croker, K. (2020). Measuring the research quality of Humanities and Social Sciences publications: An analysis of the effectiveness of traditional measures and altmetrics. https://doi.org/10.26182/v063-r898
Except for logos, Canva designs, AI generated images or where otherwise indicated, content in this guide is licensed under a Creative Commons Attribution-ShareAlike 4.0 International Licence.