Placeholder
Placeholder for page title and other branding
Navigating Changes in Research Assessment
Why research assessment is changing
Research assessment is evolving as funders and institutions move beyond traditional metrics to reflect a broader understanding of impact.
Globally, agencies like Canada’s Tri-Agency, the NIH (US),, European Research Council, and Wellcome Trust (UK) are shifting away from journal-based metrics and placing less emphasis on citation counts. Evaluation now includes mentorship, policy impact, and knowledge mobilization.
MRU researchers must engage with these evolving standards to remain competitive.
What this means for MRU researchers
Traditional metrics still have a place, but researchers need to demonstrate impact in new ways. This includes:
✅ Better showcasing the real-world relevance of your work
- How your research influences policy, communities, and students
- Recognition for mentorship, interdisciplinary collaboration, and public engagement
✅ Knowing how to use qualitative and quantitative assessment technicques to describe your work
- Qualitative (narrative-based): Policy influence, teaching contributions, open science, real-world impact
- Quantitative (metrics-based): Citations, h-index, altmetrics, downloads, media mentions
Many researchers at MRU support the principles outlined in the San Francisco Declaration on Research Assessment (DORA) as part of our commitment to fair and transparent research evaluation. However, research assessment is shifting worldwide regardless of institutional endorsement of DORA. The framework provides a clear, widely recognized foundation for evolving expectations.
Library support & resources
The MRU Library is here to support researchers as they navigate these changes. Whether you need help with:
✔ Understanding new assessment models
✔ understanding expectations for narrative CVs
✔ Finding responsible metrics for your field
✔ Showcasing research impact beyond citations
We are available to provide guidance and resources for your research assessment conversations.
📌 For support, contact your subject librarian.
Quantitative Research assessment
Quantitative metrics
Traditionally when used for research assessment these metrics focus on citations to research outputs. These metrics offer several advantages, such as ease of collecting analytics data and the simplicity of reporting using numbers and statistics. The metrics described in this section reflect the most popular or well-recognized options, but this section is not exhaustive. There are many other metrics sources, tools, and models that researchers and research evaluators can use, and each have benefits and limitations.
Responsible Use of Metrics
Be aware that while these metrics are all based quantitative data, each metric or score described in this section may be affected by several factors and each uses a different calculation, timeframe, and analysis. Take care if you are attempting to compare two journals or two authors within each scale and between scales, as direct comparisons do not account for all of the variable impacts.
Further, as research expands and more formats are adopted, it is important to consider quantitative metrics in context of other factors. The SanFrancisco
Publication metrics
Publication-level metrics are used to quantify the impact of a individual publication, a specific article, a book or chapter, a report or whitepaper, a creative production, or almost any other single published artefact. Three key metrics described in this section:
- citations to the publication
- usage metrics (downloads/views)
- holdings (for books)
Citations
Where citations help: The most-used metric for demonstrating impact; a simplified measure of your publication's research impact; easy to find and understand.
Typically measured using citations to a particular article or other publication, literally a count of how many times that output has been cited in by other researchers in their publications and reports. A high citation count suggests that the publication has been impactful by informing others' research or by generated important conversations in the discipline.
- “My article was published published two years ago and has already been cited in 88 other research papers.”
Where to find your article's citation metrics: Google Scholar, Scopus, Web of Science, Dimensions.ai, OpenAlex
Usage Metrics
Where usage metrics help: Useful for articulating interest in a publication beyond just citation counts; sometimes easy to find; easy to understand.
Typically measured using views and downloads data for that publication. While often treated less impactful than citations, these metrics help demonstrate interest in your publication even when it hasn’t been formally cited by showing that it is being accessed by others.
- “On the journal website I can see that my article PDF has been downloaded 421 times. I can even see a graph of showing the frequency of downloads for the past 12 months.”
Where to find your article's usage metrics: Abstract/description page on publisher’s website; Author submission dashboard for the publication; Contact the publisher.
Holdings Metrics (for books)
Where holdings metrics help: For disciplines where book publishing (monographs) is the dominant form of research dissemination, highlighting how widely-held your works are is a valid option that go beyond citations.
Holdings is a simple count of how many unique libraries have copies of your book. These are usually found in a union catalogue like WorldCat. When this example was captured the book was available from 695 libraries
Author-level metrics
Author-level metrics are used to quantitatively report on engagement with your research over time by tracking and combining the number of citations to your publications.
Where article metrics help: a simple summary of your overall your cumulative research impact over your career; particularly useful if you have a lot of publications; easy to find and understand.
These metrics aim to highlight both the quantity of your publications and the impact of your research as measured by citations in a single number. Since it takes time to generate multiple publications, and even more time to generate citations to those, these metrics are more accurate as your career progresses. Consider this example:
- citations: The total number of citations to all of the author's publications found and tracked by Google Scholar.
- h-index: This author has an all-time score 58, which means that 58 of this author's publications (quantity) have received 58 or more citations (impact) since being published.
- i10-index: This author has 190 works (quantity) that have been cited at least 10 times (impact). Unique to Google Scholar, this index is useful for highly productive authors or varied career stages.
Note here that Google Scholar also provides a metric accounting for recent publications only, in this case the last 5 years. This recency score is useful for reporting current impact, and can help normalize for different levels of productivity at different career stages.
Where to find author metrics:
- Google Scholar profile
- Scopus
- Web of Science
- Publish or Perish (software).
Alternative metrics (or altmetrics)
Altmetrics primarily use the social web—mentions in new and media sources, social media, blogs, shared citation management libraries, policy documents—to analyze the attention your work receives. They provide a complementary way to assess the impact of your scholarship, and can be examined at the publication-level, the author-level, or journal-level.
Altmetrics can be helpful for non-traditional outputs. Examples
Altmetrics:
- are data tracked from discussions happening online
- can help measure how research is having an impact outside of the academy
- offer a greater sense of immediacy
- compliment established metrics, but cannot replace them
- are often not considered strong indications of impact
Advantages/Disadvantages
Altmetrics are generated more quickly than traditional metrics, such as the impact factor, which permits the impact of your publications to be assessed much sooner. They provide broader and deeper insights into the impact of scholarly articles, researchers, universities, and other such things that lie outside the scope of traditional metrics. They can reveal how an article or piece of research affects diverse groups such as practitioners, educators and the general public
However, it can be difficult to use altmetrics comparatively between different disciplines, or even in the same discipline. Some disciplines are more active than others online, and some may favor particular social media tools that are used less often in other areas of scholarship. Fluctuations in the popularity of social media tools can reduce the reliability of altmetrics scores.
Where to find alternative metrics: within LibrarySearch, Scopus, Web of Science, Altmetric.com, PlumX, publisher website
Journal-level Metrics
Journal-level metrics, sometimes called journal rankings, evaluate the importance of a particular journal based on citations to that journal and are usually calculated annually. Journal rankings are usually used alongside other journals from the same discipline as a means of comparison. Five key journal metrics are described in this section:
- Journal Impact Factor
- CiteScore
- SciMago Journal Rank
- Google Scholar Publication Metrics
- Eigenfactor Score
Journal Impact Factor (JIF)
Where JIF helps: the most common metric for journal impact; measures citations to a particular journal over the previous 2 years; easy to understand, but hard to calculate.
A journal’s JIF will change over time as both the number of citations and the window of measurement shift. A high JIF signals a high impact or influential journal. JIF data is proprietary from Clarivate and accessible via subscription to Journal Citation Reports via Web of Science.
- “I published in a top-5 journal in my discipline. It has an impact factor of 8.9.”
CiteScore
Where CiteScore helps: measures citations to a particular journal over the previous 3 years; easy to understand, but hard to calculate.
Like JIF, a journal’s CiteScore will change over time as the citations and the window of measurement shift, and a higher score indicates that journal is more influential journal. CiteScore data is proprietary from Elsevier via Scopus, but is free to access.
- “I published in a top-5 journal in my discipline. It has an impact factor of 8.9.”
Scimago Journal Rank (SJR)
Where SJR helps: measures average citations to a particular journal over the previous 3 years; not as common as JIF but just as easy to understand.
Like JIF, a journal's SJR will change over time based on the citation count and timeframe being analysed, and a higher SJR indicates higher impact. Scimago rankings are free to access but are based on Elsevier's proprietary data via Scopus.
Eigenfactor Score
Where Eigenfactor score helps: less commonly used, but the calculation uses the previous 5 years; scaled for more impactful citation sources; not as easy to understand.
Yet another metric used to highlight a journal's perceived importance, Eigenfactor tried to account for citations appearing in other highly-cited journals and weighs those more citations prominently. Conversely, citations from less eminent journals are given less weight when calculating the score. This score uses data from Journal Citations Reports but is free to access and use.
Google Scholar Publication Metrics
Where these metrics help: uses Google index instead of proprietary data, so a larger dataset to pull from; also includes Article Influence score; easy to understand.
Google Scholar Publication Metrics only include journals that have published 100+ articles in the last 5 years. The data comes from a combination of open repositories, websites, publisher information, and aggregators. It is free to use and access.
Where to find journal rankings and impact factors
An Introduction to Journal Impact Factor™ (JIF™) https://clarivate.com/academia-government/scientific-and-academic-research/research-funding-analytics/journal-citation-reports/publishers/first-time-publishers/
Qualitative research assessment: Highlighting research impact
Why it matters: The benefits of going beyond numbers
Global Shift Toward Qualitative Research Assessment
Incorporating qualitative impacts into research assessments provides a richer understanding of your work’s significance. Increasingly, funding agencies - such at the Tri-Agency - are adopting frameworks inspired by initiatives such as DORA (Declaration on Research Assessment) and the Leiden Manifesto, which advocate for a shift away from a focus on quantitative metrics. This approach highlights diverse contributions and real-world relevance. The Tri-Agency (SSHRC, NSERC, CIHR) and other funders are adopting this approach to support responsible research assessment, shifting focus from quantity to qualitative impact.
Why is this change happening?
Funders & institutions are shifting away from narrow metrics (e.g., journal impact factors, citation counts) to recognize a broader range of research contributions.
Agencies like CIHR, NSERC, SSHRC, UKRI, and the European Research Council are leading this shift to ensure fairer, more meaningful assessment.
This approach values mentorship, policy influence, community engagement, and interdisciplinary work, which don’t always show up in traditional metrics.
Key benefits of this more content-rich approach include:
- Better demonstration of the real-world relevance of your work.
- Alignment with funding agencies’ with funding agencies' evolving priorities, such as equity, diversity, and societal impact.
- A more a comprehensive view of your career, capturing contributions beyond publications.
Qualitative assessment. Recognizes diverse activities that shape research, teaching, and engagement. Helps showcase the full impact of your work—on policy, students, communities, and research culture. Moving you beyond traditional metrics like citation counts that don’t tell the whole story of your research.
Why should I care?
- Tells the full story of your research impact—not just how often you’re cited
- Strengthens grant and promotion applications with richer evidence of your contributions
- Recognizes all the ways your work makes a difference—mentorship, policy influence, community partnerships, and more
- Helps meet evolving expectations from funders and publishers
Why Should Researchers Adapt to our assessment methonds including adopting Narrative CVs?
More than metrics: Funders & institutions are moving beyond journal impact factors & citation counts to recognize mentorship, policy influence, & community engagement.
Global trend: Agencies like CIHR, NSERC, SSHRC, UKRI, & ERC are shifting toward holistic assessment.
Visibility of contributions: Many researchers already do impactful work that isn’t reflected in traditional CVs.
Will This Be More Work?
✔ Yes, but it’s worth it.
The goal is not to write more, but to make contributions more visible.
It levels the playing field, particularly for interdisciplinary, early-career, or non-traditional researchers.
📌 How to Get Started:
✅ Think beyond citations – Who benefits from your work? How?
✅ Use real examples – Mentorship, collaboration, policy influence, knowledge mobilization.
✅ Check funder guidelines – Priorities vary.
✅ Start small – Revise one section at a time.
The narrative CV: Showcasing your work with context
A narrative CV goes beyond publication lists and metrics to highlight the quality, relevance, and real-world impact of your work This format provides meaningful context for your contributions, reflecting individual strengths, values, and broader research impact
📢 Tri-Agencies (CIHR, NSERC, SSHRC) will require this format by 2025.
Tri-Agency Narrative CV: Structure & Key Components
The narrative CV format consists of three sections, each designed to capture your broader research contributions.
Personal Statement
✔ Outlines your professional philosophy, research focus, and career goals.
✔ Highlights the themes and motivations behind your work.
✔ Provides a contextual overview of your approach to research, mentorship, and leadership.
Think of this as your introduction—why does your work matter?
Most Significant Contributions & Experiences
✔ Focuses on quality over quantity—showcasing relevance, impact, and key achievements.
✔ Includes research outputs, policy impact, leadership roles, interdisciplinary collaborations, & knowledge mobilization efforts.
✔ Uses contextualized descriptions (e.g., how research influenced policy, supported communities, or advanced the field).
This section answers: What are your most impactful contributions?
Supervisory & Mentorship Activities
✔ Highlights mentorship, supervision, and team leadership.
✔ Covers student training, interdisciplinary collaboration, research group leadership, & capacity building.
✔ Recognizes contributions to EDI (Equity, Diversity, and Inclusion) in training and mentorship.
This section answers: How have you supported and developed others?
➡️ For examples of how to structure your contributions, to describe impact effectively, see the 'In Practice' tab for examples.
Practical applications:
- Grant applications: Required for Tri-Agency competitions starting in 2025 - see the Tri-Agency announcement Oct, 24, 2024.
- Career advancement: Strengthens tenure, promotion, and award applications.
- Leadership and mentorship roles: Ideal for showcasing contributions beyond research outputs.
Tri-Agency Narrative CV: What you need to know
- What? SSHRC, NSERC, CIHR (Tri-Agency) replacing the Canadian Common CV (CCV) with a narrative CV
- When? Rolling out in 2025, starting with SSHRC Impact Awards
- Why? Focuses on quality, impact, and diverse contributions instead of lists of publications, highlighting societal impact, leadership, and knowledge mobilization
- How? Three sections: Personal statement, Significant contributions, Mentorship. Max 5 pages (English), 6 pages (French)
.
If you're looking for the bigger picture on why this shift is happening, check out 'What and Why.'
📌 Need help? Many institutions (including MRU Library!) offer support on narrative CVs, research impact, & responsible metrics.
What is qualitative assessment? Exploring broader research contributions
Beyond the Narrative CV—Demonstrating Your Impact Daily
Researchers can document and highlight their work in ways that align with evolving assessment practices:
Track public engagement – Media coverage, public talks, podcast interviews, blog posts
Document policy influence – Citations in government reports, white papers, or advisory roles
Recognize open science contributions – Preprints, data sharing, open-access publications
Showcase knowledge mobilization – Industry collaborations, public outreach, patents, software development
Shape Your Research Impact Strategically
To proactively position your work in this evolving landscape, consider:
Expanding dissemination – Share findings in multiple formats (e.g., reports, infographics, videos)
Engaging in interdisciplinary collaborations – Strengthen connections across disciplines
Using institutional repositories & altmetrics – Track engagement beyond citation counts
Maintaining research profiles – Ensure ORCID, Google Scholar, and institutional profiles are up to date
🔗 [Link to MRU research profiles and repositories.]
preambel qualitative asseemsent - framing your research contributions to demonstrate impact
how researchers can integrate qualitative impact measures into their regular work, beyond just CVs.
touch on stuff like Documenting public engagement (e.g., public talks, media contributions) Tracking policy influence (e.g., citations in government reports) Recognizing open science contributions (e.g., data sharing, preprints)
shape your research impact strategically - proactivley poisiton your work
align research with evolving assessment models—things like: Expanding research dissemination beyond traditional journals, Engaging with interdisciplinary collaborations Using institutional repositories & altmetrics to track engagement, kink out to research profiles
Leiden Manifesto on responsible metrics
thinkgs to think about in terms of identifyingyour deifferent contributions and framing them on your websie
Category | Description | Example |
---|---|---|
Clearly define your contribution using CRediT Taxonomy
|
Make sure your contributions are recognized by using a clear description. A growing list of publishers are using CRediT Taxonomy to ensure transparency and accountability. CRedit includes 14 defined roles. |
Data Curation: Managing and preserving data. Visualization: Creating figures, graphs, and models Writing – Original Draft: Preparing the initial manuscript |
Showcase research impact | Go beyond citations—highlight policy influence, innovative ideas, tools, and methodologies. | "Research findings on equitable AI systems contributed to national guidelines for ethical technology deployment." |
Recognize your contributions to teaching | Demonstrate how your research shapes student learning and mentorship. | "Co-designed a climate change module with local organizations, now part of undergraduate courses." |
Highlight community engagement | Document partnerships, stakeholder collaboration, and research mobilization beyond academia. | "Collaborated with Calgary’s immigrant business owners to create a toolkit for underrepresented entrepreneurs." |
Demonstrate knowledge mobilization | Show how your research reaches wider audiences through events, reports, and media. | "Created interactive webinars on mental health literacy, reaching over 1,000 community service providers." |
Advance Your Professional Growth | Show leadership, mentorship, interdisciplinary collaboration, and efforts to foster an inclusive research culture. | "Supervised undergraduate students, with several advancing to graduate programs." |
Engage with open scholarship | Support research transparency and accessibility through open access and data sharing. | "Created a public data repository, facilitating further research and community engagement." |
Reflective asssement of teaching and other
Advice for preparing a narrative CV
- Seek institutional support – Many institutions have research facilitators or grant support teams that review narrative CVs before submission. Take advantage of this resource if available.
- Expect a different process – Unlike traditional CVs, a narrative CV requires reflection on how to present your contributions effectively. Be prepared to reorganize and refine your content.
- Focus on impact – Instead of just listing achievements, provide concrete examples of your contributions and their real-world effects. The more tangible and specific, the better.
- Plan for extra time – Crafting a strong narrative takes effort. Factor in additional time to shape your story, highlight key contributions, and ensure clarity.
- Group activities by impact area, not just by year
- Frame service in terms of leadership, innovation & institutional change
- Use active, outcome-driven language
- Tailor explanations to match audience expectations
📌 Tip: Have a colleague review your CV for clarity and impact before submission.
📌 Example:
❌ "Chair, COPPUL Collections Community (2021–2023)"
✅ "Led national discussions on licensing & open-access strategies, shaping resource-sharing models for 22 academic libraries."
📌 Example:
❌ "Supervised five graduate students."
✅ "Mentored five graduate students, supporting their research in open science methodologies and guiding two to successful Tri-Agency grant applications."
✔ Instead of listing publications, grants, & service roles, describe their impact.
✔ Example:
❌ "2021–2023 Chair, COPPUL Collections Community"
✅ "Led national discussions on licensing & open-access strategies, shaping resource-sharing models for 22 academic libraries."
✔ What counts?
Research that changes policy or practice
Mentorship & training contributions
Community engagement & public outreach
Open science initiatives, software, datasets, & non-traditional outputs
📌 Next Steps: Check funding competition guidelines & templates, review FAQs, and tailor your CV accordingly.
Next Steps
📌 Want to ensure your contributions are recognized?
🔗 Learn more about CRediT: credit.niso.org
Where to learn more: Tips, resources, and examples
For those seeking deeper insight into qualitative research assessment and narrative CVs, these resources offer guidance, templates, and best practices:
General Resources
- Research Culture: Changing how we evaluate research is difficult, but not impossible. (Hatch & Curry., 2020, eLife).
- DORA (Declaration on Research Assessment): Advocates for reducing reliance on metrics like journal impact factor and promotes qualitative approaches.
- Leiden Manifesto for Research Metrics: Offers 10 principles for responsible research evaluation, emphasizing qualitative context and broader contributions.
- CoARA (Coalition for Advancing Research Assessment): An international collective of organisations committed to reforming the methods and processes by which research, researchers, and research organisations are evaluated
Narrative CV
- Tri-Agency CV.
- template
- CIHR - Tri-agency CV
- NSERC - see Discovery Horizons (pilot) instructions for helpful information
- SSHRC - Tri-agency CV instructions
- CIHR FAQs SSHRC FAQs
- How to write a narrative CV. Henville, L. (2024). University Affairs. A concise guide to crafting a compelling narrative CV, with tips on structuring key sections, showcasing research impact, and using clear, evidence-based storytelling.
- University of Winnipeg’s guidance on narrative CVs
- Western University’s library guide on structuring your CV effectively.
- Crafting a Compelling Narrative CV. Slides from OOR Workshop, Concordia University, October 2023. This workshop covered the growing use of narrative CVs in funding applications, emphasizing storytelling, research impact beyond citations, and evidence-based contributions. It provided strategies for writing persuasive narratives and preparing for the transition to this format.
- CIHR
Qualitative Impact & Knowledge Mobilization
- Research Impact Canada: Tools, resources, and training. Research Impact Canada - A national network focused on supporting researchers in enhancing the impact of their work within communities. They offer tools, resources, training, and insights into best practices.
- Knowledge Mobilization (University of Victoria Libraries) Knowledge mobilization - University of Victoria Libraries
- Knowledge Impact Assessment Toolkit (University of Calgary) Knowledge impact assessment toolkit - University of Calgary
- Guidance recently released at the UofC on writing most significant contributions statement & template
Storytelling and Digital Tools
- Storytelling for impact - Free online course module for ethical and inclusive practices relating to collecting stories from diverse individuals
- Impact story toolkit - Guide to sharing evidence of change after a specific intervention has been put in place (CIVICUS)
- Digital storytelling for social impact. (The Rockefeller Foundation)
TL;DR – Narrative CVs & Research Assessment
The ERC is adopting narrative CVs in grant applications. Researcher Anja Leist shares insights as an applicant and reviewer. She found the narrative cv provided more context: Highlights policy impact, open science, and interdisciplinary work beyond traditional metrics.
Challenges: Language barriers, self-promotion differences, and evolving review criteria may affect fairness.
https://erc.europa.eu/news-events/magazine-article/researchers-experience-narrative-CV
References and Further reading
Thelwall, M., Kousha, K., Makita, M. et al. In which fields do higher impact journals publish higher quality articles?. Scientometrics 128, 3915–3933 (2023). https://doi.org/10.1007/s11192-023-04735-0
This study explores the relationship between journal impact factors and article quality across different disciplines. Using citation-based analyses, the authors examine whether higher-ranked journals consistently publish higher-quality research or if the correlation varies by field. The findings highlight disciplinary differences in research dissemination and assessment.
A Pledge for Community-Driven Evaluation of Research Quality and Impact
https://blog.scienceopen.com/2023/09/a-pledge-for-community-driven-evaluation-of-research-quality-and-impact/
This initiative advocates for a more inclusive and researcher-led approach to assessing research quality and impact. It emphasizes transparency, diverse contributions, and the responsible use of both qualitative and quantitative indicators. The pledge encourages the academic community to move beyond journal-based metrics and adopt evaluation practices that reflect the full range of scholarly contributions.
Informing Research Choices: Indicators and Judgment
Published by the Council of Canadian Academies, this report examines how indicators and assessment methods are used in natural sciences and engineering. It provides recommendations for more effective and responsible research evaluation in these fields.
The Leiden Manifesto
A 2015 article presenting ten guiding principles to prevent the misuse of research metrics. It emphasizes the need for transparency, contextual understanding, and expert judgment in evaluating scholarly work. A summarized video version is available above.
The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management
This UK-based review explores the role of quantitative indicators in evaluating and overseeing research. It offers 20 recommendations on how metrics can be applied thoughtfully in research assessment.
San Francisco Declaration on Research Assessment (DORA)
Developed by the American Society for Cell Biology (ASCB) and scholarly journal editors, DORA calls for improvements in how scientific research outputs are assessed. It challenges the reliance on journal-based metrics and advocates for fairer evaluation practices. The University of Calgary endorsed DORA in 2021.
Leiden Manifesto – Provides ten principles for using research metrics responsibly, ensuring they support fair and meaningful assessment.
CoARA (Coalition for Advancing Research Assessment) – A global coalition of institutions committed to implementing fairer research evaluation systems, often aligning with DORA and Leiden principles.
Narrative CVs. Western University (2025). Retrieved from https://uwo.ca/research/services/kex/narrative_cvs.html
- Western University’s narrative CV guide explains how researchers can showcase impact beyond publications.