Skip to Main Content

Banner

Research assessment

Tools, guidance, and examples to help you navigate changing research assessment practices and highlight the full value of your work.

Navigating changes in research assessment

Mount Royal University signs DORA

MRU signed the San Francisco Declaration on Research Assessment (DORA) in 2025. Learn more about this commitment and what it means for researchers at MRU.

DORA is a global initiative supporting improvements to how research is assessed—encouraging fair, transparent, and inclusive approaches. MRU’s decision to sign reflects longstanding institutional values: recognizing diverse research contributions, prioritizing real-world impact, and fostering scholarly integrity.


Why it matters at MRU

Research assessment is shifting across Canada and internationally. Funders, institutions, and scholarly communities are expanding what “impact” means—ensuring recognition for high impact practices like mentorship, collaboration, public engagement, and knowledge sharing.

At MRU, these principles are not new. Our Research and Scholarship Plan 2024-2030 already reflects many of the values championed by DORA. MRU’s decision to sign reflects longstanding institutional values: recognizing diverse research contributions, prioritizing real-world impact, and fostering scholarly integrity.


Related frameworks & global initiatives

MRU’s commitment to values such as those outlined in DORA aligns with broader movements toward responsible research assessment, such as those listed below.

Tip: Use these frameworks to support and inspire continuous improvement, building on the research practices already in place.

What MRU researchers need to know about changing research assessment

Research assessment is changing, and so are the ways you’re expected to show your impact. Citation counts still count—but they’re just one piece of the puzzle. Get ahead by showing the full story of your work in diverse and meaningful ways.

What's expanding in research assessment:  

  • Relevance. Highlight the impact of your work on students, communities, or policy
  • Contributions. Showcase mentorship, teaching, interdisciplinary work, and collaboration
  • Reach. Emphasize public engagement, open scholarship, and knowledge mobilization

Key takeaway: Use a mix of quantitative (citations, altmetrics, downloads) and qualitative (narrative, peer recognition, impact stories) indicators to ensure a comprehensive demonstration of your impact.

Comparing qualitative and quantitative indicators of impact

Approach Examples Strengths Considerations
Qualitative
narrative or expert judgment
Mentorship, peer review, policy influence, community engagement Contextual, nuanced, peer- or community-recognized Less standardized, needs interpretation; opportunities for bias or lack of objectivity
Quantitative
numeric indicators
Citation counts, h-index, altmetrics Relatively comparable, scalable, perceived objectivity Doesn’t capture all forms of impact; can be manipulated

Resource spotlight - Practical tools for impact: Explore the Metrics Toolkit for practical ideas on how to demonstrate research impact. It offers a comprehensive list of metrics, along with their sources, strengths, limitations, and when to use them.  

DORA and aligned research assessment values emphasize:

  • Assessing research on its own merits, not solely based on where it’s published (including avoiding over-reliance on journal-based metrics like the Journal Impact Factor JIF)
  • Recognizing diverse contributions, including mentorship, collaboration, and community engagement
  • Using both qualitative and quantitative indicators thoughtfully, selecting those that best reflect the type of impact, and explaining their relevance (rebalancing the use of qualitative and quantitative metrics)
  • Supporting openness and accessibility through digital tools and open publishing practices

How can the Library support research assessment?

MRU librarians can help you explore responsible assessment practices and connect you with tools and discipline-relevant examples.

  • Book a consultation for support with academic profiles, metrics, or research visibility
  • Get guidance on narrative CVs and qualitative indicators
  • Learn about tools and examples tailored to your field

Common questions about the changes in research assessment

Global changes in research assessment

How is research assessment evolving globally?

Research assessment is evolving globally —including greater emphasis on openness, collaboration, interdisciplinarity, and societal impact. In Canada, the Tri-Agency funders (CIHR, NSERC, SSHRC) are aligning with responsible research assessment principles. Whether or not an institution signs declarations such as DORA, researchers will encounter these shifts in funding applications, hiring, and collaborations.

Resource spotlight:

Balancing metrics with qualitative measures

How do updated research assessment approaches affect the use of quantitative metrics like journal impact factor?

Quantitative metrics still have a role in research assessment, but the emphasis is on using them effectively and in context, rather than as the sole measure of quality. Increasingly, funders and institutions recommend combining metrics (citations, altmetrics, downloads) with qualitative indicators such as mentorship, policy influence, and interdisciplinary collaboration.

The library can help researchers find resources on the evolving use of metrics in research assessment and connect them with best practices in their discipline.

Resource spotlight:  Responsible Use of Metrics: The Metric Tide Report (UK)

Can we continue to ensure research quality and rigor even while adapting research assessment criteria?

Yes — the shift in assessment practices does not eliminate evaluation criteria, rather it broadens them. Various national and international research organizations, including the Tri-Agency funders, emphasize that quality and rigor are best assessed through peer review, methodological soundness, reproducibility, and impact and contribution to the field, rather than relying solely on citation counts or journal prestige.

Resource spotlight: Experiences with the narrative CV - a researcher shares. Researcher Anja Leist shares her experience, noting benefits such as added context and impact, and challenges including language, self-promotion styles, and evolving expectations.

Publishing and career impact

Do evolving approaches mean loss of choice over where we publish? 

No, assessment practices such as those described in the DORA recommendations do not restrict researchers' publication choices. Instead, they discourage using journal prestige as a proxy for research quality, advocating for assessment based on the actual content and impact of research.

How can I explore different ways to share my work beyond top impact-rated journals? 

While journal visibility still matters, there are growing opportunities to broaden the reach of your research. Funders and institutions increasingly value these alternative approaches, specially when documented in CVs and applications. Alternative approaches include:

  • Open and early-access platforms:
  • Academic presentations and outputs:
    • Conference presentations, posters, or proceedings
    • Policy briefs or technical reports for decision-makers or practitioners
  • Public and community engagement:
    • Blog posts, op-eds, and podcasts to share insights with the public
    • Social media platforms like X (Twitter), LinkedIn, or Mastodon to engage with your field
  • Visibility and discoverability tools:
    • Researcher profiles (e.g., ORCID, Google Scholar) to make your work discoverable
    • Community-based or practitioner-oriented publications relevant to your area of impact

Tip Matching your dissemination strategy to your intended audience or type of impact is key

How can I ensure my contributions to research projects are properly recognized?

Researchers often contribute in many ways beyond authorship, and funders and institutions are recognizing the need for clearer attribution of contributions. The CRediT taxonomy provides a structured way to describe specific roles in research projects, from conceptualization and methodology to supervision and data curation, ensuring more accurate credit for contributions.

Researcher workload and transitioning to new models

How do I find time to adapt my approach to research assessment? How do I get started if this feels overwhelming?

Like most things in research, adapting your assessment approach takes some work—but it doesn’t have to happen all at once. A useful first step is to see what funders are already asking for. For example, the NSERC Discovery Horizons pilot provides clear directions for narrative CVs.

If you know what kinds of evidence or stories that funders look for—like meaningful impacts or contributions—you can start tracking those as your research progresses. Taking a bit of time now can make it easier to prepare when you need it. 

Start small. Draft one section, seek peer feedback, or join a writing group if available.

Explore additional readings in the resource list below for more on responsible metrics and research assessment.

What if I find it hard to promote myself or write about my contributions?

You’re not alone. It is common to feel uncomfortable with self-promotion, especially if describing the impact of your work with more reliance on qualitative measures is new to you, when trying to translate complex work into plain language, or working in an additional language. It’s okay if this doesn’t come naturally. These skills can be developed over time, and support is available. Start by focusing on describing your work clearly and authentically. Framing it for yourself as telling the story of your impact, rather than self-promotion or bragging, may be a helpful reframing.

Resource spotlight

Could these approaches put early-career researchers or those with a shorter career of research at a disadvantage?

Many traditional metrics favour well-established researchers with long publication records. Narrative CVs and alternative assessments allow early-career researchers to highlight a broader range of contributions, including:

  • Mentoring
  • Interdisciplinary work
  • Public engagement

These contributions may not always be reflected in citation counts, but responsible assessment practices provide alternative ways to showcase impact.

Do I still need my Canadian Common CV (CCV)?

Yes—for now. Use your CCV as a personal archive and for award or hiring purposes.

The Canadian Common CV (CCV) is the current standard CV system used by Canada’s research funding agencies. While a transition to a narrative-style CV (TCV) is underway, the CCV remains required for many competitions. The Tri-Council indicates that researchers will be given notice before it is phased out, and guidance will be provided on how to export their information.

Considerations across the disciplines

Are there alternative ways to assess research quality in STEM fields?

Yes. Many scientific organizations and funders are adopting broader, field-appropriate models that combine quantitative indicators with qualitative evidence. These approaches may include narrative CVs, documentation of open science practices, mentorship, societal impact, and more. The goal is to better reflect the full value and context of research contributions.

Similar shifts toward balanced and context-sensitive assessment are also underway in other disciplines.

How does research assessment account for outputs beyond journal articles in the humanities and social sciences — and in other fields?

Research in the humanities and social sciences often takes the form of books, exhibitions, performances, or public scholarship — and diverse outputs are increasingly common across disciplines. To ensure fair evaluation, hiring committees, peer reviewers, and adjudicators need to recognize these different forms of scholarship and understand their significance in disciplinary contexts. Evolving assessment practices emphasize quality, impact, and relevance over format, helping ensure researchers in all fields are evaluated on the full scope of their contributions.

Does broader research assessment mean disciplinary context matters less?

Not at all. Different disciplines have distinct norms for publishing, timelines, and ways of demonstrating impact. For example, citation patterns in STEM may differ significantly from those in the humanities or social sciences. Responsible assessment encourages evaluation that reflects these differences rather than applying uniform criteria across disciplines.

How is team-based or collaborative research recognized in assessment?

Many research projects today are collaborative, involving co-investigators, interdisciplinary teams, and shared outputs. Traditional metrics often emphasize individual outputs, which may overlook the value of collaborative contributions. Responsible research assessment encourages clear documentation of each researcher's role, for example through tools like the CRediT taxonomy or narrative CVs that describe collaborative work and team impact.

 

 

Content note: AI tools were used for this section as part of an editing process to improve clarity, eliminate duplication, correct grammar and spelling, and tighten the writing.

Qualitative Assessment

Purpose

This Library resource gives broad guidance on narrative CVs and qualitative approaches to research impact.

  • For official Tri-Agency CV requirements and support: see the ORSCE Narrative CV page. 
  • For practical help: this page focuses on how to prepare, write, and showcase your contributions beyond metrics, both for Tri-Agency funding and for broader contexts like promotion, awards, and hiring.

What is a narrative CV?

A structured way to describe your research and career path using context, outcomes, and contributions instead of lists of metrics.

Emerging evidence suggests they help reviewers assess mentorship, outreach, and policy influence alongside traditional research outputs, providing richer context for your contributions.

They are increasingly used in:

  • Tri-Agency funding applications (gradual rollout, 2025 onward)
  • Tenure and promotion
  • Leadership and award applications

Key takeaway: Researchers should start preparing now.


Preparing now

Start collecting evidence:
  • Keep a running document of significant contributions, mentorship roles, EDI initiatives, policy or community impact.
  • Track non-traditional outputs (datasets, reports, exhibitions).
Practice narrative writing:
  • Write in plain language for a non-specialist audience.
  • Focus on outcomes: What changed because of your work?
Strengthen your visibility:
  • Keep ORCID and other researcher profiles updated.
  • Save examples of impact (downloads, citations in policy documents, media, collaborations).

Tri-Agency narrative CV

The Tri-Council (SSHRC, NSERC, CIHR) is replacing the Canadian Common CV (CCV) with a narrative format.
Starting with: SSHRC Impact Awards in 2025.

Format:

  • Personal statement – your research focus, motivations, and career path.
  • Most significant contributions & experiences – relevance, influence, leadership.
  • Supervisory & mentorship activities – mentorship, collaboration, team leadership, EDI efforts.

For MRU-specific details, guidance, and updates: see ORSCE’s Narrative CV page.


How narrative CVs vary by institution & purpose

While Tri-Council guidelines provide a standardized format, narrative CVs vary based on the funder, institution, or purpose. Some key differences include:

  • Structure:  Some funders emphasize mentorship, while others focus on interdisciplinary work.
  • Length: Tri-Council allows 5 pages (English), 6 pages (French), while other institutions may set different limits.
  • Emphasis: Some CVs highlight societal impact, while others prioritize leadership or research excellence.

Tip: Reuse strong content across applications, tailoring your CV to the specific guidelines of the funder or institution you’re applying to.


MRU support

The Research Office (ORSCE) leads Tri-Agency narrative CV guidance, including official requirements. The Library complements this by helping you find examples, tools, and resources that show how to describe contributions and impact across different contexts.

Qualitative metrics: What they are and why they matter

Non-numerical indicators that provide context, meaning, and nuance to research contributions.

Why it matters: Beyond numbers in research assessment

Research assessment is evolving globally. Funders—including Canada’s Tri-Council (SSHRC, NSERC, CIHR)—are moving beyond citation counts to give more recognition to qualitative contributions beyond peer review.

Why broaden our use of qualitative metrics?

They support stronger applications and offer a fuller picture of research contributions by recognizing:

  • Real-world relevance: How research influences policy, students, communities, and research culture
  • Alignment with funding priorities: Many funders prioritize equity, diversity, and societal impact
  • Application readiness: Helps communicate diverse contributions clearly in grant, tenure, and promotion materials
  • Broader contribution types: Captures mentorship, outreach, and public engagement

Challenges & considerations

  • Subjectivity & bias: Peer review and narrative assessments can be influenced by unconscious bias.
  • Lack of standardization: No benchmarks like citation counts.
  • Time-intensive: Evaluating mentorship, collaboration, and policy influence requires effort.
  • Difficult to quantify impact: Some contributions are harder to measure numerically
  • Field-specific norms: STEM fields often prioritize citation-based measures, while humanities/social sciences emphasize qualitative outputs.

In practice: Demonstrating research impact

Why this matters

Impact is more than citations. Showing how your work influences people, policy, or practice strengthens applications and CVs.

Track your impact

Document outcomes as they happen:

  • Contributions: Use the CRediT Taxonomy to describe your role in collaborative work.
  • Beyond citations: Note how your work shaped policy, practice, tools, or public understanding
  • Teaching: Show how research enriches learning or mentorship
  • Community: Record partnerships and uptake of research outside academia
  • Feedback: Save testimonials and informal evidence
  • Knowledge mobilization: Track presentations, reports, briefs, media, open outputs
  • Profiles: Keep ORCID and researcher profiles updated
  • Open scholarship: Share work through open access and data

Tip: The Library can help you choose tools to track downloads, shares, or altmetrics.


Write strong impact statements

Impact statements show how your work made a difference:

  • Describe: what you did → how you did it → why it mattered
  • Use plain language for non-specialists
  • Group by theme (leadership, mentoring, community) rather than date
  • Highlight leadership, collaboration, and EDI contributions
Examples
Focus on outcomes
  • Led a national open access project that shaped resource-sharing for 22 libraries
  • Mentored five students; two published peer-reviewed papers
  • Partnered with community groups to deliver 1,000+ mental health workshops, informing policy
  • Co-authored a policy brief adopted in regional disaster planning

Get started with your narrative CV

Practical tips
  • Frame it like a job interview: connect contributions to your goals
  • Align with funder guidelines and structure
  • Track key contributions early
  • Go beyond citations: add why it matters, your role, and results
  • Include context for career gaps (e.g., caregiving)
  • Revise in small sections over time
  • Seek feedback: peer review sharpens narratives

See the Library's CRediT FAQ for examples.

Tip: Generative AI tools can help with structure (especially for multilingual researchers) but use cautiously. Reviewers want your voice; explain any AI use if allowed.

Next step: Start a running document now. Record key contributions and context as they happen - future CVs and grant applications will be easier.


MRU support

The Library can help you explore tools and resources to document, describe, and share your impact. We don’t evaluate or write on your behalf but can connect you with guidance to strengthen your CVs, applications, or research profiles.

Peer review: Value and evolving practices

Why peer review matters

Peer review remains central to research: it improves quality, validates work, and supports research integrity. New models are expanding how it is done.

The roles of peer review
  • Development: Strengthens research through expert feedback
  • Validation: Confirms quality and relevance
  • Integrity: Promotes transparency and fairness
  • Innovation: Includes new approaches like open and post-publication review

Tip: When publishing outside traditional journals, choose venues that include peer review, these carry weight in grants and promotion.


Peer review as evidence of impact

  • Peer-reviewed datasets, software, and reports strengthen your portfolio
  • Supports recognition of diverse research outputs beyond articles
  • Aligns with evolving DORA and Tri-Agency assessment practices

Changing models

Peer review is expanding. In addition to traditional journal review, many other forms of peer review contribute to research quality. Some of these approaches are well-established (e.g., book and policy review), while others are newer, open, or evolving models. Staying aware of these options can help you choose the right venue and get recognition for diverse outputs.

Examples of peer review beyond journal articles
Examples of peer review beyond articles
  • Policy reports: Reviewed by experts and stakeholders (e.g., Canada in a Changing Climate Reports)
  • Books: Reviewed by editorial boards (e.g., McGill-Queen's University Press)
  • Guidelines: Expert panels and advisory bodies (e.g., PHAC, WHO)
  • Indigenous knowledge: Community validation led by Indigenous scholars and Elders
  • Preprints: Community comments and open peer review (e.g., medRxiv, OSF)
  • Data and software: Reviewed for structure and reproducibility (e.g., FRDR datasets)
  • Post-publication review: Open feedback after publication (e.g., F1000Research, eLife)

MRU support

The Library can help you explore resources on peer review practices in your field — including emerging models, options for non-traditional outputs, and where to look for guidance if you want to publish in newer formats.

Quantitative assessment

Quantitative metrics: What they are and why they matter

Numerical indicators used to measure research activity, productivity, and influence.

Why it matters: The role of citation-based measures

Citation counts and journal metrics have traditionally been the primary tools for assessing research impact. While useful for tracking influence, these measures often exclude other valuable contributions to research and society.

Why use quantitative metrics?

These measures offer several advantages:

  • Easy to report: Simple numerical indicators
  • Automated tracking:  Large datasets make it scalable
  • Long-term impact measurement: Shows citation trends over time
  • Supports applications: Used in tenure, promotion, and funding decisions
  • Informs publishing decisions: Helps identify high-visibility venues

The metrics described here reflect the most widely recognized options, but many other sources, tools, and models exist, each with benefits and limitations.

Limitations of quantitative metrics

  • Not a direct measure of quality: High citations don't always indicate strong research
  • Can be manipulated: Strategic use of citations or controlled timing of certain publications can skew metrics
  • Excludes non-traditional outputs: Books, performances, exhibitions, and policy contributions can be overlooked
  • Proprietary data: Many metrics are controlled by publishers, limiting transparency and inclusion

Responsible use of metrics

To ensure fair and meaningful research assessment, quantitative metrics should be used alongside qualitative indicators. Funders and institutions now encourage:

  • Balanced assessment: Combining citation-based and qualitative measures for a fuller picture of impact
  • Contextual evaluation: Recognizing different practices, especially across disciplines, and avoiding direct comparisons
  • Peer review & expert judgment: Ensuring scholarly rigor remains central across disciplines.

Numbers alone aren’t enough, qualitative indicators are essential for a fuller picture of research contributions.  
As Albert Einstein said:

"Not everything that can be counted counts, and not everything that counts can be counted."

Journal metrics: Assessing where research is published

Metrics that reflect the visibility, selectivity, or influence of journals where research appears.

Journal-level metrics help assess the influence of journals, not individual researchers or articles. Some journal metrics are behind paywalls (e.g., JCR), while others (e.g., Scimago, Google Scholar or information posted to a journal's website) are free.

Journal metrics help with:  

  • Choosing where to publish
  • Highlighting journal prestige in tenure or grant applications

Tip: Journal impact changes over time, don’t assume the “usual” venue is always the best fit.


Use journal metrics responsibly

  • Stay discipline aware: Only compare journals within the same field
  • Use multiple metrics: Each captures different aspects of influence
  • Think beyond numbers: Consider peer review model, open access status, and audience reach

Key Takeaway: Use journal metrics to evaluate journals, not to assess individual research quality.


Where to find journal metrics

Metric Access details Description
Journal Impact Factor (JIF) Subscription only (Clarivate); MRU does not subscribe, but some journals list their JIF on their homepage. Most recognized metric; 2-year citation ratio. Widely used in tenure and grant contexts.
Eigenfactor & AIS Free; based on Clarivate/JCR data

Eigenfactor: 5-year citations, weighted by source quality. Includes Article Influence Score (AIS) that measures average article influence over 5 years.

CiteScore Free; based on Elsevier/Scopus data. 4-year citation average; broader journal coverage than JIF.
Scimago Journal Rank (SJR) Weighted 3-year citations; adjusts for discipline norms; highlights field-specific trends.
Source Normalized Impact per Paper (SNIP) Free; based on Elsevier/Scopus data. Citation impact normalized by subject field; helpful across disciplines.
Google Scholar Metrics Free; less curated. 

5-year citation count; broad coverage including grey literature.

Tip: Google Scholar includes a wide range of sources, not all are peer-reviewed. Always check journal quality.

Journal metric examples
CiteScore

Scimago Journal Rank (SJR)

 
Google Scholar Publication Metrics
Eigenfactor Score


Find journal metrics on publisher sites

Many journals share metrics on their own websites, typically under, About the JournalSubmission Guidelines, or Editorial Policies. Within these pages you may find:

  • Journal Impact Factor, CiteScore, or SJR
  • Index inclusion (e.g., DOAJ, PubMed, Web of Science)
  • Peer review model or acceptance rates
  • Publisher or subject affiliations

Example: Canadian Journal of Bioethics publicly shares its Scopus metrics and indexing status on its website.

Author-level metrics: A quantitative snapshot of impact

Author-level metrics like h-index or citation counts that track cumulative scholarly influence.

These metrics usually help track how often your work has been cited, providing a cumulative snapshot of your publishing impact. 

Types of author-level metrics:

  • Citations: Total citations across your work. Found in Google Scholar, Scopus, and Web of Science
  • h-index: Reflects productivity and impact. An h-index of 15 means that author has 15 papers with 15+ citations each
  • i10-index (Google Scholar): Counts how many of your works have at least 10 citations. Useful early/mid-career
  • Recent 5-Year Metrics (Google Scholar): Citations and h-index from the past 5 years. Useful for showing recent activity or accounting for career breaks

How author-level metrics help

They track how much you’ve published against how often your work has been cited to offer a picture of your research influence.

Tip: Combine these with qualitative context to establish impact.


Where to find author metrics

Platform

Access Details

Description

Google Scholar Profile

Free Tracks citations, h-index, i10-index; widely used but less curated
Scopus
 
Subscription only
(MRU has access)
Provides curated citation data and author metrics like h-index
Web of Science
 
Subscription only*
(MRU does not have access)
Offers curated citation tracking and metrics; *limited public access
Publish or Perish (software)
 
Free

Analyzes citations using data from Google Scholar or other sources;

useful for custom metrics

Example author metrics snapshot: Citations, h-index, and i10-index

Key takeaway: These metrics are cumulative and vary by platform. Use them as one part of a broader story about your contributions.

Tip: Keep your scholar profiles, such as ORCID and Google Scholar profile updated, as it makes it easier for others to find your work.

Article & output metrics: Impact of individual works

Metrics tied to specific works, such as citations, downloads, or library holdings, that show their reach and engagement

Publication-level metrics quantify the impact of an individual publication, whether that’s an article, book, report, or creative work. These metrics show the reach or influence of specific research findings, rather than just their source.

Example use cases:

  • In a tenure or award applications: “My article was published two years ago and has already been cited in 97 other research papers.”
  • To demonstrate interest before formal citations: “My article PDF has been downloaded 421 times this year.”
  • If your book is in demand: “Nearly 700 libraries worldwide own copies of my book.”

Key takeaway: Use publication metrics to highlight the specific impact of your work, not just where it was published.

Tip: Screenshots of citation or download dashboards can be a helpful visual in tenure or grant applications.


Citations

Citations count how often your work is referenced by others. This traditional metric reflects how your research informs or influences the scholarly conversation. Citations help because they are widely used, easily understood, and show strong evidence of scholarly uptake. You can find them in the following places:

Example: Citation metrics retrieved from Scopus

 

  • Citations: Overall count of citations reported by the platform.
  • Citation Indexes: This confirms all citations from indexed sources.
  • Scopus: Citations from Scopus, a citation database.
  • CrossRef: Citation count is lower, as citations are limited to items with DOIs registered in CrossRef.

Tip: Numbers may vary across platforms, due to differences in coverage, indexing, and citation-tracking methods.


Usage metrics

Usage metrics show how often your publication is accessed (e.g., views, downloads), even when it’s not cited. Usage metrics can help demonstrate early or public interest; and are useful for newer or non-traditional outputs. They can be found in the following types of places:

  • Publisher’s website, usually the abstract or download page
  • Author dashboard or from the publisher
  • Institutional repository, Including MRU's repository where views and downloads are displayed on item pages.
Example Download trends

 

 

  • Monthly download activity for a specific item (e.g., article or report)
  • Useful for: Demonstrating interest over time, or supporting impact in the absence of citations.
Example Usage and citation snapshot from a publisher or repository page

 

  • PDF views: Number of views
  • Crossref citations: Number of DOI-based citations. Note: Only includes citations between works with registered DOIs, so the total may be lower than calculated by other metrics.
Example MROAR (Mount Royal Open Access Repository) 

 

 

Views/Downloads: Usage within the MRU repository only. The same item may have additional views/downloads from other platforms like journal websites or databases.

Citations via Crossref: This means two other publications with DOIs have cited this work, as tracked by Crossref.

Note: Citation counts may be lower than in Google Scholar or Scopus, as Crossref only tracks citations between DOI-registered items.

Key takeaway: Usage is not a substitute for citations, but it shows visibility and engagement.


Holdings metrics (for books)

Holdings reflect how many unique libraries own your book. This metric is primarily useful for disciplines where book publishing (monographs) is the dominant form of research dissemination. These are useful for disciplines where books are key research outputs; reinforces credibility and reach. They can be found in places such as: 

Tip: Holdings are especially impactful when you can also show positive reviews or that your work is included in course syllabi.

Alternative metrics (altmetrics): Capturing broader engagement

Alternative indicators that track how research is shared, saved, or discussed online, often beyond academia.

Altmetrics track how research is shared, discussed, or mentioned outside traditional academic publishing. The term refers broadly to attention-based research metrics (not to be confused with the subscription product by the same name).

They capture attention from:

  • Social media (e.g., X/Twitter, Bluesky, Mastodon, Facebook)
  • News media & blogs
  • Policy documents
  • Wikipedia and forums
  • Citation managers like Mendeley or Zotero

Alternative metrics can help: 

  • When citations are not yet available or relevant
  • Show engagement from practitioners, educators, or the public
  • In impact narratives, alongside citations and usage stats

 


Limitations of alternative metrics

  • Not a quality measure: Visibility ≠ credibility
  • Disciplinary bias: Fields vary in online activity
  • Tool-dependent: Results shift with social media trends 
  • Less established: Often secondary in evaluation processes

Key takeaway: Altmetrics track attention, not quality. Use them as a complement, not a replacement, for traditional metrics.

Tip: Especially useful for grey literature, public reports, creative works, or newer publications that haven't had time to accrue citations.


Where to find alternative metrics

Tool / Platform
Access
Notes
LibrarySearch Free Look for the donut badge on select items
Scopus Subscription (MRU has access) Includes limited altmetric data
Web of Science Subscription (not at MRU) --
Altmetric.com Subscription (not at MRU), free bookmarklet tool Widely used by journals & funders
PlumX Metrics Subscription (not at MRU) Often embedded on publisher platforms
Dimensions.ai Subscription (not at MRU) Includes citations + altmetrics

Key takeaway: Commercial altmetrics tools are collect and display data—each with different coverage, algorithms, and visualizations.

Tip: Although altmetrics can be tracked for free, paid tools offer dashboards, alerts, and more detail.


How to track altmetrics without a subscription

You can collect alternative metrics manually by checking:

  • Social media: Search for your article title or DOI on X/Twitter, LinkedIn, Reddit, Mastodon, Bluesky
  • News: Use Google News or set up Google Alerts for your work.
  • Policy documents: Search government/NGO sites using site: in Google (e.g., site:canada.ca)
  • Reference managers: See if your work is saved in Zotero or Mendeley libraries (some articles display a “saved by” count or readership metric).
  • Publisher websites: Many display views, downloads, and share counts

Key takeaway:  Manual tracking is time-consuming but effective. Be prepared to screenshot or document evidence if using in an impact case.

Tip: Most altmetrics tools don’t yet index newer platforms like Bluesky or Mastodon—search them directly to find mentions.

References and further reading

References and further reading

Have something to suggest for the list? Let us know! (fmay at mtroyal.ca)

Responsible metrics & research impact

Understanding responsible research metrics

Responsible use of metrics ensures that research evaluation considers context, quality, and impact rather than relying solely on citation counts or journal rankings.

Learn more about research metrics and evaluation

Tri-Council research assessment resources

CIHR, NSERC & SSHRC guidelines

Qualitative impact & knowledge mobilization

Resources for demonstrating research impact

Discipline specific consultations

Contact your subject Librarian