Banner
Research Assessment
Tools, guidance, and examples to help you navigate changing research assessment practices and highlight the full value of your work.
Navigating Changes in Research Assessment
Mount Royal University Signs DORA
On March 27, 2025, Mount Royal University became a signatory to the San Francisco Declaration on Research Assessment (DORA).
DORA is a global initiative supporting improvements to how research is assessed—encouraging fair, transparent, and inclusive approaches. MRU’s decision to sign reflects longstanding institutional values: recognizing diverse research contributions, prioritizing real-world impact, and fostering scholarly integrity.
Why It Matters at MRU
Research assessment is shifting across Canada and internationally. Funders, institutions, and scholarly communities are expanding what “impact” means—ensuring recognition for high impact practices like mentorship, collaboration, public engagement, and knowledge sharing.
At MRU, these principles are not new. Our Research and Scholarship Plan 2024-2030 already reflects many of the values championed by DORA. MRU’s decision to sign reflects longstanding institutional values: recognizing diverse research contributions, prioritizing real-world impact, and fostering scholarly integrity.
Related Frameworks & Global Initiatives
MRU’s commitment to values such as those outlined in DORA aligns with broader movements toward responsible research assessment, such as those listed below.
- Leiden Manifesto - Ten principles for responsible metrics use
- CoARA - Coalition for Advancing Research Assessment
- Berlin Declaration – Advocates for open access
- Hong Kong Principles – Emphasizes integrity and transparency
- Singapore Statement – Global statement on research integrity
- Concordia’s Research Impact Pathways – Canadian model using narrative and qualitative evidence
Tip: Use these frameworks to support and inspire continuous improvement, building on the research practices already in place.
What Changing Research Assessment Means for MRU Researchers
As global research assessment evolves, expectations for demonstrating impact are shifting too. Traditional metrics like citation counts still matter—but they’re no longer enough on their own.
What's expanding in research assessment:
- Relevance– Highlight the impact of your work on students, communities, or policy
- Contributions – Showcase mentorship, teaching, interdisciplinary work, and collaboration
- Reach – Emphasize public engagement, open scholarship, and knowledge mobilization
Key Takeaway: Use a mix of quantitative (citations, altmetrics, downloads) and qualitative (narrative, peer recognition, impact stories) indicators to ensure a comprehensive demonstration of your impact.
Approach | Examples | Strengths | Considerations |
Qualitative narrative or expert judgment |
Mentorship, peer review, policy influence, community engagement | Contextual, nuanced, peer- or community-recognized | Less standardized, needs interpretation; opportunities for bias or lack of objectivity |
Quantitative numeric indicators |
Citation counts, h-index, altmetrics | Relatively comparable, scalable, perceived objectivity | Doesn’t capture all forms of impact; can be manipulated |
Resource spotlight: Explore the Metrics Toolkit for practical ideas on how to demonstrate research impact. It offers a comprehensive list of metrics — along with their sources, strengths, limitations, and when to use them.
DORA and aligned research assessment values emphasize:
- Assessing research on its own merits—not solely based on where it’s published
- Recognizing diverse contributions, including mentorship, collaboration, and community engagement
- Using both qualitative and quantitative indicators thoughtfully—selecting those that best reflect the type of impact, and explaining their relevance—while avoiding overreliance on journal-based metrics like the Journal Impact Factor (JIF)
- Supporting openness and accessibility through digital tools and open publishing practices
How can the Library support research assessment?
We know these changes can raise questions. MRU librarians can help you explore responsible assessment practices and connect you with discipline-relevant resources.
- Book a consultation to get help navigating changes in research assessment
- Get support on creating and managing academic profiles, understanding metrics, and showcasing diverse contributions
- Find resources on narrative CVs and qualitative indicators
- Connect with your subject librarian for tools and examples in your field
Tip: Stay ahead of evolving research assessment by preparing to show your impact in diverse and meaningful ways.
Understanding Changes in Research Assessment
Why is Research Assessment Changing?
How is research assessment evolving globally?
Research assessment is evolving globally —including greater emphasis on openness, collaboration, interdisciplinarity, and societal impact., In Canada, the Tri-Agency funders (CIHR, NSERC, SSHRC) are aligning with responsible research assessment principles. Whether or not an institution signs declarations such as DORA, researchers will encounter these shifts in funding applications, hiring, and collaborations.
Resource spotlight
- The DORA Movement in Canada: Working Together to Advance Assessment of Research Excellence (2024)
- The San Francisco Declaration on Research Assessment (DORA)
Explore additional References and Further Readings below for more on responsible metrics and research assessment.
Balancing Metrics with Qualitative Measures
How do updated research assessment approaches affect the use of quantitative metrics like journal impact factor?
Quantitative metrics still have a role in research assessment, but the emphasis is on using them effectively and in context, rather than as the sole measure of quality. Increasingly, funders and institutions recommend combining metrics (citations, altmetrics, downloads) with qualitative indicators such as mentorship, policy influence, and interdisciplinary collaboration.
The library can help researchers find resources on the evolving use of metrics in research assessment and connect them with best practices in their discipline.
Resource spotlight: Responsible Use of Metrics: The Metric Tide Report (UK)
Explore additional References and Further Readings below for more on responsible metrics and research assessment.
Can we continue to ensure research quality and rigor even while adapting research assessment criteria?
Yes — the shift in assessment practices does not eliminate evaluation criteria, rather it broadens them. Various national and international research organizations, including the Tri-Agency funders, emphasize that quality and rigor are best assessed through peer review, methodological soundness, reproducibility, and impact and contribution to the field, rather than relying solely on citation counts or journal prestige.
Resource spotlight: Experiences with the narrative CV - a researcher shares. Researcher Anja Leist shares her experience with narrative CVs, noting benefits like added context and impact, and challenges such as language, self-promotion styles, and evolving expectations.
Explore additional References and Further Readings below for more on responsible metrics and research assessment.
Publishing and Career Impact
Do evolving approaches mean loss of choice over where we publish?
No — assessment practices such as those described in the DORA recommendations do not restrict researchers' publication choices. Instead, they discourage using journal prestige as a proxy for research quality, advocating for assessment based on the actual content and impact of research.
How can I explore different ways to share my work beyond top impact-rated journals?
While journal visibility remains important, there are increasing ways to expand research reach. Funders and institutions increasingly recognize these when included in CVs and application documents. Alternative approaches include:
- Institutional or disciplinary repositories (e.g., MRU’s MROAR, Canadian HSS Commons, arXiv)
- Open access journals with rigorous peer review, even if not “top-ranked” by impact factor
- Preprints to share early findings quickly (check out a CSP blog post "What's the deal with preprints?" to find out more.)
- Conference presentations, posters, or proceedings
- Policy briefs or technical reports for decision-makers or practitioners
- Blog posts, op-eds, and podcasts to share insights with the public
- Social media platforms like X (Twitter), LinkedIn, or Mastodon to engage with your field
- Researcher profiles (e.g., ORCID, Google Scholar) to make your work discoverable
- Community-based or practitioner publications relevant to your area of impact
Tip Matching your dissemination strategy to your intended audience or type of impact is key
How can I ensure my contributions to research projects are properly recognized?
Researchers often contribute in many ways beyond authorship, and funders and institutions are recognizing the need for clearer attribution of contributions. The CRediT taxonomy provides a structured way to describe specific roles in research projects, from conceptualization and methodology to supervision and data curation, ensuring more accurate credit for contributions.
Researcher Workload and Transitioning to New Models
How do I find time to adapt my approach to research assessment? How do I get started if this feels overwhelming?
Like most things in research, adapting your assessment approach takes some work—but it doesn’t have to happen all at once. A useful first step is to see what funders are already asking for. For example, the NSERC Discovery Horizons pilot provides clear directions for narrative CVs.
If you know what kinds of evidence or stories that funders look for—like meaningful impacts or contributions—you can start tracking those as your research progresses. Taking a bit of time now can make it easier to prepare when you need it.
Start small. Draft one section, seek peer feedback, or join a writing group if available.
Explore additional readings in the resource list below for more on responsible metrics and research assessment.
What if I find it hard to promote myself or write about my contributions?
You’re not alone. It is common to feel uncomfortable with self-promotion—especially if describing the impact of your work with more reliance on qualitative measures is new to you, when trying to translate complex work into plain language, or working in an additional language. It’s okay if this doesn’t come naturally. These skills can be developed over time, and support is available. Start by focusing on describing your work clearly and authentically. Framing it for yourself as telling the story of your impact, rather than self-promotion or bragging may be a helpful reframing.
Resource spotlight
- Researchers: fight back against your struggle with self-promotion (Williams, 2021, Times Higher Education) – A short piece offering reassurance and practical tips to help reframe the idea of self-promotion as advocacy for your work.
- Academic Phrasebank. University of Manchester – tool designed to help researchers—especially those using English as an additional language—navigate academic writing more confidently. Offers language you can adapt to your own work and is useful for qualitative and quantitative writing.
- Guides to writing and research - UBC Centre for Writing and Scholarly Communication – Evidence-informed guide outline typical conventions and approaches used in key academic writing tasks, including abstracts, plain language summaries, literature reviews, and introductions to research articles.
Could these approaches put early-career researchers or those with a shorter career of research at a disadvantage?
Many traditional metrics favour well-established researchers with long publication records. Narrative CVs and alternative assessments allow early-career researchers to highlight a broader range of contributions, including:
- Mentoring
- Interdisciplinary work
- Public engagement
These contributions may not always be reflected in citation counts, but responsible assessment practices provide alternative ways to showcase impact.
Do I still need my Canadian Common CV (CCV)?
Yes—for now. Use your CCV as a personal archive and for award or hiring purposes.
The Canadian Common CV (CCV) is the current standard CV system used by Canada’s research funding agencies. While a transition to a narrative-style CV (TCV) is underway, the CCV remains in use for many competitions. The Tri-Agency indicates that researchers will be given notice before it is phased out, and guidance will be provided on how to export their information.
Considerations Across the Disciplines
Are there alternative ways to assess research quality in STEM fields?
Yes. Many scientific organizations and funders are adopting broader, field-appropriate models that combine quantitative indicators with qualitative evidence. These approaches may include narrative CVs, documentation of open science practices, mentorship, societal impact, and more. The goal is to better reflect the full value and context of research contributions.
While this FAQ focuses on STEM, similar shifts toward balanced and context-sensitive assessment are also underway in other disciplines.
How does research assessment account for outputs beyond journal articles in the humanities and social sciences — and in other fields?
Research in the humanities and social sciences often takes the form of books, exhibitions, performances, or public scholarship — and diverse outputs are increasingly common across disciplines. To ensure fair evaluation, hiring committees, peer reviewers, and adjudicators need to recognize these different forms of scholarship and understand their significance in disciplinary contexts. Evolving assessment practices emphasize quality, impact, and relevance over format, helping ensure researchers in all fields are evaluated on the full scope of their contributions.
Does broader research assessment mean disciplinary context matters less?
Not at all. Different disciplines have distinct norms for publishing, timelines, and ways of demonstrating impact. For example, citation patterns in STEM may differ significantly from those in the humanities or social sciences. Responsible assessment encourages evaluation that reflects these differences rather than applying uniform criteria across disciplines.
How is team-based or collaborative research recognized in assessment?
Many research projects today are collaborative, involving co-investigators, interdisciplinary teams, and shared outputs. Traditional metrics often emphasize individual achievements, which may not capture the value of collective contributions. Responsible research assessment encourages clear documentation of each researcher's role, for example through tools like the CRediT taxonomy or narrative CVs that describe collaborative work and team impact.
Qualitative Assessment
Qualitative Metrics
Why It Matters: Beyond Numbers in Research Assessment
Research assessment is evolving globally.Funders—including Canada’s Tri-Agency (SSHRC, NSERC, CIHR)—are moving beyond citation counts to give more recognition to qualitative contributions beyond peer review, such as mentorship, policy influence, and societal impact. Inspired by DORA and the Leiden Manifesto, this shift ensures research is valued for its real-world influence, not just numerical indicators.
Why Broaden our use of Qualitative Metrics?
They provide a fuller picture of research contributions by recognizing:
- Real-world relevance – How research influences policy, students, communities, and research culture.
- Alignment with funding priorities– Many funders prioritize equity, diversity, and societal impact.
- Career impact – Highlights leadership, teaching, and interdisciplinary work.
- Stronger applications – Strengthens grant, tenure, and promotion materials.
- Recognition of diverse contributions – Captures mentorship, outreach, and public engagement.
Challenges & Considerations
- Subjectivity & bias – Peer review and narrative assessments can be influenced by unconscious bias.
- Lack of standardization – No benchmarks like citation counts.
- Time-intensive – Evaluating mentorship, collaboration, and policy influence requires effort.
- Difficult to quantify impact – Some contributions are harder to measure numerically.
- Field-specific norms – STEM fields often prioritize citation-based measures, while humanities/social sciences emphasize qualitative outputs.
Change takes effort—but brings lasting benefits, laying the foundation for a more equitable and transparent system that leads to stronger research outcomes. Rosemary Brown, the first Black Canadian elected to a provincial legislature emphasized the importance of keeping doors open for future generations:
"We must open the doors and we must see to it they remain open, so that others can pass through."
The Narrative CV: Showcasing Your Work with Context
Key Takeaway: Narrative CVs go beyond metrics, showcasing the quality, relevance, and real-world impact of your work.
Emerging evidence suggests they help reviewers assess mentorship, outreach, and policy impact alongside traditional research outputs, providing richer context for your contributions.
Practical Applications:
- Career advancement: Strengthens tenure, promotion, and award applications.
- Leadership and mentorship roles: Ideal for showcasing contributions beyond research outputs.
- Stronger funding applications – Increasingly required by funders worldwide.
Tri-Agencies will require this format see the Tri-Agency announcement which describes a pilot projects and a gradual rollout. More details from the Tri-agency CV FAQs.
Key Takeaway: Researchers should prepare now - narrative CV formats will be required in more contexts, from funding applications to hiring and promotion.
How Narrative CVs Vary by Institution & Purpose
While Tri-Agency guidelines provide a standardized format, narrative CVs vary based on the funder, institution, or purpose. Some key differences include:
- Structure – Some funders emphasize mentorship, while others focus on interdisciplinary work.
- Length – Tri-Agency allows 5 pages (English), 6 pages (French), while other institutions may set different limits.
- Emphasis – Some CVs highlight societal impact, while others prioritize leadership or research excellence.
Tip: Reuse strong content across applications, tailoring your CV to the specific guidelines of the funder or institution you’re applying to.
Tri-Agency Narrative CV: What you Need to Know
The Tri-Council agencies in Canada are replacing the current Canadian Common CV (CCV) with a narrative CV format to promote more holistic, fair, and meaningful research assessment.
- What? A new CV format that evaluates impact beyond publications lists.
- When? Rolling out in 2025, starting with SSHRC Impact Awards
- Why? Focuses on quality, leadership, and societal impact, rather than emphasizing citation metrics.
- How? Three sections: Personal statement, Significant contributions, Mentorship.
Explore the resources on this page and others for guidance on structuring your narrative CV.
Key Takeaway: The narrative CV reflects a national shift toward recognizing the full range of a researcher’s impact—not just what can be counted.
Tip: Unlike the CCV, the TriAgency CV is grant-specific. You will want to keep a living draft and refine for each application.
Tri-Agency Narrative CV: Structure & Key Components
The Tri-Agency template consists of three sections, each designed to capture the broader impact of your research.
Section 1: Personal Statement Your introduction—why does your work matter?
- Outlines research focus, motivations, and career trajectory.
- Provides a narrative of your contributions, rather than just listing outputs.
Tip: Use this section to frame your research in terms of purpose and direction—not just past achievements.
Section 2: Most Significant Contributions & Experiences What has had the most impact?
- Focuses on quality over quantity—showcasing relevance, impact, and key achievements.
- Includes policy impact, leadership roles, interdisciplinary work, and knowledge mobilization.
Tip: Don’t be afraid to highlight fewer but more meaningful contributions—quality and context matter more than volume.
Section 3: Supervisory & Mentorship Activities How have you supported others?
- Highlights mentorship, team leadership, student training, and EDI contributions.
- Recognizes collaborative and interdisciplinary efforts in research development.
For examples of how to structure your contributions for a narrative CV see the 'In Practice' tab.
- DORA Building Blocks for Impact 2022Capturing scholarly “impact” often relies on familiar suspects like h-index, JIF, and citations, despite evidence that these indicators are narrow, often misleading, and generally insufficient to capture the full richness of scholarly work. Considering a wider breadth of contributions in assessing the value of academic activities may require a new mental model.
In Practice: How To Demonstrate Research Impact
Researchers can document and highlight their work in ways that align with evolving assessment practices:
Practical Ways to Track & Showcase Impact
Category | Description | Example |
---|---|---|
Clearly define your contribution using CRediT Taxonomy |
Make sure your contributions are recognized by using a clear description. A growing list of publishers are using CRediT Taxonomy to ensure transparency and accountability. CRedit includes 14 defined roles. |
Data Curation: Managing and preserving data. Visualization: Creating figures, graphs, and models Writing – Original Draft: Preparing the initial manuscript |
Showcase research impact | Go beyond citations—highlight policy influence, innovative ideas, tools, and methodologies. | "Research findings on equitable AI systems contributed to national guidelines for ethical technology deployment." |
Recognize your contributions to teaching | Demonstrate how your research shapes student learning and mentorship. | "Co-designed a climate change module with local organizations, now part of undergraduate courses." |
Highlight community engagement | Document partnerships, stakeholder collaboration, and research mobilization beyond academia. | "Collaborated with Calgary’s immigrant business owners to create a toolkit for underrepresented entrepreneurs." |
Demonstrate knowledge mobilization | Show how your research reaches wider audiences through events, reports, and media. | "Created interactive webinars on mental health literacy, reaching over 1,000 community service providers." |
Advance your professional growth | Show leadership, mentorship, interdisciplinary collaboration, and efforts to foster an inclusive research culture. | "Supervised undergraduate students, with several advancing to graduate programs." |
Engage with open scholarship | Support research transparency and accessibility through open access and data sharing. | "Created a public data repository, facilitating further research and community engagement." |
Strategically Shaping your Research Impact
- Expand how you share findings – Use reports, infographics, videos, and social media.
- Engage in interdisciplinary work – Collaborate across fields to broaden impact.
- Use institutional repositories & altmetrics – Track downloads, shares, and non-traditional reach.
- Keep researcher profiles updated – Maintain your ORCID profile as well as any other you use such as Google Scholar, LinkedIn and institutional pages. Reviewer may be looking there for additional details.
Tip: Check out MRU Library's guide to scholar profile tools
Framing of Contributions for Stronger Impact Statements
- Don't list roles > Instead, describe their effect.
- Highlight leadership & innovation > Show how your work influenced systems, policies, or practices. Include how you support students or peers (teaching counts!). Add training philosophy, inclusive practices, or notable outcomes.
- Use active, outcome-driven language > Make your contributions stand out by showing what changed.
- Group activities by impact area—not just year > Helps reviewers see the bigger picture and your growth over time.
Tip: The University of Calgary offers a practical guide for researchers - Most Significant Contributions Statement Guide
Writing Stronger Impact Statements: Before & After
When describing your impact, go beyond listing what you did. Show how your work led to results, shaped policies, or benefited others.
Weaker Description | Enhanced Description of Impact |
---|---|
Chaired the COPPUL Library Collections Community from 2021-2025 Leadership |
Led national discussions on licensing & open-access strategies, shaping resource-sharing models for 22 academic libraries. |
Supervised five undergraduate student research projects |
Mentored five undergraduate students; two published peer-reviewed, open access articles based on their work. |
Developed a community outreach project on mental health literacy. Community engagement |
Led a university-community initiative that provided mental health literacy workshops to 1,000+ service providers and informed regional mental health policy recommendations. |
Conducted interviews with social workers on housing policy. Policy impact |
Directed a mixed-methods study with 50+ social workers; findings influenced municipal housing policy changes. |
Published a report on community resilience. Knowledge mobilization |
Co-authored a policy brief with local partners; adopted by regional disaster response agencies. |
Tip: Think of the TCV like a written job interview—just with time to reflect. Use it to connect the dots: show how your experiences, growth, and contributions align with the work you’re proposing.
Getting Started with Your Narrative CV
- Check the guidelines – Align with funder priorities and structure.
- Track key contributions early – Don’t wait for the deadline.
- Go beyond citations – For each entry: Why it matters, your role, what changed.
- Use real examples – Show mentorship, collaboration, policy or community impact.
- Include career slowdowns – Context like caregiving or part-time roles is welcome and if applicable, challenges, reasons and lessons learned .
- Start small, revise often – Focus on one section at a time.
- Seek feedback – Peer review can sharpen your narrative and impact.
Tip: Refer to CRediT Taxonomy for help defining contributions in a systematic way
Tip: AI tools may be able to help—but use with care. Generative AI may help with language or formatting, especially for English or French as a second language researchers, but reviewers still want your voice. Check for guidelines in the grant documentation and describe how the tools were used in your application.
Why Peer Review Matters in Evolving Research Assessment
Peer review remains essential—but it’s evolving. New models are helping recognize a broader range of contributions.
Peer Review: Where Tradition Meets Innovation
Peer review remains central to research assessment—even frameworks like DORA uphold its importance. While traditionally focused on journal articles and monographs, peer review now applies more broadly: to datasets, software, policy reports, preprints, and even public engagement efforts.
Key Takeaway: Peer review is evolving—not disappearing. It supports a wider range of research outputs in more flexible and inclusive ways.
The Continuing Importance of Peer Review
Peer review plays a critical role in ensuring research meets scholarly standards across different formats.
- Maintains rigor – Validates research through expert evaluation.
- Supports research integrity – Ensures transparent and fair assessment.
- Adapts to new models – Open peer review, preprint sharing and commenting, and post-publication review are gaining traction.
Tip: When submitting work outside traditional journals, look for platforms or processes that offer peer review—these can carry weight in funding and promotion.
How Peer Review Benefits Researchers
- Strengthens career recognition – Peer-reviewed datasets, software, and reports strengthen academic portfolios.
- Broadens evidence of impact – Supports assessment models that go beyond citations.
- Aligns with evolving standards – Recognized in DORA-aligned and Tri-Agency practices.
Key Takeaway: Peer-reviewed outputs—whether reports, datasets, or software—are valuable scholarly contributions.
Moving Forward: Balancing Tradition and Innovation
Peer review is evolving by seeking to enhance transparency, inclusivity and responsiveness to new research formats—but change takes time.
- Tensions remain. For example, eLife was removed from Clarivate's Journal Citation Reports, highlighting the friction between traditional indexing and evolving assessment models.
- Challenge. Embedding inclusive peer review models into legacy systems requires time and resources.
- Opportunity: Continued engagement and advocacy will shape future models.
Tip:Stay informed about how your field and funders view new peer review models—especially open and post-publication review.
Peer Review Across Different Research Outputs
Output Type | How Peer Review is Applied | Example |
---|---|---|
Policy Reports | Reviewed by experts and stakeholders; may include public consultation | Canada in a Changing Climate Reports - peer reviewed by scientists and policymakers. |
Academic Books | Reviewed by subject experts via university press/editorial boards | McGill Queen's University Press - Publication Review Committee |
Public Health Guidelines | Expert panels apply structured evaluation frameworks |
Public Health Agency of Canada's External Advisory Bodies WHO Guidelines – The development of these guidelines includes review by relevant experts. |
Indigenous Knowledge | Community validation led by Indigenous scholars and Elders | National Collaborating Centre for Indigenous Health (NCCIH) – Indigenous-led peer review process for public health research. |
Preprints | Community feedback and public peer review pre-publication |
medRxiv, SocArXiv, bioRxiv, and OSF.io – Some preprint platforms are for specific disciplines, others cover a wide array of research areas. |
Data & Software | Reviewed for transparency, structure, and reproducibility | FRDR (Federated Research Data Repository) – Supports formal peer review of datasets |
Post-Publication Review | Ongoing open commentary after publication |
F1000Research & eLife – Examples of open peer review for research outputs.
|
Investigative Journalism | Editorial and fact-checking processes for credibility |
High editorial standards in investigative reporting |
Tip: Consider adding peer-reviewed outputs beyond articles to your CV or narrative CV—these demonstrate impact and rigor.
How Can the Library Help?
- Offer consultations on peer review models, journal selection, and alternative publication formats.
- Host workshops on open peer review and preprint repositories.
- Help faculty navigate publisher policies, open-access options, and data-sharing best practices.
- Raise awareness of research assessment changes to support strong presentation of work.
Key Takeaway: The Library can help you leverage peer review practices—traditional and emerging—to strengthen your research visibility and credibility.
Quantitative Assessment
Quantitative Metrics
Why it Matters: The Role of Citation-Based Measures
Citation counts and journal metrics have traditionally been the primary tools for assessing research impact. While useful for tracking influence, these measures often exclude other valuable contributions to research and society.
Why Use Quantitative Metrics?
These measures offer several advantages:
- Easy to report – Simple numerical indicators.
- Automated tracking – Large datasets make it scalable.
- Long-term impact measurement – Shows citation trends over time.
- Supports applications – Used in tenure, promotion, and funding decisions.
- Informs publishing decisions – Helps identify high-visibility venues.
The metrics described here reflect the most widely recognized options, but many other sources, tools, and models exist, each with benefits and limitations.
Limitations of Quantitative Metrics
- Not a direct measure of quality – High citations don't always indicate strong research.
- Can be manipulated – Strategic use of citations or controlled timing of certain publications can skew metrics
- Excludes non-traditional outputs – Books, performances, exhibitions, and policy contributions can be overlooked.
- Proprietary data – Many metrics are controlled by publishers, limiting transparency and inclusion.
Responsible Use of Metrics
To ensure fair and meaningful research assessment, quantitative metrics should be used alongside qualitative indicators. Funders and institutions now encourage:
- Balanced assessment – Combining citation-based and qualitative measures for a fuller picture of impact.
- Contextual evaluation – Recognizing different practices, especially across disciplines, and avoiding direct comparisons
- Peer review & expert judgment – Ensuring scholarly rigor remains central across disciplines.
Numbers alone aren’t enough—qualitative indicators are essential for a fuller picture of research contributions.
As Albert Einstein said:
"Not everything that can be counted counts, and not everything that counts can be counted."
Journal-Level Metrics
Journal-level metrics help assess the influence of journals, not individual researchers or articles. Some journal metrics are behind paywalls (e.g., JCR), others (e.g., Scimago, Google Scholar or information posted to a journal's website) are free.
How journal metrics help:
- Choosing where to publish
- Highlighting journal prestige in tenure or grant applications
Tip: Journal impact changes over time—don’t assume the “usual” venue is always the best fit.
Use Journal Metrics Responsibly
- Stay discipline aware: Only compare journals within the same field.
- Use multiple metrics: Each captures different aspects of influence.
- Think beyond numbers: Consider peer review model, open access status, and audience reach.
Key Takeaway: Use journal metrics to evaluate journals—not to assess individual research quality.
- Learn more about article metrics > Article-Level Metrics
- Track your own impact > Scholarly Profiles
Where to Find Journal Metrics
Metric | Access details | Description |
---|---|---|
Journal Impact Factor (JIF) | Subscription only (Clarivate); MRU does not subscribe, but some journals list their JIF on their homepage. | Most recognized metric; 2-year citation ratio. Widely used in tenure and grant contexts. |
Eigenfactor & AIS | Free; based on Clarivate/JCR data |
Eigenfactor: 5-year citations, weighted by source quality. Includes Article Influence Score (AIS) that measures average article influence over 5 years. |
CiteScore | Free; based on Elsevier/Scopus data. | 4-year citation average; broader journal coverage than JIF. |
Scimago Journal Rank (SJR) | Weighted 3-year citations; adjusts for discipline norms; highlights field-specific trends. | |
Source Normalized Impact per Paper (SNIP) | Free; based on Elsevier/Scopus data. | Citation impact normalized by subject field; helpful across disciplines. |
Google Scholar Metrics | Free; less curated. |
5-year citation count; broad coverage including grey literature. |
Tip: Google Scholar includes a wide range of sources—not all are peer-reviewed. Always check journal quality.
Journal Metric Examples
CiteScore |
Scimago Journal Rank (SJR) |
Google Scholar Publication Metrics |
Eigenfactor Score |
Find Journal Metrics on Publisher Sites
Many journals share metrics on their own websites—typically under:
- About the Journal
- Submission Guidelines
- Editorial Policies
You may find:
- Journal Impact Factor, CiteScore, or SJR
- Index inclusion (e.g., DOAJ, PubMed, Web of Science)
- Peer review model or acceptance rates
- Publisher or subject affiliations
Example: Canadian Journal of Bioethics publicly shares its Scopus metrics and indexing status on its website.
Author-Level Metrics
These metrics usually help track how often your work has been cited, providing a cumulative snapshot of your publishing impact.
Types of Author-Level Metrics
- Citations – Total citations across your work. Found in Google Scholar, Scopus, and Web of Science.
- h-index – Reflects productivity and impact. An h-index of 15 means that author has 15 papers with 15+ citations each.
- i10-index (Google Scholar) – Counts how many of your works have at least 10 citations. Useful early-/mid-career.
- Recent 5-Year Metrics (Google Scholar) – Citations and h-index from the past 5 years. Useful for showing recent activity or accounting for career breaks.
How Author-Level Metrics Help
They track how much you’ve published with how often your work has been cited, and combined they offer a picture of your research influence.
Tip: If you're early in your career or publishing in fields with slower citation cycles, these metrics may underrepresent your impact. Use qualitative context too.
Where to Find Author Metrics
Platform |
Access Details |
Description |
Free | Tracks citations, h-index, i10-index; widely used but less curated | |
Scopus |
Subscription only (MRU has access) |
Provides curated citation data and author metrics like h-index |
Web of Science |
Subscription only* (MRU does not have access) |
Offers curated citation tracking and metrics; *limited public access |
Publish or Perish (software) |
Free | Analyzes citations using data from Google Scholar or other sources; useful for custom metrics |
Key Takeaway: These metrics are cumulative and vary by platform. Use them as one part of a broader story about your contributions.
Tip: Keep your scholar profiles, such as ORCID and Google Scholar profile updated—many citation tools pull from them, and it’s easy for others to find your work.
Publication-Level Metrics
Publication-level metrics quantify the impact of an individual publication—whether that’s an article, book, report, or creative work. These metrics show the reach or influence of specific research findings, rather than just their source.
Example use cases:
- In a tenure or award application: “My article was published two years ago and has already been cited in 97 other research papers.”
- To demonstrate interest before formal citations: “My article PDF has been downloaded 421 times this year.”
- If your book is in demand: “Nearly 700 libraries worldwide own copies of my book.”
Key Takeaway: Use publication metrics to highlight the specific impact of your work—not just where it was published.
Tip: Screenshots of citation or download dashboards can be a helpful visual in tenure or grant applications.
Citations
Citations count how often your work is referenced by others. This traditional metric reflects how your research informs or influences the scholarly conversation.
- Where citations help: Widely used; easily understood; strong evidence of scholarly uptake.
- Where to find them:
- Google Scholar; free
- Scopus; subscription (MRU has access)
- Web of Science; subscription (MRU does not have access)
- OpenAlex; free
- Dimensions.ai; subscription (MRU does not have access)
Example: Citation metrics retrieved from Scopus | |
|
|
Tip: Numbers may vary across platforms—due to differences in coverage, indexing, and citation-tracking methods.
Usage Metrics
Usage metrics show how often your publication is accessed (e.g., views, downloads), even when it’s not cited.
- Where usage metrics help: Demonstrates early or public interest; useful for newer or non-traditional outputs.
- Where to find them:
- Publisher’s website (abstract or download page)
- Author dashboard or contact with publisher
- Institutional repository. Including MRU's repository where views and downloads are displayed on item pages.
Example Download trends |
|
|
|
Example Usage and citation snapshot from a publisher or repository page |
|
|
|
Example MROAR (Mount Royal Open Access Repository) |
|
|
Views/Downloads: Usage within the MRU repository only. The same item may have additional views/downloads from other platforms like journal websites or databases. Citations via Crossref: This means two other publications with DOIs have cited this work, as tracked by Crossref. Note: Citation counts may be lower than in Google Scholar or Scopus, as Crossref only tracks citations between DOI-registered items. |
Key Takeaway: Usage is not a substitute for citations—but it shows visibility and engagement.
Holdings Metrics (for books)
Holdings reflect how many unique libraries (academic, public, etc.) own your book—usually tracked via union catalogues. This metric is primarily useful for disciplines where book publishing (monographs) is the dominant form of research dissemination.
- Where holdings help: Useful for disciplines where books are key research outputs; reinforces credibility and reach.
- Where to find them:
- WorldCat.org for worldwide holdings;
- Voilà is Canada's national union (powered by WorldCat).
Tip: Holdings are especially impactful when you can also show positive reviews or that your work is included in course syllabi.
Alternative Metrics (Altmetrics)
Altmetrics track how research is shared, discussed, or mentioned outside traditional academic publishing. The term refers broadly to attention-based research metrics (not to be confused with the subscription product by the same name).
They capture attention from:
- Social media (e.g., X/Twitter, Bluesky, Mastodon, Facebook)
- News media & blogs
- Policy documents
- Wikipedia and forums
- Citation managers like Mendeley or Zotero
Where Alternative Metrics Help
- When citations are not yet available or relevant
- To show engagement from practitioners, educators, or the public
- In impact narratives, alongside citations and usage stats
Limitations of Alternative Metrics
- Not a quality measure— Visibility ≠ credibility
- Disciplinary bias—Fields vary in online activity
- Tool-dependent—Results shift with social media trends
- Less established— Often secondary in evaluation processes
Key Takeaway: Altmetrics track attention, not quality. Use them as a complement—not a replacement—for traditional metrics.
Tip: Especially useful for grey literature, public reports, creative works, or newer publications that haven't had time to accrue citations.
Where to Find Alternative Metrics
Tool / Platform |
Access |
Notes |
LibrarySearch | Free | Look for the donut badge on select items |
Scopus | Subscription (MRU has access) | Includes limited altmetric data |
Web of Science | Subscription (not at MRU) | -- |
Altmetric.com | Subscription (not at MRU), free bookmarklet tool | Widely used by journals & funders |
PlumX Metrics | Subscription (not at MRU) | Often embedded on publisher platforms |
Dimensions.ai | Subscription (not at MRU) | Includes citations + altmetrics |
Key Takeaway: Commercial altmetrics tools are collect and display data—each with different coverage, algorithms, and visualizations.
Tip: Although altmetrics can be tracked for free, paid tools offer dashboards, alerts, and more detail.
How to Track Altmetrics Without a Subscription
You can collect alternative metrics manually by checking:
- Social media – Search for your article title or DOI on X/Twitter, LinkedIn, Reddit, Mastodon, Bluesky
- News– Use Google News or set up Google Alerts for your work.
- Policy documents –Search government/NGO sites using site: in Google (e.g., site:canada.ca)
- Reference managers – See if your work is saved in Zotero or Mendeley libraries (some articles display a “saved by” count or readership metric).
- Publisher websites – Many display views, downloads, and share counts
Key Takeaway: Manual tracking is time-consuming but effective. Be prepared to screenshot or document evidence if using in an impact case.
Tip: Most altmetrics tools don’t yet index newer platforms like Bluesky or Mastodon—search them directly to find mentions.
References and further reading
Responsible Metrics & Research Impact
Understanding responsible research metrics
Responsible use of metrics ensures that research evaluation considers context, quality, and impact rather than relying solely on citation counts or journal rankings.
Learn more about research metrics and evaluation
- The Metric Tide report – A 2015 UK report on the role of metrics in research assessment, with recommendations for responsible evaluation.
- Harnessing the Metric Tide – A 2022 update exploring how responsible metrics have been implemented, offering 20 recommendations for future assessment.
- Metrics Toolkit – A practical guide explaining different research metrics, their proper use, strengths, and limitations.
Tri-Council Research Assessment Resources
CIHR, NSERC & SSHRC guidelines
- Tri-Agency
- CIHR
- NSERC
- SSHRC
Qualitative Impact & Knowledge Mobilization
Resources for demonstrating research impact
- General guides
- Research Impact Canada – National network supporting knowledge mobilization.
- Knowledge Mobilization Guide (University of Victoria) – Strategies for making research findings accessible.
- Knowledge Impact Assessment Toolkit (University of Calgary) – A structured toolkit for assessing research impact.
- Qualitative impact support
- Most Significant Contributions Guide (University of Calgary) - Helps researchers create evidence-based narratives to showcase the quality and impact of their work, aligned with DORA and Tri-Council CV requirements.
- Narrative CV
- Developing a narrative CV: Guidance for researchers from the University of Oxford – A comprehensive resource including downloadable guides, webinars, presentations and case studies
- Taming Complexity: Narrative CVs in Grant Funding Evaluations - Explores how narrative CVs can influence evaluative practices in peer review. The authors propose a conceptual framework for understanding the impact of narrative CVs and present preliminary findings on their effectiveness in capturing researchers' diverse contributions. (Varga, Kaltenbrunner, and Woods, 2024)
Need help?
Book a consultation with your subject librarian for support.
Content note: AI tools were used in the creation of the content of this page as part of an iterative editing process—submitting drafts, reviewing AI-generated revisions, and refining the results through repeated cycles to improve clarity, eliminate duplication, correct grammar and spelling, and tighten the writing