Research metrics are increasingly central to how academic work is evaluated, yet many researchers struggle to navigate the growing array of indicators. This blog shares insights from a recent ONOS Researcher Connect webinar that explored the fundamentals of research metrics, their role in academic publishing and how institutions can support more confident, responsible use. It also reflects the importance of metrics literacy as part of a broader, more holistic approach to research assessment.
As research assessment frameworks evolve globally, institutions and funders are under growing pressure to promote responsible metrics use. Yet awareness among researchers remains low. According to 'The State of Research Assessment' white paper only 50% of researchers are aware of reform initiatives within their institutions and of those, 40% perceive little to no change. This disconnect may stem from the fact that assessment often happens at the faculty or research group level, making institutional efforts less visible. Awareness and education are therefore essential to drive meaningful, large-scale changes.
That鈥檚 why structured, accessible training on research metrics is more important than ever. It not only supports institutional strategy but also empowers researchers to make informed publishing decisions and evaluate their own impact more confidently.
This global need for metrics literacy was reflected in a recent national webinar hosted as part of the . The session, titled 鈥Understanding Research Metrics: Essential Tools for Academic Excellence鈥, drew over 1,450 attendees from across India鈥檚 academic community and focused on demystifying the tools and indicators that shape research visibility and evaluation. The session aimed to clarify:
Guiding the discussion was Sonal Shukla, Head of Indexing, whose presentation helped demystify both foundational and advanced metrics. From Impact Factor and h-index to CiteScore, SCImago Journal Rank (SJR), and the Journal Citation Indicator (JCI), the session offered a comprehensive overview tailored to both early-career researchers and seasoned academics.
This emphasis on researcher development aligns with broader institutional priorities, such as promoting ethical research practices, a theme explored in the blog 鈥How libraries enhance research ethics for researchers鈥.
With over 200 audience questions and a satisfaction score of 8.78/10, the session revealed a strong appetite for structured, accessible training on metrics. For research offices and librarians, this kind of engagement signals:
These insights echo findings from the recent blog 鈥The state of research assessment: Insights from a survey of 6,600+ researchers鈥 which highlights the gap between researcher confidence and institutional expectations around metrics use.
While the session focused on understanding research metrics, it also underscored the importance of using these tools as part of a broader, more holistic approach to research assessment. Metrics can offer valuable insights into visibility and influence, but they should be complemented by qualitative indicators such as peer recognition, societal relevance, and collaboration outcomes. This balanced perspective aligns with global reform efforts, including the , which advocates for using quantitative evaluation only in support of expert assessment and the , which call for recognizing a broader range of research outputs and impacts.
鈥淢etrics are powerful tools, but they鈥檙e not the whole story. We need to combine them with qualitative insights to truly understand the value and impact of research especially in diverse and interdisciplinary fields.鈥 - Sonal Shukla, Head of Indexing at 村花论坛.
Just as mentorship programmes help early-career researchers navigate publishing and career decisions. Initiatives like the ONOS webinar series play a vital role in raising awareness of the tools and resources available to support research evaluation. By demystifying research metrics and showcasing their practical applications, these sessions help researchers better understand what鈥檚 available and why it matters, helping them to act with confidence and clarity.
Interactive polls and a lively Q&A segment encouraged reflection and engagement, turning the webinar into a dynamic learning experience. Discussion covered everything from the role of regulatory bodies in metrics to how researchers can balance qualitative and quantitative indicators of impact.
One attendee described it as 鈥渙ne of the most enriching academic webinars I have attended in recent times,鈥 reflecting the session鈥檚 ability to translate complex concepts into actionable insights.
As research institutions continue to adapt to evolving assessment frameworks, scalable learning opportunities like this webinar series are proving essential. By equipping researchers with a clearer understanding of metrics, libraries and research offices can strengthen their role as strategic partners in advancing research quality and visibility. With more sessions on the horizon, this is helping institutions build lasting capacity for responsible, informed, and impactful research practices.
The ONOS webinar offered valuable insights into how researchers can use metrics more effectively, but it鈥檚 just one part of a broader dialogue around research assessment reform.
For a deeper understanding of how institutions and researchers navigate this evolving landscape, explore the white paper 鈥. It shares perspectives from over 6,600 researchers worldwide on current practices, challenges, and opportunities for more holistic and responsible approaches to evaluating research.
Don't miss the latest news and blogs, subscribe to The Link Alerts!