Research Evaluation & Assessment Service (REAS)

Enhancing research impact by increasing the visibility of researchers and their scholarly outputs, optimizing publishing strategies and evaluating research performance are services the Research Intelligence community supports with its resources and tools. A team of bibliometric practitioners of the University Library can give advice on effective strategies for maximizing research impact and to create understanding in the research community of research evaluation metrics or intelligence. Furthermore, the community engages in the development of new methodologies in measuring quality and relevance of research and its societal impact.

    • Awareness: advocating sensible use of metrics-based research assessment and analytics
    • Guidance as to how and where faculty might learn about, locate, and apply bibliometric indicators to their scholarly output (best practices)
    • Training: developing e-modules and instructions
    • Giving advice on publication strategies
    • Increasing researcher's visibility (author ID's, including ORCID)
    • Supporting bibliometric analyses for funding, and for reappointments, tenure, and promotion
    • Developing and maintaining Research impacts: an e-course covering the three most used databases for measuring research impact: Web of Science, Scopus and Google Scholar
    • Engaging in the research projects
      • Quality and Relevance of Research in Law, Social Sciences and Humanities (with CWTS and EIPK)
      • Research Intelligence project (with CWTS, EUR/ErasmusMC and TU Delft).
    • Drs. Gert Goris, coordinator Research Intelligence Community EUR
    • Drs. Gusta Drenthe, faculty liaison librarian, Dept. Research Services of the University Library, member Research Intelligence Community EUR
    • Drs. Judith Gulpers, faculty liaison librarian, Dept. Research Services of the University Library, member Research Intelligence Community EUR

When using metrics for impact measurement, we would like to suggest the following rule of thumb: if a researcher shows good citation metrics, it is very likely that she or he has made a significant impact on the field. However, the reverse is not necessarily true. If a researcher shows weak citation metrics, this may be caused by a lack of impact on the field, but also by one or more of the following:

  • working in a small field (therefore generating fewer citations in total);
  • being an early career academic;
  • publishing in a language other than English;
  • publishing mainly (in) books. 

Furthermore we adhere to the Leiden Manifesto for Research Metrics Opens external, ten principles to guide research evaluation: 'a distillation of best practice in metrics-based research assessment so that researchers can hold evaluators to account, and evaluators can hold their indicators to account'.