Research Evaluation and Assessment Service - REAS
Enhancing research impact by increasing the visibility of researchers and their scholarly outputs, and optimizing publishing strategies are basic services the University Library offers.
Evaluating research performance is another service the library supports with its resources and tools. A team of bibliometric practitioners can give advice on effective strategies for maximizing research impact and to create understanding in the research community of research evaluation metrics or intelligence. The library also engages in research in the development of new methodologies in measuring quality and relevance of research and its societal impact.
- Awareness: advocating sensible use of metrics-based research assessment and analytics
- Guidance as to how and where faculty might learn about, locate, and apply bibliometric indicators to their scholarly output (best practices)
- Training: developing e-modules and instructions
- Giving advice on publication strategies
- Increasing researcher's visibility (author ID's)
- Supporting bibliometric analyses for funding, and for reappointments, tenure, and promotion
- Developing and maintaining Research impacts: sources and metrics, an e-course covering the three most used databases for measuring research impact: Web of Science, Scopus and Google Scholar
- Engaging in the research project Quality and Relevance of Research in Law, Social Sciences and Humanities (with CWTS and EIPK).
- drs. Gert Goris, Deputy Director University Library & Project leader Research Intelligence
- drs. Gusta Drenthe, faculty liaison librarian, Department Academic Services, member Research Intelligence community of the EUR
- drs. Judith Gulpers, faculty liaison librarian, Department Academic Services, member Research Intelligence community of the EUR
When using metrics for impact measurement, we would like to suggest the following rule of thumb: if a researcher shows good citation metrics, it is very likely that she or he has made a significant impact on the field. However, the reverse is not necessarily true. If a researcher shows weak citation metrics, this may be caused by a lack of impact on the field, but also by one or more of the following:
- working in a small field (therefore generating fewer citations in total);
- being an early career academic;
- publishing in a language other then English;
- publishing mainly (in) books.
Furthermore we adhere to the Leiden Manifesto for Research Metrics, ten principles to guide research evaluation: 'a distillation of best practice in metrics-based research assessment so that researchers can hold evaluators to account, and evaluators can hold their indicators to account'.