Bibliometrics uses statistical methods for quantitative analysis of scientific publications and their citations. Bibliometric analyses provide information on the visibility and impact of research by individual researchers, project groups, and whole institutions.
Personal indicators are the basis for bibliometric analyses and are based on the citation count and number of publications. A prominent example is the h-index.
Example: A researcher’s h-index is 5, if 5 publications have been cited for 5 or more times.
Journal based indicators measure the influence or impact of scientific journals. They are not suitable for evaluating and measuring the performance of individual researches, as they do not allow any assessment to be made about the quality of the published articles. They merely serve as a basis for assessing the visibility of the articles in the specialist community.
The best known is the so-called Journal Impact Factor (JIF). It indicates how often the articles of a particular journal are cited on average per year in other scientific publications.
The Ruhr-Universität has licensed both Web of Science and Scopus. Both are interdisciplinary citation databases and as such form the data basis of the most important international research and university rankings. They already contain basic analytical tools, e.g. for determining the h-index. The Journal Citation Reports and Essential Science Indicators can also be accessed via Ruhr-Universität. Besides the University Bibliography, SciVal serves as the most important and powerful analytical tool. You may access it via the university network and need to create an account. If you have already created an account with Scopus, you can also use it for SciVal.
Bibliometrics and bibliometric analyses have limitations. They are never a substitute for qualitative assessment of research, e.g. through peer review. For a comprehensive analysis, multiple metrics should always be selected for consideration. Great care is needed in the selection of metrics and the interpretation of results. The informative value of such an evaluation also depends heavily on the selected underlying data. Depending on the focus and scope of the database used, different results may be obtained for the selected indicators. Furthermore, an interdisciplinary comparison is not possible due to the large differences in publication behavior.