Quantifying research contributions in Haiti: Combining altmetrics and bibliometrics for life and Earth sciences


The academic world faces growing pressure to quantify research output and impact. This pressure has led to the development of various metrics, including bibliometrics (measuring publication numbers and citations) and altmetrics (measuring online attention and engagement). These metrics are used to evaluate researchers, universities, and journals, often influencing funding allocation and career advancement. While initially focused on individual researchers, the evaluation process has expanded to include institutions and research networks. This has fueled international collaboration and network density, while potentially neglecting research conducted outside these networks. Beyond traditional bibliometrics like the H-index and journal impact factor, new metrics like altmetrics are gaining traction. These metrics capture the broader impact of research beyond traditional citations, including online discussions and social media mentions.

This trend of evaluation is particularly relevant in developing countries like Haiti, where research evaluation is often informal and project-based. However, the implementation of bibliometric and altmetric indicators can help Haitian researchers participate in the global academic conversation and demonstrate the quality and impact of their work.

 

Theoretical

Bibliometrics: Measuring the Pulse of Research

Bibliometrics, a field within scientometrics, focuses on quantitatively analyzing scientific publications. It describes research landscapes, evaluates research performance, and monitors scientific progress. Developed by Pritchard in 1969, it focuses on counting and analyzing published research to gauge its scientific impact. Traditionally, bibliometrics has served as a key tool for research evaluations and performance measurement. By analyzing factors like the number of citations and the H-index, it allows researchers and institutions to demonstrate their productivity and impact.

However, in recent years, expectations for bibliometrics have grown significantly. It is now seen as a potentially powerful tool for evaluating research, with the potential to shape funding decisions and career advancements.

 Despite its growing importance, it's crucial to remember that bibliometrics alone cannot provide a complete picture of research quality and impact. Other factors, such as the originality of the research, its contribution to specific fields, and its real-world applications, also play a crucial role.

Traditionally, research impact has been measured by the number of citations a publication receives. This metric, championed by Konkiel, reflects how much a researcher's work influences their field. However, relying solely on citations can be limiting.

Hirsch's H-index offers a more nuanced approach by considering both the number of publications and their citation counts. This provides a more holistic view of a researcher's output and impact.

However, the rise of social media and online platforms has led to the emergence of "altmetrics," a term including alternative metrics for measuring research impact.  Adriaanse and Rensleigh highlight the increasing integration of social media tools in academia, where researchers use platforms like blogs, Twitter, and Mendeley to share and discuss their work. This online activity, captured by altmetrics, reflects the real-world reach and impact of research beyond traditional scholarly circles.

ResearchGate, a social network for researchers with 20 million members, aims to revolutionize research by providing tools for collaboration and reputation building. It offers various metrics, including the RG Score, Total Research Interest, and H-index, to assess scientific impact and peer review.

The RG Score, based on user interactions with a researcher's work on the platform, aims to measure scientific reputation. However, its lack of transparency and reliance on journal impact factors raise concerns about its validity. Some studies criticize its limitations, including the inability to detect manipulation and the questionable practice of using journal impact factors to assess individual researchers.

Total Research Interest, a different metric, gauges the interest other researchers have in a researcher's work. It combines bibliometric and altmetric indicators, including reads, full-text reads, recommendations, and citations, to provide a broader picture of impact beyond traditional citations.

While ResearchGate offers valuable tools for connecting and collaborating, the validity of its metrics remains a subject of debate. Researchers should be aware of the limitations of these metrics and use them with caution when evaluating their own or others' work.

 

Method used

Data from ResearchGate, Scopus, and Google Scholar for 47 researchers with an RG score above 1 are analyzed. The environment sector had the highest RG score, followed by "others," agronomy, and health. This reflects the order of researchers per field, except for agronomy which had the fewest researchers.

Considering the citations, agronomy had the highest total citations but also the highest concentration, with one researcher contributing 97.4%. Environment and health had more evenly distributed citations while the environment sector had the highest H-index, followed by health and agronomy. However, All sectors showed significant increases in both RG score and bibliometric indicators throughout 2020.

 

Increase in Citations and H-index

All fields showed significant increases in citations and H-index between January and December 2020.

Agronomy had the highest increase in citations (21.93%), followed by environment (27.78%), health (17.47%), and others (114.77%). The order of increase in H-index was "others" > agronomy > environment > health, ranging from 33.33% to 140%.

 

Distribution of Productivity

Almost 64% of researchers have a publication score below 5, while only 10% have scores above 10. This indicates a large gap in productivity among researchers.

 

Refined Analysis with Stricter Criteria

To minimize bias from self-citations, researchers with an RG-Score ≥10 and H-index unaffected by self-citations were analyzed. Only 5 researchers met these criteria (1 in agronomy, 1 in environment, 3 in health). This analysis suggests potentially lower productivity in agronomy and environment compared to health.

 

Relationship between RG-Score and Bibliometrics

The Average RG-Score was 52.59, with agronomy having the lowest while considering citation the number in Google Scholar was 3226, with health having 0.

While all fields showed increased productivity in 2020, there's a significant gap between researchers. A refined analysis suggests potentially lower productivity in agronomy and environment compared to health.

 

XLSTAT and R software,

The Kolmogorov-Smirnov normality test was applied to various indicators like citation counts and H-index from different platforms. The null hypothesis (normal distribution) was not rejected for any of the indicators, suggesting they follow a normal distribution.

 

Distribution of Indicators:

Compared to Google Scholar and Scopus, ResearchGate shows an increase in the total number of citations (19.53% and 50.02%, respectively).

Correlation Analysis:

Pearson's correlation coefficient was used to analyze the relationships between RG Score and bibliometric indicators (citations and H-index) from various platforms.Strong correlations were observed between:

·         RG Score and Scopus citations (r = 0.99, suggesting 99% overlap in cited works)

·         ResearchGate citations and Google Scholar citations (r = 0.928)

·         RG Score and Scopus H-index (r = 0.90)

·         RG H-index and Scopus H-index (r = 0.997)

 

 

Conclusion

This study addresses the scientific productivity of Quisqueya University researchers in agronomy, environment, and health. All fields showed significant increases in the RG Score (altmetric) and bibliometric indicators (citations, H-index). Strong positive correlations were found between RG indicators and those from established platforms like Scopus, supporting the validity of RG Score as a productivity measure. However strong negative correlations were observed between RG indicators and Google Scholar indicators. This requires further investigation to understand the underlying factors causing this discrepancy.

DOI: https://doi.org/10.19044/esj.2021.v17n21p316


Comments

Popular posts from this blog

The concept of global water governance and water rights through the prism of Canaan, Haiti.

Pollution of Water Resources and Environmental Impacts in Urban Areas of Developing Countries: Case of the City of Les Cayes (Haiti)

Haiti's Open Defecation Crisis: Unraveling the Social and Demographic Threads