Critical issues
Does the African academy need its own citation index?
David Mills
Associate Professor, Pedagogy and the Social Sciences
University of Oxford
david.mills@education.ox.ac.uk
Toluwase Asubiaro
Doctor of Philosophy
African Research Visibility Initiative, Calgary, Alberta, Canada
Department of Information Science, University of South Africa, South Africa
Why does being indexed in Web of Science and Scopus matter so much for African journals? And why is getting indexed so difficult? In this paper we revisit the history of the first citation index, the logic behind its highly selective coverage, its persistent under-representation of Africaās journals, and its symbolic importance for many researchers. We ask if the solution is to create an alternative African citation index, or if there are other ways to promote the visibility and findability of African journals. Garfieldās company, the Institute for Scientific Information (ISI), launched the first Science Citation Index in 1963. An analysis of extant ISI documentation shows that 95% of the first 615 science journals indexed were published in Europe and America. This coverage changed author behavior and journal choices and reinforced existing status hierarchies. Garfield later justified journal selection decisions mathematically, defending his decision to prioritize ācoreā journals, and coining what he called āGarfieldās law of concentrationā. In 1973, the ISI launched a Social Science Citation Index, indexing 1,000 core social science journals, again with no African representation. In the 1990s, the indexes were digitized, allowing their data to be mined. The creation of university rankings amplified the reputational importance and commercial value of the indexes. Today Web of Science and Scopus continue to use the ācitednessā of candidate journals by journals already within the index to inform selection decisions. As a result journals published in the global peripheries, in small fields, or in languages other than English, struggle to get indexed. In 2023, if one excludes South Africa, only around 60 of the 30,000 plus journals indexed in Web of Science were published from Sub Saharan Africa. One response is to create an alternative Africa-focused journal index and database. We end by describing the history of attempts to create such an index, including current initiatives. Another is to promote the international visibility and findability of African journals through the provision of high-quality metadata, the use of DOIs and hosting on international portals.
Keywords
Citation indexes, Open Science, journal publishing, Africa, bibliodiversity
Plan of the paper
Introduction
A short history of Garfieldās index
Why are so few African Journals indexed in Web of Science and Scopus?
Is an African citation index the solution?
Conclusion: Global science, African science?
Introduction
In November 2023, the editor of the Nigerian Journal of TechnologyĀ announced that Volume 42, issue 3 of the serial was a ābrand newā journal, because it had just been accepted for indexing in Scopus (Nnaji, 2023). This had, Nnaji pointed out, been a long journey. The journal had first been published online in 2012, had already applied for indexing twice, and upgraded their website. In case potential authors doubted the veracity of the claim, they also included a link to the Scopus tracking tool[1]. A few months later, the Nigerian political scientist Jideofor Adibe was also celebrating. His March 2024 newsletter marked 20 years of journal publishing by his company Adonis and Abbey. In it he reminded readers that a program set up to āhelp address the question of the high mortality of journals published by Africansā had led to it launching 21 journals, including seven indexed in Scopus (Adibe, 2024).
For every winner in the indexing game there are many losers. High profile Nigerian medical journals have been delisted from Scopus, including the West African Journal of Medicine (discontinued from Scopus in 2021), the African Journal of Medicine and Medical sciencesĀ and the Nigerian Journal of MedicineĀ (both discontinued in 2016). Despite being amongst the top 10 journals for Nigerian scholars, and being indexed in other databases such as the WHOās African Index Medicus, all three had struggled to attract submissions and keep up their publication schedules. As journals of national or regional medical associations, some had been publishing since the 1980s, relying on the unpaid labor of volunteer editors and reviewers. The West African Journal of MedicineĀ was revived in 2021, after a seven-year publishing hiatus. The African Journal of Medicine and Medical SciencesĀ has not published an issue since 2021, whilst the Nigerian Journal of MedicineĀ has been published since 2021 by Medknow, an imprint of the international publisher Wolters Kluwer.
Why does being indexed in Web of Science and Scopus matter so much for African journal editors? Web of Science and Scopus, a rival index set up by Elsevier in 2004, have become influential multidisciplinary citation indexes, partly because of their selectivity. They claim to provide the global coverage necessary to support the science communication system (Schott, 1991), yet they index a much smaller fraction of journals and research from the majority world than that published in Western Europe and North America (Archambault et al., 2006; Mongeon & Paul-Hus, 2016; Rafols et al., 2019, Asubiaro et al., 2024). Despite repeated critiques of these geographical disparities in coverage, the dominance and influence of these indexes remains.
In this article, drawing on extant historical sources, we show how linguistic, geographical and disciplinary biases were built into Garfieldās Science Citation Index from the very start. Initial decisions about which journals to index reified existing status divides between science in what was in the 1980s called the āfirstā and āthirdā world (Gibb, 1995). This was amplified through citation metrics-based journal selection and evaluation processes. Authors prioritized indexed journals for publication, strengthening the reputation of existing academic networks and geographies. Journals from across the majority world suffered as a result. The legacy of these initial indexing decisions are still visible in the indexesā contemporary coverage. How might this be remedied? If so few African journals are indexed, should Africa build its own alternative citation index? Or would this just create new inequalities? The last part of the paper explores efforts ā past and present ā to improve the international visibility and findability of African scholarship, and the potential now offered by the Open Science movement.
Ā
A short history of Garfieldās index
The idea for an academic citation index was first developed by the American āinformation scientistā Eugene Garfield in the 1950s. Fascinated by the challenge of managing growing information flows generated by the post-WW2 growth in science, he had set up his own company in 1955 - the Institute for Scientific Information (ISI). Its first publication was Current Contents, a stapled booklet of contents pages of life science journals that research librarians used to decide which journals their institutions should subscribe to. Launched in 1958 with 150 journals, its popularity rapidly grew. By 1967 Current Contents covered 1,500 journals in physics, chemistry and the life sciences, supported by university library and corporate subscriptions.
His next invention was inspired by a US legal paper-based research tool called Shepherdās citations. Dating back to the 1870s, it allowed lawyers to research case law and track precedent. Garfield saw the āshepherdisationā of science as a way of managing the exponential increase in scientific publishing after the Second World War (Garfield, 1955). He saw it as helping librarians identify the most important scientific developments and cope with what he later called āiatrogenic information overloadā (1984). He also felt that scientists ought to know about the existing citations of an article they were also citing, and that links to earlier work would help them to understand the ātransmission of ideasā, and the intellectual structure of thought. By counting the total number of citations one could measure the āimpact factorā of an article, and hence quantify its importance (see also Small, 2017).
The story of Garfieldās index is also a story of cold war rivalry. As Russia and the US competed for scientific influence, both sides turned to bibliometrics to measure and track science. Garfield was aware of earlier attempts to create comprehensive catalogues of science (1961). Csiszar (2023) traces the origins of Garfieldās index to what he calls ābibliometric imperialismā (Csiszar, 2023, p. 105), such as the unsuccessful efforts by the Royal Society of London to create an International Catalogue of Scientific Literature in the years leading up to World War 1. Work by historians of science (e.g. Guedon, 2001, Daling, 2023, Fyfe, 2017) also trace the impact of commercial journal publishing on this growing information flow.
Garfieldās journey is recounted in carefully footnoted detail by Wouters (1999), who notes how the āSputnik crisisā of 1957 āturned the librariansā problem of bibliographic control into a national information crisis (Wouters, 1999, p. 62). Very early on, Garfield sought to make a pragmatic case for selectivity, arguing that ālack of complete coverage is not necessarily an argument against a citation index. It is in fact an argument in its favorā (1955, p. 109). Garfield insisted that decisions about which journals to index was based on citation data and expert guidance, but inevitably also on ISIās experience of publishing Current Contents, and its track record of selecting the most significant and important journals for abstracting, based on feedback from researchers and librarians. Csiszar suggests that it was simply a ābusiness decision based on the needs of potential and actual subscribersā (2023, p. 120).
After finally getting funding from the US Navy, Garfield was able to start work assembling the Science Citation Index (Garfield, 1963). After several prototypes with genetics journals, Garfieldās first index, published in 1963, assembled citation data from 613 scientific journals. Whilst this included journals from 28 countries, our analysis shows that 70% were published from the US or UK, and nearly all the rest from Europe. This selection was based on a US-based perspective of the scientific landscape. Ten Russian journals were included, along with two from China, three from India, seven from Japan and four from Latin America, but none from Africa. The academic geography of the Euro-American publishing economy was hard-wired into the index from the start.
The initial challenge was keeping the task and costs manageable, given that millions of footnotes, references and citations needed to be typed in manually, stored on magnetic disks and processed on an early IBM computer. Garfield made the most of early computer technology to keep costs down. ISI employed a team of 100 data operators adding data to a central mainframe via desk tapes. Working two shifts five days a week, they were able to process 25,000 references a day (Garfield, 1979). As Csiszar notes, turning footnotes into citations was a āmassively labor-intensive operationā (2023, p. 120). The index grew and as the SCI began to shape purchasing decisions, journal publishers and editors āclamored to get their journals coveredā (Small, 2017, pp. 605-606). By 1966 the Science Citation Index included 1,150 journals, increasing to 2,000 by 1968. Gradually more non-European journals were indexed, but their overall proportion remained small, given the parallel growth in the number of US and European-published series. As more āinternationalā journals were launched or taken over by commercial publishers, having a journal impact factor was a kite mark of the serialās importance and status (Mills, 2024). Over the next two decades the index doubled in size, and by 1990 it was indexing around 4000 journals. Despite Garfieldās marketing efforts, including several films, and his success at selling his Index to Soviet state agencies (Aronova, 2021), the index never made a profit for ISI.
As soon as it was launched, sociologists and science scholars began questioning the coverage of the index, and the meaningfulness of citation data for different disciplines and regions, given very different citation cultures (Cole & Cole, 1971). Some mocked the idea that objectivity could be achieved by ānot reading the literatureā (Garfield, 1972; Oliver, 1970). Concerns grew about the disciplinary geographical and linguistic selectivity intrinsic to these indexes. The geographical and linguistic bias of the Science Citation Index (now called Web of Science) against āperipheral scienceā was first highlighted by science scholars in the 1970s (Narin, 1976). Narin draws on available evidence to suggest that there were between 550-650 African journals being published across the continent in the early 1960s, yet none of these were indexed.
The debates continued. In the 1990s Scientific American published a critique of the citation indexesā systematic discrimination against what were called āthird world journalsā (Gibbs, 1995). Analyzing 1994 SCI data, Gibbs showed that the percentage of journals indexed from this region world had dropped 40% over the decade from 1983 to 1993 from around 2.5% of all journals in the index, down to around 1.5% (Gibbs, 1995, p. 194). It also highlighted the marked drop in impact factors of several Brazilian and Indian journals over the same period.
Garfield was always quick to respond to these critiques, questioning the quality of third world journals (Garfield, 1983, 1997; Goodwin & Garfield, 1980).Ā Garfield wrote monthly āEssays of an Information Scientistā published in Current Contents. Under pressure to justify the indexās selectivity, Garfield had long been aware of Bradfordās law of scattering, named after a British mathematician, that held that that the most important literature in any scientific field is published only in a narrow group of journals. As early as 1955, Garfield highlighted the 80/20 distribution of significant articles. In 1971, he returned to this as a justification for journal selectivity. He claimed that ISIās own work had confirmed what he called āGarfieldās law of concentrationā, that a ābasic list of 500 to 1000 journals will account for 80 to 100% of all journal referencesā (Garfield, 1971, p. 223) in a field. He noted that ā25 journals accounted for 20-25% of the 4 million citationsā processed for the 1969 index, before archly pointing out that the āimplications of this finding for establishing future libraries, especially in developing countries, should be quite obviousā (Garfield, 1971, p. 223).
The costs of inputting citation data proved a drain on ISI resources, especially as the index grew (Small, 2017). Thompson Reuters acquired ISI from Garfield in 1992, and fully digitized the citation database.Ā The growth in computing power, and the potential this provided for large-scale data mining and analysis, changed the data landscape, as did the arrival of the internet. The index was rebranded as Web of Science, and in 2016 acquired by private equity for Ā£3.5 billion in 2016 and rebranded as Clarivate (Pranckute, 2021). The global influence of citation data was amplified by its use in global university rankings and the rise of national research evaluation policies.
The ācore collectionā of Web of Science now covers more than 21,500 journals within four different indexes: Science Citation Index Expanded (SCIE), Social Sciences Citation Index (SSCI), Arts & Humanities Citation Index (A&HCI) and the Emerging Sources Citation Index (ESCI)[2]. Clarivate Analytics, the owner of Web of Science and many other information analytics and data products (e.g. Proquest, EndNote) is a major multinational company with an operating revenue of $2.7 billion in 2022.Ā
Elsevier, the largest of the academic publishing houses, based in Amsterdam and London, launched a rival index, Scopus, in 2004. The latterās coverage is around 30% larger than WoS and continues to expand. In 2023 Scopus indexed 29,750 active journals, as well as journal pre-prints and conference papers[3]. Elsevier markets related services to universities, including support on preparing journals for an application for indexation[4]. Scopus is owned by Elsevier, part of the RELX group, based in Amsterdam and London, with an operating revenue of Ā£8.5 billion and a profit margin of 31%. Both benefit from the European dominance of a āglobalā research publishing industry that can be traced back to the start of the twentieth century (Mills, 2024).
Ā
Why are so few African Journals indexed in Web of Science and Scopus?
The Ulrichsweb journal directory is a database of almost 90,000 academic journals. Of the approximately 2,200 active academic journals from sub-Saharan Africa included, only 169 are indexed in Web of ScienceĀ and 178 in Scopus (Asubiaro & Onaolapo, 2023). Of these, more than 100 are published from South Africa. If one excludes South Africa, around sixty journals published from sub-Saharan Africa are currently indexed in Scopus. Only a handful are published in French either wholly or partly ā including two from Senegal, one from Mali, and a few from Morocco, Algeria and Tunisia. The number indexed in Web of Science is lower still. The Open Access journal portal DOAJ only indexed 213 sub-Saharan African journals in 2023, of which 142 were from South Africa. Its coverage of Francophone Africa Open Access journals is particularly poor ā with only journal one each from Cote dāIvoire, Burkina and Mali, and none from Senegal. This is fewer than Scopus or Web of Science. By comparison, more than 700 African academic journals are hosted on the respected AJOL platform (Africa Journals Online), of which around 2/3 are active.
About a quarter of the worldās journals in Ulrichās are indexed in Web of Science and Scopus, but some regions are covered much more systematically than others. Sub-Saharan Africa is one of the least represented regions (Asubiaro, 2023; Asubiaro et al., 2024). According to Asubiaro, Onaolapo and Mills (2024), 32% of European journals listed in Ulrichās directory are indexed in Web of Science, a figure similar to North American journals. Conversely, a mere 8% of journals from Central and Southern Asia, Eastern and South-eastern Asia, and Sub-Saharan Africa listed in Ulrichās directory are indexed in Web of Science. As a result one could say that Europe and North America are āover-representedā in Scopus and Web of Science, with overrepresentation rates ranging from 19% to 35%. These geographical inequalities are self-reinforcing, driving publishing choices across the world (Huang et al., 2020; Selten et al., 2020). Researchers in the majority world are incentivized for publishing in, and citing, indexed journals, undermining local and non-mainstream journals and national science systems (Rafols et al., 2019). For instance, many Nigerian research universities have instituted promotion policies that require publications in āinternationalā rather than Nigerian journals (Mills & Branford, 2022). This is because āinternationalā journals ā often defined vaguely ā are seen as more reputable and hence a better judge of the quality of research. This has raised further questions as what counts as reputable āinternationalā journals (Omobowale, 2014). In some cases, such as at the private Covenant University in Nigeria, āinternationalā is defined more specifically as journals indexed in the top quartile of WoS and Scopus.
The dynamism of non-Anglophone research ecosystems across the majority world, along with their research outputs and citations, is better captured in regional databases, such as SciELO, redalyc and REDIB, serving Latin America, Garuda serving Indonesia, and the CNKI in China. The levels of underrepresentation of research from countries outside Western Europe and North America in Web of Science and Scopus is made clear. These geographical inequalities in indexing inequalities are compounded by citation practices, whereby individual researchers are most likely to cite indexed work. Whilst there is a growing volume of African scientific production, much of this continues to be overlooked by these indexes (Asubiaro et al., 2024). (Rabkin et al., 1979; Tijssen, 2007). Given the exponential growth in scientific outputs and citations, existing citation inequalities continue to widen (Horton, 2022; Vanderstraeten & Vandermoere, 2021). Nielsen and Andersen (2021) show how the top 1% of āhighly citedā global scientists are extending their share of publications and citations. South Africa is the only exception to the general pan-African trend, increasing its concentration of āhighly-citedā researchers.
Sixty years after Garfield launched his first index, getting journals indexed in WoS or Scopus remains challenging. The indexes have exacting metrics-based selection and evaluation policies, and regularly delist āunderperformingā journals. Web of Science[5] uses 28 different criteria to evaluate journals. Journal impact criteria include assessment of content significance and three citation-based metrics: author citations, editorial board citations, and comparative citation data.
Scopus evaluates the journal standing of the 3,500 applications it receives each year according to the ācitedness of journal articles in Scopusā. It also measures the ādiversity in geographical distributionā of editors and authors. Scopus uses citation-based peer benchmarks as part of what it calls ātitle enrichmentā. Journals have to have a self-citation rate of no higher than 200% of the average for their field, and to have citation rates, numbers of articles, and number of clicks on Scopus as no less than 50% of the average. These metrics discriminate against small journals and those that cater for a relatively autonomous or specialist research community, such as many journals published from African universities. These same metrics are used to assess and delist āunderperformingā journals. Between 2016 and 2020, 536 journals were removed from Scopus, including several from Nigeria.
Both indexes are aware that citation data can be manipulated. Journals within these indexes attract ever more submissions, because of the pressures (and incentives) on researchers to publish.Ā There have been a number of cases of journals lowering their reviewing standards or having special issues āhijackedā (Moussa, 2021). Both have been accused of unintentionally indexing so-called āpredatoryā journals. Their response has been to implement more stringent journal monitoring. Despite the claims of their designers and advocates (Baas et al., 2020; Birkle et al., 2020) the integrity of these indexes is constantly under question. A growing community of scientific watchdogs and āsleuthsā is quick to point out retractions, especially when this involves large numbers in a guest-edited special issue[6]. Within an āauthor-paysā publishing model, publishers and journals are rewarded for accepting more papers, but rapid expansion comes with quality challenges. There are growing numbers of cases of indexed journals being delisted from the indexes, either because of breaches in editorial processes or unusual citation and authorship trends. In response, the indexes set more stringent bibliometric standards for inclusion, using AI tools to detect citation anomalies and authorship patterns. Over-reliance on citations also encourages citation gaming (Biagioli & Lippman, 2020). In an unequal system, many resort to tactics such as self-citation or citation clubs to increase their visibility and scores.Ā
Questions continue to be asked about the indexesā methodological biases (Gallagher & Barnaby, 1998; Seglen, 1992), linguistic biases (Harzing, 2016; Mas-Bleda & Thelwall, 2016; Vera-Baceta et al., 2019), and different cultures of citation (Callaham et al., 2002; Velho, 1986). Critical work has also highlighted the over-representation within the indices of research published in English (Albarillo, 2014), of work from the UK and America (Gingras & Khelfaoui, 2018; Luwel, 1999) and from the natural and physical sciences (MartĆn-MartĆn et al., 2018; Mongeon & Paul-Hus, 2016).
Recent analyses reveal the consequences of excluding journals published in languages other than English from the indices. More than 95% of Web of Science (and 92.5% of Scopus) indexed documents are in English (Vera-Baceta et al., 2019). The Ulrichsweb database of journals lists 1,800 African journals publishing in English, around 97.5% of all Africa-published journals, along with 90 published in French (wholly or partly), and 73 that publish articles in Afrikaans. In Web of Science and Scopus journals that will publish in Afrikaans outnumber those publishing work in French by three to one.[7]Ā This shows the under-representation of Francophone West African research within these indexes. Indeed, Afrikaans is among the top 20 non-European scientific languages in Web of Science and Scopus, higher than Arabic and Persian.
A related challenge is that many of Africaās journals are oriented to African concerns and debates. These African problems can seem less relevant to global audiences, and so work in this area is less likely to be published and sometimes less likely to be researched. Some African scholars, in a bid to become globally visible, abandon local problems in favor of āNorthernā conceptual and theoretical debates (Nymanjoh, 2004). This can undermines local research ecosystems, especially if having a strong national or regional publication profile counts for little when it comes to promotion decisions.
Ā
Is an African citation index the solution?
Some see the creation of an African citation index as the solution. This has long been the ambition of the Nigerian information scientist Williams Nwagwu, who argues that an autonomous ācitation index could be used to leverage the limited publicity of African resourcesā (Nwagwu, 2006, p. 11). Nwagwu is critical of the international citation indexes, and the way they āhomogenize, centralize and globalize scholarly performance criteriaā (2006, p. 228), and their lack of ādeference to global diversity and complexityā (2006, p. 228). He led CODESRIAās[8] initiative to develop such an initiative in the 2000s, calling for it to be modelled on the principle of āAfricanism, recognizing and putting African knowledge into a global perspectiveā (2006, p. 238). When he presented at a 2006 conference at the Centre for African Studies in Leiden, the idea was enthusiastically endorsed (Nwagwu, 2006).
CODESRIA began to seek institutional backing for the index. The hope was that Africaās universities and disciplinary associations would also see the value of the initiative and provide seed-funding. But many of these associations operate on a shoestring, and underfunded university libraries were little better placed. The proposals also failed to attract the support of donors, national governments, or the African Development Bank. Most interested were Elsevier, who were enthusiastic about appending an African index to Scopus. Some within CODESRIA pursued this option, whilst others were suspicious that this would just reinforce the influence and market share of Elsevierās own index. Working with Elsevier also ran against the long history of CODESRIAās commitment to creating a pan-African knowledge community as a way of challenging knowledge inequalities (Hoffman, 2018). There was also the question of whether an African citation index would create new status divides. As negotiations stalled in 2017, a new CODESRIA director faced funding shortfalls, and difficult negotiations with its main Scandinavian funders about its future. The decision was taken that an index was less of a priority for CODESRIA than its research training, mentoring and publications programs, and the index was not included in the organizationās strategic priorities.
Some still aspire to creating a comprehensive African journal database as a way of promoting the visibility and findability of African scholarship. This is the vision of the AfricaRVI (African Research Visibility Initiative), set up by the authors along with Nigerian colleagues in 2022. Our aim is to create an inclusive index of around 1,000 active African journals in all the scholarly languages used on the continent, and to generate useful bibliographic and citation data for universities and researchers. Inevitably it will have to establish minimal criteria for inclusion, such as a track-record of regular timely publishing, and basic technological standards. Still at an early stage of development, this may help plug the indexing gap, and generate African citation data, but like all indices, it too will face questions of selectivity and funding.
There are other ways to promote the visibility and findability of African journals, such as the production and sharing of high-quality metadata, the use of DOIs, and the use of platforms such as AJOL, Muse and JSTOR as well as aggregators like EBSCO. Open Science practices also help with findability. Funders such as Wellcome and others are now seeking to build professional capacity within African journals, and INASP has long promoted journal publishing standards. The JPPS framework (Journal Publishing Practices and Standards), adopted by AJOL (African Journals Online), is one example, which values journals against a three-star evaluation system to support and reward quality enhancements.[9]Ā
Ā
Conclusion: Global science, African science?
Over the course of 60 years the reach and power of the citation index has come a long way from Garfieldās early experiments. Web of Science and Scopus are global infrastructures owned by major multinational companies. Bibliometric data underpins university rankings and researchers are expected to prioritize publishing in indexed journals. Citation data has enabled the post-war world science system to visualize itself as one vast connected circuit of knowledge flows (Mills, 2024), even as these infrastructures reinforce Euro-American dominance of the global research economy.
Citations are no substitute for the relationships of trust that underpins many scientific communities, yet the geographies of academic credibility are often exclusionary (Mills & Robinson, 2021). The linguistic inequalities created by an Anglophone research economy (Chen & Chan, 2021, Rowlands & Wright, 2022) have profound consequences for bibliodiversity (Shearer et al., 2020, Albornoz et al., 2020). Writing from an African perspective, Bhakuni and Abimbola (2021) point out that credibility āsurplusā of one social or epistemic group comes at the cost of a credibility ādeficitā of a marginalized group.
In the future are we likely to see more African humanities and social science journals being indexed, following the path established by Adonis and Abbey? Scopus offers consultancy support to African universities on preparing journals for indexation, providing advice, guidance and training to librarians, researchers and editors alike. This may be an astute commercial strategy for Elsevier, as the continentās research infrastructures continue to develop. African university leaders are aware of the reputational rewards that accrue from getting journals indexed, and some are willing to pay for this consultancy advice. Yet Web of Science and Scopus are designed to support knowledge flows at a global scale, rather than nurture national research cultures and knowledge ecosystems. The challenge is to be simultaneously local and global.
Some scholars and Open Access-oriented librarians call for āscaling smallā, rejecting the market-led assumptions that publishing requires economies of scale, and instead nurturing scale through āintentional collaborations between community-driven projects that promote a bibliodiverse ecosystemā (Adema & Moore, 2021, p. 27). This is a vision that prioritises local community building over profit, fostering diversity and building alliances. Perhaps funders and research policies need to scale both small and large, supporting a range of different publishing initiatives to promote visibility and resilience. In the short term, an African citation index may be less urgent than sustainable funding for national and regional research infrastructures.
Notes
[1]Ā The journal also provided a link to its Scopus validation: https://suggestor.step.scopus.com/progressTracker/?trackingID=87AEB0D2DB8706F3Ā .
[2]Ā https://clarivate.com/blog/unveiling-the-journal-citation-reports-2023-supporting-research-integrity-with-trusted-tools-and-data/Ā
[3]Ā https://blog.scopus.com/posts/scopus-now-includes-90-million-content-recordsĀ
[4]Ā https://www.elsevier.com/en-gb/connect/journal-indexation-why-does-it-matterĀ
[5]Ā https://clarivate.com/products/scientific-and-academic-research/research-discovery-and-workflow-solutions/web-of-science/core-collection/editorial-selection-process/editorial-selection-process/Ā
[6]Ā https://retractionwatch.com/2023/04/05/wiley-and-hindawi-to-retract-1200-more-papers-for-compromised-peer-review/Ā
[7]Ā There are no French language citation indexes, but Francophone journal databases include CairnĀ (with 630+ humanities and social sciences journals) and the PerseeĀ journal portal.
[8]Ā The Council for the Development of Social Science Research in Africa
Bibliographie
Adema, J., & Moore S. (2021). Scaling Small; Or How to Envision New Relationalities for Knowledge Production. Westminster Papers in Communication and Culture, 16(1), 27-45.
Adibe, J. (2024). From Adonis and Abbey - Commemorating 20 years of journal publishing: March 16 sent via email.
Albarillo, F. (2014). Language in Social Science Databases: English Versus Non-English Articles in JSTOR and Scopus, Behavioral & Social Sciences Librarian, 33(2), 77-90.
Albornoz, D., Okune A., & Chan, L. (2020). Can Open Scholarly Practices Redress Epistemic Injustice? In M. P. Eve & J. Gray (eds), Reassembling Scholarly Communications: Histories, Infrastructures, and Global Politics of Open Access.Ā MIT Press.
Archambault, Ć., Vignola-GagnĆ©, Ć., CĆ“tĆ©, G., LariviĆØre, V., & Gingras, Y. (2006). Benchmarking scientific output in the social sciences and humanities: The limits of existing databases, Scientometrics, 68(3), 329ā42. DOI: 10.1007/s11192-006-0115-z.
Aronova, E. (2021). Scientometrics with and without computers: The cold war transnational journeys of the science citation index. In M. Solovey & C. DayĆ© (Eds.), Cold War Social Science: Transnational entanglementsĀ (pp. 73ā98). Springer International Publishing.
Asubiaro, T. V. (2023). Variations in Web of Science and Scopus Journal Coverage, Visibility and Prestige between 2001 and 2020. arXiv. DOI: 10.48550/arXiv.2311.18165.
Asubiaro, T. V., & Onaolapo, S. (2023). A Comparative Study of the Coverage of African Journals in Web of Science, Scopus and CrossRef, Journal of the Association for Information Science and Technology. DOI: https://doi.org/10.1002/asi.24758.
Asubiaro, T., Onaolapo, S., & Mills, D. (2024). Regional disparities in Web of Science and Scopus journal coverage. Scientometrics,Ā 129(3): 1469-1491.
Baas, J., Schotten, M., Plume, A., CĆ“tĆ©, G., & Karimi, R. (2020). Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies, Quantitative Science Studies, 1(1), 377ā86. DOI: 10.1162/qss_a_00019.
Bhakuni, H., & Abimbola, S. (2021). Epistemic injustice in academic global health, The Lancet Global Health, 9(10), e1465ā70. DOI: 10.1016/S2214-109X(21)00301-6.
Biagioli, M., & Lippman, A. (2020). Gaming the Metrics: Misconduct and Manipulation in Academic Research. DOI: 10.7551/mitpress/11087.001.0001.
Birkle, C., Pendlebury, D. A., Schnell, J., & Adams, J. (2020). Web of Science as a data source for research on scientific and scholarly activity, Quantitative Science Studies, 1(1), 363ā76. DOI: 10.1162/qss_a_00018.
Callaham, M., Wears, R. L., & Weber, E. (2002). Journal prestige, publication bias, and other characteristics associated with citation of published studies in peer-reviewed journals, JAMA, 287(21), 2847ā50. DOI: 10.1001/jama.287.21.2847.
Chen, G., & Chan, L. (2021). University rankings and governance by metrics and algorithms. In E.Ā HazelkornĀ & G.Ā MihutĀ (eds),Ā Research Handbook on University Rankings (pp.Ā 425-443). Edward Elgar Publishing.
Cole, J. R., & Cole, S. (1971). Measuring the Quality of Sociological Research: Problems in the Use of the āScience Citation Indexā, The American Sociologist, 6, 23-29.
Csiszar, A. (2023). Provincializing Impact: From Imperial Anxiety to Algorithmic Universalism. Osiris, 38(1), 103-126.
Daling, D. (2023). āOn the ruins of serialityā: The scientific journal and the nature of the scientific life. Endeavour, 47(4), 100885.
Escobar, A. (2020). Pluriversal Politics: The Real and the Possible. Duke University Press.
Fyfe, A., Coate, K., Curry, S., Lawson, S., Moxham, N., & RĆøstvik, C. M. (2017). Untangling academic publishing: A history of the relationship between commercial interests, academic prestige and the circulation of research. St Andrews. https://doi.org/10.5281/zenodo.546100.
Gallagher, E. J., & Barnaby, D. P. (1998). Evidence of Methodologic Bias in the Derivation of the Science Citation Index Impact Factor, Annals of Emergency Medicine, 31(1), 83-86. DOI: 10.1016/S0196-0644(98)70286-0.
Garfield, E. (1955). Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas. Science, 122(3159), 108-111.
Garfield, E. (1963). "Science Citation Index." Science Citation Index 1961, 1, v-xvi.
Garfield, E. (1967). Current Contents: Ninth Anniversary. Essays of an information scientist: the informatorium, 1,Ā 12.
Garfield, E. (1970). Citation Indexing for Studying Science. Nature,Ā 227(5260), 669-671. DOI: 10.1038/227870b0.
Garfield, E. (1971). The mystery of the transposed journal listsāWherein Bradfordās Law of Scattering is generalized according to Garfieldās Law of Concentration. Current Comments, 222ā223.
Garfield, E. (1972). Citation Analysis as a Tool in Journal Evaluation. Science 178(4060), 471-479.
Garfield, E. (1979). Citation indexing: Its theory and application in science, technology, and humanities. New York, Wiley.
Garfield, E. (1983). Mapping science in the Third World. Science and Public Policy 10(3), 112-127.
Garfield, E. (1984). Iatrogenic information overload. Journal of Information Science, 8(1), 39-39.
Garfield, E. (1997). A statistically valid definition of bias is needed to determine whether the Science Citation Index discriminates against third world journals, Current Science, 73(8), 639-641.
Gibbs, W. W. (1995). Lost Science in the Third World, Scientific American, 273(2), 92-99. DOI: 10.1038/scientificamerican0895-92.
Gingras, Y., & Khelfaoui, M. (2018). Assessing the effect of the United StatesāĀ ācitation advantageāĀ on other countriesā scientific impact as measured in the Web of Science (WoS) database, Scientometrics, 114(2), 517ā32. DOI: 10.1007/s11192-017-2593-6.
Goodwin, J., & Garfield, E. (1980). Citation Indexing - Its Theory and Application in Science, Technology, and Humanities, Technology and Culture, 21(4), 714-715. DOI: 10.2307/3104125.
GuƩdon, J.-C. (2001). In Oldenburg's Long Shadow: Librarians, Research Scientists, Publishers, and the Control of Scientific Publishing. Association of Research Libraries.
Harzing, A.-W. (2016). Do Google Scholar, Scopus and the Web of Science speak your language?. Harzing.com. Retrieved August 9, 2022, from https://harzing.com/publications/white-papers/do-google-scholar-scopus-and-the-web-of-science-speak-your-language
Hoffman, N. (2018). The knowledge commons, pan-Africanism, and epistemic inequality: a study of CODESRIA. [PhD thesis, Faculty of Humanities, Rhodes University].
Horton, R. (2022). Offline: The scramble for science. The Lancet, 400(10357), 983. DOI: 10.1016/S0140-6736(22)01750-0.
Huang, C.-K. (Karl), Neylon, C., Brookes-Kenworthy, C., Hosking, R., Montgomery, L., Wilson, K., & Ozaygen, A. (2020). Comparison of bibliographic data sources: Implications for the robustness of university rankings. Quantitative Science Studies, 1(2), 445-478. DOI: 10.1162/qss_a_00031.
Luwel, M. (1999). Is the science citation index US-biased?. Scientometrics, 46(3), 549-562. DOI: 10.1007/BF02459611.
MartĆn-MartĆn, A., Orduna-Malea, E., & LĆ³pez-CĆ³zar, E. D. (2018). Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison. Scientometrics, 116(3), 2175-2188. DOI: 10.1007/s11192-018-2820-9.
Mas-Bleda, A., & Thelwall, M. (2016). Can alternative indicators overcome language biases in citation counts? A comparison of Spanish and UK research. Scientometrics, 109(3), 2007-2030. DOI: 10.1007/s11192-016-2118-8.
Mills, D. (2024). One index, two publishers and the global research economy. Oxford Review of Education, 1ā16. https://doi.org/10.1080/03054985.2024.2348448.
Mills, D., & Branford, A. (2022). Getting by in a bibliometric economy: scholarly publishing and academic credibility in the Nigerian academy. Africa, 92(5): 839-859.
Mills, D., & Robinson, N. (2021). Democratising Monograph Publishing or Preying on Researchers? Scholarly Recognition and Global āCredibility Economiesā. Science as Culture, 31(2), 187-211.
Mongeon, P., & Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: a comparative analysis. Scientometrics, 106(1), 213ā28. DOI: 10.1007/s11192-015-1765-5.
Moussa, S. (2021). Journal hijacking: Challenges and potential solutions. Learned PublishingĀ 34(4), 688-695.
Narin, F. (1976). Evaluative Bibliometrics: The Use of Publication and Citation Analysis in the Evaluation of Scientific Activity. Cherry Hill, NJ: Computer Horizons, Inc.Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā Ā
Nielsen, M. W., & Andersen, J. P. (2021). Global citation inequality is on the rise. Proceedings of the National Academy of Sciences, 118(7), e2012208118.
Nnaji, C. (2023). Editorial. Nigerian Journal of technology, 42(3), 304-305.
Nwagwu, W. E. (2006). Organising and monitoring research production and performance in Africa: towards an African Citation Index. Unpublished paper presented at Bridging the North-South Divide in Scholarly Communication on Africa: Threats and Opportunities in the Digital Era, Centre for African Studies, Leiden.Ā
Nyamnjoh, F. A. (2004). A Relevant Education for African DevelopmentāSome Epistemological Considerations. Africa Development, 29(1). https://doi.org/10.4314/ad.v29i1.22190.
Omobowale, A. O., Akanle, O., Adeniran, A. I., & Adegboyega, K. (2014). Peripheral scholarship and the context of foreign paid publishing in Nigeria. Current Sociology, 62(5), 666ā684. https://doi.org/10.1177/0011392113508127.
PranckutÄ, R. (2021). Web of Science (WoS) and Scopus: The titans of bibliographic information in todayās academic world. Publications, 9(1), 12.
Rabkin, Y. M., Eisemon, T. O., Lafitte-Houssat, J.-J., & Rathgeber, E. M. (1979). Citation Visibility of Africaās Science. Social Studies of Science, 9(4), 499ā506. Sage Publications, Ltd.
Rafols, I., Ciarli, T., & Chavarro, D. (2019). Under-reporting research relevant to local needs in the south. Database biases in rice research. In R. Arvanitis & D. OāBrien (Eds.), The Transformation of Research in the South: Policies and outcomesĀ (pp. 105ā110). Editions des archives contemporaines. https://eac.ac/articles/2080.
Rowlands, J., & Wright, S. (2022). The role of bibliometric research assessment in a global order of epistemic injustice: a case study of humanities research in Denmark. Critical Studies in Education, 63(5), 572-588. Routledge. DOI: 10.1080/17508487.2020.1792523.
Schott, T. (1991). The world scientific community: Globality and globalisation. Minerva, 29(4), 440-462. DOI: 10.1007/BF01113491.
Seglen, P. O. (1992). The skewness of science. Journal of the American Society for Information Science, 43(9), 628-638. DOI: 10.1002/(SICI)1097-4571(199210)43:9<628::AID-ASI5>3.0.CO;2-0.
Selten, F., Neylon, C., Huang, C.-K., & Groth, P. (2020). A longitudinal analysis of university rankings. Quantitative Science Studies, 1(3), 1109-1135. DOI: 10.1162/qss_a_00052.
Shearer, K., Chan, L., Kuchma, I., & Mounier, P. (2020). Fostering Bibliodiversity in Scholarly Communications: A Call for Action! http://doi.org/10.5281/zenodo.3752923.
Small, H. (2017). A tribute to Eugene Garfield: Information innovator and idealist. Journal of Informetrics, 11(3), 599-612.
Tijssen, R. J. W. (2007). Africaās contribution to the worldwide research literature: New analytical perspectives, trends, and performance indicators. Scientometrics, 71(2), 303-327. DOI: 10.1007/s11192-007-1658-3.
Vanderstraeten, R., & Vandermoere, F. (2021). Inequalities in the growth of Web of Science. Scientometrics, 126(10), 8635-8651. DOI: 10.1007/s11192-021-04143-2.
Velho, L. (1986). The āmeaningā of citation in the context of a scientifically peripheral country. Scientometrics, 9(1), 71ā89. DOI: 10.1007/BF02016609.
Vera-Baceta, M.-A., Thelwall, M., & Kousha, K. (2019). Web of Science and Scopus language coverage. Scientometrics, 121(3), 1803-1813. DOI: 10.1007/s11192-019-03264-z
Wouters, P. (1999). The citation cultureĀ (Doctoral dissertation, Universiteit van Amsterdam).
To cite this paper:
APA
Mills, D., & Asubiaro, T. (2024). Does the African academy need its own citation index? Global Africa, (7), pp. 115-125. https://doi.org/10.57832/18yw-xv96
MLA
Mills, D. & Asubiaro, T. "Does the African academy need its own citation index?". Global Africa, no. 7, 2024, p. 115-125. doi.org/10.57832/18yw-xv96
DOI
https://doi.org/10.57832/18yw-xv96
Ā© 2024 by author(s). This work is openly licensed via CC BY-NC 4.0