Subscribe to our newsletter to receive the latest on research and innovation from around the world.Subscribe
The data used in the analysis and selection of the new Highly Cited Researchers came from Essential Science Indicators, 2004-2014, which then included 128,887 Highly Cited Papers. Each of these papers ranked in the top 1% by total citations according to their Essential Science Indicators field assignment and year of publication. For more information on the identification of Highly Cited Papers in Essential Science Indicators, see the Essential Science Indicators help file at Essential Science Indicators.
Essential Science Indicators surveys the Science Citation Index Expanded and Social Sciences Citation Index components of the Web of Science, meaning journal articles in the sciences and social sciences. The analysis is further limited to items indexed as articles or reviews only, and does not include letters to the editor, correction notices, and other marginalia.
In Essential Science Indicators, all papers, including Highly Cited Papers, are assigned to one of 22 broad fields (the 22nd is Multidisciplinary, on which see below). Each journal in Essential Science Indicators is assigned to only one field and papers appearing in that title are similarly assigned. In the case of multidisciplinary journals such as Science, Nature, Proceedings of the National Academy of Sciences of the USA, and others, however, a special analysis is undertaken. Each article in such publications is individually reviewed, including an examination of the journals cited in its references. The paper is then reclassified to the most frequently occurring category represented by the article's cited references. For more information about this reclassification process, see our article at Classification of Papers in Multidisciplinary Journal.
A ranking of author names in each Essential Science Indicators category by number of Highly Cited Papers produced during 2004-2014 determined the identification and selection of our new list of highly cited researchers. We used algorithmic analysis to help distinguish between individuals with the same name or name form (surname and initials). In instances where any ambiguity remained, manual inspection was needed. This entailed searching for papers by author surname and one or multiple initials, ordering them chronologically, visually inspecting each (noting journal of publication, research topic or theme, institutional addresses, co-authorships, and other attributes), and deciding which ones could be attributed to a specific individual. As noted in the FAQ section, we examined original papers, if necessary, as well the websites of researchers themselves and their curricula vitae. This was often required if a researcher changed institutional affiliations several times during the period surveyed.
Once the data on Highly Cited Papers within an Essential Science Indicators field were verified and assigned to specific individuals, the authors in the field were ranked by number of Highly Cited Papers. To determine how many researchers to select for inclusion in the new list, we considered the size of each Essential Science Indicators field in terms of number of authors (as a proxy for population) represented on the Highly Cited Papers for the field. The Essential Science Indicators fields are of very different sizes, the result of the definition used for the field which includes the number of journals assigned to that field. Clinical Medicine, for example, makes up some 18.2% of the content of Essential Science Indicators while Economics and Business, Immunology, Microbiology, and Space Science (Astronomy and Astrophysics) account for 1.8%, 1.8%, 1.4%, and 1.1%, respectively. For each Essential Science Indicators field, author names (before use of the disambiguation algorithm and therefore not disambiguated) were counted, and then the square route of that number was calculated. That number was used to decide approximately how many researchers to include in each Essential Science Indicators field. From the list of authors in a field ranked by number of Highly Cited Papers, the number of papers at the rank represented by the square root score determined the threshold number of Highly Cited Papers required for inclusion. If an author had one fewer Highly Cited Paper than this threshold, but whose citations to their Highly Cited Papers were sufficient to rank them in the top 50% by citations among those with Highly Cited Papers at or above the threshold, these individuals were also selected. In addition, citations to an individual’s Highly Cited Papers had to meet or exceed the threshold for total citations used in the 2004-2014 version of Essential Science Indicators for including a researcher in the top 1% (highly cited list) for an Essential Science Indicators field.
In a few fields, such as chemistry, engineering, and materials science, there are many Chinese names that appear on Highly Cited Papers. These name forms (especially surname and initials) often represent multiple researchers. Manual inspection often results in removal of a name since none of the individuals represented by the name form qualify for selection. This adjustment occurs so frequently in certain fields that there are significantly fewer researchers than the square root number who have published the threshold number of Highly Cited Papers determined from analysis of the raw data. In these cases, the threshold number of Highly Cited Papers in a field is reduced until the square root number of disambiguated researchers is obtained. For example, before this procedure the required number of Highly Cited Papers in Chemistry, 2004-2014, was 17 but after disambiguation and removal of false positives the threshold number was 14.
The methodology described above was applied to all Essential Science Indicators fields with the exception of Physics. The relatively large number of Highly Cited Papers in Physics dealing with high-energy experiments typically carried hundreds of author names. Using the whole counting method produced a list of high-energy physicists only and excluded those working in other subfields. It was decided to eliminate from consideration any paper with more than 30 institutional addresses in the Physics category. This removed the problem of overweighting to high-energy physics.
Finally, we excluded retracted articles in our final analysis of Highly Cited Papers. Also, researchers found to have committed scientific misconduct in formal proceedings conducted by a researcher’s institution, a government agency, a funder, or publisher were excluded from our list of Highly Cited Researchers.
The final new list contains about 3,000 Highly Cited Researchers in 21 fields of the sciences and social sciences.