Limitations of the “AD Scientific Index”: Missing or Inaccurate Profiles or Missing Institution Names
This index is a comparative platform developed by ranking accessible and verified profiles. First and foremost, not being included in this index for various reasons does not indicate that the academician is not prized or it does not mean that only those academicians listed in the index are the prized ones. This needs to be carefully noted. A meritorious scientist may not have been included in this index because of not having a Google Scholar profile or our lack of access to that profile for various reasons. The unavailability of verified Google Scholar profiles of scientists, who work in well-known and respected academic institutions in respective countries, may prevent us from finding institutions and scientist profiles. Because updating the profiles in the system and collection of data from open sources require efforts and because the data have been collected for the first time, it is not possible for the index to be completely free of errors. Accurate and instant updating of profiles and institution names requires an endless workload that no institution can overcome only with available resources despite all endeavors. A high h-index (WOS, Scopus, Publon, etc.) does not mean that a profile will be automatically created for the academician in
Google Scholar. Indeed, Google Scholar profiles are created and made public by scientists themselves on a voluntary basis. An individual may not have created a profile for various reasons and, therefore, will not be listed in the “AD Scientific Index”.
Furthermore, a profile can be rejected or may not be listed at a particular time. It needs to be considered that, at the time of our search, a profile may not exist or may not be public, some profiles may be public only at particular times, the information in the
profile may not be standard, there may be more than one profile belonging to the same person, the profiles may not be verified, the name of the institution can be missing, surnames or institution names can change, profile owners may have died, or known or
unforeseen problems may happen. However, missing information is completed in the system regularly and the list is updated and corrected continuously. Profiles; whose owners have passed away, are removed from the system.
When we detect or be informed of unethical situations in profile information that go beyond the limits of goodwill, the person
is excluded from the list. You can report problematic and misleading profiles on our “Rejection List” page. As individuals are
responsible for the accuracy of their profiles, organizations, too, should include the need for reviewing academic staff profiles in the
agenda.
Articles with thousands of authors such as CERN studies in the field of physics or scientific studies with more than one author in
classification studies in medicine or statistical studies raise debates about the requirements for the amount of the article content belonging
to one author. Because such papers may cause inequality of opportunity, a separate grouping system may be needed in the future.
Pros and cons of “ranking” systems including Web of Science, Scopus, Google Scholar, and similar others are well known and the
limits of use of such systems have long been recognized in the scientific community. Therefore, interpreting this study beyond these limits
may lead to incorrect results. The “AD Scientific Index” needs to be evaluated considering all of the abovementioned potential limitations.
Comparisons of Ranking Systems
In addition to ranking lists of scientists, consisting of many tables and charts of trends analyses to be delivered for the first
time, this comprehensive system offers several data and analysis results that will importantly provide an added value to branches
and institutions within the limits of inherent advantages and limitations. We would like to kindly emphasize that comparisons should
not be performed between two branches, either of which having different potentials to produce scientific papers. For example, it is
not correct to expect the same number of articles from completely different branches such as law, social sciences, music, physics, or
biochemistry. Ranking comparisons should not overlook the inherent potentials of branches to produce publications. For this reason,
we try to primarily involve observations within the same subject/department and recent productivity.
Through the contribution of many scientists from different fields, the “AD Scientific Index” undergoes systematic updates with
the aim of continuous improvement. The index is an independent institution and does not receive any support from any institutions,
organizations, countries, or funds. Concurrently with the continuous increase in the number of universities and scientists registered
to the Index, we are improving methodology, software, data accuracy, and data cleaning procedures every day through the
contributions of a large team. Your remarks and contributions about our shortcomings will shed light to lead our efforts for continuous
improvement.
Could this work have been designed in another way?
It is not possible to exactly measure the research capacity of a university or scientist by using a few parameters. Assessments
should include many other types of data such as patents, research funds, incentives, published books, tutoring intensity, congress
presentations, and graduate and doctoral teaching positions. As a frequently voiced criticism, we have been asked why the Web of
Science h-index is not used. Since it is not possible to have access to the entire data covering all academic components such as the
h-indexes of the Web of Science, Scopus, or Publons, etc., or the organizations, patents, awards, etc. Therefore, only available
qualified data have been included.
What’s in the “AD Scientific Index”?
The “AD Scientific Index” includes nearly 210 countries, more than 13,600 universities, and nearly a million academicians.
The “AD Scientific Index” provides data from all over the world. Besides providing data globally, the system presents data by
continents including Africa, Asia, Europe, North America, Latin America, and Oceania. In addition, data are presented by countries
and country groups including Arab League, EECA, BRICS, Latin America, and COMESA. When the country flag in the list is clicked,
researchers ranked in the first 50 in that country are listed. When the name of the university is clicked, researchers ranked in the
first 50 in that university are listed. In addition, it is possible to select a specialty area from the list below to see the ranking specific
to the selected field of research (squared area 1 on Figure 1).
Asia Top 10.000 Scientists AD Scientific Index – 2021 Version 1.1