Saturday, December 21, 2024
spot_img
HomeOceaniaOceania Top 10.000 Scientists. AD Scientific Index 2021 Version 1

Oceania Top 10.000 Scientists. AD Scientific Index 2021 Version 1

Oceania Top 10.000 Scientists. “AD Scientific Index 2021 Version 1”

“AD Scientific Index” (Alper-Doger Scientific Index):

The AD Scientific Index (Alper-Doger Scientific Index), unlike other systems that provide evaluations of journals and universities, is a ranking and analysis system based on the scientific performance and the added value of the scientific productivity of individual scientists. Furthermore, it provides rankings of institutions based on the scientific characteristics of affiliated scientists.

This new index has been developed by Prof. Dr. Murat ALPER (MD) and Associate Prof. Dr. Cihan DÖĞER (MD) by using the total and last 5 years’ values of the i10 index, h-index, and citation scores in Google Scholar.  In addition, the ratio of the last 5 years’ value to the total value of the abovementioned indexes is used. Using a total of nine parameters, the “AD Scientific Index” shows the ranking of an individual scientist by 12 subject (Agriculture & Forestry, Arts, Design and Architecture, Business & Management, Economics & Econometrics, Education, Engineering & Technology, History, Philosophy, Theology, Law / Law and Legal Studies, Medical and Health Sciences, Natural Sciences, Social Sciences and Others), 256 branch, 11,940 institution of employment, 195 country, 11 region (Africa, Asia, Europe, North America, Latin America, Oceania, Arab Leageu, EECA, BRICS, USAN and COMESA) and in the world. The “AD Scientific Index” is the first and only study that shows the total and the last five-year productivity coefficients of scientists based on the h-index and i10 index scores and citations in Google Scholar. In other words, the “AD Scientific Index” provides both the academic ranking and analysis results.

One of the major differences of the AD Scientific Index is the provision of the last five years’ scores and the total scores of the h-index and the i10 index, and the total and last five years’ number of citations. Other unique differences of the AD Scientific Index include rankings in all fields and subjects of scientific interest and the emphasis on the scientific productivity of the scientist. Thus, scientists and universities can obtain their academic rankings and monitor developments in the ranking over time.

We are an independent organization that does not receive support from any institution, organization, country, or fund. Your contributions will shed light on us in our constant efforts while we are aiming the best for software improvements and data size and accuracy.

For a more detailed explanation, please refer to www.adscientificindex.com

Best regards.

AD Scientific Index Ltd.

Methodology

Ranking academic journals according to the impact factor is a practice that started many years ago. The need to access scientifically valuable studies within limited time frames or the need to find scientists working in a certain field has led to the procedure of ranking scientists and scientific studies. For this purpose, many scoring systems such as the h-index, i10 index, g-index, m-index, the Erdös number, tori index, riq index, and read-10 index have been studied as numerical indicators showing how productive and effective a researcher is. Each of these systems has many advantages as well as disadvantages. Of the abovementioned indexes, the most accepted one is the h-index. The h-index is determined based on the number of articles cited at least h times. In order to achieve a high h-index, an academician must have a high number of articles published and have received a high number of citations. For example, an h-index value of 15 indicates that the academician has received at least 15 citations to each of the 15 articles published. In order to increase the h-index value from 15 to 16, the same academician should receive at least 16 citations to the published 16 papers. To find the h-index value, several databases can be used including Google Scholar, Web of Science, Scopus, and Publons, some of which are public or require a subscription. In the calculation of h-indexes, such databases use different parameters including SCI-E or indexed journals or non-indexed auxiliary elements such as other journals, books, or patents. Because the set of parameters used by each database are different from those used by others, each database may calculate different h-index values. Therefore, h-indexes calculated by each of the Google Scholar, Web of Science, Scopus, and Publons databases may be different for the same researcher. For example, a researcher, who has authored several books more than scientific papers, may receive a low h-index score in the Web of Science despite a high number of citations received. Neither of these indexes is equivalent to the other because of differences in their scopes.

Having a large number of publications indicates that the researcher is productive, but data alone may not be the actual indicator of the success of the researcher. For example, a researcher may have 10 publications that have received 400 citations. We can argue that this researcher is more successful than a researcher having more than a hundred published papers that received, let’s say, 200 citations. Besides, some valuable studies may not have been attributed the actual value they deserved because of various reasons such as the failure of the use of adequate methods that would enable easy accessibility through scientific channels. The high number of the use of papers as references by other authors shows the value and extent of contribution to the scientific literature.

The i10-index is another academic scoring system, in which the scores are calculated by Google Scholar. In this scoring system, only scientific studies such as articles and books that have received 10 or more citations are taken into consideration. The number of studies that have been cited ten or more times yields the i10-index value. The i10 index and the h-index values calculated for the last five years do not show that the article was written and published in the last 5 years. Instead, these values show the citation power in the last 5 years, indicating whether the paper is still effective.

Google Scholar provides both the total values of the i10-index, the h-index, and citation numbers along with the last 5 years’ values through a system based on the voluntariness principle. In this system, scientists create their accounts, select their papers, and upload the selected papers onto the system. This service does not require a password and is free of charge. Here, we introduce a newly developed index that we have developed based on the public Google Scholar profiles of scientists. We named this new system the “AD Scientific Index”, which we have developed through robust intellectual infrastructure and maximum efforts aiming to contribute to global scientific efforts.

Why is the “AD Scientific Index” needed?

The “AD Scientific Index” is the first and only study that shows the total and the last five-year productivity coefficients of scientists based on h-index and i10 index scores and citations in Google Scholar. Furthermore, the index provides the ranking and assessment of scientists in academic subjects and branches and in 11,940 universities, 195 countries, regions, and the world. In other words, the “AD Scientific Index” provides both the ranking and analysis results.

“AD Scientific Index” (Alper-Doger Scientific Index):

This new index has been developed by Prof. Dr. Murat ALPER (MD) and Associate Prof. Dr. Cihan DÖĞER (MD) by using the total and last 5 years’ values of the i10 index, h-index, and citation scores in Google Scholar. In addition, the ratio of the last 5 years’ value to the total value of the abovementioned indexes is used. Using a total of nine parameters, the “AD Scientific Index” shows the ranking of an individual scientist by 12 subjects (Agriculture & ForestryArts, Design and ArchitectureBusiness & ManagementEconomics & EconometricsEducationEngineering & TechnologyHistory, Philosophy, TheologyLaw / Law and Legal StudiesMedical and Health SciencesNatural SciencesSocial Sciences, and Others), 256 branches, 11,940 institutions of employment, 195 countries, 11 regions (AfricaAsiaEuropeNorth AmericaLatin AmericaOceaniaArab LeageuEECABRICSUSAN, and COMESA), and in the world. Thus, scientists can obtain their academic rankings and monitor developments in the ranking over time.

Data Collection and Standardization:

Collecting data manually based on the ranking from Google Scholar, the profiles with up to 300 citations and verified addresses or the profiles that build confidence for their accuracy are listed primarily. Thus, it is aimed to standardize the names, institutions, and branches as much as possible. Non-standardized data including wide ranges of variations in the information and the use of abbreviations and a variety of languages have caused difficulties. Performing data mining and scrutinizing the acquired information, many profiles were excluded from the index. Furthermore, some of the profiles were excluded during the regular examination of the data onward. Data cleaning requires a regular process in place to be conducted meticulously. We welcome your contributions in data cleaning and ensuring accuracy.

Determining the subjects/departments, to which scientific fields would belong, may seem easy in some branches and in a variety of countries. However, it may create considerable confusion in some other countries, regions, and schools. We would like to emphasize that the following fields including Engineering, Natural and Environmental Sciences, Biology, and Biochemistry, Material Science, Chemistry, and Social Sciences may exist in quite variable spectrums in different countries. Therefore, we would like to stress that the standardization of subjects and branches has not been easy. To perform standardizations, we accepted the official names of the institutions and academic branches as accurate in the way that they were specified on the university website. We have developed this strategy in order to standardize this complex situation at least partially. Furthermore, we started a procedure to add an asterisk as “*” at the end of the names of the authors when a scientific paper of interest included many authors such as CERN’s scientific papers.

Ranking Criteria:

Ranking of scientists by the university, country, region, and in the world was performed based on the “total h-index”. The “total h-index” was used in rankings by the branch and the subbranch.

The ranking criteria based on the “total h-index” scores were used in the following order: Firstly, the “total h-index” scores; secondly, the total number of citations; and thirdly, the “total i10 index” scores (1. Total h-index scores, 2. Total number of citations, 3. Total i10 index scores).

Ranking based on the last 5 years’ h-index scores was performed using criteria in the following order: 1. Last 5 years’ h-index scores, 2. Total number of citations in the last 5 years, 3. Last 5 years’ total i-10 index scores).

The ranking criteria for the total i10 index were used in the following order: 1. Total i10 index scores, 2. Total h-index scores, and 3. Total number of citations.

Ranking based on the last 5 years’ i10 index scores was performed using the criteria in the following order: 1. Last 5 years’ i10 index scores, 2. Last 5 years’ h-index scores, 3. Total number of citations in the last 5 years.

Ranking based on the total number of citations was performed using the criteria in the following order: 1. Total number of citations, 2. Total h-index scores, 3. Total i10 index scores.

Ranking based on the total number of citations in the last 5 years was performed using the criteria in the following order: 1: Total number of citations in the last 5 years, 2. Last 5 years’ h-index scores, 3: Last 5 years’ i10 index scores.

Why are the last 5 years’ ratios / total ratios important?

The h-index, i10 index, and the last 5-year ratios/total ratios are major unique characteristics of the AD Scientific Index, showing both the development in the individual performance of the scientist and the reflections of the institutional policies of universities onto the overall scientific picture.

Academic collaboration

Scientific fields of interest specified in the profiles of scientists are available for other scientists from different countries and institutions to enable academic collaboration.

 Ranking Criteria for Top Universities:

In the presence of many different university ranking systems, as the “AD Scientific Index”, we have developed a ranking system with a different methodology based on the principle of including only meritorious scientists. Based on Google Scholar’s total h-index scores, we have listed all academicians, who are ranked in the world in the top 10,000 and top 100,000 in university rankings. Furthermore, we have listed the breakdown of this ranking by main subjects. As the order of ranking principles, we used the overall top 10,000 scientists list primarily. Secondly, we used the ranking in the top 100,000 scientists list. Thirdly, the total number of scientists in the AD Scientific Index was ranked by the university. In the case of equalities within a university ranking, we used the highest rank of the scientist in the respective university as it is listed in the world ranking.

You may sort the ranking from the highest score to the lowest or vice versa in any of these fields. You can observe the fields, which move the respective university to the forefront. Furthermore, the name of the academician with the highest total h-index in the respective university is displayed with the world ranking. Top University Ranking by “AD Scientific Index” will not only list the areas, where a university is the best or has room for improvement, but also reflect the outcomes of scientist policies of the institutions. This report reveals the competency of institutions to attract prized scientists and the ability of institutions to encourage advances and retain scientists.

Ranking Criteria for Countries:

As described in the university ranking section, it is not easy to obtain and standardize data from about 12,000 universities for the country ranking. Therefore, we based our ranking system on the number of meritorious scientists. Four criteria are used to rank the countries. The first one is the number of scientists in the top 10,000 list. The second criterion is the number of scientists in the top 100,000 list. The third one is the number of scientists listed in the AD Scientific Index. In the case of equalities after applying all these three criteria, the world rank of the meritorious scientist of that country is used.

World Top 100 Scientists 2021

The ranking of “Top 100” scientists is based on total h-index scores. Top 100 scientists can be ranked globally or specific to the following regions including Africa, Asia, Europe, North America, Latin America, Oceania, Arab League, EECA, BRICS, USAN, and COMESA based on total h-index scores without any breakdown by subject areas. Top 100 rankings in the world, in a continent, or a region include standardized subjects areas of Agriculture & Forestry, Arts, Design and Architecture, Business & Management, Economics & Econometrics, Education, Engineering & Technology, History, Philosophy, Theology, Law/Law and Legal Studies, Medical and Health Sciences, Natural Sciences, and Social Sciences. Subjects indicated as “others” will not be included in the ranking by regions and subjects. Therefore, you may wish to specify your subject and branch and contribute in order to standardize your performance. Determining the subjects/departments, to which scientific fields would belong, may seem easy in some branches and in a variety of countries. However, it may create considerable confusion in some other countries, regions, and schools. We would like to emphasize that the following fields including Engineering, Natural and Environmental Sciences, Biology, Biochemistry, Material Science, Biotechnology, Chemistry, and Social Sciences may exist in quite variable spectrums in different countries. Therefore, we would like to stress that the standardization of subjects and branches has not been easy. To perform standardizations, we accepted the official names of the institutions and academic branches as accurate in the way that they were specified on the university website. We have developed this strategy in order to standardize this complex situation at least partially. Furthermore, we started a procedure to add an asterisk as “*” at the end of the names of the authors when a scientific paper of interest included many authors such as CERN’s scientific papers.

Limitations of the “AD Scientific Index”:

Missing or Inaccurate Profiles or Missing Institution Names

This index is a comparative platform developed by ranking accessible and verified profiles. First and foremost, not being included in this index for various reasons does not indicate that the academician is not prized or it does not mean that only those academicians listed in the index are the prized ones. This needs to be carefully noted. A meritorious scientist may not have been included in this index because of not having a Google Scholar profile or our lack of access to that profile for various reasons. The unavailability of verified Google Scholar profiles of scientists, who work in well-known and respected academic institutions in respective countries, may prevent us from finding institutions and scientist profiles. Because updating the profiles in the system and collection of data from open sources require efforts and because the data have been collected for the first time, it is not possible for the index to be completely free of errors. Accurate and instant updating of profiles and institution names requires an endless workload that no institution can overcome only with available resources despite all endeavors.

A high h-index (WOS, Scopus, Publon, etc.) does not mean that a profile will be automatically created for the academician in Google Scholar. Indeed, Google Scholar profiles are created and made public by scientists themselves on a voluntary basis. An individual may not have created a profile for various reasons and, therefore, will not be listed in the “AD Scientific Index”. Furthermore, a profile can be rejected or may not be listed at a particular time.  It needs to be considered that, at the time of our search, a profile may not exist or may not be public, some profiles may be public only at particular times, the information in the profile may not be standard, there may be more than one profile belonging to the same person, the profiles may not be verified, the name of the institution can be missing, surnames or institution names can change, profile owners may have died, or known or unforeseen problems may happen. However, missing information is completed in the system regularly and the list is updated and corrected continuously. Profiles; whose owners have passed away, are removed from the system.

When we detect or be informed of unethical situations in profile information that go beyond the limits of goodwill, the person is excluded from the list. You can report problematic and misleading profiles on our “Rejection List” page. As individuals are responsible for the accuracy of their profiles, organizations, too, should include the need for reviewing academic staff profiles in the agenda.

Articles with thousands of authors such as CERN studies in the field of physics or scientific studies with more than one author in classification studies in medicine or statistical studies raise debates about the requirements for the amount of the article content belonging to one author. Because such papers may cause inequality of opportunity, a separate grouping system may be needed in the future.

Pros and cons of “ranking” systems including Web of Science, Scopus, Google Scholar, and similar others are well known and the limits of use of such systems have long been recognized in the scientific community. Therefore, interpreting this study beyond these limits may lead to incorrect results. The “AD Scientific Index” needs to be evaluated considering all of the abovementioned potential limitations.

Comparisons of Ranking Systems

In addition to ranking lists of scientists, consisting of many tables and charts of trends analyses to be delivered for the first time, this comprehensive system offers several data and analysis results that will importantly provide an added value to branches and institutions within the limits of inherent advantages and limitations. We would like to kindly emphasize that comparisons should not be performed between two branches, either of which having different potentials to produce scientific papers. For example, it is not correct to expect the same number of articles from completely different branches such as law, social sciences, music, physics, or biochemistry. Ranking comparisons should not overlook the inherent potentials of branches to produce publications. For this reason, we try to primarily involve observations within the same subject/department and recent productivity.

Could this work have been designed in another way?

It is not possible to exactly measure the research capacity of a university or scientist by using a few parameters. Assessments should include many other types of data such as patents, research funds, incentives, published books, tutoring intensity, congress presentations, and graduate and doctoral teaching positions. As a frequently voiced criticism, we have been asked why the Web of Science h-index is not used. Since it is not possible to have access to the entire data covering all academic components such as the h-indexes of the Web of Science, Scopus, or Publons, etc., or the organizations, patents, awards, etc. Therefore, only available qualified data have been included.

What’s in the “AD Scientific Index”?

The “AD Scientific Index” includes nearly 200 countries, more than 11,900 universities, and nearly a million academicians.

The “AD Scientific Index” provides data from all over the world. Besides providing data globally, the system presents data by continents including Africa, Asia, Europe, North America, Latin America, and Oceania. In addition, data are presented by countries and country groups including Arab LeagueEECABRICSUSAN, and COMESA. When the country flag in the list is clicked, researchers ranked in the first 50 in that country are listed. When the name of the university is clicked, researchers ranked in the first 50 in that university are listed. In addition, it is possible to select a specialty area from the list below to see the ranking specific to the selected field of research.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

spot_img

Most Popular

Recent Comments