University, Subject, Country, Region, World

Methodology

Ranking academic journals according to the impact factor is a practice that started many years ago. The need to access scientifically valuable studies within limited time frames or the need to find scientists working in a certain field has led to the procedure of ranking scientists and scientific studies. For this purpose, many scoring systems such as the h-index, i10 index, g-index, m-index, the Erdös number, tori index, riq index, and read-10 index have been studied as numerical indicators showing how productive and effective a researcher is. Each of these systems has many advantages as well as disadvantages. Of the abovementioned indexes, the most accepted one is the h-index. The h-index is determined based on the number of articles cited at least h times. In order to achieve a high h-index, an academician must have a high number of articles published and have received a high number of citations. For example, an h-index value of 15 indicates that the academician has received at least 15 citations to each of the 15 articles published. In order to increase the h-index value from 15 to 16, the same academician should receive at least 16 citations to the published 16 papers. To find the h-index value, several databases can be used including Google Scholar, Web of Science, Scopus, and Publons, some of which are public or require a subscription. In the calculation of h-indexes, such databases use different parameters including SCI-E or indexed journals or non-indexed auxiliary elements such as other journals, books, or patents. Because the set of parameters used by each database are different from those used by others, each database may calculate different h-index values. Therefore, h-indexes calculated by each of the Google Scholar, Web of Science, Scopus, and Publons databases may be different for the same researcher. For example, a researcher, who has authored several books more than scientific papers, may receive a low h-index score in the Web of Science despite a high number of citations received. Neither of these indexes is equivalent to the other because of differences in their scopes.

Having a large number of publications indicates that the researcher is productive, but data alone may not be the actual indicator of the success of the researcher. For example, a researcher may have 10 publications that have received 400 citations. We can argue that this researcher is more successful than a researcher having more than a hundred published papers that received, let’s say, 200 citations. Besides, some valuable studies may not have been attributed the actual value they deserved because of various reasons such as the failure of the use of adequate methods that would enable easy accessibility through scientific channels. The high number of the use of papers as references by other authors shows the value and extent of contribution to the scientific literature.

The i10-index is another academic scoring system, in which the scores are calculated by Google Scholar. In this scoring system, only scientific studies such as articles and books that have received 10 or more citations are taken into consideration. The number of studies that have been cited ten or more times yields the i10-index value. The i10 index and the h-index values calculated for the last five years do not show that the article was written and published in the last 5 years. Instead, these values show the citation power in the last 5 years, indicating whether the paper is still effective.

Google Scholar provides both the total values of the i10-index, the h-index, and citation numbers along with the last 5 years’ values through a system based on the voluntariness principle. In this system, scientists create their accounts, select their papers, and upload the selected papers onto the system. This service does not require a password and is free of charge. Here, we introduce a newly developed index that we have developed based on the public Google Scholar profiles of scientists. We named this new system the “AD Scientific Index”, which we have developed through robust intellectual infrastructure and maximum efforts aiming to contribute to global scientific efforts.

Why is the “AD Scientific Index” needed?

The “AD Scientific Index” is the first and only study that shows the total and the last five-year productivity coefficients of scientists based on h-index and i10 index scores and citations in Google Scholar. Furthermore, the index provides the ranking and assessment of scientists in academic subjects and branches and in 19,500 universities, 216 countries, regions, and the world. In other words, the “AD Scientific Index” provides both the ranking and analysis results. Besides the indexing and ranking functions, AD Scientific Index sheds life on academic lives and offers the user the opportunity to perform an efficient academic analysis to scrutinize and detect faulty and unethical profiles, plagiarism, forgery, distortion, duplications, fabrication, slicing, salamization, unfair authorship, and several manifestations of academic mobbing. Such analyses also help reveal the medium and long-term results of several policies implemented by institutions, including those of the academic staff employment and retention policies, wage policies, academic incentives, and scientific working environments. 

“AD Scientific Index” (Alper-Doger Scientific Index):

This new index has been developed by Prof. Dr. Murat ALPER (MD) and Associate Prof. Dr. Cihan DÖĞER (MD) by using the total and last 5 years’ values of the i10 index, h-index, and citation scores in Google Scholar. In addition, the ratio of the last 5 years’ value to the total value of the abovementioned indexes is used. Using a total of nine parameters, the “AD Scientific Index” shows the ranking of an individual scientist by 12 subjects (Agriculture & ForestryArts, Design and ArchitectureBusiness & ManagementEconomics & EconometricsEducationEngineering & TechnologyHistory, Philosophy, TheologyLaw / Law and Legal StudiesMedical and Health SciencesNatural SciencesSocial Sciences, and Others), 256 branches, 19,500 institutions of employment, 216 countries, 10 regions (AfricaAsiaEuropeNorth AmericaOceaniaArab LeageuEECABRICSLatin America, and COMESA), and in the world. Thus, scientists can obtain their academic rankings and monitor developments in the ranking over time.  

Data Collection and Standardization:

Collecting data manually based on the ranking from Google Scholar, the profiles with up to 300 citations and verified addresses or the profiles that build confidence for their accuracy are listed primarily. Thus, it is aimed to standardize the names, institutions, and branches as much as possible. Non-standardized data including wide ranges of variations in the information and the use of abbreviations and a variety of languages have caused difficulties. Performing data mining and scrutinizing the acquired information, many profiles were excluded from the index. Furthermore, some of the profiles were excluded during the regular examination of the data onward. Data cleaning requires a regular process in place to be conducted meticulously. We welcome your contributions in data cleaning and ensuring accuracy.

Determining the subjects/departments, to which scientific fields would belong, may seem easy in some branches and in a variety of countries. However, it may create considerable confusion in some other countries, regions, and schools. We would like to emphasize that the following fields including Engineering, Natural and Environmental Sciences, Biology, and Biochemistry, Material Science, Chemistry, and Social Sciences may exist in quite variable spectrums in different countries. Therefore, we would like to stress that the standardization of subjects and branches has not been easy. To perform standardizations, we accepted the official names of the institutions and academic branches as accurate in the way that they were specified on the university website. We have developed this strategy in order to standardize this complex situation at least partially.

The number of universities in the countries and the number of academicians in universities are increasing gradually, free of charge, within our means. The current list of registered academicians includes 1.090.000  individuals. The frequent updates will be limited to red list and death reductions, to new profile registrations from our institution, and new individual and corporate registrations.

 Although various additions, corrections, and improvements are made every day, it should be kindly considered that this process requires devoted efforts and time.

Importantly, one should remember that taking 300 citations as the lower limit for inclusion in the index brings up the potential of exclusion because of variations across different H-index values. We are going to spend our best efforts to respond to e-mails, which question the justification for not being included in the list despite high H-index values. Examples could be such that an academician with an H-index value of 1 with 300 citations may be included in the ranking, while another with an H-index value of 5 with 30 citations or an academic with an H-index of 10 with 120 citations may be excluded. Such issues require regular updates and it may take a considerably long time to update rankings because of the limits of the workload. We will be answering to such e-mails within our means with our best efforts while we continue to expand our workforce. It is possible to accelerate this period through individual and corporate registration options. 

As a non-funded and independent organization, we have achieved become the most comprehensive index with 19,500 universities and institutions and about 1,100,000 scientists with registrations free of charge. Currently, we do not aim to include all scientists in the world in the system; however, we aim to maintain our expansion policy with the data we have already collected. This will take a certain period, which is needed for the maintenance of data cleanliness and the promotion of reliability. We would like to emphasize that we have a transparent methodology, which allows academicians and scientists to know their scores free of charge and their rankings in line with other systems such as H-index and citation numbers. Therefore, registration to our index is not mandatory. Through generous and challenging efforts, we have achieved the sufficient capacity to represent the institutions adequately. Currently, the way to be added to our database is to register as an individual or a corporate entity through the "ADD / EDIT PROFILE" link.

Studies that influence the order of ranking because of a high number of citations received, in a manner similar to CERN:

We started a procedure to add an asterisk as “i” at the end of the names of the authors when a scientific paper of interest included many authors such as CERN, ATLAS, ALICE, CMS, Statistical Data, Guideline, Updates etc. scientific papers. We think that new criteria will be defined to be implemented for such studies. Until further criteria are described, we marked such studies with a “i” sign.

Profile information and ethical responsibility:

The ethical responsibility for the correct profile information rests entirely with the relevant scientist. However, we think that it would be prudent for institutions, countries, and even branch associations to conduct periodic reviews of scientist profiles affiliated to the respective organization since misleading information may compromise the reputation of the organization or the country. Organizations should also review profiles to identify and report scientists, who are not affiliated with the respective institution. In order to avoid any compromise to the institutional reputation, institutions should take necessary corrective and preventive actions against published scientist profiles arranged unethically.

Data Cleaning and the Redlist

Data cleaning is a dynamic process that we systemically perform continuously. Despite all our best efforts, we may not be completely accurate and we welcome your contributions to the redlist notifications. Rarely, some scientists are included in the redlist due to innocent mistakes with good intentions and no unethical behavior. Most errors result from inadequate periodic profile checks. However, the correction of such an error is easy through the submission of a correction request. In order to avoid such an undesirable situation to occur, scientists should regularly check their profiles and institutions should review the profiles of the staff systematically. Use [email protected] to report an inappropriate profile, death, or other conditions that would require the deletion of the relevant profile.

Ranking Criteria:

Ranking of scientists by the university, country, region, and in the world was performed based on the “total h-index”. The “total h-index” was used in rankings by the branch and the subbranch.

The ranking criteria based on the “total h-index” scores were used in the following order: Firstly, the “total h-index” scores; secondly, the total number of citations; and thirdly, the “total i10 index” scores (1. Total h-index scores, 2. Total number of citations, 3. Total i10 index scores, 4. Last 5 years’ h-index scores).

Ranking based on the last 5 years’ h-index scores was performed using criteria in the following order: 1. Last 5 years’ h-index scores, 2. Number of citations in the last 5 years, 3. Last 5 years’ i10 index scores, 4- Total h-index scores. 

The ranking criteria for the total i10 index were used in the following order: 1. Total i10 index scores, 2. Total h-index scores, 3. Total number of citations, and 4. Last 5 years’ i10 index scores.

Ranking based on the last 5 years’ i10 index scores was performed using the criteria in the following order: 1. Last 5 years’ i10 index scores, 2. Last 5 years’ h-index scores, 3. Number of citations in the last 5 years and 4. Total i10 index scores.

Ranking based on the total number of citations was performed using the criteria in the following order: 1. Total number of citations, 2. Total h-index scores, 3. Total i10 index scores and 4. Number of citations in the last 5 years.

Ranking based on the total number of citations in the last 5 years was performed using the criteria in the following order: 1: Number of citations in the last 5 years, 2. Last 5 years’ h-index scores, 3: Last 5 years’ i10 index scores and 4. Total number of citations

Productivity Rankings

Productivity Rankings is a unique service offered only by “AD Scientific Index”. This is a ranking system derived from the i10 index in order to show the productivity of the scientist in publishing scientific articles of value. Productivity Rankings is an instrument that lists productive scientists in a given area, discipline, university, and country and can guide the development of meaningful incentives and academic policies. The world rankings, regional rankings, and university rankings of scientists in this table are developed based on the total i10 index.

Why are the last 5 years’ ratios / total ratios important?

The h-index, i10 index, and citation the last 5-year ratios/total ratios are major unique characteristics of the AD Scientific Index, showing both the development in the individual performance of the scientist and the reflections of the institutional policies of universities onto the overall scientific picture.

Ranking Criteria for Universities:

For global university rankings, ranking organizations use the following parameters including the quality of education, employment rates of graduates, the quality of faculties within an individual university, international collaborations, the number of graduates and staff awarded with Nobel Prizes and Fields Medals, the number of highly cited researchers selected by Clarivate Analytics, the total number of research papers, the number of articles published by Nature and Science journals, the number of articles indexed in Science Citation Index-Expanded (SCIE) and Social Science Citation Index (SSCI), and the number of research articles receiving a high number of citations. Each ranking organization develops a ranking methodology, which attributes different % weights to selected items out of such parameters. Experienced ranking organizations evaluate 2000 - 3000 universities for ranking.

AD Scientific Index performs rankings using a single parameter, which is the number of "Valued and Productive Scientists" staffed by a specific university. Selected after years of observation, this parameter is calculated using the total H-index and i10-index values together with the number of citations, and the last 5 years’ total H-index and i10-index values along with the number of citations received over the last 5 years. We rank more than 19.300 universities using this method. A careful examination will reveal that most of the other parameters are representations of natural academic products of "Valued and Productive Scientists." Institutions that employ a high number of "Valued and Productive Scientists", for example, scientists in the first Top 2%, Top 10%, Top 20%, Top 40%, top 60% and top 80% and later ranks, respectively, will naturally produce increased numbers of academic outputs listed as the parameters above.

A comparison of "AD Scientific Index" scores of universities, some examples of which are given below, with the scores from other ranking institutions will show great agreement among scores. We use our methodology to rank universities of different characteristics and sizes from different countries and every continent, achieving very successful results through the yielded ranking figures. Considering the continuing processes of data entry and data cleaning for more than 19.500 universities, we foresee that data entry issues, such as not completed entries or human errors in data entry caused either by universities or our team mistakenly, will be resolved and will yield to improved accuracy of results over time.

Top University Ranking by “AD Scientific Index” will not only list the areas, where a university is the best or has room for improvement, but also reflect the outcomes of scientist policies of the institutions. This report reveals the competency of institutions to attract prized scientists and the ability of institutions to encourage advances and retain scientists.

Ranking Criteria for Countries:

As described in the university ranking section, it is not easy to obtain and standardize data from about 19,500 universities for the country ranking. Therefore, we based our ranking system on the number of meritorious scientists. Four criteria are used to rank the countries. The first one is the number of scientists in the top 2% list. The second and third criterion are the number of scientists in the Top 10%, Top 20%, Top 40%,Top 60% Top 80%, and later ranks. The fourth one is the number of scientists listed in the AD Scientific Index. In the case of equalities after applying all these four criteria, the world rank of the meritorious scientist of that country is used. 

Top 100 Scientists

The ranking of “Top 100” scientists is based on total h-index scores. Top 100 scientists can be ranked globally or specific to the following regions including Africa, Asia, Europe, North America,  Oceania, Arab League, EECA, BRICS and Latin America based on total h-index scores without any breakdown by subject areas. Top 100 rankings in the world, in a continent, or a region include standardized subjects areas of Agriculture & Forestry, Arts, Design and Architecture, Business & Management, Economics & Econometrics, Education, Engineering & Technology, History, Philosophy, Theology, Law/Law and Legal Studies, Medical and Health Sciences, Natural Sciences, and Social Sciences. Subjects indicated as “others” will not be included in the ranking by regions and subjects. Therefore, you may wish to specify your subject and branch and contribute in order to standardize your performance. Determining the subjects/departments, to which scientific fields would belong, may seem easy in some branches and in a variety of countries. However, it may create considerable confusion in some other countries, regions, and schools. We would like to emphasize that the following fields including Engineering, Natural and Environmental Sciences, Biology, Biochemistry, Material Science, Biotechnology, Chemistry, and Social Sciences may exist in quite variable spectrums in different countries. Therefore, we would like to stress that the standardization of subjects and branches has not been easy. To perform standardizations, we accepted the official names of the institutions and academic branches as accurate in the way that they were specified on the university website. We have developed this strategy in order to standardize this complex situation at least partially. Furthermore, we started a procedure to add an asterisk as “i” at the end of the names of the authors when a scientific paper of interest included many authors such as CERN’s scientific papers.

Academic collaboration

Scientific fields of interest specified in the profiles of scientists are available for other scientists from different countries and institutions to enable academic collaboration.

Limitations of the “AD Scientific Index”: Missing or Inaccurate Profiles or Missing Institution Names

This index is a comparative platform developed by ranking accessible and verified profiles. First and foremost, not being included in this index for various reasons does not indicate that the academician is not prized or it does not mean that only those academicians listed in the index are the prized ones. This needs to be carefully noted. A meritorious scientist may not have been included in this index because of not having a Google Scholar profile or our lack of access to that profile for various reasons. The unavailability of verified Google Scholar profiles of scientists, who work in well-known and respected academic institutions in respective countries, may prevent us from finding institutions and scientist profiles. Because updating the profiles in the system and collection of data from open sources require efforts and because the data have been collected for the first time, it is not possible for the index to be completely free of errors. Accurate and instant updating of profiles and institution names requires an endless workload that no institution can overcome only with available resources despite all endeavors.

A high h-index (WOS, Scopus, Publon, etc.) does not mean that a profile will be automatically created for the academician in Google Scholar. Indeed, Google Scholar profiles are created and made public by scientists themselves on a voluntary basis. An individual may not have created a profile for various reasons and, therefore, will not be listed in the "AD Scientific Index". Furthermore, a profile can be rejected or may not be listed at a particular time.  It needs to be considered that, at the time of our search, a profile may not exist or may not be public, some profiles may be public only at particular times, the information in the profile may not be standard, there may be more than one profile belonging to the same person, the profiles may not be verified, the name of the institution can be missing, surnames or institution names can change, profile owners may have died, or known or unforeseen problems may happen. However, missing information is completed in the system regularly and the list is updated and corrected continuously. Profiles; whose owners have passed away, are removed from the system.

When we detect or be informed of unethical situations in profile information that go beyond the limits of goodwill, the person is excluded from the list. As individuals are responsible for the accuracy of their profiles, organizations, too, should include the need for reviewing academic staff profiles in the agenda.

Articles with thousands of authors such as CERN studies in the field of physics or scientific studies with more than one author in classification studies in medicine or statistical studies raise debates about the requirements for the amount of the article content belonging to one author. Because such papers may cause inequality of opportunity, a separate grouping system may be needed in the future.

In order to minimize this problem, it is also possible to perform sorting by using the option "List without CERN, Statistical Data, etc". This is a feature, which can only be found in the "AD Scientific Index".

Pros and cons of "ranking" systems including Web of Science, Scopus, Google Scholar, and similar others are well known and the limits of use of such systems have long been recognized in the scientific community. Therefore, interpreting this study beyond these limits may lead to incorrect results. The “AD Scientific Index” needs to be evaluated considering all of the abovementioned potential limitations.

Possible reasons why a scientist is not on this list...

Since its foundation, AD Scientific Index has expanded at a rapid pace to include relevant individuals, regions, universities, countries, and continents. Currently, it includes 1,150,000 scientists and academicians from 216 countries and 19,600 universities and institutions. We are in continuous pursuit of comprehensiveness with close observations for the accuracy, cleanliness, reliability, and up-to-dateness of the data so as to ensure sustainability. During each update, all data with several types of increases in figures are subject to reviews for controls. So far, we have excluded almost 200,000 items of data for several reasons during the several stages of list development.

 

Reasons why a name is not on the list:

  • A Google Scholar profile is not available,
  • The Google Scholar profile is not PUBLIC,
  • The information in the profile is incomplete or irrelevant,
  • A change in the PUBLIC status of the profile,
  • Some publications do not belong to the respective profile,
  • Detection of non-compliance during controls upon a notification about the profile,
  • The profile has recently been developed after our data expansion process about that institution,
  • The address is not clear or reliable,
  • Personal preferences for exclusion from a respective list,
  • Exclusions based on various non-compliance notifications made by institutions,
  • Exclusion of previously listed profiles because profiles were not available during several search activities.
  • In addition, a name may not be found on the list because of our inadvertent mistakes.

How can individuals find out their ranking if they are not already included in the list?

You do not need to be included in a relevant list to find out your ranking. The ranking will be the same as those of other academicians or scientists with similar scores in the list. However, there are two possible ways to be included in the list: The first way is that you can spontaneously be included in the list during an updating process because the index continues to expand by continents, countries, universities, and branches periodically.

May 25, 2021 Total 417.605 scientist, 167 country, 9.525 university

June 18, 2021 Total 700.093 scientist, 182 country, 11.350 university

October 18, 2021 Total 708.675 scientist, 206 country, 13.542 university

December 8, 2021 Total 710.652 scientist, 212 country, 14.124 university

January 3, 2022 Total 716.979 scientist, 212 country, 14.130 university

February 1, 2022 Total 744.886 scientist, 216 country, 14.172 university

March 27, 2022 Total 838.396 scientist, 216 country, 15.165 university

April 27, 2022 Total 907.146 scientist, 217 country, 15.450 university

June 5, 2022 Total 948.737 scientist, 216 country, 15.652 university

August 1, 2022 Total 1.037.806 scientist, 16.043 university

October 1, 2022 Total 1.082.054 scientist, 19.490 university

November 1, 2022 Total 1.113.263 scientist, 216 country, 19.520 university

November 13, 2022 Total 1.144.571 scientist, 216 country, 19.528 university

The second way is to use the registration page of the website; on which you can register as an individual or use the option of discounted institutional registration. We are not available for responding to other requests submitted through other means or e-mailing.

 Comparisons of Ranking Systems

In addition to ranking lists of scientists, consisting of many tables and charts of trends analyses to be delivered for the first time, this comprehensive system offers several data and analysis results that will importantly provide an added value to branches and institutions within the limits of inherent advantages and limitations. We would like to kindly emphasize that comparisons should not be performed between two branches, either of which having different potentials to produce scientific papers. For example, it is not correct to expect the same number of articles from completely different branches such as law, social sciences, music, physics, or biochemistry. Ranking comparisons should not overlook the inherent potentials of branches to produce publications. For this reason, we try to primarily involve observations within the same subject/department and recent productivity.

Through the contribution of many scientists from different fields, the "AD Scientific Index" undergoes systematic updates with the aim of continuous improvement. The index is an independent institution and does not receive any support from any institutions, organizations, countries, or funds. Concurrently with the continuous increase in the number of universities and scientists registered to the Index, we are improving methodology, software, data accuracy, and data cleaning procedures every day through the contributions of a large team. Your remarks and contributions about our shortcomings will shed light to lead our efforts for continuous improvement.

Could this work have been designed in another way?

It is not possible to exactly measure the research capacity of a university or scientist by using a few parameters. Assessments should include many other types of data such as patents, research funds, incentives, published books, tutoring intensity, congress presentations, and graduate and doctoral teaching positions. As a frequently voiced criticism, we have been asked why the Web of Science h-index is not used. Since it is not possible to have access to the entire data covering all academic components such as the h-indexes of the Web of Science, Scopus, or Publons, etc., or the organizations, patents, awards, etc.

Because it will not be possible to reach the above-mentioned information about 19,500 universities, the only common parameter for an evaluation is the methodology we use. Our methodology results yield the same results as those from other ranking systems, which use a large number of parameters.

The Concept of Predatory:

A journal or an academic service cannot be considered predatory only because it is not free. The concept of predatory is used for describing any unethical action including those with factitious, spurious, exaggerated, or deceptive quality, performed in return for a fee. Any predatory activity is misleading and unfair. As an institution that does not receive any governmental, institutional, or financial support and with the aim of maintaining the sustainability of our academic services and the preservation of editorial independence, we have reached the following figures of 1.090.000 academicians and 19,500 universities included in our database completely free of charge through the extensive efforts of a large team within the scope of expanding our data in terms of countries, branches, and universities. Our expansion continues at a certain pace. However, we charge a small service fee from those, who prefer to be included in the system faster, without compromising ethical principles.

Methodology that increases transparency and visibility. "AD Scientific Index" not only provides ranking services but also lights the way against ethical noncompliance by presenting data available to the public, paving the way to sort out any ethical violations. Bearing a torch this way, we improve controllability, transparency, and accountability both on individual and corporate levels. With such efforts, individuals and institutions have been led to focus on academic profiles and tens of thousands of academicians revised and rearranged their profiles, removing incorrect data. Besides stressing the requirement that academicians should regularly review information in their profiles, we emphasize that institutions, too, should check the profiles of academic staff. You are always welcome to contribute by reporting incorrect data through the redlist link.

How will the new rankings be updated in the “AD Scientific Index”?

Updates and new rankings will be available through the current list of profiles and the pool of academicians that would expand along with new subscriptions. Importantly, one should remember that taking 300 citations as the lower limit for inclusion in the index brings up the potential of exclusion because of variations across different H-index values. We are going to spend our best efforts to respond to e-mails, which question the justification for not being included in the list despite high H-index values.

Because data processing with simultaneous data input may entail the risk of data pollution, we prefer not to work with instant data online. Although it is difficult and time-consuming to check all profiles with increased numerical values during each data extraction, we regularly perform such checking procedures. Therefore, please do not send an e-mail requesting an update when the data in your profile changes. However, you are always welcome to contribute by reporting an accidentally overlooked inappropriate profile by sending an e-mail.

How can I be included in the “AD Scientific Index”?

First of all, you must have a Google Scholar profile and this profile must be set to PUBLIC. If you do not have a Google Scholar profile, you can create a profile at https://scholar.google.com/ and add your published scientific articles. It is the liability of the scientist to ensure the accuracy and the ethical aspects of the profile. Furthermore, it is recommended that institutions would check the profiles of respective employees. We would like to remind you that you should check your profile regularly and keep it updated. Published scientific papers added to your profile may cause ethical issues if they do not belong to you.

Inappropriate or unethical profiles will be deleted, even if a fee is paid.

Is there a specified lower limit for the h-index and i10 index scores or the number of citations to be included in “AD Scientific Index”?

For SUBMISSIONS, no lower limits have been specified for the number of citations or the h-index or i10-index scores to be included in the “AD Scientific Index”.

See our ADD/ EDIT PROFILE page to be included in the index through individual and corporate registration options.

We request a small contribution as a processing fee for the sustainability and independence of this system, which has been developed through the hard work of many people without receiving any institutional or financial support. Through the contribution of many scientists from different fields, the "AD Scientific Index" undergoes systematic updates with the aim of continuous improvement. The index is an independent institution and does not receive any support from any institutions, organizations, countries, or funds. Concurrently with the continuous increase in the number of universities and scientists registered to the Index, we are improving methodology, software, data accuracy, and data cleaning procedures every day through the contributions of a large team. Your remarks and contributions about our shortcomings will shed light to lead our efforts for continuous improvement.