University, Subject, Country, Region, World

Methodology

Ranking scientific journals according to their impact factor is a practice that began many years ago. The need to access scientifically valuable studies within a limited time frame, or to find scientists working in a particular field, has led to the practice of ranking scientists and scientific studies. To this end, many scoring systems such as the h-index, i10 index, g-index, m-index, Erdös number, tori index, riq index and read-10 index have been studied as numerical indicators of how productive and effective a researcher is. Each of these systems has many advantages and disadvantages. Of the above indices, the most widely accepted is the h-index. The h-index is calculated based on the number of times an article has been cited at least h times. In order to have a high h-index, an academic must have published a high number of articles and received a high number of citations. For example, an h-index value of 15 indicates that the academic has received at least 15 citations for each of the 15 articles published. To increase the h-index value from 15 to 16, the same academic would need to receive at least 16 citations for the 16 papers published. Several databases can be used to find the h-index value, including Google Scholar, Web of Science, Scopus and Publons, some of which are public and some of which require a subscription. These databases use different parameters to calculate h-indexes, including SCI-E or indexed journals, or non-indexed ancillary elements such as other journals, books or patents. Because the set of parameters used by each database is different from those used by others, each database may calculate different h-index values. Therefore, the h-indexes calculated by Google Scholar, Web of Science, Scopus and Publons may be different for the same researcher. For example, a researcher who has written more books than scientific papers may have a low h-index in the Web of Science despite having a high number of citations. Neither index is equivalent to the other because of their different scopes.

Having a large number of publications indicates that the researcher is productive, but data alone may not be the true indicator of the researcher's success. For example, a researcher may have 10 publications that have received 400 citations. We can argue that this researcher is more successful than a researcher who has more than a hundred published papers that have received, let's say, 200 citations. Moreover, some valuable studies may not have been given the value they deserve for various reasons, such as the failure to use appropriate methods that would allow easy access through scientific channels. The high number of papers cited by other authors shows the value and extent of the contribution to the scientific literature.

The i10 index is another academic scoring system where the scores are calculated by Google Scholar. In this scoring system, only scientific studies such as articles and books that have received 10 or more citations are taken into account. The number of studies cited ten or more times gives the i10 index value. The i10 index and h-index values calculated for the last six years do not indicate that the article was written and published in the last six years. Instead, these values show the citation power over the last 6 years, which indicates whether the paper is still effective.

Google Scholar provides both the total i10 index, h-index and citation counts as well as the values for the last 6 years through a voluntary system. In this system, researchers create their accounts, select their papers and upload the selected papers to the system. This service does not require a password and is free of charge. Here we present a newly developed index that we have developed based on the public Google Scholar profiles of scientists. We have named this new system "AD Scientific Index", which we have developed through a robust intellectual infrastructure and maximum efforts aimed at contributing to global scientific efforts.

Why is the “AD Scientific Index” needed?

The "AD Scientific Index" is the first and only study that shows the total and six-year productivity coefficients of scientists based on h-index and i10 index scores and citations in Google Scholar. In addition, the index provides the ranking and assessment of scientists in academic subjects and fields as well as in 22.388 universities, 218 countries, regions and the world. In other words, the "AD Scientific Index" provides both ranking and analysis results. In addition to the indexing and ranking functions, AD Scientific Index enlivens the academic life and offers the user the possibility to carry out an efficient academic analysis to verify and detect incorrect and unethical profiles, plagiarism, falsification, distortion, duplication, fabrication, slicing, salamisation, unfair authorship and various manifestations of academic harassment. Such analyses also help to reveal the medium- and long-term results of various policies implemented by institutions, including those related to academic staff recruitment and retention policies, salary policies, academic incentives and the scientific working environment.

“AD Scientific Index” (Alper-Doger Scientific Index):

This new index has been developed by Prof. Dr. Murat ALPER (MD) and Associate Prof. Dr. Cihan DÖĞER (MD) by using the total and the last 6 years' values of the i10 index, the h-index and the citation scores in Google Scholar. In addition, the ratio of the last 6 years' value to the total value of the above indices is used. Using a total of nine parameters, the "AD Scientific Index" shows the ranking of an individual scientist in 12 subject areas (Agriculture & Forestry, Arts, Design & Architecture, Business & Management, Economics & Econometrics, Education, Engineering & Technology, History, Philosophy, Theology, Law / Legal Studies, Medicine & Health Sciences, Natural Sciences, Physical Sciences), Medical and Health Sciences, Natural Sciences, Social Sciences, and Others), 256 branches, 22.388 employing institutions, 218 countries, 10 regions (Africa, Asia, Europe, North America, Oceania, Arab League, EECA, BRICS, Latin America, and COMESA), and the world. This allows researchers to see their academic rankings and follow the evolution of their rankings over time.

Data Policy:

All data here is taken from Google Scholar and the data provided during registration, and no information that has not been made public with the consent of the individual is shared here, except for academic purposes. However, you may send a message to [email protected] to have your information removed from here, and your information will be deleted within 6 business days. We do not collect credit card information.

Data Update, Data Collection and Expansion Policy:

Data is updated every 30-60 days. The data is manually collected based on Google Scholar rankings, with priority given to profiles with up to 300 citations and verified addresses, or profiles where there is confidence in their accuracy. The aim is to standardise names, institutions and industries as much as possible. Non-standardised data, including wide variations in information and the use of abbreviations and a variety of languages, have caused difficulties. Updates and new rankings will be available through the current list of profiles and the pool of academics, which would grow with new subscriptions. By performing data mining and reviewing the information obtained, many profiles have been excluded from the index. In addition, some profiles were excluded during the regular data cleaning process. Data cleansing requires a regular process that must be carried out meticulously. We welcome your input in cleaning the data and ensuring accuracy.

Identifying the subjects/departments to which scientific fields would belong may seem easy in some industries and in a number of countries. However, it may cause considerable confusion in some other countries, regions and schools. We would like to emphasise that the following fields, including engineering, natural and environmental sciences, biology and biochemistry, materials science, chemistry and social sciences, may exist in quite different spectrums in different countries. Therefore, we would like to emphasise that the standardisation of subjects and branches has not been easy. In order to carry out the standardisation, we have accepted the official names of the institutions and academic branches as they appear on the university website. We developed this strategy in order to at least partially standardise this complex situation.

The number of universities in the countries and the number of academics in the universities are gradually increasing, free of charge, within our means. The current list of registered academics contains 1.353.897 individuals. Frequent updates will be limited to red list and death reductions, new profile registrations from our institution, and new individual and corporate registrations.

Although various additions, corrections and improvements are made on a daily basis, it should be noted that this process requires dedicated effort and time.

It is important to remember that using 300 citations as the lower limit for inclusion in the index introduces the potential for exclusion due to variations between different H-index values. For example, an academic with an H-index of 1 and 300 citations may be included in the ranking, while another with an H-index of 5 and 30 citations, or an academic with an H-index of 10 and 120 citations, may be excluded. Such issues require regular updates, and it can take a considerable amount of time to update the rankings due to workload constraints. It is possible to reduce this time through individual and institutional submission options.

As a non-funded and independent organisation, we have managed to become the most comprehensive index with 22.388 universities and institutions and about 1.353.897 scientists with free registrations. At present, we do not aim to include all scientists in the world in the system, but we do aim to maintain our expansion policy with the data we have already collected. This will take some time, which is necessary to maintain data cleanliness and promote reliability. We would like to emphasise that we have a transparent methodology that allows academics and scientists to know their scores and rankings free of charge, in line with other systems such as the H-index and citation counts. Registration in our index is therefore not mandatory. Through generous and challenging efforts, we have achieved sufficient capacity to adequately represent institutions. Currently, the way to be included in our database is to register as an individual or corporate entity via the "Registration" link.

Studies that influence the order of ranking because of a high number of citations received, in a manner similar to CERN:

We started a procedure to add an asterisk as “i” at the end of the names of the authors when a scientific paper of interest included many authors such as CERN, ATLAS, ALICE, CMS, Statistical Data, Guideline, Updates etc. scientific papers. We think that new criteria will be defined to be implemented for such studies. Until further criteria are described, we marked such studies with a “i” sign.

Profile information and ethical responsibility:

The ethical responsibility for accurate profile information rests entirely with the individual scientist. However, we believe that it would be prudent for institutions, countries, and even professional societies to conduct periodic reviews of the profiles of scientists affiliated with their organisation, as misleading information can damage the reputation of the organisation or country. Organisations should also review profiles to identify and report on scientists who are not affiliated with the institution. In order to avoid damage to the reputation of the institution, institutions should take the necessary corrective and preventive action against published scientist profiles that are unethically arranged.

Data Cleaning and the Redlist

Data cleansing is a dynamic process that we perform systematically on an ongoing basis. Despite our best efforts, we may not be completely accurate and we welcome your contributions to the Red List notifications. Rarely, some scientists are placed on the Red List due to innocent mistakes made in good faith and without unethical behaviour. Most errors are the result of inadequate periodic profile checks. However, such an error can be easily corrected by submitting a correction request. To avoid such an undesirable situation, researchers should regularly check their profiles and institutions should systematically check the profiles of their staff. Use [email protected] to report an inappropriate profile, death, or any other condition that would require the profile to be removed.

Ranking Criteria:

Ranking of scientists by the university, country, region, and in the world was performed based on the “total h-index”. The “total h-index” was used in rankings by the branch and the subbranch.

The ranking criteria based on the “total h-index” scores were used in the following order: Firstly, the “total h-index” scores; secondly, the total number of citations; and thirdly, the “total i10 index” scores (1. Total h-index scores, 2. Total number of citations, 3. Total i10 index scores, 4. Last 6 years’ h-index scores).

Ranking based on the last 6 years’ h-index scores was performed using criteria in the following order: 1. Last 6 years’ h-index scores, 2. Number of citations in the last 6 years, 3. Last 6 years’ i10 index scores, 4- Total h-index scores. 

The ranking criteria for the total i10 index were used in the following order: 1. Total i10 index scores, 2. Total h-index scores, 3. Total number of citations, and 4. Last 6 years’ i10 index scores.

Ranking based on the last 6 years’ i10 index scores was performed using the criteria in the following order: 1. Last 6 years’ i10 index scores, 2. Last 6 years’ h-index scores, 3. Number of citations in the last 6 years and 4. Total i10 index scores.

Ranking based on the total number of citations was performed using the criteria in the following order: 1. Total number of citations, 2. Total h-index scores, 3. Total i10 index scores and 4. Number of citations in the last 6 years.

Ranking based on the total number of citations in the last 6 years was performed using the criteria in the following order: 1: Number of citations in the last 6 years, 2. Last 6 years’ h-index scores, 3: Last 6 years’ i10 index scores and 4. Total number of citations

Productivity Rankings

Productivity Rankings is a unique service offered only by "AD Scientific Index". It is a ranking system derived from the i10 index to show the productivity of scientists in publishing high-value scientific articles. Productivity Rankings is a tool that lists the most productive scientists in a given field, discipline, university and country, and can guide the development of meaningful incentives and academic policies. The world, regional and university rankings of scientists in this table are calculated on the basis of the overall i10 index.

Why are the last 6 years’ ratios / total ratios important?

The h-index, the i10 index and the ratio of citations in the last 6 years to the total number of citations are important unique features of the AD Scientific Index, showing both the development of the individual performance of the scientist and the impact of the institutional policies of the universities on the overall scientific picture.

Ranking Criteria for Universities:

We have a ranking that includes all universities, private universities, public universities, institutions, hospitals, companies, as well as a ranking that includes only the relevant categories. For example, a private university: You can see its ranking in the country, the region and the world among all institutions, all private universities and all universities.

For global university rankings, ranking organisations use the following parameters: quality of education, employment rates of graduates, quality of faculties within an individual university, international collaborations, number of alumni and staff awarded Nobel Prizes and Fields Medals, number of highly cited researchers selected by Clarivate Analytics, total number of research papers, number of articles published in Nature and Science journals, number of articles indexed in Science Citation Index-Expanded (SCIE) and Social Science Citation Index (SSCI), and number of highly cited research articles. Each ranking organisation develops a ranking methodology that assigns different weightings to selected elements of these parameters. Experienced ranking organisations evaluate 2000-3000 universities for the ranking.

AD Scientific Index performs rankings using a single parameter, the number of "Valued and Productive Scientists" employed by a given university. This parameter, selected after years of observation, is calculated using the total H-index and i10-index values together with the number of citations, and the total H-index and i10-index values of the last 6 years together with the number of citations received in the last 6 years. We rank more than 22,350 universities in this way. Careful examination will reveal that most of the other parameters are representations of the natural academic products of 'valued and productive academics'. Institutions employing a high number of Valued and Productive Scientists, for example scientists in the first top 3%, top 10%, top 20%, top 40%, top 60%, top 80% and later ranks, will naturally produce a higher number of academic outputs listed as the parameters above. "The AD Scientific Index is the only university ranking system that analyses the distribution of scientists in an institution according to the 3, 10, 20, 30, 40, 50, 60, 70, 80 and 90 percentiles.

The ranking of institutions starts by identifying the scientists in the top 3, 10, 20, 30, 40, 50, 60, 70, 80 and 90 per cent of the institution. Institutions with more scientists in these bands are ranked higher. If there is an equal number of scientists in a range, the next range is considered. If the number is still equal, the institution with the higher number of individual scientists is ranked higher.

A comparison of the AD Scientific Index scores of institutions with the scores of other ranked institutions will show a high degree of consistency between the scores. We use our methodology to rank institutions of different characteristics and sizes from different countries and all continents, and achieve very successful results through the ranking figures obtained. Given the ongoing processes of data entry and data cleansing for over 21,500 universities, we expect that data entry issues such as incomplete entries or human errors in data entry made by either the universities or our team will be resolved and lead to improved accuracy of results over time.

The AD Scientific Index top university rankings will not only list the areas in which a university is the best or has room for improvement, but will also reflect the results of the institutions' science policies. This report reveals the ability of institutions to attract highly-regarded researchers and the ability of institutions to promote progress and retain researchers.

Ranking Criteria for Countries:

As described in the university ranking section, it is not easy to obtain and standardize data from about 22.388 universities for the 218 country ranking. Therefore, we based our ranking system on the number of meritorious scientists. Four criteria are used to rank the countries. The first one is the number of scientists in the top 3% list. The second and third criterion are the number of scientists in the Top 10%, Top 20%, Top 40%,Top 60% Top 80%, and later ranks. The fourth one is the number of scientists listed in the AD Scientific Index. In the case of equalities after applying all these four criteria, the world rank of the meritorious scientist of that country is used. 

Top 100 Institutions

With this ranking, you can see the top 100 institutions among all universities, private universities, public universities, all institutions, hospitals and companies in any country, region and the world.

Top 100 Scientists

The Top 100 Scientists ranking is based on total h-index scores. The Top 100 Scientists can be ranked globally or specifically for the following regions: Africa, Asia, Europe, North America, Oceania, Arab League, EECA, BRICS and Latin America, based on total h-index scores without any breakdown by subject area. The top 100 rankings in the world, continent or region include the standardised subject areas of Agriculture & Forestry, Arts, Design & Architecture, Business & Management, Economics & Econometrics, Education, Engineering & Technology, History, Philosophy, Theology, Law & Legal Studies, Medical & Health Sciences, Natural Sciences and Social Sciences. Subjects listed as 'other' are not included in the rankings by region and subject. Therefore, you may wish to specify your subject and field and contribute to the standardisation of your performance. Identifying the subjects/departments to which scientific fields would belong may seem easy in some sectors and in a number of countries. However, it may cause considerable confusion in some other countries, regions and schools. We would like to emphasise that the following fields, including engineering, natural and environmental sciences, biology, biochemistry, materials science, biotechnology, chemistry and social sciences, may exist in quite different spectrums in different countries. Therefore, we would like to emphasise that the standardisation of subjects and branches was not easy. In order to carry out the standardisation, we have accepted the official names of the institutions and academic branches as they appear on the university website. We developed this strategy to at least partially standardise this complex situation. We also started a procedure of adding an asterisk as an "i" at the end of the authors' names when a scientific paper of interest had many authors, such as the scientific papers of CERN.

Academic collaboration

Scientific fields of interest specified in the profiles of scientists are available for other scientists from different countries and institutions to enable academic collaboration.

Limitations of the “AD Scientific Index”: Missing or Inaccurate Profiles or Missing Institution Names

This index is a comparative platform developed by ranking accessible and verified profiles. First and foremost, not being included in this index for various reasons does not mean that the academician is not valued or that only those academicians listed in the index are the valued ones. This should be noted carefully. A meritorious scholar may not have been included in this index because he or she does not have a Google Scholar profile or we do not have access to that profile for various reasons. The unavailability of verified Google Scholar profiles of scholars working at well-known and respected academic institutions in their respective countries may prevent us from finding institutions and scholars' profiles. Because updating profiles in the system and collecting data from open sources requires effort, and because the data is being collected for the first time, it is not possible for the index to be completely error-free. Accurate and immediate updating of profiles and names of institutions requires an endless amount of work, which no institution can do with the resources available.

A high h-index (WOS, Scopus, Publon, etc.) does not automatically mean that a Google Scholar profile will be created for the scholar. In fact, Google Scholar profiles are created and published by scholars themselves on a voluntary basis. An individual may not have created a profile for a variety of reasons and will therefore not be listed in the AD Scientific Index. In addition, a profile may be rejected or not listed at any given time. It is important to remember that a profile may not exist or be public at the time of our search, some profiles may only be public at certain times, the information in the profile may not be consistent, there may be more than one profile belonging to the same person, profiles may not be verified, the name of the institution may be missing, surnames or names of institutions may change, profile owners may have died, or known or unforeseen problems may occur. However, missing information is regularly added to the system and the list is continually updated and corrected. Profiles whose owners have died will be removed from the system.

If we discover or are informed of unethical situations in profile information that go beyond the bounds of decency, the person will be removed from the list. As individuals are responsible for the accuracy of their profiles, organisations should also include the need to review academic staff profiles in their agenda.

Articles with thousands of authors, such as CERN studies in the field of physics, or scientific studies with more than one author in classification studies in medicine or statistical studies, raise debates about the requirements for the amount of article content that belongs to an author. As such papers may lead to inequality of opportunity, a separate grouping system may be needed in the future.

To minimise this problem, it is also possible to sort using the "List without CERN, Statistical Data, etc" option. This is a feature found only in the AD Scientific Index.

The pros and cons of "ranking" systems such as Web of Science, Scopus, Google Scholar and similar others are well known, and the limitations of such systems have long been recognised in the scientific community. Therefore, interpreting this study beyond these limitations may lead to erroneous results. The AD Scientific Index needs to be evaluated with all of the above potential limitations in mind.

Possible reasons why a scientist is not on this list...

Since its foundation, AD Scientific Index has expanded at a rapid pace to include relevant individuals, regions, universities, countries, and continents. Currently, it includes 1.353.897 scientists and academicians from 218 countries and 22.388 universities and institutions. We are in continuous pursuit of comprehensiveness with close observations for the accuracy, cleanliness, reliability, and up-to-dateness of the data so as to ensure sustainability. During each update, all data with several types of increases in figures are subject to reviews for controls. So far, we have excluded almost 200,000 items of data for several reasons during the several stages of list development.

Reasons why a name is not on the list:

  • No Google Scholar profile available,
    Notification that the person does not wish to be listed,
    The Google Scholar profile is not PUBLIC,
    The information in the profile is incomplete or irrelevant,
    A change in the profile's PUBLIC status,
    Some publications do not belong to the profile,
    Inappropriateness found and deleted during the review of a complaint about the profile
    Opening of the personal profile outside the period of periodic data expansion for the organisation
    The address is not clear or reliable,
    Deletions due to various notifications of non-compliance by the researcher's institution
    Deletion of previously listed profiles due to inaccessibility of profiles during updates,
    In addition, a name may not appear in the list due to various errors.

How can individuals find out their ranking if they are not already included in the list?

You do not need to be included in a relevant list to find out your ranking. The ranking will be the same as those of other academicians or scientists with similar scores in the list. However, there is only one way to get on the list: using the registration page of the website. You can use the individual or institutional registration option from this page. We do not respond to individual registration requests sent by e-mail.

May 25, 2021 Total 417.605 scientist, 167 country, 9.525 university

June 18, 2021 Total 700.093 scientist, 182 country, 11.350 university

June 5, 2022 Total 948.737 scientist, 216 country, 15.652 university

October 1, 2022 Total 1.082.054 scientist, 19.490 university

April 1, 2023 Total 1.350.571 scientist, 218 country, 21.500 university

 Comparisons of Ranking Systems

In addition to the rankings of scientists, which consist of many tables and graphs of trend analyses that are provided for the first time, this comprehensive system offers several data and analysis results that, within the limits of the inherent advantages and limitations, will provide important added value to branches and institutions. We would like to emphasise that comparisons should not be made between two branches, each of which has a different potential to produce scientific publications. For example, it is not correct to expect the same number of articles from completely different fields such as law, social sciences, music, physics or biochemistry. Ranking comparisons should not overlook the inherent potential of fields to produce publications. For this reason, we try to focus on observations within the same subject/field and on recent productivity.

Could this work have been designed in another way?

It is not possible to measure the research capacity of a university or a researcher accurately on the basis of a few parameters. Assessments should include many other types of data, such as patents, research funding, incentives, published books, teaching intensity, congress presentations, and graduate and postgraduate teaching positions. A common criticism is why the Web of Science h-index is not used. Since it is not possible to have access to all the data covering all the academic components, such as the h-indexes of the Web of Science, Scopus or Publons, etc., or the organisations, patents, awards, etc., it is not possible to have access to all the data covering all the academic components.

Because it will not be possible to reach the above-mentioned information 22.388 universities, the only common parameter for an evaluation is the methodology we use. Our methodology results yield the same results as those from other ranking systems, which use a large number of parameters.

The Concept of Predatory:

A journal or an academic service cannot be considered predatory only because it is not free. The concept of predatory is used for describing any unethical action including those with factitious, spurious, exaggerated, or deceptive quality, performed in return for a fee. Any predatory activity is misleading and unfair. As an institution that does not receive any governmental, institutional, or financial support and with the aim of maintaining the sustainability of our academic services and the preservation of editorial independence, we have reached the following figures of 1.353.897 academicians and 22.388 universities included in our database completely free of charge through the extensive efforts of a large team within the scope of expanding our data in terms of countries, branches, and universities. Our expansion continues at a certain pace. However, we charge a small service fee from those, who prefer to be included in the system faster, without compromising ethical principles.

A methodology that increases transparency and visibility. The "AD Scientific Index" not only provides ranking services, but also shines a light on ethical violations by presenting publicly available data, thus paving the way for ethical violations to be resolved. By carrying the torch in this way, we are improving controllability, transparency and accountability at both individual and corporate levels. These efforts have led individuals and institutions to focus on academic profiles, and tens of thousands of academics have revised and rearranged their profiles, removing inaccurate data. As well as stressing the need for academics to regularly review the information in their profiles, we also emphasise the need for institutions to review the profiles of their academic staff. You are always welcome to contribute by reporting incorrect data via the Red List link.

How will the new rankings be updated in the “AD Scientific Index”?

Updates and new rankings will be available through the current list of profiles and the pool of academicians that would expand along with new subscriptions. Importantly, one should remember that taking 300 citations as the lower limit for inclusion in the index brings up the potential of exclusion because of variations across different H-index values. We are going to spend our best efforts to respond to e-mails, which question the justification for not being included in the list despite high H-index values.

Because data processing with simultaneous data input may entail the risk of data pollution, we prefer not to work with instant data online. Although it is difficult and time-consuming to check all profiles with increased numerical values during each data extraction, we regularly perform such checking procedures. Therefore, please do not send an e-mail requesting an update when the data in your profile changes. However, you are always welcome to contribute by reporting an accidentally overlooked inappropriate profile by sending an e-mail.

How can I be included in the “AD Scientific Index”?

First of all, you must have a Google Scholar profile and this profile must be set to PUBLIC. If you do not have a Google Scholar profile, you can create a profile at https://scholar.google.com/ and add your published scientific articles. It is the liability of the scientist to ensure the accuracy and the ethical aspects of the profile. Furthermore, it is recommended that institutions would check the profiles of respective employees. We would like to remind you that you should check your profile regularly and keep it updated. Published scientific papers added to your profile may cause ethical issues if they do not belong to you.

Inappropriate or unethical profiles will be deleted, even if a fee is paid.

Is there a specified lower limit for the h-index and i10 index scores or the number of citations to be included in “AD Scientific Index”?

For SUBMISSIONS, no lower limits have been specified for the number of citations or the h-index or i10-index scores to be included in the “AD Scientific Index”.

See our ADD/ EDIT PROFILE page to be included in the index through individual and corporate registration options.

Fee Policy: For the sustainability and independence of this system, which has been developed by the labor of many people without any institutional or financial support, we request a small contribution as a transaction fee. With the contribution of many scientists from different fields, the "AD Scientific Index" is systematically updated for continuous improvement. In parallel with the continuous increase in the number of universities and scientists registered in the index, we are improving the methodology, software, data accuracy and data cleaning procedures every day with the contributions of a large team. Free changes: University/institution changes (by emailing [email protected] with evidence). Paid changes: 24-30 USD depending on country and includes Add Profile, Add Branch and subfield, ORCID ID, Web Of Science Researcher ID, Researchgate, Scopus Author ID, Email, Personal Twitter, Facebook, LinkdIn, Awards and Achievements, Office, Company or Private Business link, Books - E-books, Institutional Web Address and Lecture Notes link. Click here to change

Your comments and contributions regarding our shortcomings will shed light on our continuous improvement efforts.