Browsing by Author "Aparna Basu"
Now showing 1 - 6 of 6
- Results Per Page
- Sort Options
PublicationConference Paper An empirical analysis of existence of power laws in social media mentions to scholarly articles(International Society for Scientometrics and Informetrics, 2021) Sumit Kumar Banshal; Aparna Basu; Vivek Kumar Singh; Hiran H. Lathabai; Solanki Gupta; Pranab K. MuhuriPower laws are a characteristic distribution found in both natural as well as in man-made systems. Previous studies have shown that citations to scientific articles follow a power law, i.e., the number of papers having a certain level of citation x are proportional to x raised to some negative power. However, the distributional character of altmetrics (such as reads, likes, mentions, etc.) has not been studied in much detail, particularly with respect to existence of power law behaviours. This article, therefore, attempts to do an empirical analysis of altmetric mention data of a large set of scholarly articles to see if they exhibit power law. The individual and the composite data series of 'mentions' on the various platforms are fit to a power law distribution, and the parameters and goodness of fit determined using least squares regression. We also explore fit to other distributions like the log-normal and Hooked Power Law. Results obtained confirm the existence of power law behaviour in social media mentions to scholarly articles and we conclude that altmetric distributions also follow power law with a fairly good fit over a wide range of values. © 2021 18th International Conference on Scientometrics and Informetrics, ISSI 2021. All rights reserved.PublicationArticle Designing a Composite Index for research performance evaluation at the national or regional level: ranking Central Universities in India(Springer Netherlands, 2016) Aparna Basu; Sumit Kumar Banshal; Khushboo Singhal; Vivek Kumar SinghIt is now generally accepted that institutions of higher education and research, largely publicly funded, need to be subjected to some benchmarking process or performance evaluation. Currently there are several international ranking exercises that rank institutions at the global level, using a variety of performance criteria such as research publication data, citations, awards and reputation surveys etc. In these ranking exercises, the data are combined in specified ways to create an index which is then used to rank the institutions. These lists are generally limited to the top 500–1000 institutions in the world. Further, some criteria (e.g., the Nobel Prize), used in some of the ranking exercises, are not relevant for the large number of institutions that are in the medium range. In this paper we propose a multidimensional ‘Quality–Quantity’ Composite Index for a group of institutions using bibliometric data, that can be used for ranking and for decision making or policy purposes at the national or regional level. The index is applied here to rank Central Universities in India. The ranks obtained compare well with those obtained with the h-index and partially with the size-dependent Leiden ranking and University Ranking by Academic Performance. A generalized model for the index using other variables and variable weights is proposed. © 2016, Akadémiai Kiadó, Budapest, Hungary.PublicationErratum Erratum to: Designing a Composite Index for research performance evaluation at the national or regional level: ranking Central Universities in India (Scientometrics, (2016), 107, (1171-1193), 10.1007/s11192-016-1935-0)(Springer Netherlands, 2016) Aparna Basu; Sumit Kumar Banshal; Khushboo Singhal; Vivek Kumar SinghIn the original publication of the article, in Table 1, two entries of the last column “Date of inception/conversion” for Hemwati Nandan Bahuguna University (HNBGU) and Central University of Himachal Pradesh (CUHP) have been erroneously interchanged. The correct date of inception of HNBGU is 1973/2009 and for CUHP is 2009. The correct Table 1 is provided in this erratum. © Akadémiai Kiadó, Budapest, Hungary 2016.PublicationArticle Research performance of Indian Institutes of Technology(Indian Academy of Sciences, 2017) Sumit Kumar Banshal; Vivek Kumar Singh; Aparna Basu; Pranab Kumar MuhuriThis article presents a computational analysis of the research performance of 16 relatively older Indian Institutes of Technology (IITs) in India. The research publication data indexed in Web of Science for all the 16 IITs is used for the analysis. The data is computationally analysed to identify productivity, productivity per capita, rate of growth of research output, authorship and collaboration pattern, citation impact and discipline-wise research strengths of the different IITs. The research performances of the IITs have been compared with those of two top ranking engineering and technology institutions of the world (MIT-USA and NTU-Singapore) and most cited papers from these IITs have also been identified. The analytical results are expected to provide a informative, up-to-date and useful account of research performance assessment of the IITs.PublicationLetter Response to the Letter to the Editor by Gangan Prathap on the article: Designing a composite index for research performance evaluation at the national or regional level: ranking Central Universities in India(Springer Netherlands, 2016) Aparna Basu; Sumit Kumar Banshal; Khushboo Singhal; Vivek Kumar Singh[No abstract available]PublicationConference Paper Scientific vs. public attention: A comparison of top cited papers in WoS and top papers by altmetric score(Springer Verlag, 2018) Sumit Kumar Banshal; Aparna Basu; Vivek Kumar Singh; Pranab K. MuhuriAlternative metrics or altmetrics are article level metrics that have been used to quantify the attention given to scholarly papers in online fora or social media. In a way it was sought to replace journal level metrics such as the Impact Factor and also to see if altmetric scores could predict highly cited papers. Early studies, done soon after altmetrics was proposed in 2010, were somewhat premature as the use of social media was not widely prevalent, and their results may no longer be relevant at the present time when use levels have risen significantly. In 2016, Altmetric.com has tracked over 17 million mentions of 2.7 million different research outputs. Of these, the top 100 most-discussed journal articles of 2016 were presented with details of journals in which the research was published, affiliations of authors, fields, countries, etc. Our attempt is to obtain a bibliometric profile of these papers with high altmetric scores to see the underlying patterns and characteristics that motivate high public attention. In parallel we have analyzed top 100 highly cited papers from the same year. Our objective in this empirical study is to examine the similarities and distinguishing features of scientific attention as measured by citations and public attention in online fora. A significant finding is that there is very little overlap between very highly cited papers and those that received the highest altmetric scores. © Springer Nature Singapore Pte Ltd. 2018.
