Browsing by Author "Sumit Kumar Banshal"
Now showing 1 - 20 of 21
- Results Per Page
- Sort Options
PublicationArticle A large-scale comparison of coverage and mentions captured by the two altmetric aggregators: Altmetric.com and PlumX(Springer Science and Business Media B.V., 2021) Mousumi Karmakar; Sumit Kumar Banshal; Vivek Kumar SinghThe increased social media attention to scholarly articles has resulted in creation of platforms & services to track the social media transactions around them. Altmetric.com and PlumX are two such popular altmetric aggregators. Scholarly articles get mentions in different social platforms (such as Twitter, Blog, Facebook) and academic social networks (such as Mendeley, Academia and ResearchGate). The aggregators track activity and events in social media and academic social networks and provide the coverage and transaction data to researchers for various purposes. Some previous studies have compared different altmetric aggregators and found differences in the coverage and mentions captured by them. This paper attempts to revisit the question by doing a large-scale analysis of altmetric mentions captured by the two aggregators, for a set 1,785,149 publication records from Web of Science. Results obtained show that PlumX tracks more altmetric sources and captures altmetric events for a larger number of articles as compared to Altmetric.com. However, the coverage and average mentions of the two aggregators, for the same set of articles, vary across different platforms, with Altmetric.com recording higher mentions in Twitter and Blog, and PlumX recording higher mentions in Facebook and Mendeley. The article also analysed coverage and average mentions captured by the two aggregators across different document types, subjects and publishers. © 2021, Akadémiai Kiadó, Budapest, Hungary.PublicationConference Paper A Scientometric analysis of computer science research in India(Institute of Electrical and Electronics Engineers Inc., 2015) Khushboo Singhal; Sumit Kumar Banshal; Ashraf Uddin; Vivek Kumar SinghThis paper presents results of our Scientometrics and text-based analysis of computer science research output from India during the last 25 years. We have collected the data for research output indexed in Scopus and performed a detailed computational analysis to obtain important indicators, such as total research output, citation impact, collaboration patterns, top institutions/authors/publication sources. We also performed a text-based analysis on keywords of all papers indexed in Scopus to identify thematic trends during the period. The analytical results present a detailed and useful picture of status and competence of CS domain research in India. © 2015 IEEE.PublicationArticle Altmetric data quality analysis using Benford’s law(Akademiai Kiado ZRt., 2024) Solanki Gupta; Vivek Kumar Singh; Sumit Kumar BanshalAltmetrics, or alternative metrics, refer to the newer kind of events around scholarly articles, such as the number of times the article is read, tweeted, mentioned in blog posts etc. These metrics have gained a lot of popularity during last few years and are now being collected and used in several ways, ranging from early measure of article impact to a potential indicator of societal relevance of research. However, there are several studies which have cautioned about use of altmetrics on account of quality and reliability of altmetric data, as they may be more prone to manipulations and artificial inflations. This study proposes a framework based on application of Benford’s Law to evaluate the quality of altmetric data. A large sized altmetric data sample is considered and the fits with Benford’s Law are computed. The analysis is performed by doing plots of the empirical data distributions and the theoretical Benford's, and by employing relevant statistical measures and tests. Results for fit on first and second leading digit of altmetric data show conformity to Benford's distribution. To further explore the usefulness of the framework, the altmetric data is subjected to artificial manipulations through a systematic process and the fits to Benford’s law are reassessed to see if there are distortions. The results and analysis suggest that Benford’s Law based framework can be used to test the quality of altmetric data. Relevant implications of the research are discussed. © Akadémiai Kiadó, Budapest, Hungary 2024.PublicationArticle An altmetric analysis of scholarly articles from India(IOS Press, 2018) Sumit Kumar Banshal; Vivek Kumar Singh; Golam Kaderye; Pranab Kumar Muhuri; Belém Priego SánchezScholarly articles are considered one of the primary medium for dissemination of inventions and discoveries. Traditionally, usefulness and popularity of a scholarly article has been measured in terms of citations it receives. However, in the changed research publishing landscape, where most of the publications are now available in digital form accessible through various digital libraries; new measures of measuring usefulness of scholarly articles have emerged. Nowadays, scholarly articles are easily available for access and download from various digital access portals. The use and popularity of these digital access portals has also made it possible to integrate various social media platforms with journal access and use. Most of the journals now maintain statistics about reads, number of downloads, social profile shares etc. Several newer platforms like ResearchGate, Academia and Mendeley have also become popular. Researchers now often share their articles on various such platforms and also use social media channels to disseminate their article to a wider audience. This transformed environment has allowed to track and measure usefulness and popularity of scholarly articles through alternative metrics (now popularly known as Altmetrics) as compared to traditional citation impact measures. Altmetrics attempts to derive impact of a scholarly article by using data from different kinds such as social network share, mentions, tweets etc. The use of Altmetrics varies widely from country to country and discipline to discipline. This paper attempts to present findings of an exploratory analysis of relevance of Altmetrics data through a case study of scholarly articles from India published during 2016 and indexed in Web of Science and also updated on ResearchGate. The results obtained provide an interesting insight on relatedness and correlation of presence of scholarly articles in Web of Science and ResearchGate. It is observed that about 61% papers indexed inWeb of Science have an entry in ResearchGate. There are, however, disciplinary variations in presence of articles in ResearchGate. Only about 61% of the total disciplines in Web of Science are found to be covered in ResearchGate. © 2018-IOS Press and the authors. All rights reserved.PublicationConference Paper An empirical analysis of existence of power laws in social media mentions to scholarly articles(International Society for Scientometrics and Informetrics, 2021) Sumit Kumar Banshal; Aparna Basu; Vivek Kumar Singh; Hiran H. Lathabai; Solanki Gupta; Pranab K. MuhuriPower laws are a characteristic distribution found in both natural as well as in man-made systems. Previous studies have shown that citations to scientific articles follow a power law, i.e., the number of papers having a certain level of citation x are proportional to x raised to some negative power. However, the distributional character of altmetrics (such as reads, likes, mentions, etc.) has not been studied in much detail, particularly with respect to existence of power law behaviours. This article, therefore, attempts to do an empirical analysis of altmetric mention data of a large set of scholarly articles to see if they exhibit power law. The individual and the composite data series of 'mentions' on the various platforms are fit to a power law distribution, and the parameters and goodness of fit determined using least squares regression. We also explore fit to other distributions like the log-normal and Hooked Power Law. Results obtained confirm the existence of power law behaviour in social media mentions to scholarly articles and we conclude that altmetric distributions also follow power law with a fairly good fit over a wide range of values. © 2021 18th International Conference on Scientometrics and Informetrics, ISSI 2021. All rights reserved.PublicationArticle Can altmetric mentions predict later citations? A test of validity on data from ResearchGate and three social media platforms(Emerald Group Holdings Ltd., 2020) Sumit Kumar Banshal; Vivek Kumar Singh; Pranab Kumar MuhuriPurpose: The main purpose of this study is to explore and validate the question “whether altmetric mentions can predict citations to scholarly articles”. The paper attempts to explore the nature and degree of correlation between altmetrics (from ResearchGate and three social media platforms) and citations. Design/methodology/approach: A large size data sample of scholarly articles published from India for the year 2016 is obtained from the Web of Science database and the corresponding altmetric data are obtained from ResearchGate and three social media platforms (Twitter, Facebook and blog through Altmetric.com aggregator). Correlations are computed between early altmetric mentions and later citation counts, for data grouped in different disciplinary groups. Findings: Results show that the correlation between altmetric mentions and citation counts are positive, but weak. Correlations are relatively higher in the case of data from ResearchGate as compared to the data from the three social media platforms. Further, significant disciplinary differences are observed in the degree of correlations between altmetrics and citations. Research limitations/implications: The results support the idea that altmetrics do not necessarily reflect the same kind of impact as citations. However, articles that get higher altmetric attention early may actually have a slight citation advantage. Further, altmetrics from academic social networks like ResearchGate are more correlated with citations, as compared to social media platforms. Originality/value: The paper has novelty in two respects. First, it takes altmetric data for a window of about 1–1.5 years after the article publication and citation counts for a longer citation window of about 3–4 years after the publication of article. Second, it is one of the first studies to analyze data from the ResearchGate platform, a popular academic social network, to understand the type and degree of correlations. Peer review: The peer review history for this article is available at: https://publons.com/publon/10.1108/OIR-11-2019-0364. © 2020, Emerald Publishing Limited.PublicationArticle Comparing research performance of private universities in India with IITs, central universities and NITs(Indian Academy of Sciences, 2019) Sumit Kumar Banshal; Vivek Kumar Singh; Philipp MayrDuring the last two decades the number of private universities in India has increased significantly. According to AISHE report of 2016, out of 799 universities in India, 277 are private universities, i.e. one out of every three universities in India is a private university. A significant proportion of colleges (about 78%) are also privately managed, as they do not contribute much to research activities and hence are not included in this analysis. Private universities are now becoming a major component of the Indian higher education system. Some of the private universities are exclusively positioning and projecting themselves as universities for high quality research and innovation. A few of them are now well placed in the national-level NIRF ranking framework. It is in this context that this paper presents a comparative account of research performance of the 25 most productive private universities with the set of Indian Institutes of Technology (IITs), Central Universities (CUs) and National Institutes of Technology (NITs), all of which have a well-established environment and culture of research. A set-based comparison methodology is followed. The results show good performance of private universities in research, especially in terms of output and rate of growth of output. However, on quality and productivity per capita and per rupee spent, they have a long way to go to match the performance levels of well-established centrally funded higher education institutions of India. This study presents detailed scientometric assessment of some most productive private universities in India. © 2019 Current Science Association, Bengaluru.PublicationArticle Designing a Composite Index for research performance evaluation at the national or regional level: ranking Central Universities in India(Springer Netherlands, 2016) Aparna Basu; Sumit Kumar Banshal; Khushboo Singhal; Vivek Kumar SinghIt is now generally accepted that institutions of higher education and research, largely publicly funded, need to be subjected to some benchmarking process or performance evaluation. Currently there are several international ranking exercises that rank institutions at the global level, using a variety of performance criteria such as research publication data, citations, awards and reputation surveys etc. In these ranking exercises, the data are combined in specified ways to create an index which is then used to rank the institutions. These lists are generally limited to the top 500–1000 institutions in the world. Further, some criteria (e.g., the Nobel Prize), used in some of the ranking exercises, are not relevant for the large number of institutions that are in the medium range. In this paper we propose a multidimensional ‘Quality–Quantity’ Composite Index for a group of institutions using bibliometric data, that can be used for ranking and for decision making or policy purposes at the national or regional level. The index is applied here to rank Central Universities in India. The ranks obtained compare well with those obtained with the h-index and partially with the size-dependent Leiden ranking and University Ranking by Academic Performance. A generalized model for the index using other variables and variable weights is proposed. © 2016, Akadémiai Kiadó, Budapest, Hungary.PublicationConference Paper Disciplinary variations in altmetric coverage of scholarly articles(International Society for Scientometrics and Informetrics, 2019) Sumit Kumar Banshal; Vivek Kumar Singh; Pranab K. Muhuri; Philipp MayrThe popular social media platforms are now making it possible for scholarly articles to be shared rapidly in different forms, which in turn can significantly improve the visibility and reach of articles. Many authors are now utilizing the social media platforms to disseminate their scholarly articles (often as pre- or post- prints) beyond the paywalls of journals. It is however not very well established if the level of social media coverage and attention of scholarly articles is same across all research disciplines or there exist discipline-wise variations. This paper aims to explore the disciplinary variations in coverage and altmetric attention by analyzing a significantly large amount of data from Web of Science and Altmetric.com. Results obtained show interesting patterns. Medical Sciences and Biology are found to account for more than 50% of all instances in Altmetrics. In terms of coverage, disciplines like Biology, Medical Science and Multidisciplinary Sciences have more than 60% of their articles covered in Altmetrics, whereas disciplines like Engineering, Mathematics and Material Science have less than 25% of their articles covered in Altmetrics. The coverage percentages further vary across different altmetric platforms, with Twitter and Mendeley having much higher overall coverage than Facebook and News. Disciplinary variations in coverage are also found in different altmetric platforms, with variations as large as 7.5% for Engineering discipline to 55.7% for Multidisciplinary in Twitter. The paper also looks into the possible role of source of publication in altmetric coverage level of articles. Interestingly, some journals are found to have a higher altmetric coverage in comparison to the average altmetric coverage level of that discipline. © 2019 17th International Conference on Scientometrics and Informetrics, ISSI 2019 - Proceedings. All rights reserved.PublicationArticle Does presence of social media plugins in a journal website result in higher social media attention of its research publications?(Springer Netherlands, 2020) Mousumi Karmakar; Sumit Kumar Banshal; Vivek Kumar SinghSocial media platforms have now emerged as an important medium for wider dissemination of research articles; with authors, readers and publishers creating different kinds of social media activity about the article. Some research studies have even shown that articles that get more social media attention may get higher visibility and citations. These factors are now persuading journal publishers to integrate social media plugins in their webpages to facilitate sharing and dissemination of articles in social media platforms. Many past studies have analyzed several factors (like journal impact factor, open access, collaboration etc.) that may impact social media attention of scholarly articles. However, there are no studies to analyze whether the presence of social media plugin in a journal could result in higher social media attention of articles published in the journal. This paper aims to bridge this gap in knowledge by analyzing a sufficiently large-sized sample of 99,749 articles from 100 different journals. Results obtained show that journals that have social media plugins integrated in their webpages get significantly higher social media mentions and shares for their articles as compared to journals that do not provide such plugins. Authors and readers visiting journal webpages appear to be a major contributor to social media activity around articles published in such journals. The results suggest that publishing houses should actively provide social media plugin integration in their journal webpages to increase social media visibility (altmetric impact) of their articles. © 2020, Akadémiai Kiadó, Budapest, Hungary.PublicationErratum Erratum to: Designing a Composite Index for research performance evaluation at the national or regional level: ranking Central Universities in India (Scientometrics, (2016), 107, (1171-1193), 10.1007/s11192-016-1935-0)(Springer Netherlands, 2016) Aparna Basu; Sumit Kumar Banshal; Khushboo Singhal; Vivek Kumar SinghIn the original publication of the article, in Table 1, two entries of the last column “Date of inception/conversion” for Hemwati Nandan Bahuguna University (HNBGU) and Central University of Himachal Pradesh (CUHP) have been erroneously interchanged. The correct date of inception of HNBGU is 1973/2009 and for CUHP is 2009. The correct Table 1 is provided in this erratum. © Akadémiai Kiadó, Budapest, Hungary 2016.PublicationArticle How much research output from India gets social media attention?(Indian Academy of Sciences, 2019) Sumit Kumar Banshal; Vivek Kumar Singh; Pranab K. Muhuri; Philipp MayrScholarly articles are now increasingly being mentioned and discussed in social media platforms, sometimes even as pre- or post-print version uploads. Measures of social media mentions and coverage are now emerging as an alternative indicator of impact of scholarly articles. This article aims to explore how much scholarly research output from India is covered in different social media platforms, and how similar or different it is from the world average. It also analyses the disciplinewise variations in coverage and altmetric attention for Indian research output, including a comparison with the world average. Results obtained show interesting patterns. Only 28.5% of the total research output from India is covered in social media platforms, which is about 18% less than the world average. ResearchGate and Mendeley are the most popular social media platforms in India for scholarly article coverage. In terms of discipline-wise variation, medical sciences and biological sciences have relatively higher coverage across different platforms compared to disciplines like information science and engineering. © 2019 Current Science Association, Bengaluru.PublicationConference Paper Measuring altmetric events: The need for Longer Observation Period and Article Level Computations(International Society for Scientometrics and Informetrics, 2021) Mousumi Karmakar; Sumit Kumar Banshal; Vivek Kumar Singh[No abstract available]PublicationArticle Measuring altmetric events: the need for longer observation period and article level computations(Emerald Publishing, 2025) Mousumi Karmakar; Vivek Kumar Singh; Sumit Kumar BanshalPurpose: This paper aims to explore the impact of the data observation period on the computation of altmetric measures like velocity index (VI) and half-life. Furthermore, it also attempts to determine whether article-level computations are better than computations on the whole of the data for computing such measures. Design/methodology/approach: The complete publication records for the year 2016 indexed in Web of Science and their altmetric data (original tweets) obtained from PlumX are obtained and analysed. The creation date of articles is taken from Crossref. Two time-dependent variables, namely, half-life and VI are computed. The altmetric measures are computed for all articles at different observation points, and by using whole group as well as article-level averaging. Findings: The results show that use of longer observation period significantly changes the values of different altmetric measures computed. Furthermore, use of article-level delineation is advocated for computing different measures for a more accurate representation of the true values for the article distribution. Research limitations/implications: The analytical results show that using different observation periods change the measured values of the time-related altmetric measures. It is suggested that longer observation period should be used for appropriate measurement of altmetric measures. Furthermore, the use of article-level delineation for computing the measures is advocated as a more accurate method to capture the true values of such measures. Practical implications: The research work suggests that altmetric mentions accrue for a longer period than the commonly believed short life span and therefore the altmetric measurements should not be limited to observation of early accrued data only. Social implications: The present study indicates that use of altmetric measures for research evaluation or other purposes should be based on data for a longer observation period and article-level delineation may be preferred. It contradicts the common belief that tweet accumulation about scholarly articles decay quickly. Originality/value: Several studies have shown that altmetric data correlate well with citations and hence early altmetric counts can be used to predict future citations. Inspired by these findings, majority of such monitoring and measuring exercises have focused mainly on capturing immediate altmetric event data for articles just after the publication of the paper. This paper demonstrates the impact of the observation period and article-level aggregation on such computations and suggests to use a longer observation period and article-level delineation. To the best of the authors’ knowledge, this is the first such study of its kind and presents novel findings. © 2023, Emerald Publishing Limited.PublicationArticle Power Laws in altmetrics: An empirical analysis(Elsevier Ltd, 2022) Sumit Kumar Banshal; Solanki Gupta; Hiran H. Lathabai; Vivek Kumar SinghPower Laws are a characteristic distribution found in both natural as well as in man-made systems. Previous studies have shown that citations to scientific articles follow a power law, i.e., the number of papers having a certain level of citation x are proportional to x raised to some negative power. However, the distributional character of altmetrics (such as reads, likes, mentions, etc.) has not been studied in much detail, particularly with respect to existence of power law behaviours. This article, therefore, attempts to do an empirical analysis of altmetric mention data of a large set of scholarly articles to see if they exhibit power law. The individual and the composite data series of 'mentions' on the various platforms are fit to a power law distribution, and the parameters and goodness of fit are determined, both using least squares regression as well as the Maximum Likelihood Estimate (MLE) approach. We also explore the fit of the mention data to other distribution families like the Log-normal and exponential distributions. Results obtained confirm the existence of power law behaviour in social media mentions to scholarly articles. The Log-normal distribution also looks plausible but is not found to be statistically significant, and the exponential distribution does not show a good fit. Major implications of power law in altmetrics are given and interesting research questions are posed in pursuit of enhancing the reliability of altmetrics for research evaluation purposes. © 2022 Elsevier Ltd. All rights reserved.PublicationArticle Research performance of central universities in India(Indian Academy of Sciences, 2017) Marisha; Sumit Kumar Banshal; Vivek Kumar SinghThis article presents the research performance of the 39 central universities in India. The research publication data, indexed in the Web of Science, for the 39 central universities for a 25-year period (1990-2014) are used for analysis. The data are computationally analysed to identify productivity, productivity per capita, productivity per crore rupees grant, rate of growth of research output, authorship and collaboration pattern, citation impact and discipline-wise research strength of these institutions. Research performance of the central universities is measured and compared with two top-ranking world universities, namely University of Cambridge and Stanford University. While older well-established big universities such as University of Delhi and Banaras Hindu University perform better than newer universities, some relatively smaller universities, such as the university of Hyderabad have impressive research performance. What is disturbing is that combined research output of all central universities taken together is less than that of either of University of Cambridge or Stanford University alone. The results also provide discipline-wise research strengths of all the universities.PublicationArticle Research performance of Indian Institutes of Technology(Indian Academy of Sciences, 2017) Sumit Kumar Banshal; Vivek Kumar Singh; Aparna Basu; Pranab Kumar MuhuriThis article presents a computational analysis of the research performance of 16 relatively older Indian Institutes of Technology (IITs) in India. The research publication data indexed in Web of Science for all the 16 IITs is used for the analysis. The data is computationally analysed to identify productivity, productivity per capita, rate of growth of research output, authorship and collaboration pattern, citation impact and discipline-wise research strengths of the different IITs. The research performances of the IITs have been compared with those of two top ranking engineering and technology institutions of the world (MIT-USA and NTU-Singapore) and most cited papers from these IITs have also been identified. The analytical results are expected to provide a informative, up-to-date and useful account of research performance assessment of the IITs.PublicationArticle Research performance of the National Institutes of Technology in India(Indian Academy of Sciences, 2018) Sumit Kumar Banshal; Tanu Solanki; Vivek Kumar SinghThis article presents a bibliometric assessment of research performance of the National Institutes of Technology (NITs) in India. While many of these institutes were originally established in 1960s as Regional Engineering Colleges (RECs), they were upgraded to NITs around 2002 and later. Initially NITs offered only undergraduate programmes in engineering. However, during the last decade, several NITs have started postgraduate teaching and are focusing more on research activities. It is in this context that this article assesses the research performance of NITs during 2005-2016. The performance assessment uses research publication data obtained from the Web of Science index. The data collected are computationally analysed to identify productivity, productivity per capita, rate of growth of research, international collaboration pattern, citation impact and discipline-wise distribution of the research output for the NITs. The performance of NITs is also viewed vis-à-vis two top-performing Indian institutions, namely Indian Institute of Science, Bengaluru and Indian Institute of Technology Bombay, Mumbai. A simple single-value composite ranking of research performance of NITs is also presented by combining quantity and quality factors. The study presents an informative and useful account of assessment of research work in the NITs. © 2018, Indian Academy of Sciences.PublicationLetter Response to the Letter to the Editor by Gangan Prathap on the article: Designing a composite index for research performance evaluation at the national or regional level: ranking Central Universities in India(Springer Netherlands, 2016) Aparna Basu; Sumit Kumar Banshal; Khushboo Singhal; Vivek Kumar Singh[No abstract available]PublicationConference Paper Scientific vs. public attention: A comparison of top cited papers in WoS and top papers by altmetric score(Springer Verlag, 2018) Sumit Kumar Banshal; Aparna Basu; Vivek Kumar Singh; Pranab K. MuhuriAlternative metrics or altmetrics are article level metrics that have been used to quantify the attention given to scholarly papers in online fora or social media. In a way it was sought to replace journal level metrics such as the Impact Factor and also to see if altmetric scores could predict highly cited papers. Early studies, done soon after altmetrics was proposed in 2010, were somewhat premature as the use of social media was not widely prevalent, and their results may no longer be relevant at the present time when use levels have risen significantly. In 2016, Altmetric.com has tracked over 17 million mentions of 2.7 million different research outputs. Of these, the top 100 most-discussed journal articles of 2016 were presented with details of journals in which the research was published, affiliations of authors, fields, countries, etc. Our attempt is to obtain a bibliometric profile of these papers with high altmetric scores to see the underlying patterns and characteristics that motivate high public attention. In parallel we have analyzed top 100 highly cited papers from the same year. Our objective in this empirical study is to examine the similarities and distinguishing features of scientific attention as measured by citations and public attention in online fora. A significant finding is that there is very little overlap between very highly cited papers and those that received the highest altmetric scores. © Springer Nature Singapore Pte Ltd. 2018.
