Browsing by Author "D.R. Sahu"
Now showing 1 - 20 of 85
- Results Per Page
- Sort Options
PublicationArticle A derivative free projection method for the singularities of vector fields with convex constraints on Hadamard manifolds(Taylor and Francis Ltd., 2024) D.R. Sahu; Shikher SharmaThe objective of this paper is to introduce a derivative free projection method designed to find the singularities of pseudomonotone vector fields with convex constraints on Hadamard manifolds. This innovative approach combines the hyperplane projection method with a novel search direction. The global convergence of the proposed method is established under certain conditions. Our method improves some existing results in the literature on Hadamard manifolds. Additionally, illustrative numerical examples are provided to demonstrate the practical efficacy of our method. © 2024 Informa UK Limited, trading as Taylor & Francis Group.PublicationArticle A general implicit iteration for finding fixed points of nonexpansive mappings(International Scientific Research Publications, 2016) D.R. Sahu; Shin Min Kang; Ajeet Kumar; Sun Young ChoThe aim of the paper is to construct an iterative method for finding the fixed points of nonexpansive mappings. We introduce a general implicit iterative scheme for finding an element of the set of fixed points of a nonexpansive mapping defined on a nonempty closed convex subset of a real Hilbert space. The strong convergence theorem for the proposed iterative scheme is proved under certain assumptions imposed on the sequence of parameters. Our results extend and improve the results given by Ke and Ma Y. Ke, C. Ma, Fixed Point Theory Appl., 2015 (2015), 21 pages., Xu et al. H. K. Xu, M. A. Alghamdi, N. Shahzad, Fixed Point Theory Appl., 2015 (2015), 12 pages., and many others. © 2016 all rights reserved.PublicationArticle A generalized hybrid steepest descent method and applications(Biemdas Academic Publishers, 2017) D.R. Sahu; J.C. YaoThe purpose of this paper is to investigate a generalized hybrid steepest descent method and develop a convergence theory for solving monotone variational inequality over the fixed point set of a mapping which is not necessarily Lipschitz continuous. Using this result, we consider the convex minimization problem for a continuously differentiable convex function whose gradient is not necessarily Lipschitzian. © 2017 Journal of Nonlinear and Variational AnalysisPublicationArticle A generalized hybrid steepest-descent method for variational inequalities in Banach spaces(2011) N.C. Wong; D.R. Sahu; J.C. YaoThe hybrid steepest-descent method introduced by Yamada (2001) is an algorithmic solution to the variational inequality problem over the fixed point set of nonlinear mapping and applicable to a broad range of convexly constrained nonlinear inverse problems in real Hilbert spaces. Lehdili and Moudafi (1996) introduced the new prox-Tikhonov regularization method for proximal point algorithm to generate a strongly convergent sequence and established a convergence property for it by using the technique of variational distance in Hilbert spaces. In this paper, motivated by Yamada's hybrid steepest-descent and Lehdili and Moudafi's algorithms, a generalized hybrid steepest-descent algorithm for computing the solutions of the variational inequality problem over the common fixed point set of sequence of nonexpansive-type mappings in the framework of Banach space is proposed. The strong convergence for the proposed algorithm to the solution is guaranteed under some assumptions. Our strong convergence theorems extend and improve certain corresponding results in the recent literature. Copyright 2011 D. R. Sahu et al.PublicationArticle A new iteration technique for nonlinear operators as concerns convex programming and feasibility problems(Springer, 2020) D.R. Sahu; A. Pitea; M. VermaThe aim of this work is to develop an S-iteration technique for finding common fixed points for nonself quasi-nonexpansive mappings in the framework of a uniformly convex Banach space. Convergence properties of the proposed algorithm are analyzed in the setting of uniformly convex Banach spaces. To prove the usability of our results, some novel applications are provided, focused on zeros of accretive operators, convex programming, and feasibility problems. Some numerical experiments with real datasets for Lasso problems are provided. © 2019, Springer Science+Business Media, LLC, part of Springer Nature.PublicationArticle A Newton-Like method for generalized operator equations in Banach spaces(Kluwer Academic Publishers, 2014) D.R. Sahu; Krishna Kumar Singh; Vipin Kumar SinghIn this paper, we are concerned with the semilocal convergence analysis of a Newton-like method discussed by Bartle (Amer Math Soc 6: 827–831, 1955) to solve the generalized operator equations containing nondifferentiatble term in Banach spaces. This method has also been studied by Rheinboldt (SIAM J Numer Anal 5: 42–63, 1968). The aim of the paper is to discuss the convergence analysis under local Lipschitz condition our results extend and improve the previous ones in the sense of local Lipschitz conditions. We apply our results to solve the Fredholm-type operator equations. © 2014, Springer Science+Business Media New York.PublicationArticle A newton-like method for solving generalized operator equations and variational inequalities(Yokohama Publications, 2015) D.R. Sahu; K.K. Singh; V.K. Singh; Y.J. ChoIntlris paper, we present a semilocal convergence analysis of a Newton- like method for solving the generalized operator equations in Hilbert spaces and also discuss the convergence analysis of the proposed algorithm under weak con- ditions. We establish sharp generalizations of Kantorovich theory for operator equations when the derivative is not necessarily invertible. As a simple con- sequence of our result, we discuss the existence and uniqueness of solutions of mixed variational inequality problems. Finally, we give numerical examples for the equations involving single valued as well as multi-valued mappings. © 2015.PublicationArticle A third order Newton-like method and its applications(MDPI AG, 2018) D.R. Sahu; Ravi P. Agarwal; Vipin Kumar SinghIn this paper, we design a new third order Newton-like method and establish its convergence theory for finding the approximate solutions of nonlinear operator equations in the setting of Banach spaces. First, we discuss the convergence analysis of our third order Newton-like method under the ω-continuity condition. Then we apply our approach to solve nonlinear fixed point problems and Fredholm integral equations, where the first derivative of an involved operator does not necessarily satisfy the Hölder and Lipschitz continuity conditions. Several numerical examples are given, which compare the applicability of our convergence theory with the ones in the literature. © 2018 by the authors.PublicationArticle A unified framework for three accelerated extragradient methods and further acceleration for variational inequality problems(Springer Science and Business Media Deutschland GmbH, 2023) D.R. SahuThe main strategy of this paper is intended to speed up the convergence of the inertial Mann iterative method and further speed up it through the normal S-iterative method for a certain class of nonexpansive-type operators that are linked with variational inequality problems. Our new convergence theory permits us to settle down the difficulty of unification of Korpelevich’s extragradient method, Tseng’s extragardient method, and subgardient extragardient method for solving variational inequality problems through an auxiliary algorithmic operator, which is associated with the seed operator. The paper establishes an interesting the fact that the relaxed inertial normal S-iterative extragradient methods do influence much more on convergence behaviour. Finally, the numerical experiments are carried out to illustrate that the relaxed inertial iterative methods; in particular, the relaxed inertial normal S-iterative extragradient methods may have a number of advantages over other methods in computing solutions to variational inequality problems in many cases. © 2023, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.PublicationArticle A unified hybrid iterative method for hierarchical minimization problems(2013) D.R. Sahu; Q.H. Ansari; J.C. YaoIn this paper, we introduce and analyze a new unified hybrid iterative method to compute the approximate solution of the general optimization problem defined over the set D=Fix(T)∩Ω[GMEP(Φ,Ψ,φ)], where Fix(T) is the set of common fixed points of a family T=T(t):0≤t<∞ of nonexpansive self-mappings on a Hilbert space H, and Ω[GMEP(Φ,Ψ,)] is the set of solutions of the generalized mixed equilibrium problem (in short, GMEP). Such type of minimization problem is called the hierarchical minimization problem. We establish the strong convergence of the sequences generated by the proposed algorithm. Our strong convergence theorem extends, improves and unifies the previously known results in the literature. We also give a numerical example to illustrate our algorithm and results. © 2013 Elsevier B.V. All rights reserved.PublicationArticle A unified hybrid iterative method for solving variational inequalities involving generalized pseudocontractive mappings(2012) D.R. Sahu; N.C. Wong; J.C. YaoWe study in this paper the existence and the approximation of solutions of variational inequalities involving generalized pseudocontractive mappings in Banach spaces. The convergence analysis of a proposed hybrid iterative method for approximating common zeros or fixed points of a possibly infinitely countable or uncountable family of such operators will be conducted within the conceptual framework of the "viscosity approximation technique" in reflexive Banach spaces with uniform Gâteaux differentiable norms. This technique should make existing or new results in solving variational inequalities more applicable. © 2012 Society for Industrial and Applied Mathematics.PublicationArticle Accelerated iterative splitting methods on Hadamard manifolds(Taylor and Francis Ltd., 2024) D.R. Sahu; Shikher Sharma; J.C. Yao; Xiaopeng ZhaoThis paper aims to solve the monotone inclusion problem, minimization problem of multiple summands and the generalized Heron problem. We present an innovative approach, the modified normal S-iteration method, designed to approximate common fixed points of nearly nonexpansive sequences and families of operators via the property (Formula presented.). Some deductions of our results improve some existing results in the literature. To show the applicability of our result, we give application to the inclusion problem via forward–backward splitting method version of our algorithm and minimization problem via Douglas–Rachford splitting method version of our algorithm. To demonstrate the practical utility of the algorithm, we apply it to the generalized Heron problem. © 2024 Informa UK Limited, trading as Taylor & Francis Group.PublicationArticle Accessibility of solutions of operator equations by Newton-like methods(Academic Press Inc., 2015) D.R. Sahu; Y.J. Cho; R.P. Agarwal; I.K. ArgyrosThe concept of a majorizing sequence introduced and applied by Rheinboldt in 1968 is taken up to develop a convergence theory of the Picard iteration xn+1=G(xn) for each n≥0 for fixed points of an iteration mapping G: D0⊂X→X in a complete metric space X satisfying iterated contraction-like condition: d(G(y), G(x)) ≤ ψ(d(y, x), d(y, x0), d(x, x0))d(y, x) for all x ∈ D0 with y=G(x) ∈ D0, where x0 and ψ ∈ Φ(3). Here 3 is a suitable set of (ℝ+)3 to be defined in Section 2. We study the region of accessibility of fixed points of G by the Picard iteration un+1=G(un), where the starting point u0 ∈ D0 is not necessarily x0. Our convergence theory is applied to the Newton-like iterations in Banach spaces under the center Lipschitz condition ∥ Fx′-Fx0′∥≤ ω(x-x0) for a given point x0 ∈ D0. Our results extend and improve the previous ones in the sense of the center Lipschitz condition and the region of accessibility of solutions. We apply our results to solve the nonlinear Fredholm operator equations of second kind. © 2015 Elsevier Inc.PublicationArticle Altering points and applications(Touch Briefings, 2014) D.R. SahuIt is well known that the rate of convergence of S-iteration process introduced by Agarwal et al. [J. Nonlinear Convex Anal., 8 (1) (2007), 61-79.] is faster than Picard iteration process for contraction operators. Following the ideas of S-iteration process, we introduce a parallel S-iteration process for finding altering points of nonlinear operators. We apply our algorithms to solve a system of operator equations in Banach space setting. This work also includes convergence analysis of hybrid steepest-descent-like method and hybrid Newton-like method in the context of altering points. © CSP - Cambridge, UK; I&S - Florida, USA, 2014.PublicationArticle An accelerated forward-backward splitting algorithm for solving inclusion problems with applications to regression and link prediction problems(Biemdas Academic Publishers, 2021) A. Dixit; D.R. Sahu; P. Gautam; T. Som; J.C. YaoThe forward-backward method is a very popular approach to solve composite inclusion problems. In this paper, we propose a novel accelerated forward-backward algorithm to obtain the vanishing point of sum of two operators in which one is maximal monotone and other is M-cocoercive, where M is a linear bounded operator on underlying spaces. Our proposed algorithm is more general than previously known algorithms. We study the convergence behavior of proposed algorithm under mild assumptions in the framework of real Hilbert spaces. We employ our model to solve regression problems and link prediction problems for high dimensional datasets and conduct numerical experiments to support our results. This model improves convergence speed and accuracy in respective problems. We also conduct numerical experiments to support our results. © 2021 Journal of Nonlinear and Variational AnalysisPublicationArticle An extragradient iterative scheme for common fixed point problems and variational inequality problems with applications(Ovidius University, 2015) Petruşel Adrian; D.R. Sahu; Vidya SagarIn this paper, by combining a modified extragradient scheme with the viscosity approximation technique, an iterative scheme is developed for computing the common element of the set of fixed points of a sequence of asymptotically nonexpansive mappings and the set of solutions of the variational inequality problem for an α-inverse strongly monotone mapping. We prove a strong convergence theorem for the sequences generated by this scheme and give some applications of our convergence theorem.PublicationArticle An extragradient iterative scheme for common fixed point problems and variational inequality problems with applications(Sciendo, 2015) Adrian Petruşel; D.R. Sahu; Vidya SagarIn this paper, by combining a modified extragradient scheme with the viscosity approximation technique, an iterative scheme is developed for computing the common element of the set of fixed points of a sequence of asymptotically nonexpansive mappings and the set of solutions of the variational inequality problem for an α-inverse strongly monotone mapping. We prove a strong convergence theorem for the sequences generated by this scheme and give some applications of our convergence theorem. © 2015 Sciendo. All rights reserved.PublicationArticle Application of a new accelerated algorithm to regression problems(Springer, 2020) Avinash Dixit; D.R. Sahu; Amit Kumar Singh; T. SomMany iterative algorithms like Picard, Mann, Ishikawa are very useful to solve fixed point problems of nonlinear operators in real Hilbert spaces. The recent trend is to enhance their convergence rate abruptly by using inertial terms. The purpose of this paper is to investigate a new inertial iterative algorithm for finding the fixed points of nonexpansive operators in the framework of Hilbert spaces. We study the weak convergence of the proposed algorithm under mild assumptions. We apply our algorithm to design a new accelerated proximal gradient method. This new proximal gradient technique is applied to regression problems. Numerical experiments have been conducted for regression problems with several publicly available high-dimensional datasets and compare the proposed algorithm with already existing algorithms on the basis of their performance for accuracy and objective function values. Results show that the performance of our proposed algorithm overreaches the other algorithms, while keeping the iteration parameters unchanged. © 2019, Springer-Verlag GmbH Germany, part of Springer Nature.PublicationArticle Application of new strongly convergent iterative methods to split equality problems(Springer Science and Business Media Deutschland GmbH, 2020) Pankaj Gautam; Avinash Dixit; D.R. Sahu; T. SomIn this paper, we study the generalized problem of split equality variational inclusion problem. For this purpose, we introduced the problem of finding the zero of a nonnegative lower semicontinuous function over the common solution set of fixed point problem and monotone inclusion problem. We proposed and studied the convergence behaviour of different iterative techniques to solve the generalized problem. Furthermore, we study an inertial form of the proposed algorithm and compare the convergence speed. Numerical experiments have been conducted to compare the convergence speed of the proposed algorithm, its inertial form and already existing algorithms to solve the generalized problem. © 2020, SBMAC - Sociedade Brasileira de Matemática Aplicada e Computacional.PublicationArticle Applications of a variable anchoring iterative method to equation and inclusion problems on Hadamard manifolds(Elsevier B.V., 2024) D.R. Sahu; Ariana Pitea; Shikher Sharma; Amit Kumar SinghIn this paper, we introduce a new iterative technique with a variable anchoring operator for reckoning the solution of a variational inequality problem over the set of the common fixed points of a nearly nonexpansive sequence of operators in the framework of Hadamard manifolds. We also establish a convergence result on the proposed algorithm for approximating a solution of the problem, under suitable assumptions. We apply our results for finding the solutions of a system of nonlinear equations, and of inclusion problems to support their utility. Our work improves results in the recent literature. Numerical simulations are given for a better understanding of the effectiveness of our outcomes. © 2024 Elsevier B.V.
