This dissertation considers distributed algorithms for centralized and decentralized networks that solve general convex optimization problems. First, a centralized algorithm is explored for parameter server networks robust to the straggler problem. It is proved that the server nodes' estimates converge to the minimizer of the global objective function with connections vulnerable to an allowed number of stragglers. We then show that convergence is also attained in other different cases: Either by using only the received local gradients scenario or the scenario of using delayed local gradients for the non received gradients. Concurrently, the convergence rates for the above aforementioned scenarios were established and verified using numerical simulations. Additionally, a synchronous distributed algorithm on distributed networks was designed that achieves better convergence rates through using an adequate coding scheme. Moreover, the convergence of the algorithm was proven in the case of static and time-varying networks consistent with the defined coding scheme. The convergence rate is found explicitly and analyzed with the defining criteria that influence its behavior. The algorithms' performance was also verified using the appropriate numerical simulations. Finally, we present another distributed algorithm which is a default-form algorithm for distributed coded gradient algorithms. Both convergence and convergence rate were analyzed to understand better the general behavior of such algorithms.
If this is your thesis or dissertation, and want to learn how to access it or for more information about readership statistics, contact us at STARS@ucf.edu
Doctor of Philosophy (Ph.D.)
College of Engineering and Computer Science
Electrical and Computer Engineering
Length of Campus-only Access
Doctoral Dissertation (Open Access)
Atallah, Elie, "Stragglers-Robust First-Order Distributed Optimization Algorithms Over Networks Utilizing Gradient Coding" (2019). Electronic Theses and Dissertations. 6843.