12-04-2017 | Aditya Devarakonda: Communication-Avoiding Algorithms

High Performance Computing Incubator Speaker Series: Seminar #1: Communication-Avoiding Algorithms

Speaker: Aditya Devarakonda, Ph.D. Candidate, University of California, Berkeley

Date: Monday, December 4, 2017

Time: 10:00AM -11:00AM

Location: NASA IESB (Building 2102), Room 114

Abstract: The cost of communication in modern parallel and sequential computer architectures is a primary performance bottleneck. Recent work has shown that communication-avoiding algorithms can be designed for most of numerical linear algebra and, more recently, for optimization methods in machine learning. In this talk, I will cover the state-of-the-art in the theory and practice of communication-avoiding algorithms and illustrate that, by reducing communication, we can obtain faster and more scalable algorithms. Finally, I will present current work, open problems, and challenges to extending this work to other domains.

 

High Performance Computing Incubator Series: Seminar #2: Communication-Avoiding Algorithms [related to Krylov Subspace Methods]

Speaker: Aditya Devarakonda, Ph.D. Candidate, University of California, Berkeley

Date: Monday, December 4, 2017

Time: 1:00PM – 2:00pm

Location: NASA IESB (Building 2102), Room 114

Abstract: Krylov subspace methods (KSMs) are popular iterative methods that are widely used in scientific computing. Parallel versions of these methods require communication at every iteration. On modern distributed-memory parallel machines the communication cost of KSMs is often the primary performance bottleneck. Communication-Avoiding KSMs avoid this communication for s iterations, where s is a tunable parameter. This talk will cover the derivation, numerical stability, and performance of CA-KSMs.

Biography: Aditya Devarakonda is a Ph.D. Candidate in Computer Science at the University of California, Berkeley where he is co-advised by James Demmel and Michael W. Mahoney. His thesis research extends the theory and practice of communication-avoiding algorithms to optimization methods in machine learning. He is a recipient of a National Science Foundation Graduate Research Fellowship and a University of California, Berkeley EECS Fellowship.