Let'S Agree On Computing Flops For The Symmetric Sparse Matrix Vector Product
Keywords
Compressed sparse row; Performance metrics; Sparse matrix-vector multiplication; Symmetric sparse matrices
Abstract
It is well known that the sparse matrix vector product Ax requires two floating-point operations per each non zero element in A. However, when computing the number of flops for the symmetric sparse matrix vector product (Sym-SpMV) some subtleties should be considered because the number of non zero (nnz) elements reported on symmetric sparse matrices varies from one research work to another. In general, matrices are chosen from the University of Florida Sparse Matrix Collection and for symmetric matrices, in this collection, the nnz elements in the profile differs from the nnz elements stored in file. Therefore, we can find two different works using similar algorithms and reporting different nnz elements for the same matrix because one uses the nnz elements found in the matrix profile and the other reports the nnz elements in store (the lower triangular matrix), which is misleading. If symmetry is exploited, computing four floating point operations per each off diagonal non zero element is correct in a symmetric matrix but it is not accurate to count four floating point operations on diagonal non zero coefficients because they only produce one or two floating point operations per nonzero element depending on the handling of the diagonal. Furthermore, there are symmetric matrices whose diagonal is not dense or all elements are zero. We evaluate three algorithms proposed in the literature to compute the symmetric SpMV and observe their behavior on a small set of different symmetric sparse matrix types. Based on our experimental results we propose a more accurate way to measure flops when dealing with the symmetric sparse matrix vector product. Finally, we show that an algorithm can run faster than another and produce less flops than the slower one.
Publication Date
1-1-2016
Publication Title
Simulation Series
Volume
48
Issue
4
Number of Pages
62-67
Document Type
Article; Proceedings Paper
Personal Identifier
scopus
Copyright Status
Unknown
Socpus ID
84977090729 (Scopus)
Source API URL
https://api.elsevier.com/content/abstract/scopus_id/84977090729
STARS Citation
Montagne, Euripides and Aymerich, Edward, "Let'S Agree On Computing Flops For The Symmetric Sparse Matrix Vector Product" (2016). Scopus Export 2015-2019. 4501.
https://stars.library.ucf.edu/scopus2015/4501