Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LinearAlgebra traited matrix operations #48861

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

KlausC
Copy link
Contributor

@KlausC KlausC commented Mar 2, 2023

The functions *, \ for AbstractMatrix and [rl]mul! and [rl]div! for triangular matrices are equipped with an indirection trait, which allows to call specialized implementations, also if the type of the matrices is hidden for standard dispatch due to multiple wrappers.
This is an effort to remedy the "Abstract Array Fallback Trap".

This PR does not introduce new behavior nor should it worsen runtime for the mentioned functions. But it allows other stdlibs or packages to use own implementations for selected subclasses of AbstractMatrix (for example SparseArrays).

Copy link
Member

@vchuravy vchuravy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very interesting idea!

In your mind would GPUArrays define it's own StorageType? Or would it also re-use DenseStorage?

The concrete subtypes shall always contain a field named `data` keeping the
attributed object.
"""
abstract type AbstractStorageTrait{T} end
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
abstract type AbstractStorageTrait{T} end
abstract type AbstractStorageTrait{T} end

@KlausC
Copy link
Contributor Author

KlausC commented Mar 2, 2023

In your mind would GPUArrays define it's own StorageType? Or would it also re-use DenseStorage?

I am not fluent in GPUArrays, but I think all of that could be managed inside that package:

It would define a trait GPUStorageType{T} <: AbstractStorageTrait ...
and method storage_trait(::Type{<:GPUArray}) = GPUStorageType
And of course the new methods for the GPUArrays implementions of the various methods to be supported.
It is also possible to fall back to the dense implementation intentionally, if that is appropriate, but all in the external package.

A paradigmatic use is in JuliaSparse/SparseArrays.jl#354.

@jishnub
Copy link
Contributor

jishnub commented Mar 3, 2023

This seems similar in some ways to https://github.com/JuliaLinearAlgebra/ArrayLayouts.jl. Do you envisage combining the two, such that the same traits may be used in both packages?

@KlausC
Copy link
Contributor Author

KlausC commented Mar 3, 2023

This seems similar in some ways to https://github.com/JuliaLinearAlgebra/ArrayLayouts.jl. Do you envisage combining the two, such that the same traits may be used in both packages?

There is the essential difference, that this PR tries to redirect the standard methods *, ldiv!, ... to the correct implementation depending on the storage type of the matrix, while ArrayLayouts introduces new functions mul, muladd! ... to get control over kinds of strided matrices.
In order to combine both approaches it would be possible to use the features introduced in this PR in ArrayLayouts allow the user to call the standard functions and get the implementations from that package.

(All that can be managed inside ArrayLayouts. I think the traits defined there cannot be re-used unmodified, as long as they are not subtypes of AbstractArrayTrait. Probably more appropriate to use a new concrete subtype and keep the others as they are.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
linear algebra Linear algebra
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants