Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add isunitary & isorthogonal functions to LinearAlgebra stdlib? #1098

Open
singularitti opened this issue Oct 12, 2024 · 8 comments
Open

Add isunitary & isorthogonal functions to LinearAlgebra stdlib? #1098

singularitti opened this issue Oct 12, 2024 · 8 comments
Labels
feature Indicates new feature / enhancement requests

Comments

@singularitti
Copy link
Contributor

singularitti commented Oct 12, 2024

Would that be possible to do it? We already have something like ishermitian.

@nsajko nsajko added the feature Indicates new feature / enhancement requests label Oct 12, 2024
@singularitti
Copy link
Contributor Author

I could help?

@jishnub
Copy link
Collaborator

jishnub commented Oct 14, 2024

@PredictiveManish There's no formal assignment. You may submit a PR with the implementations added, along with appropriate tests.

I wonder if there is an easy way to check this without verifying U*U' == I to within some tolerance? Checking for symmetry is O(N^2) as only the elements need to be compared.

@barucden
Copy link
Contributor

barucden commented Oct 14, 2024

Is it faster to compute the multiplication U * U' than looping through the columns and just checking dot(u, v) = 0 (or 1)? The latter could return early in the case of non-orthogonal matrices (plus it would not allocate).

Anyway, for certain matrices (e.g., Diagonal), the check can be simpler, right?

@dkarrasch
Copy link
Member

I'm not sure how these functions would be used. Symmetry (or Hermitian-ness) is checked exactly, based on the components, and we don't provide an "almost-symmetric check", for good reasons, because bad things can happen when you perturb symmetric matrices out of symmetric matrices. But what do you do with a result like "is orthogonal up to an error"? So it's "not orthogonal", or "almost orthogonal"? I believe somebody has requested adding an orthogonality check function for vectors, but again what do you do with "numerically almost orthogonal vectors"?

It's something different when you have an object that comes from an algorithm that is theoretically guaranteed to return orthogonal objects (like columns of AbstractQ operators, or AbstractQs themselves), but checking random matrices or vectors for "sort of orthogonal"? I'm not sure.

I think one would need more background and a very good reason to add such fragile functions, otherwise it could very well be implemented in a few lines in a package.

@singularitti
Copy link
Contributor Author

I've thinking about that too, @dkarrasch. These functions are implemented in https://github.com/jlapeyre/IsApprox.jl.

@aravindh-krishnamoorthy
Copy link
Contributor

I think one would need more background and a very good reason to add such fragile functions, otherwise it could very well be implemented in a few lines in a package.

I agree with this.

Furthermore, istriangular(.), ishermitian(.), and such are necessary in STDLIB because they are used to identify, e.g., the LAPACK routines to be called. For now, I don't see routines that explicitly require unitary or orthogonal inputs. Hence, it may be best to provide this functionality in a separate package.

@KristofferC KristofferC transferred this issue from JuliaLang/julia Nov 26, 2024
@singularitti
Copy link
Contributor Author

singularitti commented Dec 3, 2024

Maybe the others are not so easy to implement, but we can add is_skew_symmetric just like issymmetric without any issue? Since it is pretty easy to verify: $A^\intercal = -A$.

@stevengj
Copy link
Member

stevengj commented Dec 4, 2024

@singularitti, the SkewLinearAlgebra.jl package provides an isskewhermitian(A) function (and lots of functions to take advantage of skew-Hermitian matrices).

In LinearAlgebra.jl, checking for skew-symmetric matrices seems pretty useless since we have no functions that take advantage of it?

One place that might be useful is in exp … a lot of people (e.g. coming from quantum mechanics) do exp(im*H) where H is Hermitian, not realizing that it is probably much more efficient in that case to use cis(H). It might be worth checking for this case in the exp function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Indicates new feature / enhancement requests
Projects
None yet
Development

No branches or pull requests

7 participants