Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Hessenberg matrix format #99

Closed
dlfivefifty opened this issue Apr 6, 2014 · 14 comments
Closed

Add Hessenberg matrix format #99

dlfivefifty opened this issue Apr 6, 2014 · 14 comments

Comments

@dlfivefifty
Copy link
Contributor

There should be a Hessenberg matrix format to match LAPack's optimized routines, a la Triangular

@jiahao
Copy link
Member

jiahao commented Apr 6, 2014

+1,000,000

I've been wanting this for awhile and wondering if I was the only one.

@dlfivefifty
Copy link
Contributor Author

I would also think schurfact(A,B) should return matrices in Hessenberg format, but at the moment not even lufact(A) returns matrices in Triangular format.

@jiahao
Copy link
Member

jiahao commented Apr 6, 2014

Yes, we don't return the tightest possible type annotations for matrices in matrix factorizations. I'm not sure if this is historical design predating the introduction of the special matrix types, or if there is a deeper issue behind this. cc: @ViralBShah @andreasnoackjensen

@tkelman
Copy link

tkelman commented Apr 6, 2014

Don't know whether this is the reason within Julia, but many of these special matrix factorization formats are stored in a way that is only "psychologically" a certain structure, and/or packed in a non-intuitive way to reduce storage requirements. It can be extra work (or at least extra copying and memory) to return the actual triangular/Hessenberg/what-have-you mathematically interesting matrix, but when you're using the Lapack data structures for all operations it's never absolutely necessary.

It could be worth exposing helper functions to convert from the Lapack-style data structures back to the textbook versions (and maybe vice versa). That functionality is probably already there, but it might require looking up the Lapack code name and API for the various transformations.

@dlfivefifty
Copy link
Contributor Author

It's a little unfortunate that matrix structure and how the matrix is stored in memory are conflated, because exploiting structure can have a huge effect on algorithmic efficiency, even if a full matrix is allocated. So ideally a matrix could be flagged with special structure without doing any data manipulation, but I suppose that's a bit unrealistic

Sent from my iPad

On 6 Apr 2014, at 3:54 pm, Tony Kelman [email protected] wrote:

Don't know whether this is the reason within Julia, but many of these special matrix factorization formats are stored in a way that is only "psychologically" a certain structure, and/or packed in a non-intuitive way to reduce storage requirements. It can be extra work (or at least extra copying and memory) to return the actual triangular/Hessenberg/what-have-you mathematically interesting matrix, but when you're using the Lapack data structures for all operations it's never absolutely necessary.

It could be worth exposing helper functions to convert from (and maybe to) the Lapack-style data structures back to the textbook versions. That functionality may already be there, but it might require looking up the Lapack code name and API for the various transformations.


Reply to this email directly or view it on GitHub.

@andreasnoack
Copy link
Member

First of all, I think we should implement a Hessenberg matrix. I plan to do a general restructuring of the code where this could be added. Originally, the special matrices were defined for dispatch to efficient methods and not for storing in general, but I think that it would make sense that the linear algebra function return Triangular and Symmetric when these structures are known. However, the return type of schurfact(A,B) should be Triangular, right?

At some point I considered the packed storage schemes in LAPACK and at that point I decided that it was not worth the extra complexity because all BLAS and LAPACK operations seemed to be slower on packed matrices and you only save half the storage. If somebody argues convincing that the packed storage is needed, we could possibly revisit this.

I plan to parametrise Triangular and Symmetric on the input matrix type and this allows for an easy way to define the packed storage versions. The usual version would be something like LowerTriangular{Float64,Matrix{Float64}} and the packed version would be LowerTriangular{Float,Vector{Float64}}. The number of methods we need to write to make this work is quite large so buying more RAM might be a better solution.

Finally, there is actually some functionality for packed storage format hidden deep down in the linalg code. Recently LAPACK added a compact format that allows blocked operations which the usual blocked format doesn't. You can try to do

julia> A=randn(10,10)
10x10 Array{Float64,2}:
 -2.0458     0.0606019   0.0542024  …  -1.28826     0.106455   3.26719 
 -0.11179    0.592715   -1.27365       -0.0076927  -0.998839  -1.61936 
  0.52874    0.755426    0.370643       0.0286099  -0.848458   0.218889
 -0.241565  -1.80039    -1.31543       -0.299369   -0.383693   0.880289
  0.618063   1.2247     -0.225992       0.6614     -0.568612   0.999198
 -0.239916   1.75287    -1.21257    …   0.860337    0.752382   1.16358 
  1.08867   -0.914929    1.41425        0.87509    -2.51558    0.512537
 -0.780965  -0.349939    0.840016      -0.334041    1.12887   -0.530548
 -1.32258   -0.396148    0.860297      -0.457098   -0.779192   0.337591
  0.586829   0.945906   -0.350257      -0.735366   -1.13461    0.965293

julia> typeof(Base.LinAlg.Ac_mul_A_RFP(A))
SymmetricRFP{Float64} (constructor with 1 method)

julia> cholfact(Base.LinAlg.Ac_mul_A_RFP(A))
CholeskyDenseRFP{Float64}([0.107713,-0.0219899,-0.503069,-0.181281,-1.07789,2.59979,2.97724,0.448993,0.0795152,-0.387418  …  0.725883,0.186391,3.39004,0.509515,0.292814,1.43889,-0.550847,0.186724,0.489196,2.69069],'N','U')

but actually I don't think it belongs in Base. It least not in its present state.

@dlfivefifty
Copy link
Contributor Author

schurfact() returns only quasi-triangular, which can have dense 2x2 blocks on the diagonal. This is a subclass of Hessenberg. I guess ideally it would return Triangular when schurfact() finds a triangular decomposition and Hessenberg when not.

If I remember right, LAPack uses dense data storage for Hessenberg matrices anyways, so implementation of Hessenberg could just consist of wrapping a standard Array, and calling the appropriate LAPack routines.

On 6 Apr 2014, at 9:47 pm, Andreas Noack Jensen [email protected] wrote:

First of all, I think we should implement a Hessenberg matrix. I plan to do a general restructuring of the code where this could be added. Originally, the special matrices were defined for dispatch to efficient methods and not for storing in general, but I think that it would make sense that the linear algebra function return Triangular and Symmetric when these structures are known. However, the return type of schurfact(A,B) should be Triangular, right?

@andreasnoack
Copy link
Member

@dlfivefifty Oh you are right. I knew I should have thought longer before answering. At least it is triangular in the complex case.

@dlfivefifty
Copy link
Contributor Author

Occasionally I am ;)

Sent from my iPhone

On Apr 7, 2014, at 11:45 PM, Andreas Noack Jensen [email protected] wrote:

@dlfivefifty Oh you are right. I knew I should have thought longer before answering. At least it is triangular in the complex case.


Reply to this email directly or view it on GitHub.

@kshyatt
Copy link
Contributor

kshyatt commented Aug 11, 2015

We now have a Hessenberg type and ops for it, right?

@kshyatt kshyatt closed this as completed Aug 11, 2015
@tkelman
Copy link

tkelman commented Aug 11, 2015

Not really, the current Hessenberg is a QR-related factorization type. JuliaLang/julia#7965 implements a little more of the lapack bindings for a Hessenberg matrix type, but is about a year out of date.

edit: amazingly there's apparently no conflict there, but it needs tests and updating for JuliaLang/julia#8734 and probably a few other changes.

@kshyatt
Copy link
Contributor

kshyatt commented Aug 11, 2015

Should I not have closed it? Sorry. Should I get that PR up in a good state as penance?

@tkelman
Copy link

tkelman commented Aug 11, 2015

It's probably fine to close the issue as long as there's a PR open on the same topic - just reopen the issue if all related PR's somehow get closed without merging.

I don't think anyone would mind if you felt inclined to adopt that PR, add tests and docs etc to finish it. Some of the more obscure RFP lapack bindings were moved out of base a while back but the Hessenberg matrix format type and operations could be useful. No penance, but if you want to use the code then go for it.

@stevengj
Copy link
Member

Update to this old issue:

Nowadays, we indeed have a hessenberg(A) function that returns an UpperHessenberg matrix type (via LAPACK), or SymTridiagonal for A::Hermitian, with various operations supported (#31738), though more could be added (e.g. #858).

@KristofferC KristofferC transferred this issue from JuliaLang/julia Nov 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants