Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Varimax did not converge for echelon-pattern matrix #48

Closed
alyst opened this issue Mar 27, 2024 · 4 comments
Closed

Varimax did not converge for echelon-pattern matrix #48

alyst opened this issue Mar 27, 2024 · 4 comments

Comments

@alyst
Copy link
Contributor

alyst commented Mar 27, 2024

I'm trying a varimax rotation of a 62x8 matrix A_factors with echelon pattern:

A_rotated = rotate(A_factors, Varimax())

and I am getting

ERROR: ConvergenceError: Algorithm did not converge after 1000 iterations

(also tried 10^4 iterations without success, as well as other rotations, e.g. geomin)

lavaan implementation of varimax works instantly and gives reasonable results:

A_rotated_lav = rcopy(R"lavaan:::lav_matrix_rotate($A_factors, method='varimax', rstarts=10, warn=TRUE, verbose=FALSE)")

Here's the A_factors matrix:

62×8 Array{Float64, 2}:
 -1.8031194480220782   0.0                     0.0                    0.0                     0.0                    0.0                    0.0                   0.0
 -1.780254378158767    0.8320815077043094      0.0                    0.0                     0.0                    0.0                    0.0                   0.0
 -1.8066020073724849   0.6860861819708227      0.23333141990373732    0.0                     0.0                    0.0                    0.0                   0.0
 -1.0884633869922222   1.2078318626461864     -0.014148358570189328  -0.7687259678984802      0.0                    0.0                    0.0                   0.0
 -1.594455847645369    0.5900444638522866      0.3355252683178977     0.13634827579831255     0.1922082926558959     0.0                    0.0                   0.0
 -1.622055937580073    0.8935060997826365      0.46609489031479867    0.22347448650237323     0.11020391952543476   -0.3732211147918004     0.0                   0.0
 -1.3583153251937512   1.1997424103482002      0.6570826080721309    -0.10833691066635939    -0.14042618044854474   -0.33154656943437627    0.4930803720861098    0.0
 -1.522998941021372    1.1432984289777777      0.3163336690266741     0.16089432010473903    -0.052787196401661074  -0.4679971737299509     0.4882215603739261    0.18692641916259384
 -1.3470176823633673   0.3684993326867659      0.5856812839369648    -0.047102515838819595    0.474095832306008     -0.55577455778375      -0.2788114484550642   -0.044421087141533616
 -1.3926156992508074   1.024319172161821       0.4998697292913641     0.10934787908667518    -0.23441297080703768   -0.7834255094127972     0.6176832784578933    0.14003496261594237
 -1.391031378168573    0.3629107287699976      0.7020791221911443    -0.2647613112366453      0.10988378615961196   -0.6254972955544791     0.5822135142048368    0.2758399537502801
 -1.4386991145675003   0.7331417681548938      0.29187626401554023    0.0019331471902012992  -0.015145452735129397  -0.9219194401798962     0.4895012267546539    0.20295059667793708
 -1.1067158852420826   0.6683817877931757      0.8856572626803916     0.1037120791202086     -0.730057994395716     -0.6687698843209683     0.6600848108732608   -0.19390341758632923
 -1.3765538848772576   0.9550989719672984      0.1587527682614311     0.07147966188051864     0.03411986886896091   -0.834425006666808      0.34502239093151016   0.16367596481352437
 -0.9867883131214238   1.008460191611201       0.5517901500029752    -0.010803614738111677   -0.1999048285736874    -0.5737650553782931     0.4321913970162881   -0.1190688313796259
 -1.204576035995242    0.6536751595112944      0.8365937426430078     0.08430266598938294     0.044634249730299405  -0.8543276197957083     0.42256231651885184  -0.16767686563001155
 -0.8595287964338265  -0.7208510984205633      0.00673426710999219   -0.32962776789590675     0.7312052701990258    -0.07395130127630392    0.5331556810268492   -0.04953208672704367
 -0.9365463432946263  -0.22195894470650607     0.2584899700368313    -0.4157070826478478      0.7164346219724458    -0.20023972442630458    0.6981088376612178    0.024850837356375208
 -1.2415191109727028   0.9655644906991633     -0.08296533574821885    0.20363040042407662    -0.18054396157396577   -0.9066447658772612     0.4814527411071327    0.10876819017734714
 -1.2424729632046516   0.8461973921439498      0.0657763307396866     0.0717710278394688     -0.1338981664668663    -1.0323530943267887     0.4235911317078656    0.15092214620336541
 -1.3375369534732768   0.8922938098944122      0.15457200744660474    0.15051813428289731    -0.06689466377038078   -1.0425235926869274     0.6321727642189441    0.1536553142948441
 -1.3425966381531036   0.680280299481588       0.2826646908815103    -0.12046272604472064     0.00965389138439711   -0.9905904763803932     0.5050381249457256    0.1658817548509178
 -1.312910654120439    0.37945244111778165    -0.09777836831710249   -0.08715975975972504    -0.4716686686507357    -1.0259555151689657     0.3668696797055012    0.22980335065688143
 -1.1972155039702221  -0.45579546896769474    -0.10434038060487133   -0.1500445266854162      0.44775293744359745   -0.1744725070995336     0.9657028056541566    0.15351823820065058
 -1.3369427153419915   0.019524396517193978    0.1449194853717558    -0.10240792395619082     0.3669021458702285    -0.44581784450027206    1.1526825343179745    0.19156025220541176
 -1.2270646211848601   0.7097934372646774      0.65087168995325       0.18824287259519387    -0.34558070223317233   -0.8915119628394351     0.8006507075999326   -0.15946422703377383
 -1.3641058478781631   0.6428113971682583      0.025713610004139898   0.16451230950549206    -0.07547705543487752   -1.07008784747311       0.4523345869866829    0.13541208750304715
 -1.2304801155331775   0.8328285533130361     -0.044827259530393314   0.13128067084026793    -0.20058882551901489   -1.0917869202087795     0.4618251614789089    0.21456863118302313
 -1.2135083351211609   0.7721448330246423     -0.0906791677817307     0.1027333973353973     -0.13600838769438628   -1.0197715837608217     0.49047035432160085   0.16284002391198193
 -1.3711737429274236   0.4818575947164035      0.2952424445022958     0.0015369516800574675  -0.29616210258593245   -1.0924430067756807     0.3147724163222799    0.26779801283718324
 -0.881623507326852   -0.27499536602152386     0.0641009543733715    -0.15758095653368792     0.5149075280617658     0.004028433150330064   1.1724994742917823   -0.00100380468676092
 -1.1319113389477742  -0.02994624004545933     0.08145416696634103    0.11603134546572053     0.42914939836218957   -0.2833423557454894     1.0582676322121192    0.10098831541822546
 -1.257959277486438    0.7847802603276001      0.3236333906513619    -0.0009921149417387108   0.18740260479996487   -1.0081915557042498     0.5535069312549558    0.22391969556718702
 -1.2907015143011498   0.8994886120979219      0.23357000221847482    0.22261049867918822    -0.026947296783202304  -1.0432458103453437     0.6619384361973949    0.31646247779751024
 -1.1060576677852068   0.3442054981080828      0.864336608296191     -0.1652160031943413      0.165217534403898     -0.8450006731083897     0.3684228145960439    0.07413149354344642
 -0.5674794827805819   0.326708389404871       0.20948259417610846   -0.7840656849370239      0.6191293892703074    -0.1498471711959651     0.3477798227300398    0.08406489925827725
 -1.323008722575005    0.135335563360369       0.6630656777410613    -0.20677966860175157    -0.2451644211281073    -0.9269915933597859     0.17593312399111585   0.7356251648313544
 -0.6663191421449417  -0.6248120551689771     -0.008171753359386337  -0.11142507181929844     0.5836506814780484     0.1896698942550045     0.9381252481486407    0.26763303890838713
 -1.2610662476754595   0.48318088901465506     0.8251362419645101    -0.05262493589010914    -0.5401842049351399    -0.8663476756939109     0.41415746845528884   0.6083022361243062
 -1.2290072844206048   0.5975916087611479      0.5536207675107245     0.08733903615042092    -0.15047682120639125   -0.8244474245648942     0.4151906666556279    0.8227233093876255
 -1.177458200478683    0.46063557581242714     0.8132390933406021     0.10214339223269972    -0.29807659940534725   -0.9070892392227131     0.2913772228405803    0.6764859632246042
 -0.4214688280017217  -0.15972076409574815     1.1995229180242242     0.019920520482620496   -0.14957061327529017   -0.598452398174669     -0.346721927038896    -0.11344883905218622
 -1.1335840903502818   0.6649039810338493      0.603957329499418      0.3693125880590583     -0.3710600236294928    -0.9045494286915285     0.23630427178853958   0.6960663680151772
 -0.5594058780323587  -0.3969927682751389     -0.060699042154902696   0.009209201433501589    0.6533509469529127     0.20182263396277697    0.9770088743482498    0.4033331803124932
 -0.8532528985000337  -0.032788449367812264    0.23455887904307693    0.14489564923141202     0.5263354629590897    -0.16040651502770745    0.9583701110282514    0.5372833429374374
 -1.1366264022675754   0.6550780064402162      0.6251481402094554     0.19000152681548807    -0.39819885423618945   -1.0508475229410277     0.28553499142700245   0.7352381235401934
 -1.0235118650832922   1.0234090108092657      0.2552819160772967     0.49969504890911054    -0.2699145292862323    -0.9168174672278988     0.2733804608962816    0.5664638957412536
 -1.1411011341270563   0.8259754145829105      0.3743170940144502     0.3760478498619468      0.036055607956490154  -0.7967277955252248     0.6087309399222001    0.6023368774672232
 -1.05022651719759     0.8833275981120378      0.7256748709733882     0.3857836393789754     -0.44865304994404315   -0.8563782830390544     0.6775328436111433    0.29306365146974345
 -1.2416420172821132   0.5288304389285903      0.624820046189931     -0.05564592532667939    -0.3539216377188395    -0.9372004884523375     0.3441192814049033    0.7325399911744743
 -0.5810718886837715  -0.5433492110154212     -0.1325289993132914    -0.12048256649042689     0.43826832385422165    0.42466617692669817    1.115357715035477     0.39950091048409553
 -0.8361880151610976  -0.5072018057871102     -0.12498101859790083   -0.05228016543758773     0.45536661300196385    0.2901571932232159     0.8457971203854567    0.6286206247192335
 -1.1152986817531951  -0.26762800124007496     0.2968952332315172    -0.0859620517467263      0.23544075214900398   -0.13394412581563167    0.6829985833963735    0.7718861679814695
 -1.2314826838634585   0.5605490848718775      0.47495257136080193    0.051953283549940645   -0.6361861279384355    -0.8733732108409983     0.14912536692936923   0.8556878222121429
 -1.2274614131730566   0.5231991947219401      0.7340894830616401     0.17726020409396265    -0.4939478342294616    -0.844552810336331      0.4073965635110079    0.762505103092431
 -1.1015584910958005   1.023949033743692       0.4853117910595453     0.38891454241480794    -0.2774555672126431    -0.7575688521907774     0.43356287076507      0.6245157431989959
 -1.1388216434524      0.7931178825383353      0.6293495046578295     0.31214898481468434    -0.6011526083218882    -0.8191665540094917     0.4308627195666771    0.5880678437903624
 -0.8191952603686615  -0.5262939115962224      0.25634917338296687   -0.32567208179871615     0.3178949066532508     0.2515245530309727     0.8736718777008866    0.507678116491813
 -0.8369737590507431  -0.30409292913585617     0.6204655535710238    -0.06291015248506651    -0.06159437708208639   -0.018592936193299932   0.8708712655129236    0.4505491318911019
 -1.0428679819045816   0.3076250647071792      0.8565937182453383    -0.21520387146889466    -0.3337330079701434    -0.7109078122985785    -0.07088128356600733   0.8026439529585926
 -1.2240644597416597   0.7005033027268256      0.44558686071397574    0.24193475107588083    -0.3111467355786483    -0.9641533176349951     0.27270876543558603   0.7319508214475807
 -0.783440047275303    0.0024135824718757126   1.089496321674055      0.04385978246129533    -0.13523389200469982   -0.7359936058134907    -0.21244363288761542   0.02744257340283202
@p-gw
Copy link
Owner

p-gw commented Mar 27, 2024

It seems like the lavaan function has some different default values for its function arguments:

  1. The tolerance is set to 1e-5 instead of 1e-6
  2. The number of maximum iterations is set to 10000 instead of 1000
  3. It does Kaiser normalization by default (at least for Varimax rotation)
  4. It uses 100 random starts

Otherwise it uses the same algorithm so it should produce the same results in this case.

Matching the function arguments converges for me. For now you have to normalize the matrix manually (see #47):

Anorm, weights = kaiser_normalize(A)
rotate!(Anorm, Varimax(), atol = 1e-5, maxiter1 = 10_000, randomstarts = 10)
Arot = kaiser_denormalize(Anorm, weights)

Please note that FactorRotations.jl does not yet do some of the convenience transformations such as reflecting signs (#50) or reordering the columns (#49).

@alyst
Copy link
Contributor Author

alyst commented Mar 28, 2024

@p-gw Thank you, adjusting the method parameters helped! With the bigger matrix the method struggles more, though (but so does the lavaan implementation).
I have noticed that in Varimax criterion() the Lambda-square columns are not centered, but in criterion_and_gradient() they are. Is it equivalent?

@alyst alyst mentioned this issue Mar 29, 2024
@alyst
Copy link
Contributor Author

alyst commented Mar 30, 2024

I've checked the convergence criterion: both lavaan and FactorRotations.jl use the Frobenius norm.
I guess the norm would tend to be larger for larger factor matrices, so the convergence criterion gets stricter as the matrix size grows.
But that probably means that user scripts that deal with variable number of factors or observed variables would have to adjust the criterion to the matrix size.
Would it make sense to change it to the mean of the squared matrix elements on the method's side?

@p-gw
Copy link
Owner

p-gw commented Apr 2, 2024

Would it make sense to change it to the mean of the squared matrix elements on the method's side?

To be honest, I don't know. The paper remains pretty vague about how the specific stopping rule was chosen. I'd probably rather keep compatability with existing packages (GPARotation, lavaan, etc.) which means the default values in FactorRotations.jl would need to be changed.

On the other hand I also had the idea of adding a relative tolerance to the stopping rule (see #7). I think this would have also caught your convergence error where the criterion just stops decreasing after a while.

@alyst alyst closed this as completed Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants