Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] PETSc #165

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open

[WIP] PETSc #165

wants to merge 5 commits into from

Conversation

ESeNonFossiIo
Copy link
Contributor

Require mathLab/deal2lkit#292

I got some problem with reinit of LinearOperator when the LinearOperator is associated to a PETSc matrix.
I am trying to make a test similar to linear_operator_04 for PETSc. -> too messy. I will look for another approach.

@asartori86
Copy link
Contributor

we also need to add petsc to travis...

@ESeNonFossiIo
Copy link
Contributor Author

I got problem with the test:

libc++abi.dylib: terminating with uncaught exception of type dealii::PETScWrappers::internal::VectorReference::ExcAccessToNonlocalElement:
--------------------------------------------------------
An error occurred in line <129> of file </Users/esenonfossiio/git/dealii/dealii/source/lac/petsc_vector_base.cc> in function
    PetscScalar dealii::PETScWrappers::internal::VectorReference::operator double() const
The violated condition was:
    (index >= static_cast<size_type>(begin)) && (index < static_cast<size_type>(end))
The name and call sequence of the exception was:
    ExcAccessToNonlocalElement (index, begin, end-1)
Additional Information:
You tried to access element 0 of a distributed vector, but only elements 9 through 8 are stored locally and can be accessed.
--------------------------------------------------------

I am going to work on other stuff and then I will fix it.
I you have ideas... :D

@ESeNonFossiIo
Copy link
Contributor Author

@nicola-giuliani any idea?

49: --------------------------------------------------------
49: An error occurred in line <474> of file </Applications/deal.II.app/Contents/Resources/opt/dealii_dev/include/deal.II/lac/petsc_parallel_vector.h> in function
49:     dealii::PETScWrappers::MPI::Vector &dealii::PETScWrappers::MPI::Vector::operator=(const dealii::PETScWrappers::MPI::Vector &)
49: The violated condition was:
49:     v.last_action == VectorOperation::unknown
49: The name and call sequence of the exception was:
49:     internal::VectorReference::ExcWrongMode (VectorOperation::unknown, v.last_action)
49: Additional Information:
49: You tried to do a ??? operation but the vector is currently in 'set' mode. You first have to call 'compress()'.
49: --------------------------------------------------------

@ESeNonFossiIo
Copy link
Contributor Author

his=0x00007fff5fbf9b00, v=0x0000000125a4f8e0) + 285 at petsc_parallel_block_vector.h:338
   335        this->block_indices = v.block_indices;
   336
   337        for (unsigned int i=0; i<this->n_blocks(); ++i)
-> 338          this->components[i] = v.components[i];
   339      }
   340
   341      inline
(lldb)
frame #7: 0x000000010027365d libpidomus-lib.g.dylib`dealii::PETScWrappers::MPI::BlockVector::BlockVector(this=0x00007fff5fbf9b00, v=0x0000000125a4f8e0) + 29 at petsc_parallel_block_vector.h:333
   330      BlockVector::BlockVector (const BlockVector &v)
   331        :
   332        BlockVectorBase<Vector > ()
-> 333      {
   334        this->components.resize (v.n_blocks());
   335        this->block_indices = v.block_indices;
   336
(lldb)
frame #8: 0x00000001002f9c2b libpidomus-lib.g.dylib`piDoMUS<2, 2, LAPETSc>::syncronize(this=0x00007fff5fbfbf60, t=0x00007fff5fbfaaf0, solution=0x0000000125a4f8e0, solution_dot=0x0000000125a53d50) + 155 at pidomus.cc:544
   541        // previous explicit solution will be zero
   542        current_time = t;
   543        update_functions_and_constraints(t);
-> 544        typename LAC::VectorType tmp(solution);
   545        typename LAC::VectorType tmp_dot(solution_dot);
   546        constraints.distribute(tmp);
   547        constraints_dot.distribute(tmp_dot);
(lldb)
frame #9: 0x00000001002fb1d8 libpidomus-lib.g.dylib`piDoMUS<2, 2, LAPETSc>::residual(this=0x00007fff5fbfbf60, t=0, solution=0x0000000125a4f8e0, solution_dot=0x0000000125a53d50, dst=0x0000000125a54080) + 248 at pidomus.cc:1033
   1030 {
   1031   auto _timer = computing_timer.scoped_timer ("Residual");
   1032
-> 1033   syncronize(t,solution,solution_dot);
   1034
   1035   const QGauss<dim> quadrature_formula(fe->degree + 1);
   1036   const QGauss < dim - 1 > face_quadrature_formula(fe->degree + 1);
(lldb)
frame #10: 0x00000001002fcac9 libpidomus-lib.g.dylib`non-virtual thunk to piDoMUS<2, 2, LAPETSc>::residual(this=0x00007fff5fbfbf60, t=0, solution=0x0000000125a4f8e0, solution_dot=0x0000000125a53d50, dst=0x0000000125a54080) + 73 at pidomus.h:108
   105
   106    /** For dae problems, we need a
   107     residual function. */
-> 108    virtual int residual(const double t,
   109                         const typename LAC::VectorType &src_yy,
   110                         const typename LAC::VectorType &src_yp,
   111                         typename LAC::VectorType &dst);
(lldb)
frame #11: 0x00000001009cfc39 libdeal2lkit.g.dylib`int deal2lkit::t_dae_residual<dealii::PETScWrappers::MPI::BlockVector>(tt=0, yy=0x0000000125a473e0, yp=0x0000000125a33c00, rr=0x0000000125a2c6b0, user_data=0x00007fff5fbfbf60) + 249 at ida_interface.cc:60
   57     copy(*src_yy, yy);
   58     copy(*src_yp, yp);
   59
-> 60     int err = solver.residual(tt, *src_yy, *src_yp, *residual);
   61
   62     copy(rr, *residual);
   63
(lldb)
frame #12: 0x0000000122d668a0 libsundials_ida.2.0.0.dylib`IDAnlsIC + 128
libsundials_ida.2.0.0.dylib`IDAnlsIC:
    0x122d668a0 <+128>: movl   %eax, -0x14(%rbp)
    0x122d668a3 <+131>: movq   -0x10(%rbp), %rcx
    0x122d668a7 <+135>: movq   0x318(%rcx), %rdx
    0x122d668ae <+142>: addq   $0x1, %rdx
(lldb)
frame #13: 0x0000000122d6660f libsundials_ida.2.0.0.dylib`IDACalcIC + 1247
libsundials_ida.2.0.0.dylib`IDACalcIC:
    0x122d6660f <+1247>: movl   %eax, -0x3c(%rbp)
    0x122d66612 <+1250>: cmpl   $0x0, -0x3c(%rbp)
    0x122d66619 <+1257>: jne    0x122d66624               ; <+1268>
    0x122d6661f <+1263>: jmp    0x122d666f9               ; <+1481>
(lldb) exit

@ESeNonFossiIo
Copy link
Contributor Author

se provo ad aggiungere compress:

/Users/esenonfossiio/git/project/pi-DoMUS/source/pidomus.cc:546:7: error: member function 'compress' not
      viable: 'this' argument has type 'const typename LAPETSc::VectorType' (aka 'const
      dealii::PETScWrappers::MPI::BlockVector'), but function is not marked const
      solution_dot.compress(VectorOperation::insert);

@nicola-giuliani
Copy link
Contributor

compress dove lo hai aggiunto?

@nicola-giuliani
Copy link
Contributor

a occhio mi pare che quando synchronize chiama il costruttore copia il vettore originale non sia stato compressed. Bisognerebbe capire l'ultima operazione fatta sul vettore e chiamare un compress prima di syncronize. Cosi spero che l'errore cambi...come a Pozioni nel primo semestre ;)

@ESeNonFossiIo
Copy link
Contributor Author

Ho provato:

   543        update_functions_and_constraints(t);
   ---> qua
   544        typename LAC::VectorType tmp(solution);
   545        typename LAC::VectorType tmp_dot(solution_dot);

ma ho errori legati alle funzioni const e a solution che è const. se provo a togliere const è un macello.

@ESeNonFossiIo
Copy link
Contributor Author

movie

(lldb) run
Process 23235 launched: './tests/poisson_04.debug/poisson_04.debug' (x86_64)
Number of active cells: 256 (on 5 levels)
Number of degrees of freedom: 289(289)

computing consistent initial conditions with the option use_y_diff please be patient.
compute initial conditions: done.
     0 ---->  0.01
  0.01 ---->  0.02      5
  0.02 ---->  0.03      6
  ################ restart #########
max_kelly > threshold
0.0100929 >  0.01
######################################
Number of active cells: 274 (on 6 levels)
Number of degrees of freedom: 331(331)

computing consistent initial conditions with the option use_y_dot please be patient.
compute initial conditions: done.
  0.03 ---->  0.04
  0.04 ---->  0.05      15
  0.05 ---->  0.06      19
  ################ restart #########
max_kelly > threshold
0.0101164 >  0.01
######################################
Number of active cells: 328 (on 6 levels)
Number of degrees of freedom: 392(392)

computing consistent initial conditions with the option use_y_dot please be patient.
compute initial conditions: done.
  ################ restart #########
max_kelly > threshold
0.0107257 >  0.01
######################################
Number of active cells: 394 (on 6 levels)
Number of degrees of freedom: 462(462)

computing consistent initial conditions with the option use_y_dot please be patient.
compute initial conditions: done.
  0.06 ---->  0.07
  0.07 ---->  0.08      22
  0.08 ---->  0.09      19
  0.09 ---->   0.1
  ################ restart #########
max_kelly > threshold
0.0102897 >  0.01
######################################
Number of active cells: 442 (on 6 levels)
Number of degrees of freedom: 507(507)

computing consistent initial conditions with the option use_y_dot please be patient.
compute initial conditions: done.
   0.1 ---->  0.11
 iterations:            5
cells dofs    u_L2        u_H1
  442  507 7.964e-02 - 5.706e-01 -
*** The MPI_Attr_get() function was called after MPI_FINALIZE was invoked.
*** This is disallowed by the MPI standard.
*** Your MPI job will now abort.
[(null):23235] Local abort after MPI_FINALIZE completed successfully; not able to aggregate error messages, and not able to guarantee that all other processes were killed!
Process 23235 exited with status = 1 (0x00000001)
(lldb) up
error: invalid thread

@asartori86
Copy link
Contributor

se il poisson_04.cc è quello che c'è in questa PR è normale che salti alla fine.. oppure l'errore non è legato al ciclo for?

@asartori86
Copy link
Contributor

per curiosità.. puoi mettere use_y_dot in ida per vedere se il numero di iterazioni cambia?

@ESeNonFossiIo
Copy link
Contributor Author

se il poisson_04.cc è quello che c'è in questa PR è normale che salti alla fine.. oppure l'errore non è legato al ciclo for? Scusami ma non capisco proprio. Lo ho copiato da poisson_03.cc... Puoi essere verboso :)

@luca-heltai
Copy link
Contributor

Una cosa importante.... PETSc NON è multithreaded. Qui bisogna sempre forzare il numero di threads a uno. Se fai così funziona?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants