Skip to content

Examples fails with HYPRE >= 2.29 #279

Closed
@sshiraiwa

Description

@sshiraiwa

When buiding PyMFEM with Hypre 2.29 (changing Hypre version in setup.py), I realized Hypre solver seems to have a problem. Below are output with ex0p.py. It seems BoomerAMG was able to process input matrix, but somehow CG iteration can not start and fails with Error code 1.
Interestingly, this error happens only with PyMFEM and MFEM does not have this issue. Also, with HYPRE 2.28, everything seems file.

Does anyone have any clue on where I should look at?

(py312) bash-4.4$ mpirun -np 1 python ../examples/ex0p.py 
Ignoring PCI device with non-16bit domain.
Pass --enable-32bits-pci-domain to configure to support such devices
(warning: it would break the library ABI, don't enable unless really needed).
Options used:
   --mesh  star.mesh
   --order  1
Number of finite element unknowns: 101


 Num MPI tasks = 1

 Num OpenMP threads = 1


BoomerAMG SETUP PARAMETERS:

 Max levels = 25
 Num levels = 2

 Strength Threshold = 0.250000
 Interpolation Truncation Factor = 0.000000
 Maximum Row Sum Threshold for Dependency Weakening = 0.900000

 Coarsening Type = HMIS 

 No. of levels of aggressive coarsening: 1

 Interpolation on agg. levels= multipass interpolation
 measures are determined locally


 No global partition option chosen.

 Interpolation = extended+i interpolation

Operator Matrix Information:

             nonzero            entries/row          row sums
lev    rows  entries sparse   min  max     avg      min         max
======================================================================
  0     101      781  0.077     4   11     7.7  -7.216e-16   1.915e+00
  1       1        1  1.000     1    1     1.0   1.469e+01   1.469e+01


Interpolation Matrix Information:
                    entries/row        min        max            row sums
lev  rows x cols  min  max  avgW     weight      weight       min         max
================================================================================
  0   101 x 1       0    1   0.6   3.171e-01   1.000e+00   0.000e+00   1.000e+00


     Complexity:    grid = 1.009901
                operator = 1.001280
                memory = 1.079385




BoomerAMG SOLVER PARAMETERS:

  Maximum number of cycles:         1 
  Stopping Tolerance:               0.000000e+00 
  Cycle type (1 = V, 2 = W, etc.):  1

  Relaxation Parameters:
   Visiting Grid:                     down   up  coarse
            Number of sweeps:            1    1     1 
   Type 0=Jac, 3=hGS, 6=hSGS, 9=GE:      8    8     9 
   Point types, partial sweeps (1=C, -1=F):
                  Pre-CG relaxation (down):   0
                   Post-CG relaxation (up):   0
                             Coarsest grid:   0



Verification failed: (!err_flag) is false:
 --> Error during setup! Error code: 1
 ... in function: virtual void mfem::HypreSolver::Setup(const mfem::HypreParVector&, mfem::HypreParVector&) const
 ... in file: /home/sshiraiw/venvs/py312/src/PyMFEM/external/mfem/linalg/hypre.cpp:4092

--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions