Some questions about poroelastic-outerrise-3d

Dear Mr. Baagaard,

I encountered some issues while using the poroelastic module in Pylith 4.13. Specifically, I’m analyzing the diffusion behavior of a three-dimensional model under surface pressure, and the relevant files are as follows. The main problem I encountered is:
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, probably memory access out of range
[0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
[0]PETSC ERROR: or see FAQ — PETSc 3.22.0 documentation and FAQ — PETSc 3.22.0 documentation
[0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, and run
[0]PETSC ERROR: to get more information on the crash.
[0]PETSC ERROR: Run with -malloc_debug to check if memory corruption is causing the crash.
Abort(59) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/nemesis: mpiexec: exit 15
/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/pylith: /public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/nemesis: exit 1
file.zip (7.2 MB)

I would like to ask what might be causing this issue.

I also want to add variations to my pressure file, for example, setting it to 500 in the first year, 600 in the second year, and so on. How should I do that?

For simple temporal variations in boundary conditions, refer to equation 161 in the PyLith manual. Starting from an initial pressure and linearly increasing the value with time would involve setting an initial value and a constant rate. This is analogous to setting an initial displacement and a constant velocity for a displacement Dirichlet boundary condition, which are used in examples/box-3d/step05.

When I try to run your simulation, I get the following output indicating an error associated with a spatial database for your material properties. In running your simulation, I modified lines 92 and 100 to include the name of the material. This results in a more informative error message ([0]PETSC ERROR: Could not find values for solid_density at ( -119839 -163906 -30000) in spatial database 'Spatial database for upcrust material properties and state variables'. ) indicating the problem is with the spatial database for upcrust. It looks like there are cells that are below z=-30.0e+3.

When I turn off the linear interpolation to resolve the spatial database error, I do get a segmentation fault during the solve. I will try to investigate this further later this week.

pylith step01_no_faults_no_flexure.cfg
 >> tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/apps/PyLithApp.py:77:main
 -- pylithapp(info)
 -- Running on 1 process(es).
 >> tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/problems/Problem.py:116:preinitialize
 -- timedependent(info)
 -- Performing minimal initialization before verifying configuration.
 >> tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/problems/Solution.py:39:preinitialize
 -- solution(info)
 -- Performing minimal initialization of solution.
 >> /Users/baagaard/tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/problems/Problem.py:174:verifyConfiguration
 -- timedependent(info)
 -- Verifying compatibility of problem configuration.
 >> tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/problems/Problem.py:219:_printInfo
 -- timedependent(info)
 -- Scales for nondimensionalization:
    Length scale: 1000*m
    Time scale: 3.15576e+06*s
    Pressure scale: 1e+10*m**-1*kg*s**-2
    Density scale: 9.95882e+16*m**-3*kg
    Temperature scale: 1*K
 >> tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/problems/Problem.py:185:initialize
 -- timedependent(info)
 -- Initializing timedependent problem with quasistatic formulation.
 >> ../../../pylith-4.1.3/libsrc/pylith/utils/PetscOptions.cc:250:static void pylith::utils::_PetscOptions::write(pythia::journal::info_t &, const char *, const PetscOptions &)
 -- petscoptions(info)
 -- Setting PETSc options:
fieldsplit_displacement_pc_type = lu
fieldsplit_pressure_pc_type = bjacobi
fieldsplit_pressure_t_pc_type = bjacobi
fieldsplit_trace_strain_pc_type = bjacobi
fieldsplit_trace_strain_t_pc_type = bjacobi
fieldsplit_velocity_pc_type = bjacobi
ksp_atol = 1.0e-12
ksp_converged_reason = true
ksp_error_if_not_converged = true
ksp_guess_pod_size = 8
ksp_guess_type = pod
ksp_rtol = 1.0e-12
pc_fieldsplit_0_fields = 2
pc_fieldsplit_1_fields = 1
pc_fieldsplit_2_fields = 0
pc_fieldsplit_3_fields = 3
pc_fieldsplit_4_fields = 4
pc_fieldsplit_5_fields = 5
pc_fieldsplit_type = multiplicative
pc_type = fieldsplit
snes_atol = 1.0e-9
snes_converged_reason = true
snes_error_if_not_converged = true
snes_monitor = true
snes_rtol = 1.0e-12
ts_error_if_step_fails = true
ts_monitor = true
ts_type = beuler
viewer_hdf5_collective = true

[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Error in external library
[0]PETSC ERROR: Could not find values for solid_density at (  -119839  -163906  -30000) in spatial database 'Spatial database for material properties and state variables'.
[0]PETSC ERROR: WARNING! There are unused option(s) set! Could be the program crashed before usage or a spelling mistake, etc!
[0]PETSC ERROR:   Option left: name:-fieldsplit_displacement_pc_type value: lu source: code
[0]PETSC ERROR:   Option left: name:-fieldsplit_pressure_pc_type value: bjacobi source: code
[0]PETSC ERROR:   Option left: name:-fieldsplit_pressure_t_pc_type value: bjacobi source: code
[0]PETSC ERROR:   Option left: name:-fieldsplit_trace_strain_pc_type value: bjacobi source: code
[0]PETSC ERROR:   Option left: name:-fieldsplit_trace_strain_t_pc_type value: bjacobi source: code
[0]PETSC ERROR:   Option left: name:-fieldsplit_velocity_pc_type value: bjacobi source: code
[0]PETSC ERROR:   Option left: name:-ksp_atol value: 1.0e-12 source: code
[0]PETSC ERROR:   Option left: name:-ksp_converged_reason (no value) source: code
[0]PETSC ERROR:   Option left: name:-ksp_error_if_not_converged (no value) source: code
[0]PETSC ERROR:   Option left: name:-ksp_guess_pod_size value: 8 source: code
[0]PETSC ERROR:   Option left: name:-ksp_guess_type value: pod source: code
[0]PETSC ERROR:   Option left: name:-ksp_rtol value: 1.0e-12 source: code
[0]PETSC ERROR:   Option left: name:-pc_fieldsplit_0_fields value: 2 source: code
[0]PETSC ERROR:   Option left: name:-pc_fieldsplit_1_fields value: 1 source: code
[0]PETSC ERROR:   Option left: name:-pc_fieldsplit_2_fields value: 0 source: code
[0]PETSC ERROR:   Option left: name:-pc_fieldsplit_3_fields value: 3 source: code
[0]PETSC ERROR:   Option left: name:-pc_fieldsplit_4_fields value: 4 source: code
[0]PETSC ERROR:   Option left: name:-pc_fieldsplit_5_fields value: 5 source: code
[0]PETSC ERROR:   Option left: name:-pc_fieldsplit_type value: multiplicative source: code
[0]PETSC ERROR:   Option left: name:-pc_type value: fieldsplit source: code
[0]PETSC ERROR:   Option left: name:-snes_atol value: 1.0e-9 source: code
[0]PETSC ERROR:   Option left: name:-snes_converged_reason (no value) source: code
[0]PETSC ERROR:   Option left: name:-snes_error_if_not_converged (no value) source: code
[0]PETSC ERROR:   Option left: name:-snes_monitor (no value) source: code
[0]PETSC ERROR:   Option left: name:-snes_rtol value: 1.0e-12 source: code
[0]PETSC ERROR:   Option left: name:-ts_error_if_step_fails (no value) source: code
[0]PETSC ERROR:   Option left: name:-ts_monitor (no value) source: code
[0]PETSC ERROR:   Option left: name:-ts_type value: beuler source: code
[0]PETSC ERROR:   Option left: name:-viewer_hdf5_collective (no value) source: code
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.21.3-425-g32e03751087  GIT Date: 2024-07-29 20:12:56 +0000
[0]PETSC ERROR: tmp/pylith-4.1.3-macOS-11.0-arm64/bin/mpinemesis with 1 MPI process(es) and PETSC_ARCH  by baagaard Mon Oct 14 13:40:43 2024
[0]PETSC ERROR: Configure options: --prefix=scratch/build/clang-15.0/cig/pylith-binary-arm64/dist --with-c2html=0 --with-x=0 --with-clanguage=C --with-mpicompilers=1 --with-shared-libraries=1 --with-64-bit-points=1 --with-large-file-io=1 --download-parmetis=1 --download-metis=1 --download-f2cblaslapack=1 --download-ml --with-fc=0 --with-hwloc=0 --with-ssl=0 --with-x=0 --with-c2html=0 --with-lgrind=0 --with-hdf5=1 --with-zlib=1 --LIBS=-lz --with-debugging=0 --with-fc=0 CFLAGS=-mmacos-version-min=11.0 CXXFLAGS="-mmacos-version-min=11.0 -DMPICH_IGNORE_CXX_SEEK" FCFLAGS= CPPFLAGS="-I/Users/baagaard/scratch/build/clang-15.0/cig/pylith-binary-arm64/dist/include  -Iscratch/build/clang-15.0/cig/pylith-binary-arm64/dist/include" LDFLAGS="-Lscratch/build/clang-15.0/cig/pylith-binary-arm64/dist/lib -mmacos-version-min=11.0 -Lscratch/build/clang-15.0/cig/pylith-binary-arm64/dist/lib" PETSC_DIR=scratch/build/clang-15.0/cig/pylith-binary-arm64/build/cig/petsc-pylith PETSC_ARCH=arch-pylith
[0]PETSC ERROR: #1 static PetscErrorCode pylith::topology::FieldQuery::queryDBPointFn(PylithInt, PylithReal, const PylithReal *, PylithInt, PylithScalar *, void *)() at ../../../pylith-4.1.3/libsrc/pylith/topology/FieldQuery.cc:316
[0]PETSC ERROR: #2 DMProjectPoint_Func_Private() at scratch/build/clang-15.0/cig/pylith-binary-arm64/build/cig/petsc-pylith/src/dm/impls/plex/plexproject.c:128
[0]PETSC ERROR: #3 DMProjectPoint_Private() at scratch/build/clang-15.0/cig/pylith-binary-arm64/build/cig/petsc-pylith/src/dm/impls/plex/plexproject.c:483
[0]PETSC ERROR: #4 DMProjectLocal_Generic_Plex() at scratch/build/clang-15.0/cig/pylith-binary-arm64/build/cig/petsc-pylith/src/dm/impls/plex/plexproject.c:1003
[0]PETSC ERROR: #5 DMProjectFunctionLocal_Plex() at scratch/build/clang-15.0/cig/pylith-binary-arm64/build/cig/petsc-pylith/src/dm/impls/plex/plexproject.c:1034
[0]PETSC ERROR: #6 DMProjectFunctionLocal() at scratch/build/clang-15.0/cig/pylith-binary-arm64/build/cig/petsc-pylith/src/dm/interface/dm.c:8185
[0]PETSC ERROR: #7 void pylith::topology::FieldQuery::queryDB()() at ../../../pylith-4.1.3/libsrc/pylith/topology/FieldQuery.cc:223
Fatal error. Calling MPI_Abort() to abort PyLith application.
Traceback (most recent call last):
  File "tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/apps/PetscApplication.py", line 55, in onComputeNodes
    self.main(*args, **kwds)
  File "tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/apps/PyLithApp.py", line 103, in main
    self.problem.initialize()
  File "tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/problems/Problem.py", line 187, in initialize
    ModuleProblem.initialize(self)
  File "tmp/pylith-4.1.3-macOS-11.0-arm64/lib/python3.10/site-packages/pylith/problems/problems.py", line 165, in initialize
    return _problems.Problem_initialize(self)
RuntimeError: Error detected while in PETSc function.
Abort(-1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0
tmp/pylith-4.1.3-macOS-11.0-arm64/bin/nemesis: mpiexec: exit 255
tmp/pylith-4.1.3-macOS-11.0-arm64/bin/pylith: /Users/baagaard/tmp/pylith-4.1.3-macOS-11.0-arm64/bin/nemesis: exit 1

Thank you very much for your reply, and I look forward to you resolving this issue. Wishing you all the best. However, I have recently been researching this issue as well because the error keeps showing the following:
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Out of memory. This could be due to allocating
[0]PETSC ERROR: too large an object or bleeding by not properly
[0]PETSC ERROR: destroying unneeded objects.
[0]PETSC ERROR: Memory allocated 0 Memory used by process 100422066176
[0]PETSC ERROR: Try running with -on_error_malloc_dump or -malloc_view for info.
[0]PETSC ERROR: Memory requested 18446744073046507520
[0]PETSC ERROR: See FAQ — PETSc 3.22.0 documentation for trouble shooting.
[0]PETSC ERROR: Petsc Development GIT revision: v3.21.3-425-g32e03751087 GIT Date: 2024-07-29 20:12:56 +0000
[0]PETSC ERROR: /public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/mpinemesis with 1 MPI process(es) and PETSC_ARCH on cc1201.para.bscc by t0s000462 Tue Oct 15 19:54:41 2024
[0]PETSC ERROR: Configure options: --prefix=/opt/pylith/dist --with-c2html=0 --with-x=0 --with-clanguage=C --with-mpicompilers=1 --with-shared-libraries=1 --with-64-bit-points=1 --with-large-file-io=1 --download-parmetis=1 --download-metis=1 --download-f2cblaslapack=1 --download-ml --with-fc=0 --with-hwloc=0 --with-ssl=0 --with-x=0 --with-c2html=0 --with-lgrind=0 --with-hdf5=1 --with-zlib=1 --LIBS=-lz --with-debugging=0 --with-fc=0 CFLAGS=“-g -O2” CXXFLAGS=“-g -O2 -DMPICH_IGNORE_CXX_SEEK” FCFLAGS= CPPFLAGS=“-I/opt/pylith/dist/include -I/opt/pylith/dist/include” LDFLAGS=“-L/opt/pylith/dist/lib -L/opt/pylith/dist/lib64 -L/opt/pylith/dist/lib” PETSC_DIR=/opt/pylith/build/cig/petsc-pylith PETSC_ARCH=arch-pylith
[0]PETSC ERROR: #1 PetscMallocAlign() at /opt/pylith/build/cig/petsc-pylith/src/sys/memory/mal.c:53
[0]PETSC ERROR: #2 PetscShmgetAllocateArray() at /opt/pylith/build/cig/petsc-pylith/src/sys/utils/server.c:265
[0]PETSC ERROR: #3 MatLUFactorSymbolic_SeqAIJ() at /opt/pylith/build/cig/petsc-pylith/src/mat/impls/aij/seq/aijfact.c:379
[0]PETSC ERROR: #4 MatLUFactorSymbolic() at /opt/pylith/build/cig/petsc-pylith/src/mat/interface/matrix.c:3252
[0]PETSC ERROR: #5 PCSetUp_LU() at /opt/pylith/build/cig/petsc-pylith/src/ksp/pc/impls/factor/lu/lu.c:87
[0]PETSC ERROR: #6 PCSetUp() at /opt/pylith/build/cig/petsc-pylith/src/ksp/pc/interface/precon.c:1077
[0]PETSC ERROR: #7 KSPSetUp() at /opt/pylith/build/cig/petsc-pylith/src/ksp/ksp/interface/itfunc.c:415
[0]PETSC ERROR: #8 KSPSolve_Private() at /opt/pylith/build/cig/petsc-pylith/src/ksp/ksp/interface/itfunc.c:826
[0]PETSC ERROR: #9 KSPSolve() at /opt/pylith/build/cig/petsc-pylith/src/ksp/ksp/interface/itfunc.c:1075
[0]PETSC ERROR: #10 PCApply_FieldSplit() at /opt/pylith/build/cig/petsc-pylith/src/ksp/pc/impls/fieldsplit/fieldsplit.c:1501
[0]PETSC ERROR: #11 PCApply() at /opt/pylith/build/cig/petsc-pylith/src/ksp/pc/interface/precon.c:495
[0]PETSC ERROR: #12 KSP_PCApply() at /opt/pylith/build/cig/petsc-pylith/include/petsc/private/kspimpl.h:411
[0]PETSC ERROR: #13 KSPInitialResidual() at /opt/pylith/build/cig/petsc-pylith/src/ksp/ksp/interface/itres.c:64
[0]PETSC ERROR: #14 KSPSolve_GMRES() at /opt/pylith/build/cig/petsc-pylith/src/ksp/ksp/impls/gmres/gmres.c:227
[0]PETSC ERROR: #15 KSPSolve_Private() at /opt/pylith/build/cig/petsc-pylith/src/ksp/ksp/interface/itfunc.c:900
[0]PETSC ERROR: #16 KSPSolve() at /opt/pylith/build/cig/petsc-pylith/src/ksp/ksp/interface/itfunc.c:1075
[0]PETSC ERROR: #17 SNESSolve_NEWTONLS() at /opt/pylith/build/cig/petsc-pylith/src/snes/impls/ls/ls.c:220
[0]PETSC ERROR: #18 SNESSolve() at /opt/pylith/build/cig/petsc-pylith/src/snes/interface/snes.c:4855
[0]PETSC ERROR: #19 TSTheta_SNESSolve() at /opt/pylith/build/cig/petsc-pylith/src/ts/impls/implicit/theta/theta.c:174
[0]PETSC ERROR: #20 TSStep_Theta() at /opt/pylith/build/cig/petsc-pylith/src/ts/impls/implicit/theta/theta.c:225
[0]PETSC ERROR: #21 TSStep() at /opt/pylith/build/cig/petsc-pylith/src/ts/interface/ts.c:3403
[0]PETSC ERROR: #22 TSSolve() at /opt/pylith/build/cig/petsc-pylith/src/ts/interface/ts.c:4049
[0]PETSC ERROR: #23 void pylith::problems::TimeDependent::solve()() at …/…/…/pylith-4.1.3/libsrc/pylith/problems/TimeDependent.cc:487
Fatal error. Calling MPI_Abort() to abort PyLith application.
Traceback (most recent call last):
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/apps/PetscApplication.py”, line 55, in onComputeNodes
self.main(*args, **kwds)
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/apps/PyLithApp.py”, line 114, in main
self.problem.run(self)
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/problems/TimeDependent.py”, line 134, in run
ModuleTimeDependent.solve(self)
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/problems/problems.py”, line 217, in solve
return _problems.TimeDependent_solve(self)
RuntimeError: Error detected while in PETSc function.
Abort(-1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0
/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/nemesis: mpiexec: exit 255
/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/pylith: /public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/nemesis: exit 1
So, I am wondering if the issue might be due to insufficient memory for a single CPU during the calculation. Of course, I am not entirely sure and am still trying different solutions. Thank you again for your help!

What I want to achieve is to model pressure changes caused by variations in reservoir water storage. Therefore, the changes are not simply linear. I also want to apply pressure conditions to specific points. Is it possible to implement conditions similar to the following?
#SPATIAL.ascii 1
SimpleDB {
num-values = 1 # Number of values (e.g., initial amplitude)
value-names = initial_amplitude # Name of the value
value-units = Pa # Units of the value
num-locs = 5 # Number of spatial locations
data-dim = 2 # Data dimension (values + time)
space-dim = 3 # Spatial dimension (x, y, z)
cs-data = cartesian {
to-meters = 1 # Conversion factor
space-dim = 3 # Spatial dimension
}
}

Data format:

x_position y_position z_position initial_amplitude time

-120000.000000 -165000.000000 0.000000 1.000000 1.0 # Initial state at time t=1
-130000.000000 -170000.000000 0.000000 1.000000 1.0 # State at time t=1
-140000.000000 -175000.000000 0.000000 1.000000 1.0 # State at time t=1
-150000.000000 -180000.000000 0.000000 1.000000 1.0 # State at time t=1
-120000.000000 -165000.000000 0.000000 1.000000 2.0 # Initial state at time t=2
-130000.000000 -170000.000000 0.000000 1.000000 2.0 # State at time t=2
-140000.000000 -175000.000000 0.000000 1.000000 2.0 # State at time t=2
-150000.000000 -180000.000000 0.000000 1.000000 2.0 # State at time t=2

There are a few likely explanations for the segmentation fault and error message. You may be running out of memory as poroelastic problems involve quite a few fields and material properties. Alternatively, there may be a bad memory access or bad size used when allocating memory.

I recommend creating a much coarser mesh to make debugging easier. If you create one, please send it as it will speed up my investigation as well.

In equation 161, the term f2(x) a(t-t2(x) is a user defined temporal variation. a(t is a time series that applies to all points (same shape) with spatial variations in the starting time defined by t2(x) and spatial variations in the amplitude defined by f2(x).

We have plans to implement a more general formulation for spatial and temporal boundary conditions, but there are a number of higher priority items ahead of it in our development plan.

This file should be available
data.zip (1.4 MB)

So, at present, the best option is to simplify the injection data into several linear change stages and then conduct pressure change simulations separately, right? In this case, we may not be able to consider the coupling effects between the pressure changes in the first and second stages (including changes in pore pressure and surface deformation, etc.), and can only superimpose the two parts.

Your smaller problem ran for me without errors.

As for the temporal and spatial variations in the Dirichlet boundary condition, does the amplitude vary at a given point between stages? Does the timing of each stage vary spatially? If the answer is “no” to both of these questions, then you can use the user defined time series a(t) to define the “shape” and a spatially varying amplitude f2(x).

Yes, sir, the coarser mesh model runs fine, but the refined mesh model still reports errors, so I’m not sure what’s causing it. Also, I looked into the example you mentioned yesterday in examples/box-3d/step05, but I noticed that the pressure applied in this example is actually in three directions. You probably know that for a reservoir area, the diffusion of water isn’t just downward due to vertical pressure. In fact, at a certain depth near the reservoir boundary, there’s also pressure diffusing outward along the vertical boundary. This leads to a leftward and rightward diffusion process as well. Therefore, I would like to apply pressure in the +x and -x directions on points on a certain face. However, in the examples/box-3d/step05, it seems to only mention something like db_auxiliary_field.data = [2*MPa, 2*MPa, 2*MPa, 1.0*year, 0.1*MPa/year, 0.1*MPa/year, 0.1*MPa/year]. So, I’m wondering how to do that? Or is the isotropic permeability setting in your program indicating that horizontal diffusion also exists on the surface, meaning there’s no need to apply lateral pressure?

Sir, I followed your suggestion and set the parameters such as time_history_amplitude_tangential_1, time_history_amplitude_tangential_2, time_history_amplitude_normal, and time_history_start_time, and I also provided the corresponding spatial parameter file. However, I keep getting the error:
Fatal error. Calling MPI_Abort() to abort PyLith application.
Traceback (most recent call last):
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/apps/PetscApplication.py”, line 55, in onComputeNodes
self.main(*args, **kwds)
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/apps/PyLithApp.py”, line 103, in main
self.problem.initialize()
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/problems/Problem.py”, line 187, in initialize
ModuleProblem.initialize(self)
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/problems/problems.py”, line 165, in initialize
return _problems.Problem_initialize(self)
RuntimeError: Could not find value ‘time_history_amplitude’ in spatial database ‘Neumann BC shangyou edge’. Available values are:
time_history_amplitude_tangential_1
time_history_amplitude_tangential_2
time_history_ampli
Fatal error. Calling MPI_Abort() to abort PyLith application.
Traceback (most recent call last):
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/apps/PetscApplication.py”, line 55, in onComputeNodes
self.main(*args, **kwds)
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/apps/PyLithApp.py”, line 103, in main
self.problem.initialize()
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/problems/Problem.py”, line 187, in initialize
ModuleProblem.initialize(self)
File “/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/lib/python3.10/site-packages/pylith/problems/problems.py”, line 165, in initialize
return _problems.Problem_initialize(self)
RuntimeError: Could not find value ‘time_history_amplitude’ in spatial database ‘Neumann BC shangyou edge’. Available values are:
time_history_amplitude_tangential_1
time_history_amplitude_tangential_2
time_history_amplitude_normal
time_history_start_time
Abort(-1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0.
[server]: PMIU_parse_keyvals: unexpected key delimiter at character 48 in cmd
/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/nemesis: mpirun: exit 255
/public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/pylith: /public1/home/t0s000462/pylith-4.1.3-linux-x86_64/bin/nemesis: exit 1

I have tried many times but haven’t been able to solve this issue. I would like to ask what could be the cause of this? Also, are time_history_amplitude_normal and the other two columns initial values, and are the values in the spatial file the changes over time in three directions?
pylithapp.cfg (12.2 KB)
spatial.txt (610.2 KB)

I also want to use actual surface displacement observation data to invert the poroelastic model parameters. Do you know if PyLith 4.1.3 supports this part of the research? I think this part might involve calculating the Green’s function response between the poroelastic model and surface displacement.

Fluid pressure is a scalar field, so a Dirichlet boundary condition can only specify a scalar value. A Neumann boundary condition in elasticity specifies a shear traction, so it is a vector field. The strong form of the equations given in equation 143 shows the boundary conditions supported in the poroelasticity formulation.

PyLith has a special Green’s function problem type for computing Green’s functions for static fault slip impulses, because this is a common use case. You can setup your own suite of simulations to compute Green’s functions for other cases.

Okay, sir, thank you very much for your reply. I would like to know if there are any issues with the usage of use_time_history and the related component parameters in my previous file, as well as the format of impulse.timedb. I only found one example in examples/barwaves-2d, so I want to confirm the usage format for use_time_history, time_history_start_time, time_history_amplitude_tangential_1, time_history_amplitude_tangential_2, and time_history_amplitude_normal in the spatial files.

An update on our looking into the various issues related to the segmentation fault with the large mesh and other issues.

  1. The default preconditioner for the displacement field for poroelasticity is LU. This uses a lot of memory and is slow for large problems. This likely explains why you are running out of memory and the generally poor performance when running in serial. Using the default settings for running parallel (--problem.petsc_defaults.parallel) will use the ML algebraic multigrid preconditioner. It uses less memory but we are finding it slower than expected for your simulation (see #2).
  2. We are looking at the performance of the solver for poroelasticity. For your simulation with the coarse mesh, the portion of the solve for the displacement field seems much slower than we would expect. I am going to investigate this to see if I can determine why. We also think we can improve the performance of the solver for poroelasticity, and we are going to try a few things (this is all done by changing the PETSc options).
  3. I found a couple of cases where we use a user time history, but both are for fault slip. I will create a test case in which we use it for a boundary condition to verify the implementation. Once I have a verified test case, then I can point you to a specific configuration that you can use as a guide.

Thank you very much for your help! I sincerely wish you success in resolving the issues. Thanks again!

One of the first things I have found in looking in detail at your problem setup is that your mesh has quite a few highly distorted cells. Running the MeshQuality filter in ParaView shows that the maximum condition number is 15. One of the first steps you should take is to regenerate your mesh (it looks like you are using CUBIT) and use the condition number smoothing to get the maximum condition number to around 2 or 3.

Matt Knepley and I have worked on finding better solver settings for medium-large poroelasticity simulations. The following settings provide reasonable performance. We think we can improve the performance a bit better by tweaking the settings for the displacement block. We will be updating the PyLith documentation and share/settings to include these.

It is important to make sure that the scales for nondimensionalization give values of near 1 for the ratio of the permeability to the viscosity. This has a significant effect on the conditioning of the solve for the pressure field block.

Poroelasticity without state variables

[pylithapp.petsc]
pc_type = fieldsplit
pc_fieldsplit_type = multiplicative
pc_fieldsplit_0_fields = 2
pc_fieldsplit_1_fields = 1
pc_fieldsplit_2_fields = 0


fieldsplit_displacement_ksp_type = gmres
fieldsplit_displacement_pc_type = ml
fieldsplit_displacement_ksp_rtol = 1.0e-10
fieldsplit_displacement_ksp_atol = 1.0e-10

fieldsplit_pressure_ksp_type = gmres
fieldsplit_pressure_pc_type = ml
fieldsplit_pressure_ksp_rtol = 1.0e-10
fieldsplit_pressure_ksp_atol = 1.0e-10

fieldsplit_trace_strain_ksp_type = preonly
fieldsplit_trace_strain_pc_type = bjacobi

Poroelasticity with state variables

[pylithapp.petsc]
pc_type = fieldsplit
pc_fieldsplit_type = multiplicative
pc_fieldsplit_0_fields = 2
pc_fieldsplit_1_fields = 1
pc_fieldsplit_2_fields = 0


fieldsplit_displacement_ksp_type = gmres
fieldsplit_displacement_pc_type = ml
fieldsplit_displacement_ksp_rtol = 1.0e-10
fieldsplit_displacement_ksp_atol = 1.0e-10

fieldsplit_pressure_ksp_type = gmres
fieldsplit_pressure_pc_type = ml
fieldsplit_pressure_ksp_rtol = 1.0e-10
fieldsplit_pressure_ksp_atol = 1.0e-10

fieldsplit_trace_strain_ksp_type = preonly
fieldsplit_trace_strain_pc_type = bjacobi

fieldsplit_velocity_ksp_type = preonly
fieldsplit_velocity_pc_type = bjacobi

fieldsplit_pressure_t_ksp_type = preonly
fieldsplit_pressure_t_pc_type = bjacobi

fieldsplit_trace_strain_t_ksp_type = preonly
fieldsplit_trace_strain_t_pc_type = bjacobi

I would like to express my heartfelt gratitude to you and Matt Knepley for your dedicated efforts, and extend my sincere thanks to you both. I am currently working on optimizing my model, and it is truly an honor to receive the support and guidance of the developers. Your expertise has not only helped me resolve practical issues but also provided invaluable insights. I sincerely wish you both a pleasant weekend, and I look forward to your upcoming examples on time history analysis. Once again, thank you both for your guidance and support!