Questions Regarding the Computation of Green's Functions in PyLith?

If PyLith gets “stuck” it usually happens when running in parallel, one process aborts, and the others hang. Are you running in parallel? If so, can you run in serial for 1 time step. This might allow you to see an error. Have you tried the same geometry with a coarser mesh?

I have resolved the previous configuration issue, but now when writing outputs I encounter the following error:

Fatal error. Calling MPI_Abort() to abort PyLith application.
Traceback (most recent call last):
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/apps/PetscApplication.py", line 74, in onComputeNodes
    self.main(*args, **kwds)
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/apps/PyLithApp.py", line 143, in main
    self.problem.finalize()
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/problems/TimeDependent.py", line 224, in finalize
    self.formulation.finalize()
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/problems/Implicit.py", line 278, in finalize
    Formulation.finalize(self)
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/problems/Formulation.py", line 280, in finalize
    output.close()
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/meshio/OutputManager.py", line 202, in close
    self._close()
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/meshio/OutputManager.py", line 464, in _close
    self.writer.close()
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/meshio/DataWriterHDF5.py", line 83, in close
    xdmf.write(ModuleDataWriterHDF5.hdf5Filename(self), verbose=False)
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/pylith/meshio/Xdmf.py", line 75, in write
    self.h5 = h5py.File(filenameH5, "r")
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/h5py-2.9.0-py2.7-linux-x86_64.egg/h5py/_hl/files.py", line 394, in __init__
    swmr=swmr)
  File "/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/lib/python2.7/site-packages/h5py-2.9.0-py2.7-linux-x86_64.egg/h5py/_hl/files.py", line 170, in make_fid
    fid = h5f.open(name, flags, fapl=fapl)
  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py/h5f.pyx", line 85, in h5py.h5f.open
IOError: Unable to open file (truncated file: eof = 3756584, sblock->base_addr = 0, stored_eof = 4240840)
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0
/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/bin/nemesis: mpirun: exit 255
/home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/bin/pylith: /home/zhengyong/users/xiaoyang/xiaoy/xiaoy/Pylith/pylith-2.2.2-linux-x86_64/bin/nemesis: exit 1

I have searched the community for solutions but could not find an answer. What is causing this?

The backtrace shows that it is trying to create the Xdmf file from the HDF5 file. It looks like h5py is having trouble accessing the HDF5 file and thinks it is corrupted. My guess is that the file is corrupt. It can get corrupt if multiple processes write to it and it doesn’t get closed or one process aborts before the write is complete or you run out of disk space. If you are running in parallel, then I recommend using the DataWriterHDF5External which uses external raw datasets written using MPI I/O and just writes an HDF5 with metadata. It produces a lot of files, but it is more robust.

Thank you very much for your help,I have resolved the issue! Wishing you all the best.