Cannot increase fastscape_eroding_box resolution

Dear experts.

I modified a bit the fastscape_eroding_box.prm cookbook trying to increase the mesh resolution, but the code crashed after Timestep 4. I uploaded the modifed code, any help how to solve this is appreciated.

fastscape_eroding_box_test_resolution.prm (6.5 KB)

Vlad

@vladycm Can you show what specifically the error is that you get?

Best

W.

Here it is:

…

*** Timestep 4: t=200000 years, dt=50000 years

Executing FastScape… 5 timesteps of 10000 years.

  Writing FastScape VTK...

---------------------------------------------------------

TimerOutput objects finalize timed values printed to the

screen by communicating over MPI in their destructors.

Since an exception is currently uncaught, this

synchronization (and subsequent output) will be skipped

to avoid a possible deadlock.

-------------------------------------------------------------------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 0 on processing:

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 17 on processing:

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 21 on processing:

--------------------------------------------------------

An error occurred in line <1337> of file </opt/apps/candi/29may25/deal.II-v9.5.2/include/deal.II/lac/solver_cg.h> in function

void dealii::SolverCG<VectorType>::solve(const MatrixType&, VectorType&, const VectorType&, const PreconditionerType&) \[with MatrixType = dealii::TrilinosWrappers::SparseMatrix; PreconditionerType = dealii::TrilinosWrappers::PreconditionAMG; VectorType = dealii::TrilinosWrappers::MPI::Vector\]

The violated condition was:

solver_state == SolverControl::success

Additional information:

Iterative method reported convergence failure in step 1837395. The

residual in the last step was 1.34222e-05.



This error message can indicate that you have simply not allowed a

sufficiently large number of iterations for your iterative solver to

converge. This often happens when you increase the size of your

problem. In such cases, the last residual will likely still be very

small, and you can make the error go away by increasing the allowed

number of iterations when setting up the SolverControl object that

determines the maximal number of iterations you allow.



The other situation where this error may occur is when your matrix is

not invertible (e.g., your matrix has a null-space), or if you try to

apply the wrong solver to a matrix (e.g., using CG for a matrix that

is not symmetric or not positive definite). In these cases, the

residual in the last iteration is likely going to be large.

Stacktrace:

-----------

#0 /opt/apps/aspect/3.0.0/bin/aspect-release: void dealii::SolverCGdealii::TrilinosWrappers::MPI::Vector::solve<dealii::TrilinosWrappers::SparseMatrix, dealii::TrilinosWrappers::PreconditionAMG>(dealii::TrilinosWrappers::SparseMatrix const&, dealii::TrilinosWrappers::MPI::Vector&, dealii::TrilinosWrappers::MPI::Vector const&, dealii::TrilinosWrappers::PreconditionAMG const&)

#1 /opt/apps/aspect/3.0.0/bin/aspect-release: aspect::MeshDeformation::MeshDeformationHandler<3>::compute_mesh_displacements()

#2 /opt/apps/aspect/3.0.0/bin/aspect-release: aspect::MeshDeformation::MeshDeformationHandler<3>::execute()

#3 /opt/apps/aspect/3.0.0/bin/aspect-release: aspect::Simulator<3>::solve_timestep()

#4 /opt/apps/aspect/3.0.0/bin/aspect-release: aspect::Simulator<3>::run()

#5 /opt/apps/aspect/3.0.0/bin/aspect-release: void run_simulator<3>(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&, std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&, bool, bool, bool, bool)

#6 /opt/apps/aspect/3.0.0/bin/aspect-release: main

--------------------------------------------------------

Aborting!

----------------------------------------------------

--------------------------------------------------------------------------

MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD

with errorcode 1.

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.

You may or may not see output from other processes, depending on

exactly when Open MPI kills them.

--------------------------------------------------------------------------

In: PMI_Abort(1, N/A)

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 8 on processing:

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 5 on processing:

srun: Job step aborted: Waiting up to 32 seconds for job step to finish.

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 32 on processing:

slurmstepd: error: *** STEP 24799.0 ON node04 CANCELLED AT 2025-09-24T07:30:32 ***

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 4 on processing:

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 1 on processing:

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 2 on processing:

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 29 on processing:

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 33 on processing:

----------------------------------------------------

Exception ā€˜SolverControl::NoConvergence(it, worker.residual_norm)’ on rank 23 on processing:

srun: error: node04: tasks 0,8: Killed

srun: error: node04: tasks 1-7,9-39: Killed

@vladycm The error is not actually in the Fastscape plugin. That just happens the last line of text that was printed, but the error happens in what is executed after that: The linear solver fir the mesh displacement fails, which typically happens if you have too large a deformation and you end up with a distorted mesh. There are many other questions on this forum relating to distorted meshes, and you may find solutions in those. Other people here have more knowledge and can perhaps recommend other strategies, but a good first step would be to see whether a smaller time step helps.

Best

W.

Thanks. I tried to lower the time step 1 order of magnitude, but still no luck…

@vladycm I also encountered the same issue. When I increase the grid’s resolution, an error occurs, but when I use the ā€œaspect-releaseā€ optimization mode, the error disappears. The above are some of my experiences, and I hope they can help you.

Best

Yuan

Hi all,

I have never used FastScape before. I agree with @bangerth that the error might be related to excessive deformation leading to a distorted mesh. While this explanation works in some cases, in others it may only delay the issue rather than completely avoid it.

I think another important factor lies in the initial topography in the z-direction, which has a maximum height of 2500 m and a very sharp transition. This height is more than three times the cell length in the x and y directions. Such a configuration could lead to significant element distortion and even negative element volumes under large time steps.

From my previous experience, ASPECT is highly sensitive to the aspect ratio of cells. I am currently unsure how to use deal.II to locally refine the mesh in a specific direction (e.g., near the surface in the z-direction). If @bangerth could provide guidance on this, I would be happy to try implementing such a refinement to see if it resolves the issue.

Best,
Ninghui

@tiannh7 There is currently no way to do ā€œanisotropic mesh refinementā€ in ASPECT as you suggest. If you want cells that are substantially smaller in one direction than in another, then this aspect ratio already has to be present in the coarse mesh (level 0). I must admit that I don’t know how one would want to choose the coarse mesh for a situation like you describe with large vertical deformations – these cases are simply very difficult to deal with (in ASPECT and in any other numerical software not specifically built for large deformations).

@YUAN Switching from debug to release mode to make an error go away is the same as disconnecting the ā€œcheck engineā€ light in your car because you don’t like what it warns you about: It’s true that you don’t have to look at the light any more, but it’s probably not a good sign for the trust you should have in your car’s continued functioning: It may get you to your weekend destination, or perhaps it doesn’t.

Best

W.

1 Like

@bangerth @tiannh7 Thank you for your corrections and guidance; I learned a lot from this discussion.

Hi.

This is a followup regarding the high resolution FASTSCAPE test model: changing the initial topography solved the issue. So, it looks like high resolution models are quite sensitive to the initial topography. Avoiding sharp edges solves this problem, my initial topo is now a smooth long wavelength surface and the codes works well.

@vladycm

my initial topo is now a smooth long wavelength surface and the codes works well.

Thank you for posting the update and glad to hear things are working well now.

Would you mind posting the changes you made to the parameter file? It would be good to make a note on the test case regarding your findings (you are of course welcome to do this if you would like to make a pull request).

Thanks, John

Dear John.

Sure, here is the prm file (I added also an initial thermal structure) I use to test the smooth topography with fastscape

fastscape_eroding_random_topo_wide_temp.prm (6.9 KB)

. I also attached the smooth topography file I use

long_wavelength_topography_wide.txt (322.4 KB)

. I hope you find this useful.

Best.

Vlad

And this is a screen capture of the topography I obtained (vertically exaggerated several times).