You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since the derivative and Laplacian operators are sparse matrices, and since the maximum
allowable time step scales with the resolution h and thus with the linear problem size N due to the Courant-Friedrichs-Lewy (CFL) criterion, the overall cost to calculate a solution
scales as O(N^2 log^(D−1)(N)), which is significantly more efficient than the
O(N^(D+1)) scaling for a full grid. Here D is the number of spatial dimensions. In principle it is possible to employ a
sparse space-time discretisation as well, reducing the cost further to O(N log^D (N)). However,this will require a more complex time evolution algorithm instead of
the ODE solver we employ, and we thus do not investigate this approach in this paper.
But the paper doesn't go into what the ODE solvers would need to satisfy so be able to get the optimization. Is there any chance this could be implemented with DifferentialEquations.jl internals?
The text was updated successfully, but these errors were encountered:
No, this cannot be done with DifferentialEquations, since this requires using a sparse structure in space-time. The way to do this would be to set up a sparse (D+1)-dimensional grid, and solve the hyperbolic equation via a global (in space-time!) solver, supplying the usual required initial and boundary conditions.
From the paper, page 14:
But the paper doesn't go into what the ODE solvers would need to satisfy so be able to get the optimization. Is there any chance this could be implemented with DifferentialEquations.jl internals?
The text was updated successfully, but these errors were encountered: