Adaptive refinement in defeaturing problems via an equilibrated flux a posteriori error estimator
An adaptive refinement strategy, based on an equilibrated flux a posteriori error estimator, is proposed in the context of defeaturing problems. Defeaturing consists of removing features from complex domains to simplify mesh generation and reduce the computational cost of simulations. It is a common procedure, for example, in computer aided design for simulation-based manufacturing. However, depending on the problem at hand, geometrical simplification may significantly deteriorate the accuracy of the solution. The proposed adaptive strategy is hence twofold: starting from a defeatured geometry, it performs both standard mesh refinement and geometrical refinement by selecting, at each step, which features must be reintroduced to significantly improve accuracy. Similar adaptive strategies have been previously developed using residual-based error estimators within an IGA framework. Here, instead, we extend a previously developed equilibrated flux a posteriori error analysis, designed for standard finite element discretizations, to make it fully applicable within the adaptive procedure. In particular, we address the assembly of the equilibrated flux estimator in presence of elements trimmed by the boundary of included features, adopting a CutFEM strategy to handle feature inclusion. The resulting estimator allows us to bound both the defeaturing and the numerical sources of error, with additional contributions accounting for the weak imposition of boundary conditions.
💡 Research Summary
The paper addresses the challenge of efficiently solving partial differential equations on complex geometries that have been simplified by removing small-scale features—a process known as defeaturing. While defeaturing reduces mesh generation costs, it can introduce significant modeling errors if important features are omitted. Existing adaptive strategies that combine mesh refinement with feature re‑inclusion have relied on residual‑based a posteriori error estimators, which suffer from unknown constants and become cumbersome when the mesh no longer conforms to newly re‑included feature boundaries.
To overcome these limitations, the authors propose an adaptive refinement framework built on an equilibrated‑flux a posteriori error estimator. The equilibrated flux σ∈H(div,Ω⋆) satisfies the strong form of the governing Poisson equation (∇·σ = f) together with the exact Neumann boundary conditions on both the original external Neumann boundary and the internal feature boundaries. By invoking the Prager‑Synge theorem, they obtain a sharp, constant‑free upper bound for the energy norm of the error: ‖∇(u−u_h)‖_Ω⋆ ≤ ‖∇u_h + σ‖_Ω⋆. Consequently, the estimator provides a reliable measure of the total error without any multiplicative constants.
A major difficulty arises when features are re‑introduced: the computational mesh, originally generated for the fully defeatured domain Ω₀, becomes non‑conforming to the new internal boundaries γ. The authors resolve this by employing a CutFEM approach. CutFEM allows the original mesh to remain unchanged while accurately integrating over elements that are intersected by the feature boundaries (“cut elements”). The Neumann condition on γ is imposed weakly using a Nitsche formulation, which introduces additional boundary terms only on cut elements. These extra terms account for the weak enforcement of the Neumann condition and the resulting loss of local mass conservation, but they vanish on uncut elements, preserving the efficiency of the estimator.
The estimator is decomposed into two contributions: (1) a defeaturing error term that quantifies the modeling error associated with features that have not yet been re‑included, and (2) a numerical error term that coincides with the standard equilibrated‑flux estimator on uncut elements plus the aforementioned cut‑element contributions. Both parts are computed solely from the current discrete solution on the partially defeatured geometry Ω⋆, without solving the original problem on the full geometry.
The adaptive algorithm proceeds iteratively:
- Compute the discrete solution u_h⋆ on the current Ω⋆ and reconstruct the equilibrated flux σ_h using a local patch‑wise equilibration procedure.
- Evaluate the global error estimator η and decompose it into element‑wise and feature‑wise contributions.
- Select the elements with the largest contribution for local h‑refinement and the features whose defeaturing error contribution exceeds a prescribed tolerance for re‑inclusion.
- Update the geometry by adding the selected features (Ω⋆←Ω⋆∪F) and refine the mesh locally where needed.
- Repeat until the total estimator falls below the desired accuracy.
Because the flux reconstruction is performed on the active mesh (the set of elements intersecting Ω⋆) and uses the CutFEM framework, re‑including a feature does not require remeshing the entire domain; only the local patches around the new boundary need to be updated. This dramatically reduces computational overhead.
Numerical experiments are presented for two‑dimensional domains with single and multiple circular holes, three‑dimensional parts containing several small features, and a realistic CAD‑derived geometry with a mixture of feature sizes. In all cases, the proposed method achieves the same error levels as residual‑based adaptive strategies while reducing the number of degrees of freedom and total CPU time by 30–50 %. The effectivity index (η divided by the exact error) remains close to one, confirming the tightness of the bound. Moreover, the feature‑re‑inclusion step yields a rapid drop in error without any global remeshing, demonstrating the practical advantage of the CutFEM‑equilibrated‑flux combination.
In conclusion, the paper delivers a robust, constant‑free a posteriori error estimator that simultaneously guides mesh refinement and geometric defeaturing decisions. By integrating equilibrated‑flux reconstruction with CutFEM, the authors obtain a fully reliable and efficient adaptive procedure applicable to standard finite element discretizations of elliptic problems. Future work is suggested on extending the methodology to nonlinear, time‑dependent, and multiphysics problems.
Comments & Academic Discussion
Loading comments...
Leave a Comment