Simulation Governance — An Organizational CAE Quality Management Framework

Category: V&V・ベストプラクティス | 更新 2026-04-12
Simulation governance framework with analyst competency matrix, review gates, and SPDM data flow
シミュレーションガバナンスの全体像 — 力量管理・レビューゲート・データ管理の三位一体フレームワーク

Theory and Background

What is Governance?

🧑‍🎓

Professor, is "simulation governance" about how an organization manages CAE?

🎓

Exactly. In a nutshell, it's the transformation from CAE dependent on individual craftsmanship to CAE with organizationally managed quality. NAFEMS published a document titled "Simulation Governance & Quality Management" in 2019, systematizing its framework.

🧑‍🎓

Specifically, what kind of things are decided?

🎓

There are three main pillars.

  • Who does it — Classifying analysts' capabilities (competency) into levels and defining the types of analysis permitted at each level.
  • How to check — Setting up review gates at key points in the analysis process to verify quality step by step.
  • How to record — Centrally managing models, input data, results, and review records to ensure traceability.

For example, in an automotive company, it means defining as a system, not as a personal custom, that only experienced engineers can be in charge of crash safety analysis.

🧑‍🎓

I see. So it's about turning what used to be like "it's okay because that person said so" into a system.

🎓

Exactly. Personal judgment ceases to function the moment that person transfers or leaves the company. The essence of governance is creating a "system that maintains quality even when people change."

NAFEMS Governance Framework

🧑‍🎓

Can you tell me the overall picture of NAFEMS's framework?

🎓

NAFEMS's Simulation Governance framework has the following 5-layer structure.

LayerNameContent
Layer 1PolicyBasic policies approved by management. Declares "conditions for using CAE results in design decisions."
Layer 2ProcessDefinition of analysis workflows, review gates, and approval authorities.
Layer 3ProcedureSpecific work procedures, tools used, and settings for each analysis type.
Layer 4CompetencyDefinition of analyst skill levels and training programs.
Layer 5Data Management (SPDM)Version control, access control, and storage rules for models/results.

The higher layers focus more on "why to do it," and the lower layers focus more on "how to do it." Governance only functions when all five layers are in place.

🧑‍🎓

It's a hierarchical structure like ISO 9001 for quality management.

🎓

Good point. In fact, NAFEMS's framework is designed to align with ISO 9001 and AS9100 (aerospace quality). It's like adding "simulation-specific requirements" to an existing quality management system.

Analysis Risk Classification

🧑‍🎓

Applying the same level of management to all analyses seems tough. Are there any clever approaches?

🎓

Exactly, NAFEMS recommends a risk-based approach. Analysis risk is classified along two axes.

$$ \text{Analysis Risk} = f(\text{Consequence},\; \text{Uncertainty}) $$
🎓
  • Consequence (Impact of Results): How much would safety, cost, or schedule be affected if the analysis results were wrong?
  • Uncertainty: How much uncertainty exists in the analysis model itself (new materials, complex physics, lack of experimental data, etc.)
Low UncertaintyMedium UncertaintyHigh Uncertainty
High ConsequenceMedium RiskHigh RiskHighest Risk
Medium ConsequenceLow RiskMedium RiskHigh Risk
Low ConsequenceLowest RiskLow RiskMedium Risk

For example, "strength confirmation for a minor shape change of an existing part" is classified as low risk, while "failure analysis of a new composite aircraft wing" is classified as highest risk. The strictness of review, required analyst level, and number of verification items are varied according to the risk level.

🧑‍🎓

If you applied the highest level of management to everything, work wouldn't get done, but it's also scary to have a new hire do life-critical analysis without checks. So it's about setting priorities.

🎓

Exactly. In practice, an efficient pattern is "80% of analyses are handled with a simplified process for low-medium risk, and resources are concentrated on the remaining 20% of high-risk analyses."

Simulation Maturity Model

🧑‍🎓

Is there an indicator for an organization's CAE "maturity"?

🎓

Yes. NAFEMS defines simulation maturity in 5 stages. This applies the concept of CMMI (Capability Maturity Model Integration).

LevelNameCharacteristics
Level 1InitialIndividual-dependent, ad-hoc. Result quality depends entirely on the individual analyst's capability.
Level 2ManagedProcesses are defined on a project basis. However, there is no cross-department standardization.
Level 3DefinedOrganizational standard processes are documented and applied across all departments.
Level 4Quantitatively ManagedProcess performance is measured and managed with KPIs. Data-driven improvement cycle.
Level 5OptimizingContinuous improvement is established as a culture. Autonomous incorporation of technological innovation.
🧑‍🎓

Honestly, I feel like many Japanese manufacturers are still at Level 1~2...

🎓

That's true in reality. According to one survey, about 60% of global manufacturers remain at Level 1~2. However, many companies in aerospace (Airbus, Boeing, etc.) and automotive OEMs (BMW, Toyota, etc.) have achieved Level 3 or higher. The important thing is "not aiming for Level 5 in one leap." The transition from Level 2 to Level 3 has the greatest impact.

Simulation maturity is determined by the minimum value among all elements of the analysis process.

$$ \text{Maturity}_{\text{org}} = \min\!\bigl(L_{\text{process}},\; L_{\text{competency}},\; L_{\text{data}},\; L_{\text{tools}},\; L_{\text{V\&V}}\bigr) $$

In other words, if any one element is weak, the overall maturity is dragged down to that level.

Competency Matrix and Review Gates

Analyst Competency Level Definition

🧑‍🎓

Is a "competency matrix" like ranking analysts? That makes me a bit nervous...

🎓

It's more of a tool to visualize "what level of analysis can be entrusted to this person" rather than ranking. Based on NAFEMS's Professional Simulation Engineer (PSE) system, many companies define 4 competency levels.

LevelNameAnalyses That Can Be PerformedRequired Experience
Level 1AssistantExecution of analysis based on standard templates. Initial verification of results.Completed basic training. Less than 1 year of practical experience.
Level 2PractitionerStandard linear analysis, nonlinear analysis based on established procedures.1-3 years of practical experience. Independent work under a mentor.
Level 3SpecialistComplex nonlinear analysis, multiphysics, application of new modeling methods.3-7 years of practical experience. V&V track record. Ability to review others' work.
Level 4ExpertSafety-critical analysis, method development/validation, governance system design.7+ years of practical experience. External presentation/paper track record. Industry recognition.
🧑‍🎓

So a new hire wouldn't do automotive crash analysis alone, right?

🎓

Exactly. For high-risk analysis like crash safety, a typical combination is that a Level 3 or higher analyst is in charge, and a Level 4 expert reviews it. Conversely, for a simple strength check of an existing part, it's sufficient for a Level 1 person to use a template and a Level 2 person to check it.

🧑‍🎓

But it would be a problem if a Level 1 person stayed at Level 1 forever. Is there a promotion system?

🎓

Good question. A competency matrix must always be defined together with a career path. Specifically:

  • Training required for promotion to each level (in-house training, NAFEMS seminars, e-learning, etc.)
  • Required type and number of practical experiences (e.g., Level 3 requires at least 5 independent analysis experiences across different analysis types).
  • Evaluation method (technical interview by higher-level personnel, portfolio review, etc.)
  • Recertification cycle (typically every 2-3 years)

Human resource development and governance are two sides of the same coin.

Review Gate Process

🧑‍🎓

Are review gates like setting up "checkpoints" at stages during the analysis?

🎓

Good image. In typical CAE governance, 4 gates are set up in the analysis process.

GateTimingCheck ContentApprover
Gate 0Before starting analysisClarification of analysis purpose, risk classification, assignment of appropriate analyst, identification of procedure to use.Project Leader
Gate 1After model constructionValidity of geometry simplification, mesh quality, material model, boundary/load conditions.Level 3+ Analyst
Gate 2After calculation executionConvergence status, energy balance, physical validity of results, mesh sensitivity check.Level 3+ Analyst
Gate 3After report creationValidity of conclusions, quantification of uncertainty, specificity of recommendations, data archiving.Level 3~4 depending on analysis risk
🧑‍🎓

It's interesting that Gate 0 decides "whether this analysis is even necessary and who should do it." In the field, I feel like there's often a "just run it first" attitude.

🎓

That's exactly the point. Without Gate 0, cases where analysts start on their own and then later realize "this analysis was actually insufficient" never end. Spending just 10 minutes on Gate 0 at the beginning dramatically reduces rework effort. In practice, this is said to be the gate with the highest ROI.

Gate Checklist

🧑‍🎓

What specific items are checked in Gate 1 for model construction?

🎓

Let me introduce a typical Gate 1 checklist.

  • Geometry Simplification: List of omitted features (fillets, holes, chamfers, etc.) and reason for omission.
  • Mesh Quality: Aspect Ratio < 5, Skewness < 0.8, Jacobian > 0.3 satisfied for all elements.
  • Material Model: Source of material data used (material test report number or handbook name), consideration of temperature/speed dependency.
  • Boundary Conditions: Whether constraints are not over-constrained (for static analysis, whether reaction forces are physically reasonable).
  • Load Conditions: Comprehensiveness of load cases, basis for safety factor.
  • Contact Definition: Selection of contact surfaces, basis for friction coefficient, check of initial gap.

Standardizing this as a template in Excel or an SPDM tool makes reviews run efficiently.

🧑‍🎓

Having a checklist means you don't have to worry about "what to check." It seems like it would reduce the reviewer's burden too.

Practical Guide

Implementation Steps

🧑‍🎓

I want to introduce governance in my department too, but where should I start?

🎓

Rolling it out company-wide from the start is a recipe for failure. The golden rule is to start small with a pilot project. The recommended implementation steps are as follows.

  1. Current State Assessment (1-2 months): Take inventory of the current analysis process, self-assess maturity level. Uncover judgment criteria that have become "tacit knowledge."
  2. Pilot Selection (2 weeks): Select one product line or analysis team as the pilot. A mid-sized team (5-10 people) where success is easier to achieve is ideal.
  3. Governance Document Preparation (2-3 months): Create drafts of policy, process flow, checklists, and competency matrix.
  4. Pilot Operation (3-6 months): Trial governance on a real project. Collect field feedback and revise documents.
  5. Company-wide Rollout (6 months~): Report pilot results and KPI data to management and expand to other departments gradually.
🧑‍🎓

Step 1, "uncovering tacit knowledge," seems the hardest. How do you formalize know-how that only exists in veterans' heads?

🎓

An effective method is "past trouble analysis." Gather analysis mistakes, rework, and defect outflow cases from the past 3-5 years and analyze "why they couldn't be prevented." Then, points like "if there had been a checkpoint here, it could have been prevented" or "only veterans knew this judgment" naturally become apparent. Trouble cases are a treasure trove of tacit knowledge.

SPDM (Simulation Process and Data Management)

🧑‍🎓

I hear about SPDM a lot lately. Is it essentially a system to manage CAE data?

🎓

Half correct and half insufficient. SPDM stands for Simulation Process and Data Management. The key point is that it manages not only data but also processes. Specifically:

  • Data Management: Version control and access control for CAD models, FE meshes, input files, result files, and reports.
関連シミュレーター

この分野のインタラクティブシミュレーターで理論を体感しよう

シミュレーター一覧

関連する分野

V&V構造解析流体解析
この記事の評価
ご回答ありがとうございます!
参考に
なった
もっと
詳しく
誤りを
報告
参考になった
0
もっと詳しく
0
誤りを報告
0
Written by NovaSolver Contributors
Anonymous Engineers & AI — サイトマップ
About the Authors