NAFEMS QSS (Quality System Supplement) — International Standards for CAE Simulation Quality Management
Theory and Background
What is QSS?
Professor, what are the NAFEMS quality control standards? Are they different from ISO 9001?
Good question. NAFEMS's QSS—Quality System Supplement—is something that complements ISO 9001, specifically tailored for CAE simulation. ISO 9001 is a generic framework for "quality management of products and services," but it doesn't address simulation-specific challenges—for example, "Who performed the analysis?" "Was mesh convergence checked?" "How was model validation done?"
I see, so with just ISO 9001, how to manage "CAE analysis quality" remains ambiguous.
Exactly. QSS specifically defines four pillars:
- Standardization of Analysis Processes: Document "what to do and in what order" in procedures (SOPs)
- Analyst Competency Certification: Objectively prove a technician's capability through the PSE (Professional Simulation Engineer) qualification
- Model Review Procedures: A system for independent third-party review
- Result Verification Procedures: A framework for systematically implementing V&V (Verification & Validation)
For example, in aerospace, when EASA or FAA grants type certification based on simulation results, they increasingly require such quality management systems. The NRC (U.S. Nuclear Regulatory Commission) is moving similarly in the nuclear field.
What, regulatory authorities even look at simulation quality management systems? That's quite strict...
Of course. If you're proving aircraft wing strength via simulation, you must demonstrate "whether that analysis was done using reliable procedures" and "whether the analyst has sufficient competence." Precisely because the trend of reducing physical tests and substituting with simulation is accelerating, quality standards like QSS have become essential.
Relationship with ISO 9001
Could you explain the relationship between QSS and ISO 9001 in more concrete terms? If we implement QSS, does ISO 9001 become unnecessary?
No, not at all. QSS is a supplementary layer built on top of ISO 9001. It takes the ISO 9001 Quality Management System (QMS) as a base and adds simulation-specific requirements.
| Item | ISO 9001 | NAFEMS QSS (Additional Requirements) |
|---|---|---|
| Scope | All products/services | Specialized for CAE simulation processes |
| Personnel Management | "Ensure competent personnel" (abstract) | Specifically defined via PSE qualification/competency matrix |
| Process Management | "Documented procedures" (generic) | Explicit analysis SOPs, model review, V&V procedures |
| Result Validity | "Monitoring and measurement" (generic) | Mesh convergence check, benchmark comparison, uncertainty quantification |
| Software Management | Not specified | Version control, validated solver list, patch management |
| Configuration Management | "Product identification and traceability" (generic) | Complete traceability of input files, mesh, boundary conditions, results |
So ISO 9001 only says "ensure competent personnel," but QSS defines "a person with this level of qualification in this field." That's quite specific.
QSS Framework Components
The QSS framework is largely composed of five domains:
- Governance: Defines policies, responsibilities, and authorities for simulation activities. Includes management commitment.
- People: Analyst competency requirements, PSE qualification, education/training plans.
- Process: Analysis procedure documents, model review, approval flow, change management.
- Technology: Software/hardware selection criteria, configuration management, license management.
- Data: Input/output data management, traceability, archiving policy.
"Governance" sounds like it's for large corporations, but is it necessary for SMEs and startups too?
It can be scaled according to size. For example, for a team of 5 engineers, just summarizing "who reviews" and "which solver version to use" on a single rule sheet is already proper governance. The important thing is converting tacit knowledge into explicit knowledge. The essence of QSS is moving away from person-dependent quality control like "It's fine if veteran A looks at it" to creating a state where "anyone can guarantee a certain quality level by following the procedure manual."
Quantitative Evaluation of Quality Score
Can QSS compliance be measured quantitatively?
NAFEMS doesn't officially prescribe a scoring formula, but many organizations quantitatively assess compliance with QSS requirements. For example, a model like this:
Here $w_i$ is the weight coefficient for each requirement item, $c_i$ is the compliance level (0~1), and $M$ is the total number of QSS requirement items. The compliance rate is:
I see, it's like a checklist completion rate. In practice, what percentage is typically targeted?
In aerospace, 90% or more is often required. In automotive, 70-80% is a realistic initial target. However, the goal is not "raising the score" itself; the essence is continuous improvement. It's important to have an attitude of turning the PDCA cycle and improving the score every year.
Historical Background of QSS
- 1983: NAFEMS established in the UK. Initially, the main purpose was standardizing finite element method benchmark problems.
- 1990s: NAFEMS benchmark problem sets gained international popularity. Software verification foundations were established.
- Around 2005: "Simulation-Driven Design," using simulation results for design decisions, became mainstream, making the need for quality management apparent.
- 2011: First edition of NAFEMS QSS published. Added CAE-specific requirements based on ISO 9001:2008.
- 2016: Formal operation of the PSE (Professional Simulation Engineer) qualification system began.
- 2020s: Application scope of QSS is expanding alongside digital twins and AI/ML utilization.
Related International Standards & Guidelines
- ASME V&V 10: Guidelines for verification and validation of computational models in solid mechanics
- ASME V&V 20: V&V guidelines for fluid dynamics and heat transfer
- ASME V&V 40: Standard for simulation credibility in medical devices
- NASA-STD-7009A: NASA standard for models and simulations
- ISO 13528: Statistical methods for use in proficiency testing by interlaboratory comparisons (applied to simulation result comparisons)
PSE Qualification and Analyst Competency
Overview of the PSE Qualification System
What kind of qualification is PSE? I don't hear much about it in Japan...
PSE (Professional Simulation Engineer) is an international qualification for CAE engineers certified by NAFEMS. It's set up by field, and currently includes the following categories:
- PSE-Structural: Structural analysis (linear/nonlinear)
- PSE-CFD: Fluid analysis
- PSE-Thermal: Thermal analysis
- PSE-Electromagnetics: Electromagnetic field analysis
- PSE-Multiphysics: Coupled analysis
Acquisition requires practical experience (typically 5+ years), recommendation, and a technical interview (portfolio-based). It's less about a written exam and more about evaluating "what kind of analysis you've actually done."
Huh, it's not a paper test. I think basing it on practical experience is a good system. But do Japanese companies ever say "Go get a PSE"?
Honestly, recognition is still low within Japan. But among European aerospace tier-ones (Airbus, Safran, etc.), cases requiring PSE-equivalent competency proof from analysis personnel are increasing. As long as Japanese companies are part of the global supply chain, they can't remain unrelated. Especially in the context of "replacing tests with simulation" (Certificate by Analysis - CBA), proof of analyst competency will become essential.
Designing a Competency Matrix
Even without a PSE qualification, is there a way to manage competency within a company?
Of course. QSS recommends defining a Competency Matrix. This is a table of skill items x levels, visualizing each analyst's competency.
| Skill Area | Level 1 (Beginner) | Level 2 (Intermediate) | Level 3 (Advanced) | Level 4 (Expert) |
|---|---|---|---|---|
| Mesh Generation | Automatic mesh for simple shapes | Manual control for complex shapes | Convergence evaluation + optimization | Special elements, conformal mesh |
| Material Models | Linear elastic only | Elastoplastic, creep | Damage, fracture models | User-defined materials |
| Boundary Conditions | Fixed constraints, concentrated loads | Symmetry, periodic conditions | Contact, submodeling | Coupled BC design |
| Result Verification | Contour plot confirmation | Comparison with theoretical solutions | Mesh sensitivity, uncertainty | V&V framework operation |
| Reporting | Result screenshots | Creating standard reports | Documenting technical judgments | Audit-compliant reports |
Having a matrix like this makes it clear "who should be assigned this analysis." It can prevent things like dumping a complex nonlinear analysis on a new hire.
Exactly. QSS requires defining the required competency level according to the "criticality" of the analysis. For example, a rule like: assign a Level 3+ analyst to analyses directly related to safety (e.g., aircraft fatigue life evaluation) and make independent third-party review mandatory.
Continuing Professional Development (CPD)
Once you get the qualification, is that the end?
PSE has CPD (Continuing Professional Development) requirements. You must continue a certain amount of annual learning/training to maintain the qualification. The CAE world advances rapidly, so knowledge from 5 years ago alone is insufficient. You need to keep catching up with new element technologies, solver function updates, and advances in V&V methods.
Specifically:
- Attending seminars/workshops hosted by NAFEMS
- Presenting/reviewing technical papers
- Organizing/participating in internal study sessions
- Challenging new analysis project fields
These are recorded and submitted for the PSE renewal review. It's a mechanism similar to lifelong learning for doctors.
Practical Guide: QSS Implementation Steps
Gap Analysis and Roadmap
I want to implement QSS in our company too, but where should we start?
The first step is gap analysis. Compare your current analysis processes with QSS requirements and identify "what's missing."
- Current State Assessment: Does your current team have SOPs (Standard Operating Procedures)? Who does model review? How is solver version management done?
- Gap Identification: For each of QSS's five domains (Governance, People, Process, Technology, Data), organize the difference between current state and requirements.
- Prioritization: Start with high-risk areas (safety-related analysis, regulatory compliance).
- Roadmap Creation: Phased implementation plan for 6 months, 1 year, 2 years.
You don't need to aim for 100 points right away. Starting with the "most painful point" is smart.
How do you actually do gap analysis? Is there something like a checklist?
NAFEMS provides a QSS Self-Assessment Tool. It's a questionnaire of 50-80 items, in a format where you answer "Not implemented (0)", "Partial (0.5)", "Fully implemented (1)" for each item. For example, questions like:
- "Are analyst competency requirements documented?"
- "Is an independent model review process defined?"
- "Do you maintain a validated version list for the software used?"
- "Is complete traceability of analysis input data/results ensured?"
- "Is the mesh convergence confirmation procedure standardized?"
When first done, most organizations score around 40-60%. From there, you can set a concrete goal like "First, aim for 70%."
Preparation of Analysis Procedure Documents (SOPs)
How detailed should SOPs (Standard Operating Procedures) be? If they're too detailed, they might not be used in practice...
That's a very good point. The granularity of SOPs should be at a level where "a new engineer can read it and execute it without mistakes." Even things veterans think "that's obvious" should be documented.
Typical structure of a CAE SOP:
- Scope: Which analysis types it applies to (e.g., linear static analysis, modal analysis)
- Required Competency: The competency level needed to execute this procedure
- Input Data Requirements: CAD data format, material data source, definition source for load conditions
- Mesh Standards: Element type, minimum size, quality criteria (aspect ratio < 5.0, etc.)
- Boundary Condition Check: Procedure for confirming physical validity of constraints
- Analysis Execution: Solver settings, convergence criteria, computational resources
- Result Verification: Mandatory check items (reaction force balance, energy conservation, mesh convergence)
- Review/Approval: Who reviews, who approves
- Record Retention: Storage location and retention period for input files, result files, reports
About #9, "Record Retention," does everyone actually do that properly? Honestly, result files often get lost...
Actually, that's the point where QSS implementation shows the most effect. If someone says "Please reproduce the analysis from 3 years ago," and you have all the input files, solver version, and mesh settings ready, you can respond in 10 minutes. If not, it takes 3 days. Traceability is unglamorous but has a very high return on investment.
Model Review Process
What exactly do you check in a model review?
QSS recommends a three-stage review: