DO-178C Software Certification (Complete Guide)
DO-178C is the cornerstone standard for certifying airborne software. This comprehensive topic brings together the standard itself, its supplements (DO-330 for tool qualification, DO-331 for model-based development, DO-332 for object-oriented technology, DO-333 for formal methods), and all the key concepts you encounter in a DO-178C certification program — from planning documents through verification and the Software Accomplishment Summary.
32 terms in this topic
All Terms
The primary guidance document used by certification authorities and industry for the development of airborne software. DO-178C defines the objectives, activities, and design considerations for software that performs functions in airborne systems and equipment. It establishes a framework of software lifecycle processes — planning, requirements, design, coding, integration, verification, configuration management, quality assurance, and certification liaison — with objectives that scale based on the software level (DAL A through E). DO-178C replaced DO-178B in 2011, adding technology-specific supplements and clarifying objectives.
A supplement to DO-178C that provides guidance on the qualification of software tools used in the development and verification of airborne software. DO-330 defines Tool Qualification Levels (TQL-1 through TQL-5) based on the potential impact of the tool on the airborne software and the software level. Tools that could introduce errors into the airborne software (development tools) or that could fail to detect errors (verification tools) require qualification at levels commensurate with their impact. The qualification process involves defining tool operational requirements, verifying the tool against those requirements, and demonstrating that the tool satisfies its qualification objectives.
A supplement to DO-178C that provides additional guidance for the use of model-based development and verification in airborne software. DO-331 addresses the use of models (such as Simulink, SCADE, or UML models) as design and requirements representations, including the specification of model-level requirements, simulation-based verification, and auto-code generation from models. The supplement defines when models can be used as requirements, design, or source code, and specifies additional objectives for model coverage analysis, model reviews, and traceability between models and the airborne software.
A supplement to DO-178C that provides additional guidance for the use of object-oriented technology (OOT) and related techniques in airborne software. DO-332 addresses the specific concerns that OOT introduces — including inheritance, polymorphism, dynamic dispatch, overloading, type conversion, exception handling, and templates/generics — and defines additional objectives to ensure these features do not compromise software development assurance. The supplement includes guidance on OOT-specific structural coverage criteria, such as subtype and dynamic coupling measures.
A supplement to DO-178C that provides guidance for the use of formal methods in airborne software development and verification. Formal methods use mathematically rigorous techniques to specify, develop, and verify software. DO-333 allows certain DO-178C objectives to be satisfied through formal analysis rather than through testing, provided the formal analysis is demonstrated to be sound and complete for the properties being verified. The supplement addresses formal specification, formal verification (theorem proving, model checking, abstract interpretation), and the relationship between formal analysis and traditional testing and review activities.
The EUROCAE publication of the software certification guidance document that is technically identical to RTCA DO-178C. ED-12C is published by EUROCAE (European Organisation for Civil Aviation Equipment) and is the European designation for the same standard. ED-12C is referenced by EASA through AMC 20-115D as the acceptable means for airborne software development assurance in the European regulatory framework. All technical content, objectives, tables, and appendices are identical to DO-178C.
Software that is intended to be used in airborne systems and equipment, and that performs or contributes to a function on the aircraft. Airborne software is subject to development assurance requirements as defined in DO-178C / ED-12C. The scope of airborne software includes embedded software in avionics equipment (flight management systems, display systems, engine controllers), software in line-replaceable units (LRUs), and software that performs functions necessary for continued safe flight and landing. Software used only for ground-based applications (manufacturing test, maintenance ground support) is not airborne software, although it may still require qualification as a tool under DO-330.
A separately identifiable part of a computer program that is a constituent of the airborne software. A software item is the unit at which software development assurance is applied — it has its own software level (DAL), its own set of lifecycle data, and its own compliance demonstration. A software item may be a complete standalone application, a partition in an IMA platform, or a distinct functional module with well-defined interfaces. Software items are identified during the software planning process and documented in the Plan for Software Aspects of Certification (PSAC).
A distinct part of a software item, typically identified at the architectural design level. Software components are the building blocks of a software item's architecture: they implement specific functions, have defined interfaces, and may be composed of lower-level components or code modules. In DO-178C, the term is used in the context of software architecture, where the software item is decomposed into software components that implement the high-level requirements through low-level requirements and source code.
The designation of the software development assurance effort required for a software item, based on the failure condition classification of the system function to which the software contributes. Software levels correspond to Development Assurance Levels (DALs): Level A software contributes to functions whose failure could cause or contribute to a catastrophic failure condition; Level B to hazardous; Level C to major; Level D to minor; and Level E to no safety effect. The software level determines which DO-178C objectives are applicable, the number of objectives that must be satisfied with independence, and the overall rigor of the development and verification processes.
The primary planning document for software certification, submitted to and agreed upon by the certification authority. The PSAC describes the system overview, software overview, certification considerations, software lifecycle processes, software lifecycle data, schedule, and any means of compliance deviations or alternative methods. It identifies the software items, their software levels, the applicable DO-178C objectives, the software lifecycle processes that will be used, the tools that require qualification, and any previously developed or COTS software that will be used. The PSAC is the certification authority's primary reference for understanding and overseeing the software development effort.
A lifecycle planning document that describes the verification methods, procedures, and environment that will be used to verify the software. The SVP defines the overall verification strategy, including what will be verified by review, what by analysis, what by testing, and what by a combination of these methods. It specifies the verification environment (target hardware, host simulation, emulation), the tools used for verification, the coverage criteria (statement, decision, MC/DC as applicable to the software level), and the criteria for verification completeness. The SVP also addresses independence requirements for verification activities.
A lifecycle planning document that describes the configuration management activities, procedures, and environment for the software. The SCMP defines how configuration items are identified, how changes are controlled (change request and problem report processes), how baselines are established and maintained, how configuration audits are performed, and how the integrity of the software lifecycle data is preserved. Configuration management under DO-178C is not optional: it is an integral process that ensures traceability, repeatability, and control throughout the software lifecycle.
A lifecycle planning document that describes the software quality assurance activities, methods, and responsibilities. The SQAP defines how the SQA function will provide assurance that the software development and verification processes conform to the approved plans and standards. SQA activities include process audits, transition criteria checks (ensuring activities are complete before proceeding to the next phase), review of lifecycle data for completeness and correctness, and reporting of deviations and non-conformances. The SQA function provides independence from the development team.
Software requirements that are developed directly from the system requirements allocated to the software item. High-level requirements specify the functional behavior, performance characteristics, timing constraints, interface definitions, and safety-related requirements of the software item in terms that are implementation-independent. HLRs describe what the software must do, not how it does it. Each HLR must be traceable to the system requirement(s) from which it was derived. HLRs that are not traceable to system requirements are classified as derived requirements and must be evaluated for their safety impact.
Software requirements that are developed from the high-level requirements to provide a more detailed description of the software behavior, closer to the implementation level. Low-level requirements are derived from the software architecture and detailed design process. They describe the software behavior at a level of detail sufficient to enable coding without further design interpretation. LLRs include algorithm details, data structure definitions, input/output descriptions, error handling logic, and timing requirements. Each LLR must be traceable to the HLR(s) from which it was derived, and the source code must be traceable to LLRs.
Software requirements (at either the HLR or LLR level) that are not directly traceable to a higher-level requirement but are generated by the software development process itself. Derived requirements arise from design decisions, implementation constraints, or the need to implement functions that are necessary for the software to work correctly but that were not explicitly stated in the system requirements. Examples include requirements for initialization sequences, internal data structures, error handling mechanisms, and resource management. DO-178C requires that derived requirements be provided to the system safety assessment process because they may introduce new failure modes or modify the failure behavior assumed in the system safety analysis.
The ability to trace relationships between lifecycle data items in both forward (from requirements to implementation and test) and backward (from implementation and test back to requirements) directions. DO-178C requires bi-directional traceability at multiple levels: (1) System requirements to HLRs and HLRs back to system requirements; (2) HLRs to LLRs and LLRs back to HLRs; (3) LLRs to source code and source code back to LLRs; (4) HLRs to test cases/procedures and test cases/procedures back to HLRs; (5) LLRs to test cases/procedures (for applicable levels) and test cases/procedures back to LLRs. Forward traceability ensures that all requirements are implemented and tested. Backward traceability ensures that all code and tests can be justified by a requirement (detecting extraneous code and unnecessary tests).
The evaluation of the outputs of a software lifecycle process to ensure correctness and consistency with respect to the inputs and standards for that process. In DO-178C, verification encompasses three primary methods: (1) Reviews — systematic examination of lifecycle data by qualified personnel to detect errors, omissions, and inconsistencies; (2) Analyses — examination of lifecycle data using mathematical or logical reasoning to detect errors or demonstrate properties (e.g., data flow analysis, control flow analysis, stack usage analysis, timing analysis); (3) Testing — execution of the software with defined inputs and comparison of actual outputs to expected outputs. Requirements-based testing is the primary testing strategy, supplemented by structural coverage analysis to assess the thoroughness of the test set.
A testing strategy in which test cases are derived from the software requirements (both HLRs and, where applicable, LLRs) rather than from the software implementation. Each test case is designed to verify one or more specific requirements by defining inputs that exercise the required behavior and expected outputs that demonstrate the requirement is correctly implemented. Requirements-based testing includes normal range testing (inputs within specified operating ranges), boundary value testing (inputs at the boundaries of specified ranges), and robustness testing (inputs outside specified ranges, where applicable). The goal is to demonstrate that every requirement is correctly implemented, that the software produces the correct outputs for the specified inputs, and that no unintended function exists.
An analysis of the source code structure to determine which code statements, decision branches, and conditions are exercised by the requirements-based test set. Structural coverage is not a testing method but a measure of test thoroughness. DO-178C defines three levels of structural coverage, required based on the software level: (1) Statement Coverage (Level C and above) — every statement in the code has been executed at least once; (2) Decision Coverage (Level B and above) — every decision (branch) in the code has taken both its true and false outcomes at least once; (3) Modified Condition/Decision Coverage (MC/DC) (Level A) — every condition within a decision has been shown to independently affect the decision outcome. If structural coverage analysis reveals code that is not exercised by requirements-based tests, the analysis must determine whether the gap indicates missing requirements, missing test cases, dead code, or deactivated code.
Testing that evaluates the software's response to abnormal inputs, conditions, and environmental stresses that are outside the normal operating envelope but that the software might encounter. Robustness testing verifies that the software handles invalid inputs, out-of-range values, corrupted data, timing anomalies, and resource exhaustion gracefully — without producing hazardous outputs, entering undefined states, or crashing. Robustness testing complements normal-range requirements-based testing and is driven by the HLR requirements for error handling, input validation, and fault tolerance.
The requirement that certain verification activities be performed by persons who are not the developers of the item being verified. Independence in DO-178C means separation of the verification function from the development function such that the verifier has no vested interest in the outcome and can provide an objective assessment. The degree of independence required increases with software level: at Level A, many verification objectives require independence (the person verifying an output must not be the same person who produced it); at Level D, fewer independence requirements apply. Independence can be provided by another engineer, a separate team, or an independent organization.
A designation under DO-330 / DO-178C that defines the qualification effort required for a software development or verification tool based on the potential impact of the tool on the airborne software. Five TQL levels are defined: TQL-1 (most rigorous) applies to tools whose output is part of the airborne software and that could insert errors, when used for Level A software; TQL-2 through TQL-4 apply to lesser combinations of tool impact and software level; TQL-5 (least rigorous) applies to tools that automate processes but whose output can be verified by other means. The TQL determines which DO-330 objectives must be satisfied for the tool qualification.
A design technique that provides isolation between software components or applications sharing common computing resources (processor, memory, I/O), such that a fault in one partition cannot adversely affect software in another partition. Partitioning has two dimensions: time partitioning (ensuring each partition receives its allocated processing time regardless of the behavior of other partitions) and space partitioning (ensuring each partition can only access its own memory regions and cannot corrupt another partition's data or code). Robust partitioning provides sufficient isolation that a software error in one partition cannot propagate to another partition, allowing partitions of different software levels to coexist on the same hardware platform.
Commercial Off-The-Shelf (COTS) software is software that was not developed with DO-178C compliance as a primary objective and is available commercially (e.g., operating systems, libraries, protocol stacks). Previously Developed Software (PDS) is software that was developed under DO-178C for a prior certification program and is being reused in a new application. Both categories present certification challenges. For COTS software, the full DO-178C lifecycle data is typically unavailable, so alternative means of compliance must be established — such as extensive testing, operational history credit (with caveats), or wrapping the COTS component with qualified interface protection. For PDS, change impact analysis and configuration management verification are required to ensure the software is applicable to the new installation.
A configuration item (CI) is a hardware or software entity that is designated for configuration management. In DO-178C, software lifecycle data — including plans (PSAC, SVP, SCMP, SQAP), requirements documents (SRD, SDD), source code, object code, test cases, test procedures, test results, traceability data, and the Software Accomplishment Summary — are all configuration items subject to configuration identification, change control, status accounting, and configuration audit. Each CI has a unique identifier, a defined baseline state, and a controlled change history.
A baseline is a formally established and configuration-controlled snapshot of the software lifecycle data at a specific point in the development process. DO-178C identifies several key baselines in the software lifecycle: the requirements baseline (after requirements are reviewed and approved), the design baseline (after design is reviewed and approved), the code baseline (after code is reviewed and passes testing), and the release baseline (the final configuration of the software approved for certification). Once a baseline is established, any change to its constituent configuration items must go through the formal change control process, with appropriate review, approval, and regression analysis.
A problem report (PR) is a document that records a discrepancy, deficiency, or anomaly discovered in any software lifecycle data — including requirements, design, code, test cases, test results, or documentation. A change request (CR) is a formal request to modify a configuration item. DO-178C requires that all problems discovered during the software lifecycle be recorded in problem reports, that problem reports be analyzed for their impact, and that corrective actions be tracked to closure. Open problem reports at the time of certification must be evaluated for their safety impact and documented in the Software Accomplishment Summary (SAS). The problem reporting and change control system is a critical element of configuration management.
Certification liaison is the ongoing communication between the applicant and the certification authority throughout the software lifecycle. Stages of Involvement (SOIs) are structured review points at which the certification authority evaluates the software development and verification progress. The FAA defines four SOIs for software: SOI #1 (Planning Review) — reviews the PSAC, plans, and standards before significant development begins; SOI #2 (Development Review) — reviews requirements, design, and initial development outputs; SOI #3 (Verification Review) — reviews verification results, including test results, coverage analysis, and traceability; SOI #4 (Final Review) — reviews the Software Accomplishment Summary, open problem reports, and the complete lifecycle data package. SOIs may also include audits of the development and verification environment.
The primary certification document produced at the conclusion of the software development and verification process. The SAS provides a summary of the software lifecycle, including: the software identification (part number, version); the system and software overview; the software lifecycle processes used; deviations from plans and standards; a summary of the software verification results; the status of configuration management activities; a summary of open problem reports and their disposition; a statement of compliance with DO-178C objectives; and a description of any unresolved issues. The SAS is the certification authority's primary evidence that the software development and verification process has been completed in accordance with the approved plans.
A designation of the rigor of the development assurance process applied to a system, software item, or hardware item, based on the severity of the most severe failure condition to which the item contributes. DAL is sometimes referred to as Item Development Assurance Level (IDAL). Five levels are defined: DAL A (most rigorous, associated with catastrophic failure conditions), DAL B (hazardous), DAL C (major), DAL D (minor), and DAL E (no safety effect, no development assurance objectives). The DAL drives the rigor of planning, development, verification, and configuration management activities as specified in standards like DO-178C (software), DO-254 (hardware), and ARP4754B (systems).
Related Topics
Software Certification (DO-178C)
Airborne software certification under DO-178C — planning, requirements, verification, structural coverage, and certification liaison.
The Big Standards Map
The core standards that form the spine of aviation certification — ARP4754B, ARP4761A, DO-178C, DO-254, DO-160G, and their European equivalents.
DO-254 Hardware Certification (Complete Guide)
Everything about DO-254 — complex electronic hardware certification including FPGA, ASIC, IP cores, verification, and hardware assurance.
Safety Assessment Process
The complete safety assessment process for aviation — from Functional Hazard Assessment through System Safety Assessment, using ARP4761A methods.
Need help navigating certification?
Understanding the terminology is the first step. If you need expert guidance on DO-178C, DO-254, ARP4754B, or any aspect of FAA, EASA, or TCCA certification, our team is here to help.