Reliability Issues for DOD Systems

Reliability Issues for DOD Systems PDF Author: National Research Council
Publisher: National Academies Press
ISBN: 0309168805
Category : Technology & Engineering
Languages : en
Pages : 103

Get Book

Book Description
The final report of the National Research Council's (NRC) Panel on Statistical Methods for Testing and Evaluating Defense Systems (National Research Council, 1998) was intended to provide broad advice to the U.S. Department of Defense (DoD) on current statistical methods and principles that could be applied to the developmental and operational testing and evaluation of defense systems. To that end, the report contained chapters on the use of testing as a tool of system development; current methods of experimental design; evaluation methods; methods for testing and assessing reliability, availability, and maintainability; software development and testing; and validation of modeling and simulation for use in operational test and evaluation. While the examination of such a wide variety of topics was useful in helping DoD understand the breadth of problems for which statistical methods could be applied and providing direction as to how the methods currently used could be improved, there was, quite naturally, a lack of detail in each area. To address the need for further detail, two DoD agencies-the Office of the Director of Operational Test and Evaluation and the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics-asked the NRC's Committee on National Statistics to initiate a series of workshops on statistical issues relevant to defense acquisition. The aim of each workshop is to inform DoD about the methods that represent the statistical state of the art and, through interactions of the statistical and defense communities, explore their relevance for DoD application.

Reliability Issues for DOD Systems

Reliability Issues for DOD Systems PDF Author: National Research Council
Publisher: National Academies Press
ISBN: 0309168805
Category : Technology & Engineering
Languages : en
Pages : 103

Get Book

Book Description
The final report of the National Research Council's (NRC) Panel on Statistical Methods for Testing and Evaluating Defense Systems (National Research Council, 1998) was intended to provide broad advice to the U.S. Department of Defense (DoD) on current statistical methods and principles that could be applied to the developmental and operational testing and evaluation of defense systems. To that end, the report contained chapters on the use of testing as a tool of system development; current methods of experimental design; evaluation methods; methods for testing and assessing reliability, availability, and maintainability; software development and testing; and validation of modeling and simulation for use in operational test and evaluation. While the examination of such a wide variety of topics was useful in helping DoD understand the breadth of problems for which statistical methods could be applied and providing direction as to how the methods currently used could be improved, there was, quite naturally, a lack of detail in each area. To address the need for further detail, two DoD agencies-the Office of the Director of Operational Test and Evaluation and the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics-asked the NRC's Committee on National Statistics to initiate a series of workshops on statistical issues relevant to defense acquisition. The aim of each workshop is to inform DoD about the methods that represent the statistical state of the art and, through interactions of the statistical and defense communities, explore their relevance for DoD application.

Reliability Growth

Reliability Growth PDF Author: Panel on Reliability Growth Methods for Defense Systems
Publisher: National Academy Press
ISBN: 9780309314749
Category : Technology & Engineering
Languages : en
Pages : 235

Get Book

Book Description
A high percentage of defense systems fail to meet their reliability requirements. This is a serious problem for the U.S. Department of Defense (DOD), as well as the nation. Those systems are not only less likely to successfully carry out their intended missions, but they also could endanger the lives of the operators. Furthermore, reliability failures discovered after deployment can result in costly and strategic delays and the need for expensive redesign, which often limits the tactical situations in which the system can be used. Finally, systems that fail to meet their reliability requirements are much more likely to need additional scheduled and unscheduled maintenance and to need more spare parts and possibly replacement systems, all of which can substantially increase the life-cycle costs of a system. Beginning in 2008, DOD undertook a concerted effort to raise the priority of reliability through greater use of design for reliability techniques, reliability growth testing, and formal reliability growth modeling, by both the contractors and DOD units. To this end, handbooks, guidances, and formal memoranda were revised or newly issued to reduce the frequency of reliability deficiencies for defense systems in operational testing and the effects of those deficiencies. "Reliability Growth" evaluates these recent changes and, more generally, assesses how current DOD principles and practices could be modified to increase the likelihood that defense systems will satisfy their reliability requirements. This report examines changes to the reliability requirements for proposed systems; defines modern design and testing for reliability; discusses the contractor's role in reliability testing; and summarizes the current state of formal reliability growth modeling. The recommendations of "Reliability Growth" will improve the reliability of defense systems and protect the health of the valuable personnel who operate them.

Statistics, Testing, and Defense Acquisition

Statistics, Testing, and Defense Acquisition PDF Author: National Research Council
Publisher: National Academies Press
ISBN: 0309066271
Category : Technology & Engineering
Languages : en
Pages : 180

Get Book

Book Description
The Panel on Statistical Methods for Testing and Evaluating Defense Systems had a broad mandate-to examine the use of statistics in conjunction with defense testing. This involved examining methods for software testing, reliability test planning and estimation, validation of modeling and simulation, and use of modem techniques for experimental design. Given the breadth of these areas, including the great variety of applications and special issues that arise, making a contribution in each of these areas required that the Panel's work and recommendations be at a relatively general level. However, a variety of more specific research issues were either brought to the Panel's attention by members of the test and acquisition community, e.g., what was referred to as Dubin's challenge (addressed in the Panel's interim report), or were identified by members of the panel. In many of these cases the panel thought that a more in-depth analysis or a more detailed application of suggestions or recommendations made by the Panel would either be useful as input to its deliberations or could be used to help communicate more individual views of members of the Panel to the defense test community. This resulted in several research efforts. Given various criteria, especially immediate relevance to the test and acquisition community, the Panel has decided to make available three technical or background papers, each authored by a Panel member jointly with a colleague. These papers are individual contributions and are not a consensus product of the Panel; however, the Panel has drawn from these papers in preparation of its final report: Statistics, Testing, and Defense Acquisition. The Panel has found each of these papers to be extremely useful and they are strongly recommended to readers of the Panel's final report.

Reliability Issues for DOD Systems

Reliability Issues for DOD Systems PDF Author: National Research Council
Publisher: National Academies Press
ISBN: 030908606X
Category : Technology & Engineering
Languages : en
Pages : 103

Get Book

Book Description
The final report of the National Research Council's (NRC) Panel on Statistical Methods for Testing and Evaluating Defense Systems (National Research Council, 1998) was intended to provide broad advice to the U.S. Department of Defense (DoD) on current statistical methods and principles that could be applied to the developmental and operational testing and evaluation of defense systems. To that end, the report contained chapters on the use of testing as a tool of system development; current methods of experimental design; evaluation methods; methods for testing and assessing reliability, availability, and maintainability; software development and testing; and validation of modeling and simulation for use in operational test and evaluation. While the examination of such a wide variety of topics was useful in helping DoD understand the breadth of problems for which statistical methods could be applied and providing direction as to how the methods currently used could be improved, there was, quite naturally, a lack of detail in each area. To address the need for further detail, two DoD agencies-the Office of the Director of Operational Test and Evaluation and the Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics-asked the NRC's Committee on National Statistics to initiate a series of workshops on statistical issues relevant to defense acquisition. The aim of each workshop is to inform DoD about the methods that represent the statistical state of the art and, through interactions of the statistical and defense communities, explore their relevance for DoD application.

Industrial Methods for the Effective Development and Testing of Defense Systems

Industrial Methods for the Effective Development and Testing of Defense Systems PDF Author: National Research Council
Publisher: National Academies Press
ISBN: 0309222737
Category : Technology & Engineering
Languages : en
Pages : 102

Get Book

Book Description
During the past decade and a half, the National Research Council, through its Committee on National Statistics, has carried out a number of studies on the application of statistical methods to improve the testing and development of defense systems. These studies were intended to provide advice to the Department of Defense (DOD), which sponsored these studies. The previous studies have been concerned with the role of statistical methods in testing and evaluation, reliability practices, software methods, combining information, and evolutionary acquisition. Industrial Methods for the Effective Testing and Development of Defense Systems is the latest in a series of studies, and unlike earlier studies, this report identifies current engineering practices that have proved successful in industrial applications for system development and testing. This report explores how developmental and operational testing, modeling and simulation, and related techniques can improve the development and performance of defense systems, particularly techniques that have been shown to be effective in industrial applications and are likely to be useful in defense system development. In addition to the broad issues, the report identifies three specific topics for its focus: finding failure modes earlier, technology maturity, and use of all relevant information for operational assessments.

Testing of Defense Systems in an Evolutionary Acquisition Environment

Testing of Defense Systems in an Evolutionary Acquisition Environment PDF Author: National Research Council
Publisher: National Academies Press
ISBN: 030918097X
Category : Technology & Engineering
Languages : en
Pages : 76

Get Book

Book Description
The Department of Defense (DoD) recently adopted evolutionary acquisition, a dynamic strategy for the development and acquisition of its defense systems. Evolutionary defense systems are planned, in advance, to be developed through several stages in a single procurement program. Each stage is planned to produce a viable system which could be fielded. The system requirements for each stage of development may be specified in advance of a given stage or may be decided at the outset of that stage's development. Due to the different stages that comprise an evolutionary system, there exists a need for careful reexamination of current testing and evaluation policies and processes, which were designed for single-stage developments. The Office of the Under Secretary of Defense for Acquisition, Technology and Logistics (USD-AT&L) and the Director of Operational Testing and Evaluation (DOT&E) asked the Committee on National Statistics (CNSTAT) of the National Academies to examine the key issues and implications for defense testing from the introduction of evolutionary acquisition. The CNSTAT was charged with planning and conducting a workshop to study test strategies for the evolutionary acquisition. The committee reviewed defense materials defining evolutionary acquisition and interviewed test officials from the three major test service agencies to understand the current approaches used in testing systems procured through evolutionary acquisition. The committee also examined possible alternatives to identify problems in implementation. At the workshop that took place on December 13-14, 2004, the committee tried to answer many questions including: What are the appropriate roles and objectives for testing in an evolutionary environment?, Can a systematic, disciplined process be developed for testing and evaluation in such a fluid and flexible environment?, and Is there adequate technical expertise within the acquisition community to fully exploit data gathered from previous stages to effectively combine information from various sources for test design and analysis?. Testing of Defense Systems in an Evolutionary Acquisition Environment provides the conclusions and recommendations of the CNSTAT following the workshop and its other investigations.

Test and Evaluation

Test and Evaluation PDF Author: Michael E. Motley
Publisher: DIANE Publishing
ISBN: 9780788141324
Category : History
Languages : en
Pages : 54

Get Book

Book Description
This report addresses Test and Evaluation (T&E) of software intensive systems and the DoD's efforts to improve the software process. DoD software costs total over $30 billion a year, of which 2/3's is for maintaining, upgrading, and modifying operational systems already in production. Today's major defense systems depend largely on the quality of this complex and increasingly costly software. Because software error can cause a system to fail, possibly with life threatening consequences, software intensive systems need to be thoroughly tested before production. Charts and tables.

System Reliability Toolkit

System Reliability Toolkit PDF Author: David Nicholls
Publisher: RIAC
ISBN: 1933904003
Category : Reliability (Engineering)
Languages : en
Pages : 872

Get Book

Book Description


The Role of Autonomy in DOD Systems - Reports on Unmanned Aerial Vehicles (UAV), Robotics, Teleoperation, Haptics, Centibot, Remote Presence, UxV, DARPA Research, and Space and Ground Systems

The Role of Autonomy in DOD Systems - Reports on Unmanned Aerial Vehicles (UAV), Robotics, Teleoperation, Haptics, Centibot, Remote Presence, UxV, DARPA Research, and Space and Ground Systems PDF Author: U. S. Military
Publisher:
ISBN: 9781549763373
Category :
Languages : en
Pages : 120

Get Book

Book Description
The DSB Task Force on the Role of Autonomy in DoD Systems was asked to study relevant technologies, ongoing research, and the current autonomy-relevant plans of the Military Services, to assist the DoD in identifying new opportunities to more aggressively use autonomy in military missions, to anticipate vulnerabilities, and to make recommendations for overcoming operational difficulties and systemic barriers to realizing the full potential of autonomous systems. The Task Force has concluded that, while currently fielded unmanned systems are making positive contributions across DoD operations, autonomy technology is being underutilized as a result of material obstacles within the Department that are inhibiting the broad acceptance of autonomy and its ability to more fully realize the benefits of unmanned systems. Overall, the Task Force found that unmanned systems are making a significant, positive impact on DoD objectives worldwide. However, the true value of these systems is not to provide a direct human replacement, but rather to extend and complement human capability by providing potentially unlimited persistent capabilities, reducing human exposure to life threatening tasks, and with proper design, reducing the high cognitive load currently placed on operators/supervisors.Unmanned systems are proving to have a significant impact on warfare worldwide. The true value of these systems is not to provide a direct human replacement, but rather to extend and complement human capability in a number of ways. These systems extend human reach by providing potentially unlimited persistent capabilities without degradation due to fatigue or lack of attention. Unmanned systems offer the warfighter more options and flexibility to access hazardous environments, work at small scales, or react at speeds and scales beyond human capability. With proper design of bounded autonomous capabilities, unmanned systems can also reduce the high cognitive load currently placed on operators/supervisors. Moreover, increased autonomy can enable humans to delegate those tasks that are more effectively done by computer, including synchronizing activities between multiple unmanned systems, software agents and warfighters--thus freeing humans to focus on more complex decision making.1.0 Executive Summary * 1.1. Misperceptions about Autonomy are Limiting its Adoption * 1.2. Create an Autonomous Systems Reference Framework to Replace "Levels of Autonomy" * 1.3. Technical Challenges Remain, Some Proven Autonomy Capability Underutilized * 1.4. Autonomous Systems Pose Unique Acquisition Challenges * 1.5. Avoid Capability Surprise by Anticipating Adversary Use of Autonomous Systems * 2.0 Operational Benefits of Autonomy * 2.1. Unmanned Aerial Vehicles * 2.2. Unmanned Ground Systems * 2.3. Unmanned Maritime Vehicles * 2.4. Unmanned Space Systems * 2.5. Conclusion * 3.0 Technical Issues of Autonomy * 3.1. Motivation: What Makes Autonomy Hard * 3.2. Defining Levels of Autonomy is Not Useful * 3.3. Autonomous System Reference Framework * 3.4. Needed Technology Development * 3.5. Technical Recommendations * 4.0 Acquisition Issues of Autonomy * 4.1. Requirements and Development * 4.2. Test and Evaluation * 4.3. Transition to Operational Deployment * 5.0 Capability Surprise in Autonomy Technology * 5.1. Overview of Global Unmanned Systems * 5.2. Unmanned Symmetric Adversary Scenarios * 5.3. Value for Asymmetric Adversaries * 5.4. External Vulnerabilities * 5.5. Self-Imposed Vulnerabilities * 5.6. Recommendations . * Appendix A--Details of Operational Benefits by Domain * A.1. Aerial Systems Strategy * A.2. Maritime Systems * A.3. Ground Systems * A.4. Space Systems * Appendix B--Bibliography * Appendix C--Task Force Terms of Reference * Appendix D--Task Force Membership * Appendix E--Task Force Briefings * Appendix F--Glossary

Information Technology and Military Power

Information Technology and Military Power PDF Author: Jon R. Lindsay
Publisher: Cornell University Press
ISBN: 1501749579
Category : Political Science
Languages : en
Pages : 366

Get Book

Book Description
Militaries with state-of-the-art information technology sometimes bog down in confusing conflicts. To understand why, it is important to understand the micro-foundations of military power in the information age, and this is exactly what Jon R. Lindsay's Information Technology and Military Power gives us. As Lindsay shows, digital systems now mediate almost every effort to gather, store, display, analyze, and communicate information in military organizations. He highlights how personnel now struggle with their own information systems as much as with the enemy. Throughout this foray into networked technology in military operations, we see how information practice—the ways in which practitioners use technology in actual operations—shapes the effectiveness of military performance. The quality of information practice depends on the interaction between strategic problems and organizational solutions. Information Technology and Military Power explores information practice through a series of detailed historical cases and ethnographic studies of military organizations at war. Lindsay explains why the US military, despite all its technological advantages, has struggled for so long in unconventional conflicts against weaker adversaries. This same perspective suggests that the US retains important advantages against advanced competitors like China that are less prepared to cope with the complexity of information systems in wartime. Lindsay argues convincingly that a better understanding of how personnel actually use technology can inform the design of command and control, improve the net assessment of military power, and promote reforms to improve military performance. Warfighting problems and technical solutions keep on changing, but information practice is always stuck in between.