Found 130 Papers (click title to view abstract)
Found 130 Papers
(click title to view abstract)
Training and Performance of Multiteam Systems in Naval Warfare Environments | ||
Year: 2017 Authors: Leah Ellison, Jessica Wildman, Patrick Converse, Erin Richard, Trevor Fry, Shelby-Jo Ponto, Jennifer Pagan, Alyssa Mercado, Melissa Walwanis, Andrea Postlewate, Amy Bolton Abstract: Multiteam systems (MTSs) often provide benefits over traditional teams when completing work or tasks in the context of complex and dynamic environments. However, challenges still exist in understanding and capturing the processes driving successful MTS performance. In the current effort, a cognitive task analysis (CTA) methodology was utilized to explore the driving antecedents of successful MTS coordination and integration within a carrier strike group (CSG) operating in a Naval warfare environment. The CTA identified critical incidents and emergent themes through structured interviews of 59 subject matter experts across Naval surface and air units operating in warfare environments. Researchers utilized a top down approach, leveraging existing frameworks (Ishak & Ballard, 2012; Marks, Mathieu, Zaccaro, 2001; Mathieu, Maynard, Rapp, & Gilson, 2008; Pagan, Kaste, Zemen, Walwanis, Wood, & Jorett, 2015; Wildman et al., 2012) of team knowledge, skills, and attitudes (KSAs) to be applied to the multiteam domain of the CSG. The framework was used to code CTA data to determine the KSAs necessary for successful MTS performance and modified to reflect domain specificity as required. The KSA framework was then used as guidance to provide recommendations for MTS training and performance measurement. These recommendations are currently being used to develop specific, multilevel performance measures of the KSAs needed to effectively operate in changing, complex environments. The development of these performance measures also coincides with efforts to develop training to provide feedback on coordination, information exchange, and other elements of MTS performance. Finally, efforts are also being conducted towards the development of experimental, quasi-experimental, and agent-based modeling in order to evaluate the recommendations and performance measurement criteria. Execution of these recommendations, performance measures, and training are expected to improve decision-making and information exchange of the CSG as a whole within these complex warfare environments where these processes are critical to mission success.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Developing an Immersive Virtual Reality Aircrew Training Capability | ||
Year: 2017 Authors: Dr. Eric Sikorski, Amanda Palla, Dr. Linda Brent Abstract: The Combating Terrorism Technical Support Office (CTTSO) and the Air Force Special Operations Command (AFSOC) are assessing the training effectiveness of an immersive virtual reality part task trainer (vrPTT) for AC -130 cockpit familiarization. AFSOC wants to move AC-130 checklist procedures and cockpit familiarization training from low-density/high-demand weapons system simulators to a widely available, low-cost vrPTT, while simultaneously migrating the tutorial courseware and instructor-led portions of the course to a higher fidelity, more immersive environment. This blended learning vrPTT will allow pilots to receive instruction and immediately proceed to practicing scenarios in a highly realistic, immersive Virtual Reality (VR) environment. It also reduces time in the simulators by allowing pilots to learn the procedures and develop muscle memory. The vrPTT was designed and developed from an extensive front-end analysis, including task performance interviews with AC-130 pilots and instructors. The system consists of a 3D VR representation of the AC-130 cockpit viewed through an Oculus Rift head-mounted display (HMD) and underlying equipment behavior models that produce appropriate responses to pilot inputs. Pilots interact with the virtual control systems via the integration of a Leap Motion infrared sensor that tracks the position and motion of all ten fingers. Tutorial content is laid over the equipment simulation, and an integrated intelligent tutor provides adaptive feedback during a scenario and to adjust the initial instruction for future exercises. This paper describes the front-end analyses performed to create the measures for the intelligent tutor, approaches to overlay courseware in a VR HMD environment, and the overall system required to achieve acceptance by the pilot user group. It also details future phases of this program, including a training effectiveness study comparing a control group of students in the simulator who have not used the VR system to those who have used the VR system.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Modelling and Simulation as a Service from End User Perspective | ||
Year: 2017 Authors: Lt.Col. Marco Biagini, Lt.Col. Jason Jones, Lt.Col. Michelle La Grotta, Maj. Alfio Scaccianoce, Capt. Fabio Corona, Dr. Dalibor Prochazka, Agatino Mursia, Marco Picollo, Christian Faillace Abstract: Modelling and Simulation as a Service (MSaaS) is a new approach being explored by NATO Science and Technology Organization (STO) Modelling & Simulation Group (MSG) Panel for a permanently available, flexible, service-based framework to provide more cost effective availability of Modelling and Simulation (M&S) products, data and processes to a large number of users on-demand. This Research Task Group is working on the development of the implementation of this framework, defining policies, stakeholders’ roles, services and reference architecture and reference engineering processes. MSaaS can be defined as “enterprise-based level architecture for discovery, orchestration, deployment, delivery and management of M&S services”. The University of Defence of the Czech Republic and the NATO M&S Centre of Excellence are investigating and proposing an approach to contribute to the definition of the MSaaS from an End User perspective. The paper proposes definition of M&S Software as a Service (MSSaaS), M&S Platform as a Service (MSPaaS) and M&S Infrastructure as a Service (MSIaaS) to introduce new roles and new business connections taking also into consideration Service Oriented Architecture (SOA) definitions and those definitions stated in NATO Modelling and Simulation Master Plan (NMSMP). In particular the authors propose a contribution to the definition of the different stakeholders’ roles and their relationships, starting from those of the MSG 136 group (M&S Group 136, Modelling and Simulation as a Service) and introducing new roles regarding the End User. In conclusion, this research and study activity proposes, in addition to the existing definitions, a taxonomy comparing roles across service models (MSSaaS, MSPaaS and MSIaaS). Furthermore, the M&S services’ classification is analysed in the framework of the MSG 136 Operational Concept draft, in order to identify the services which are to be properly composed and orchestrated to satisfy the End User requirements.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Performance Measurement in LVC Distributed Simulations: Lessons from OBW | ||
Year: 2017 Authors: Jaclyn Hoke, Lisa Townsend, Sam Giambarberee, Sae Schatz Abstract: Operation Blended Warrior (OBW) 2016 marked the second year of a three-year effort to document lessons learned and understand barriers to implementing Live, Virtual, Constructive (LVC) distributed training. In the first year of the event, LVC focus areas included connectivity, interoperability, data standards, after-action review, and cyber security. Year two introduced additional focus areas: multi-level security, cross domain solutions, long-haul feeds, and performance measurement. This paper focuses on this latter area—defining and collecting performance measures. Performance measurement in simulation-based training faces formidable obstacles, including the identification of individual and collective performance dimensions, how these dimensions relate to training goals, and how training transfers to operational readiness. Blending of LVC elements introduces additional complexity, not only for human performance assessment but also for evaluating the effectiveness and efficiency of the technical system. In this paper, we present the measures defined and collected during OBW in four primary areas: 1) cost analysis, 2) network performance, 3) trainee performance, and 4) whether OBW met the expectations of participating organizations. We also discuss three categories of Measures of Effectiveness (MoEs) and Measures of Performance (MoPs) established by the OBW Strategic Integrated Product Team: Programmatic, Technological, and Learning. These MoEs and MoPs will facilitate annual comparisons of performance measurement at OBW and encourage use of the event as a sandbox to design and validate LVC performance measurement tools. Finally, we present the goals and measures established for Performance Measurement during OBW 2017 and recommendations for future events.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Predicting Manufacturing Aptitude using Augmented Reality Work Instructions | ||
Year: 2017 Authors: Anastacia MacAllister, Eliot Winer, Jack Miller Abstract: The complexity of manufactured equipment for the U.S. military has increased substantially over the past decade. As more complex technology is integrated into battlefield equipment, it is more important than ever that workers manufacturing this equipment have the necessary skills. These specialized manufacturing skills require careful workforce selection and training. However, traditionally, workers are assigned roles based on instructor evaluation and qualitative self-assessments. Unfortunately, these assessments provide limited detail about a candidate’s aptitude. By using more detailed data captured from assembly operations, a more complete profile of an operator’s skills can be developed. This profile can then guide assignment of a worker to maximize productivity. This paper develops a Bayesian Network (BN) to predict worker performance using data captured from 75 participants via augmented reality guided assembly instructions. Information collected included step completion times, spatial abilities, and time spent on different assembly operations. For analysis, participant data was divided into training and testing sets. The data was mined for trends that could statistically predict measures of performance like errors or completion time. Based on these trends, the training set was used to construct the BN. The authors found that the model could predict some aspects of performance accurately, such as assembly completion time in the testing set. While these results were encouraging, further analysis demonstrated the network was biased by probabilities that were greatly influenced by the number of data points present in a category. The results highlight that, with small data sets, there is often not enough observed evidence to produce accurate predictions with BN. This suggests that a method of data simulation or generation is required to increase the number of training set samples. This would enable powerful BN tools to be used in real world manufacturing applications were collecting hundreds-of-thousands of data points is not feasible.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Creating Effective LVC Training with Augmented Reality | ||
Year: 2017 Authors: Nathan Jones, Koren Odermann, Peter Squire, Adrienne Read, Natalie Steinhauser Abstract: While live training is often the preferred method for Marines, Force-On-Force (FoF) training exercises lack the visual cues necessary to effectively train Call for Fire (CFF), Close Air Support (CAS), and other engagements. Imagination has been the method of training and feedback on live ranges for decades when weapons and platforms were not available or limited by costs. Augmented Reality (AR) training systems now offer the opportunity to provide realistic visuals of virtual and/or constructive entities and engagements on the live range. To ensure that AR technology can be utilized to support FoF training, an assessment of the Augmented Immersive Team Trainer (AITT), an AR training system, was conducted to determine how well AITT supports specified training objectives. The AITT system was developed by Office of Naval Research (ONR) and transitioned to Program Manager Training Systems (PM TRASYS). The program office technical assessment team utilized a task and attribute based approach to assess the simulator and simulation on both the activities an individual is required to do in the performance of a specific job (i.e., tasks) and the fidelity the device is required to provide to support that performance (i.e., attributes). This paper describes the AITT technology and the assessment conducted to support training objectives. In addition, the paper discusses potential for AR technology to enhance the live component in Live-Virtual-Constructive (LVC) environments. This paper provides discussion points based on lessons learned and required development for AR systems being developed to enable an effective LVC training solution.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Expert-Assisted Field Maintenance using Augmented Reality | ||
Year: 2017 Authors: Jonathan Schlueter, Eliot Winer Abstract: In 2016, the US military had an operation and maintenance budget of almost USD $200 billion, the largest budget of any appropriation category. As such, it is imperative to minimize errors and costs when performing maintenance tasks. Unfortunately, the military may not always have enough skilled technicians on hand to send to all maintenance sites. Because of this, warfighters must often perform maintenance on systems outside their area of expertise. Augmented reality (AR) has been shown to effectively deliver context-aware instructions, reducing the time needed to identify suitable maintenance steps by more than 50%. However, even with the use of AR, it would be impractical to create maintenance instructions for each unique piece of machinery. By connecting a remote warfighter to a skilled technician, a plethora of maintenance knowledge can be quickly transferred in a targeted manner. This paper details a mobile application that puts both the power of AR and the knowledge of a technician directly into the hands of a warfighter in the field. A mobile application was developed using the Unity3D game engine that enables technicians to use AR to visually share maintenance knowledge with a remote warfighter. A live camera feed of the warfighter’s immediate area is streamed to the technician. Observing this feed, the technician sends back real-time maintenance guidance in the form of augmented 3D models and animations, selected from a list or created dynamically. Once received by the remote warfighter, the augmentations are overlaid onto the physical object, and the technician-recommended maintenance step is visible to the warfighter. While there has been ample research regarding AR-assisted processes, little focus has been given to leveraging the detailed knowledge of existing personnel. This paper discusses the application development and technical evaluation to ensure real-time connectivity in geographically distributed locations.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Emerging Network and Architecture Technology Enhancements to Support Future Training Environments | ||
Year: 2017 Authors: Bruce Caulkins Ph.D., Brian Goldiez Ph.D., Paul Wiegand Ph.D., Glenn Martin Ph.D., Paul Dumanoir, Tom Torres Abstract: The Operational Environment (OE) has become increasingly complex, with challenging human factors, exponential proliferation of technology, and an increasingly determined, adaptive threat. Training Army Soldiers, leaders and units in a complex world requires modernized, integrated, realistic, and adaptive training capabilities. The Army must leverage emerging technologies to transform the way it develops and delivers training to enable agile and adaptive Soldiers, leaders and versatile units. It must provide a consistent, persistent ability to train at the point of need (PON) for both current and future operations as part of a Joint, Inter-organizational, and Multinational (JIM) force. The training venues must allow the Army to train as it fights, using its wartime systems on its operational networks and all training environments must replicate the OE to the greatest extent possible. To address this reality, the U.S. Army Program Executive Office for Simulation, Training and Instrumentation (PEO STRI) and the University of Central Florida (UCF), Institute for Simulation and Training (IST) are conducting research to create new capabilities that support both operations and training while enabling software application migration to Army enterprise data centers and cloud environments. This research pivots on Army directives that focus on modernizing information technology systems and applications. To achieve the distributed nature of this vision, technical enhancements to the underlying Army Enterprise Network (AEN) must be made in the next few years. This paper investigates potential gaps in simulation enterprise network architectures and describes research results in three major technical areas that address these gaps and will benefit future simulation and network architectures. The research topics include technologies that: (1) efficiently delivers simulations from cloud-like environments using Software Defined Networks (SDNs); (2) facilitates individual and collective home station or field-based training through the use of thin clients; and (3) optimizes computational resources through load-balancing techniques and processes.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Using Virtual Simulation for Training the Brazilian Armored TTP | ||
Year: 2017 Authors: Andrei Piccinini Legg, Osmar Marchi dos Santos, Pedro Procopio de Castro, Victor Emanuel Neves, Cristiano de Souza Dorneles, Rodrigo Dias Neto Abstract: The Brazilian Army has been pushing the use of simulation technology as a key component during training across its different training establishments. One of the first schools inside the Brazilian Army to adopt the use of simulation technology was the Brazilian Armor School (CI Bld - Centro de Instrução de Blindados). During training, CI Bld instructors use virtual simulators to train students to the latest knowledge of the Brazilian Armored Tactics, Techniques and Procedures (TTP) at the company level. Training exercises have a very tight schedule, which include up-to-date TTP subjects using virtual simulators and a tailored after action review. The exercises with virtual simulators have been taking place since 2012. Since then, more than 1000 students have attended the exercise. This work presents the methodology used in the exercises at CI Bld and how it has improved training at the Brazilian Armored Tactics, Techniques and Procedures. Moreover, changes carried out to enhance the training exercises during these years are discussed.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|
||
Web-Based GUI System for Controlling Entities in Constructive Simulations | ||
Year: 2017 Authors: Per-Idar Evensen, Kristian Selvaag, Dan Helge Bentsen, Håvard Stien Abstract: At the Norwegian Defence Research Establishment (FFI) we investigate how to increase combat effectiveness in land force operations. As part of this work we need to conduct detailed, entity-level simulations of battalion to brigade level operations, to assess the performance of different land force structures and operational concepts. Traditional constructive simulation systems often do not have the required level of resolution, are too complex and cumbersome to use, or are not flexible enough with respect to representation of new technologies (e.g. new sensor systems, weapon systems, and protection systems). Previously, we have successfully employed Virtual Battlespace (VBS) for several smaller-sized (platoon to company) virtual simulation experiments for evaluating the operational benefit of new technologies and new concepts. Recent improvements allow simulation of more than a thousand constructive, semi-automated entities in VBS, but it currently lacks an appropriate user interface for controlling constructive entities. We are developing an easy-to-use, web-based graphical user interface (GUI) system for controlling constructive entities simulated in VBS. So far we have developed functionality for controlling indirect fire entities and maneuver entities. In the future we plan to extend the system with functionality for controlling combat service and support entities simulated in VBS, and air and air defense entities possibly simulated in VR-Forces. Since we conduct simulations for experimentation and analysis purposes, and not command and staff training, the system has been designed to only require a minimum amount of input from the operators. It is a goal that military officers should be able to control the entities with minimal instruction. In addition, the simulations should be conducted with a minimum number of operators on each side. In this paper we describe the overall design and implementation of the GUI system, as well as the experiences from the initial experiments with the system.
Downloads are free for the duration of 2024. Log in with you account to download!
Register or Login
|