Augmented Reality Atmospheric Transport & Dispersion Simulation

BACKGROUND

The Augmented REality Sandtable (ARES) developed by the Army Research Laboratory (ARL) Advanced Simulation and Training Division (ATSD) in Orlando, Florida, is a prototype interactive visualization environment used by ARL for research in battlespace visualization. ARES is in use at several locations such as West Point, Camp Blanding and a few ROTC units as a training aid. The software (and in some cases, additional tables) is being shared with several organizations for collaboration in research.  The ARES architecture is intended to be an open, extensible platform to be expanded by the addition of new capabilities (as "apps" through the ARES APIs). A critical future use case involves plume dispersion of chemical, biological, radiological, nuclear, or explosive (CBRNE) constituents. This project aims to couple an NCAR Atmospheric Transport & Dispersion (AT&D) simulation capability to the ARES system.

Figure 1: User Interaction with the Augmented Reality Sandtable (upper left).  ARES-SimBox integrated visualization and simulation system (right). Simulated plume evolution rendered via animation onto a sandtable (lower left).
Figure 1: User Interaction with the Augmented Reality Sandtable (upper left).  ARES-SimBox integrated visualization and simulation system (right). Simulated plume evolution rendered via animation onto a sandtable (lower left).

RAL staff have developed a software framework and custom hardware platform for advanced atmospheric flow and constituent plume transport and dispersion simulations from which ARES API requests for can be requested by a sandtable user. The user prescribes and submits a simulation request via RabbitMQ message exchange to the “SimBox” server. The server subsequently performs automated setup, and launch through a python model-view-controller software framework, for a specific simulation based on allowable user inputs. The simulation server is a multithreaded python daemon responsible for setting up and launching a specific MPI-capable WRF simulation. Upon successful simulation launch, the simulation server utilizes python watchdog threads to track output of WRF simulation results to file. As a WRF simulation progresses and new results are continuously written the simulation server encodes this data and publishes them again via RabbitMQ for rendering plume/meteorology animations back onto the sandtable.

This integration of NCAR AT&D simulation capability alongside an augmented reality visualization medium provides a potential disruptive technological advance in current training and strategic planning paradigms across many emergency response preparedness application spaces from urban air transport and dispersion consequence to wildland fire or extreme weather events.  

FY2017 & FY2018 ACCOMPLISHMENTS

  • Specialized hardware specification, procurement, and deployment. Twin “SimBox”, machines containing 20 cores of CPU and an Nvidia GP100 graphics processing unit (GPU) capable of performing simulations utilizing WRF and the new FastEddy, GPU-accelerated large eddy simulation (LES) were obtained and distributed (one with NCAR, and one with ARL collaborators) to facilitate integrated system development.
  • SimBox-server API design and implementation. The simulation server API, model-view-controller (MVC) design and implementation combing RabbitMQ, watchdog threads, google protocol buffers, and python initiation of WRF simulations were completed and disseminated to ARES collaborators as an alpha-release of the SimBox-server.
  • WRF customizations for automated setup, launch, and results reporting of AT&D simulation was completed.
  • System integration testing was performed and minor technical issues ironed out in preparation for full system beta-release.
  • Robustness testing and hardening of the simulation setup/launch parameters and allowable configurations to meet the specific needs of instructional setting use-case.
  • Successful integrated system deployment in plume dynamics instructional setting.
  • Addition of extended model data communications to facilitate advanced augmented reality rendering including three-dimensional plume animation, and incorporation of accelerated turbulence-resolving simulations (GPU-LES on SimBox).

 

Figure 2: User Interaction with the Augmented Reality Sandtable (ARES) and SimBox model integration system. Projection of near-surface windfield quivers and plume concentration footprint both onto the sandtable (background), and through a tablet (foreground) for an enhanced end-user immersive experience.
Figure 2: User Interaction with the Augmented Reality Sandtable (ARES) and SimBox model integration system. Projection of near-surface windfield quivers and plume concentration footprint both onto the sandtable (background), and through a tablet (foreground) for an enhanced end-user immersive experience..
Figure 3: Integration of high space and time fidelity, dynamically evolving three-dimensional turbulent plume structure and concentration via virtual reality and augmented reality modalities (Hololens-left, tablet-right).
Figure 3: Integration of high space and time fidelity, dynamically evolving three-dimensional turbulent plume structure and concentration via virtual reality and augmented reality modalities (Hololens-left, tablet-right).

FY2019 Plans

Urban environment effects due through two candidate approaches. Buildings will be represented on the resolved grid through both either the immersed boundary method, or a geometry resolved, subgrid-scale porous media-like drag formulation.

WRF to GPU-LES coupling for combined mesoscale and microscale modeling in one system utilizing the cell perturbation method for resolved turbulence instigation at the nested boundaries LES domains.