DARE Demonstrates Magic Leap and HoloLens Playing Nice

Posted on March 22, 2019

RNI has recently demonstrated the Dismounted Augmented Reality Environment (DARE) abstracted plugin concept with client and server master libraries using the Magic Leap 1 and the HoloLens (rev 1). This demonstration uses the DARE master library as a client and multiplayer server that can potentially serve all cloud (or local DARE) connections (spectator, player, battle master, admin, configurator, etc.).

A surveyed fiducial marker was pasted on the wall (simple paper sticker). All AR clients then simply look at it before the exercise to set the local to world coordinate system pose references in the training location.

This concept allows the user to select what ever commercial HMD product allows for the most effective solution for any of the key dismounted components (HMD, Weapon, Radio, Body Motion Capture, Biometrics, Haptics, etc.)

The DARE includes a remote configurator which sets up the connections between abstracted plugins and different master libraries (client or server). This allow battle masters and local administrators to easily change what COTS components are used in a current exercise for AR/VR dismounted training.

RNI Wins Award to Develop the Dismounted Augmented Reality Environment (DARE)

Posted on January 9, 2017

Many organizations within the military (and academia/industry) are involved with the development of Augmented Reality (AR) technologies which will potentially benefit and enhance dismounted soldier training however most of this activity have been focused solely on the head mounted display with very little attention paid to the many other factors required for AR to be fully immersive for dismounted training.  Research Network, Inc. (RNI) is solving these deficiencies by developing the DARE; a complete environment for AR implementation which not only allows integration of the latest AR technologies but also provides for service based content, live/virtual interaction, and natural blending of augmented content into live ambient environments. The DARE is unique in its design through the implementation and use of abstracted plug-in techniques which allow for third parties to easily integrate their products.  The DARE also provides a complete distributed or stand-alone cloud-based service model which supports all aspects of DARE content, scenarios, after action metrics, and distributed development through the implementation of a master library.

DARE Components

The DARE subsystems, being designed as abstracted, discrete components, can be extended to a number of markets and uses.  The Wireless Head-Mounted Display (WHMD) will solve the limitations of the current market like Field of View (FOV) and usability in direct sunlight.  The Integrated Combat Vest can be used as a solution for integrating soldier training equipment and tactical engagement systems while providing stand-alone soldier tracking technologies; an extension of what RNI does now in support of the Soldier Battle Lab.  The Body Sensor Network (BSN) can be migrated from training into live military, commercial, and first responder applications where pose and physiological information on humans are recorded and monitored.  Imagine finally realizing the ability to monitor live soldiers or responders from anywhere in the world using video game like interfaces to see what the unit sees from any perspective and the units actual representation, biometrics, fatigue, and pose instead of just icons and symbols.

The Master Library concept with Abstracted Plugins was recently demonstrated between a Magic Leap 1 and a HoloLens AR HMD at RNI Offices.

 

 

 

 

RNI Demonstrates HitBox Haptics Systems

Posted on May 15, 2013

Under recent efforts with the US Army RDECOM STTC RNI has built and demonstrated a real-time Haptics system and integration with the Unity3D game engine through RNI’s Game Distributed Interactive System (GDIS) technology.  The system provides both an “initial bang” sensation to simulate a gun-shot wound along with a “Persisting Pain” sensation that throbs the localized area on the body until treated or healed.  In addition to the enhanced immersion, the system further enhances the use of Game Based Simulation systems in Medical and Combat Medic (TC3) application.

The HitBox Haptics system is real-time and integrates with the Avatar in a multiplayer environment.  Communications to the system are based on network messages to a special stimulator worn by the immersed (or desktop) player.  The initial prototype developed by RNI was based on body-located sensors (stimulators) which are driven by the custom driver and integrated into a standard “under-armor” shirt that is worn by the player.

The level of stimulation is fully adjustable along with the sensation types (Bang vs Persisting Pain) algorithms.   Algorithms for “kill shot” sensations and extension of the system to the full body are currently under development at RNI.  The upper body system is very low cost and uses mature  stimulator technologies which have been approved for use to the general public by the United States.

The primary intent of the HitBox Haptics system is to provide more realism to immersed players during training exercises and the necessary negative feedback associated with bad decisions in a hostile environment.  The system demonstrated these improvements along with reduced capability to function in the training simulation when the stimulator was active.  The perceived latency was very small (un-noticeable) between the visual indication of a gun shot in the simulation environment (GDIS-Unity) and the initial bang sensation.  The HitBox Haptics system can also be used as an extension to current RNI VIKENG technologies.

 

RNI Demonstrates VIKENG Locomotion Modes in UnReal3

Posted on October 17, 2012

Under sponsored research with RDECOM-STTC under the EDGE initiative, RNI was tasked to perform a feasibility demonstration of the Virtual Immersive Kinetic Engine technology along with components of the GDIS SimBridge Gesture Extension in the UnReal3 game engine.  This represents the third commercial game engine RNI has demonstrated the Gesture Extension integration with.  The first two (ValVe’s Source Engine, and Unity 3D) were demonstrated on previous efforts proving the compatibility of this technology with commercial engines and programming languages.  RNI has also used a number  of different vendors for the base motion capture subsystem including Xsens (Moven), Inertial Labs (3D Suit), and YEI 3-Space (Wireless) sensors.

The First-Step Virtual Locomotion Algorithm (RNI Patent Pending) was also used in this demonstration to illustrate seamless transition between live and virtual motion modes along with a “hybrid” model approach when in virtual locomotion modes.

 

RNI Enters a SBIR Phase III Transition Program with the Army

Posted on June 9, 2011

RNI was recently awarded a SBIR Phase III project with RDECOM-STTC to commercialize the GDIS and MMIST (Product Name VIKENG) products to government and industry.  The Phase III project exends a contacting vehicle to interested parties and has scope to migrate GDIS and Man-worn Immersive technologies to different game engines (CRY3 Engine Illustrated) to demonstrate the real-time API and Simulation Bridge capabilities of GDIS for using multiple game engines and LVC simulations simultaneously in a common experiment.  The CRY3 engine integration is supported under a research license with Crytek and Real-Time Immersive.  The use of multiple engines allows the best engine to be used for each player role such as:

  • Air, Land, Sea Assets – Mixed forces (Army, Navy, Air Force, Marines, Civilian, etc)
  • Large/Small Groups and Entity Counts
  • MMORPG or First Person Shooter Interfaces

Other items in the Phase III scope include extension of Man-Worn immersive systems to Haptic Devices (Shot location, persisting pain, virtual object physics force/feedback and manipulation.   The scope also extends virtual locomotion technology reserarch into the experimentation phase.

RNI Develops Real-Time Communications QoS API

Posted on May 23, 2011

Under sponsored effort with Northrop Grumman, RNI has developed a real-time communciations model for realistic voice communcations between squad nets and platoon leaders.  The COMMS API, includes a parameter based model of the “ICOM land mobile system” portable radios and a central voice station.  The real-time API is applied to in-game voice over IP (VOIP) systems to simulate the path loss, reflection, and obscuration effects associated with radio propagation through lossy and reflective materials causing low signal strength and in some cases complete signal loss between squad networks and point-to-point communcations with central platoon leaders.  The Game HUD was enhanced to show a real-time received signal strength (RSSI) display (RSSI) and “talker” S/I relative to “me” along with push-to-talk status and signal to interface ratio calculations.

COMMS API Hud in Action

RNI Demonstrates Real-Time Sensor Virtualization in GDIS

Posted on May 19, 2011

Under sponsorship from Northrop Grumman, RNI recently updated the GDIS software with a real-time API to allow sensor streams to be visualized on a .NET user interface.  The “Scorpion” extensions provided real-time video and thermal imagery along with aural (audio) data from virtualized sensor enties placed in the world.   The API allows for sensor configuration and positioning controls along with image configuration and interval settings all through the API.

RNI Wins Award for Virtual Locomotion Technology and Metrics Research

Posted on April 8, 2010

Under a current BAA with RDECOM-STTC RNI is performing a multi-phase study with experiments to assess, identify, and evaluate virtual locomotion techniques for the US Army. Over 27 systems were researched, evaluated and discussed in order to answer the three goals for this project. Different sensors, devices and systems that help enable those products were also studied and discussed.

STTC and RNI will be presenting the results of the first phase at I/ITSEC this year (November 2010) in a paper entitled Virtual Locomotion Concepts and Metrics Study by Tim Roberts, Jay Saffold, and Pat Garrity. Over 27 concepts in 6 discrete categories were studies for potential experimentation in the next phase.

RNI Wins Award for CNTPO extensions to Multi-Modal Interfaces in Serious Training Games

Posted on April 18, 2009

GDIS and MMIST technologiies will be updated to support numerous human interface modalities extensions to Counter Nacro Terroism mission, equipments, scenarios, and agents. These updates will extend the system to a wider range of playable levels (Afghanistan, Columbian Jungles, Border Patrol, etc) and include animal artificial intelligence in support of search for contraban.

RNI is performing direct motion capture and avatar bone manipulation. Man-worn sensory systems emphasize COTS technologies integrated with a wireless PAN. The CNTPO extensions are a plus-up to the current MMIST phase II program.

RNI GDIS Suports Army Research Institute in Game Based Training Effectiveness Study with Immersive and Desktop Systems

Posted on November 8, 2008

GDIS technologies will be used on both immersive (man-worn systems) and desktop game-based trainers to assess the pros and cons of training in these systems. The Immersive Desktop Analysis (IDA) effort will culminate in running candidates and evaluators over common scenarios and performing exit interviews to score the users comments in each system.

The study is being performed by Army Research Institute and the results have been published in Behavioral and Social Sciences forums as Usability of Wearable and Desktop Game-Based Simulations. Primary authors at ARI were John Barnett and Grant Taylor.