OASIS is a command-and-control web application used on the U.S. Army's Nuclear Biological Chemical Reconnaissance Vehicle (NBCRV). It integrates multiple chemical, biological, radiological, and nuclear (CBRN) sensors, UAVs, and vehicles into a single real-time interface for threat detection, mapping, and mission control. I was responsible for building most of the web app and real-time UI.
“This team has moved MRIGlobal into a leadership position for software development and sensor integration.”— MRIGlobal ISR Director, on the OASIS / NBCRV sensor suite program
The OASIS team received certificates of appreciation from JPEO-CBRND's Joint Project Manager for NBC Contamination Avoidance for delivering the integrated software under a tight deadline and supporting field demonstrations of the upgraded NBCRV sensor suite. I was personally honored with a plaque and a handwritten note from the Lt. Colonel overseeing the program.

Our ISR software team at the award ceremony for the NBCRV Sensor Suite Upgrade.
Mission: Real-Time CBRN Awareness from a Single Interface
The upgraded NBCRV Sensor Suite brings together multiple CBRN sensors on a Stryker platform, unmanned ground and aerial vehicles, and a unified command-and-control system. OASIS is the software layer that:
- Displays the health, status, and alarms of all integrated sensors in real time.
- Shows geospatial detections on a live map, including markers for positive identifications.
- Supports triangulation of threats and Named Areas of Interest (NAIs) using both manned and unmanned platforms.
- Allows operators to command and control UAVs directly from the map to scan specific regions for CBRN threats.
Everything had to work under field conditions, on tight timelines, with zero tolerance for vague behavior. If the UI lied or lagged, crews could not trust the system.
My Role: Real-Time Web App & UI Engineer
As a software engineer on the OASIS team, I built the majority of the web application and real-time user interface, working across the full stack:
- Implemented the SPA using Knockout.js, JavaScript, and an ASP.NET backend.
- Built rich, dynamic UI components for sensor status, alarms, and telemetry, including dashboards and detail views.
- Used D3.js for interactive visuals and overlays tied to live data.
- Worked with Entity Framework on the backend to persist logs, events, and configuration.
- Developed Python drivers for several sensors, helping bridge raw sensor protocols into the application.
I wasn't just implementing screens—I was designing how operators would think through and act on CBRN information in real time.
Map-Centric C2: NAIs, Triangulation, and UAV Tasking
A huge part of OASIS is the map-centric experience. I implemented most of the mapping UI using OpenLayers, including:
- Named Area of Interest (NAI) drawing tools: operators can mark regions on the map, and the system keeps sensors scanning those areas.
- Display of triangulation lines from IR and standoff sensors across both manned and unmanned platforms.
- GPS-based tracking of assets, vehicles, and UAVs, with real-time updates.
- A map-based UAV command-and-control interface: operators can select locations and task UAVs to investigate, all within the same UI.
My goal was to make the map feel like the primary story of the mission, not just a background visualization. Every click had to move the mission forward.
Sensor Integration & Simulation Tools
On top of the web UI, I spent a significant amount of time on sensor integration and testing. That included:
- Working on Python-based sensor drivers to standardize data from multiple CBRN devices.
- Integrating live feeds into the UI and verifying timing, formatting, and edge cases for each sensor.
- Building a separate mock-data generator application capable of streaming simulated sensor data into the UI.
That mock-data tool turned out to be critical—it allowed us to iterate on the UI and workflows quickly without waiting on full hardware setups, and made it possible to reproduce tricky field scenarios during development.
Field Testing in the Snow
Much of the OASIS work happened on-site at Edgewood Arsenal, integrating sensors onto the NBCRV Stryker and running full mission scenarios in winter conditions. These are some scenes from those trials.





Live sensor-integration trials at Aberdeen Proving Ground, where OASIS was validated under real operational conditions.
Field Demonstrations Under a Four-Month Deadline
The program had a very aggressive four-month delivery schedule for the integrated software, plus on-site testing and demonstrations. I deployed to Aberdeen Proving Ground, Edgewood Arsenal for intensive field work.
On-site we were running long days (and nights), often well over 100 hours per week, debugging live sensor feeds on vehicles, validating UI behavior in real missions, and making fixes on the fly so operators could trust the system.
The team ultimately delivered on time, supported multiple field demonstrations, and received formal recognition from JPEO-CBRND leadership. I was personally honored with a plaque and a handwritten note from the Lt. Colonel overseeing the program.
What This Project Represents
OASIS was a crash course in building mission-critical systems where UI, data, and hardware all collide. It pushed me in:
- Real-time web UI design with Knockout.js, D3.js, and OpenLayers.
- Back-end API and data modeling with ASP.NET and Entity Framework.
- Sensor integration, Python tooling, and simulation.
- Field-first engineering: watching real crews use the system and iterating until they trusted it.
It's one of those projects where the stakes are very real, the constraints are tight, and the reward is knowing that your code directly helps keep people safer.