Edge AI has been a priority for the U.S. military for years. What’s been harder to achieve is edge AI that actually works the way defense leaders want it to: portable, interoperable, hardware-agnostic, and resilient when the network disappears.
At AFCEA West, Latent AI plans to show what that looks like in practice.
The edge AI specialist announced it will demonstrate a full-stack, interoperable edge AI ecosystem next week in San Diego alongside Sigma Defense and Abaco Systems. The live demos are aligned with the Department of Defense’s Modular Open Systems Approach (MOSA) and focus squarely on plug-and-play AI that runs across multiple hardware platforms—without custom integration or vendor lock-in.
In short: this is edge AI designed for how modern military operations actually work, not how cloud-first architectures wish they did.
Why Interoperability at the Edge Still Matters
The Pentagon has been clear about its priorities. Programs like MOSA and SOSA exist for a reason: too many defense systems are brittle, proprietary, and expensive to modify once deployed. AI has only amplified that problem.
While edge AI promises real-time decision-making closer to the fight, models deployed in isolation tend to degrade as conditions change, data drifts, and adversaries adapt. Updating those models often requires reachback to centralized infrastructure—something that simply isn’t available in denied, degraded, intermittent, and low-bandwidth (DDIL) environments.
Latent AI’s message at AFCEA West is that edge AI alone isn’t enough. What matters is an ecosystem that can deploy, orchestrate, adapt, and sustain AI in the field—without depending on the cloud.
Adaptive Edge AI, Proven in the Field
At the core of Latent AI’s approach is its Field Tactical Suite (FTS), an AI pipeline designed for warfighters, not data scientists. FTS allows operators to deploy, update, and refine AI models in minutes rather than days or weeks, directly at the tactical edge.
This isn’t theoretical. Latent AI says its technology has already been field-proven through Navy Project APFIT and Army Project Linchpin, two programs focused on accelerating AI adoption in operational environments.
The key differentiator is adaptability. Rather than treating AI models as static artifacts, Latent AI’s optimized models can adjust to shifting operational conditions—maintaining performance even as data inputs, mission parameters, and threat profiles evolve.
That capability becomes critical when systems can’t rely on cloud connectivity or higher-echelon support to stay current.
The “Better Together” Stack: AI, Orchestration, and Hardware
What Latent AI is showing at AFCEA West isn’t just software. It’s a coordinated demonstration of how AI models, orchestration platforms, and rugged hardware function as a single, mission-ready system.
Sigma Defense: Orchestration at the Tactical Edge
Through its partnership with Sigma Defense, Latent AI integrates its containerized AI services with Olympus, Sigma’s platform for secure orchestration and software mobility.
Olympus handles what many edge AI deployments struggle with: the last mile. It pushes updates, monitors runtime health, and optimizes compute resources under strict size, weight, and power (SWaP) constraints—even when bandwidth is limited or connectivity is lost entirely.
This pairing enables AI capabilities like EO/IR tracking and automatic target recognition to remain operational and up to date, even in disconnected environments. For CJADC2 ambitions, that’s a non-trivial requirement.
As Sigma Defense CEO Matt Jones put it, superiority won’t come from piling on more hardware—it will come from delivering the right applications, reliably, where warfighters actually operate.
Abaco Systems: Rugged, Modular Compute Without Lock-In
Latent AI’s collaboration with Abaco Systems completes the stack.
Abaco provides defense-grade mission computers built for harsh operational environments, delivering high-performance compute in low-SWaP, SOSA-aligned form factors. Latent AI’s contribution is equally important: hardware-agnostic AI models optimized to run efficiently across CPU and GPU configurations without bespoke tuning.
The result is true plug-and-play AI. Models can move across platforms without rewriting code or revalidating integrations—a capability the military has been asking vendors to deliver for years.
MOSA in Practice, Not Just on Slides
Many vendors claim MOSA alignment. Fewer demonstrate it live.
At AFCEA West, Latent AI will showcase its tracking and adaptive edge AI capabilities at its own booth (4306), as well as within Sigma Defense’s booth (1133) and Abaco Systems’ booth (2315). The demos emphasize interoperability across vendors, platforms, and environments—exactly the kind of flexibility MOSA is meant to enable.
For defense buyers wary of vendor lock-in and integration debt, that matters. Interoperable systems don’t just reduce acquisition risk; they also shorten timelines for upgrades, experimentation, and mission adaptation.
Why This Signals a Broader Shift
The Latent AI, Sigma Defense, and Abaco Systems collaboration reflects a broader shift in defense technology priorities.
Edge AI is no longer just about running models on smaller devices. It’s about sustaining AI as a capability—keeping it trustworthy, adaptable, and operational when connectivity is unreliable and stakes are high.
That requires ecosystems, not point solutions.
Latent AI CEO and co-founder Jags Kandasamy summed it up succinctly: AI must be adaptive, interoperable, and resilient enough to operate without waiting on distant infrastructure. That philosophy aligns closely with how the Department of Defense is thinking about future force design.
The Bottom Line
At AFCEA West, Latent AI isn’t pitching edge AI as a novelty. It’s presenting it as infrastructure—one that can evolve, interoperate, and survive in the environments that matter most.
By pairing adaptive AI software with secure orchestration and rugged, modular hardware, the company is making a case that edge AI doesn’t have to be fragile or siloed. Done right, it can be updated in the field, trusted by operators, and deployed across platforms without friction.
For a military that has been explicit about avoiding lock-in and accelerating capability delivery, that combination may be exactly what the edge AI conversation needs next.
Power Tomorrow’s Intelligence — Build It with TechEdgeAI










