Achieving the 8 guiding principles of the DOD’s Data Strategy with Elastic

A modified version of this blog post appeared in the June 2021 issue of Signal magazine. Decisions that need to be made in an instant require answers in real time, but existing big data systems are unable to return queries quickly enough for real-time analytics. And with growing data being queried by more connected users […]

A modified version of this blog post appeared in the June 2021 issue of Signal magazine.

Decisions that need to be made in an instant require answers in real time, but existing big data systems are unable to return queries quickly enough for real-time analytics. And with growing data being queried by more connected users than ever before, it’s getting increasingly challenging to maintain fast reaction times. 

The DoD Data Strategy approaches this challenge by providing a blueprint for how data should be managed and accessed by the services, ensuring that trusted information gets to the right destinations at the right time. In lock step, Elastic securely ingests and queries massive amounts of diverse datasets in near real time, across a distributed environment, to support all eight of the DoD Data Strategy’s guiding principles﹣making data a force multiplier for warfighters.

  1. Data is a strategic asset With Elastic, globally distributed organizations like the DoD can use data much more effectively because they can bring their questions directly to the data, collecting it at the edge or regionally while allowing access from anywhere within a second of ingest. Security controls remain intact, and data can be cost-effectively leveraged throughout the full retention lifecycle.
  2. Collective data stewardship – Schema One (which is based on the Elastic Common Schema) enables interoperability across all data sources and services. When managing data access and retention, stewards, custodians, and managers only need to define the policies related to their function and let the Elastic platform move the data through each lifecycle phase automatically. 
  3. Data ethics – Elastic supports monitoring, auditing, and anomaly detection of data access, usage, and behavioral indicators to help protect data privacy and ethical standards. It is much easier to maintain, monitor, and apply standards to a single data system than the multitude of disparate systems currently in use.
  4. Data collection – Elastic includes tools to securely capture data at the point of creation, integrates seamlessly with other tools, and can automate the creation of pedigree and attribution tags. Subsequently combined or created products can then have the appropriate tags applied to them for data management and assurance concerns.  
  5. Enterprise-wide data access and availability – Elastic can create a global data mesh where data from all sources is immediately available via a single yet distributed search-based platform that taps into remote cluster computing power. Access controls can be dynamically created, updated, or removed at a granular level to support standing or ad-hoc missions. This ensures only those with a legitimate need and security clearance access appropriate information.

blog-dod-strategy.png

Legacy methods for analyzing widely distributed data have critical issues with respect to speed and scalability. Through the combination of cross-cluster replication and cross cluster search, data can be sequentially replicated and indexed across remote clusters and still accessed locally via search according to role-based security controls. This provides more scale and speed than data duplication, data federation, or data centralization.
  1. Data for artificial intelligence training – Elastic can create a speed layer over all organizational data sources, simplifying and accelerating the most tedious aspects of creating, training, testing, and applying AI and supervised ML. Those models can then be applied to the production data stored in Elastic. Fully integrated unsupervised and supervised ML capabilities for building custom analytics are built into Elastic, as well as hundreds of prebuilt detection rules and ML jobs for solution-specific activities related to security and observability.
  2. Data fit for purpose – At its core, Elastic is an extremely flexible general-purpose data access layer — it provides Data-as-a-Service. This means that Elastic does not shoehorn data into a particular solution or viewpoint (e.g., everything does not suddenly take on the characteristics of a security problem in order to fit into a SIEM solution). When all potentially relevant data is made available together, data is not only fit for purpose, it also becomes a mission enabler.
  3. Design for compliance – With all data available and monitored within a single platform, compliance and auditing becomes much easier and truly comprehensive. Leveraging the common data platform, compliance becomes more effective, standardized, and automated.

To learn how Elastic powers instant availability, distributed operations, and true interoperability, read the full blog in the June 2021 issue of Signal magazine. For more information visit elastic.co/federal or email [email protected]

Additional resources:

Source: Elastic