How to Accelerate Time-to-Insight with the Knowledge Graph
Content sponsored by Cambridge Semantics
Let’s talk about “how” your organization can speed up time to insight and improve the decision-making process with the Knowledge Graph.
Knowledge graphs describe, represent, and connect all enterprise data, regardless of its source or structure. The promise of knowledge graphs is that data consumers, human and automated, gain visibility, access, and understanding of all the information available within an enterprise.
A variety of industries are achieving this today with a growing list of use cases. Here are some examples :
- Health care: achieve smarter, informed patient diagnosis and treatment through AI-based recommendations.
- Life sciences: assemble research and sequencing genomes to accelerate drug discovery and approval.
- Manufacturing: improve the reliability and quality of end-to-end products and parts.
- Government: better understand security threats and risks, and improve the effectiveness of operations or missions.
- Financial services: improve customer experience, compliance and risk management.
It all starts with the foundation. Many perceive the graphical data model as applicable to networks or similar constructs. But the primary application of the graph is the rapid integration of large-scale data. Here is a proven process for realizing the promise of the Graph Data Model, specifically using W3C, RDF and OWL.
Knowledge Graph technology has reached a tipping point to achieve near real-time data integration. It is entirely possible to create knowledge graphs directly from relational, semi-structured and unstructured data sources, without the need to create copies of the source data.
This means the promise of initial data integration is achievable with a single click, creating an interconnected knowledge graph ready for analysis. Often true, robust data integration involves a hybrid approach of automation and human interaction to weave your data fabric together.
We first normalize the syntactic structure and format heterogeneity in the RDF graph model. Then, we harmonize the entities, their relationships and their characteristics using the OWL knowledge representation model.
Once the basic knowledge graph is created, we apply transformations, calculations and reasoning to mix and prepare partitions of the knowledge graph for business analysis and exposure as data services.
A robust implementation applies merging steps in a way that does not affect the content of the base knowledge graph. This provides many benefits, such as a more modular design, multiple independent views, hypothesis testing, and rapid application development.
Once the content of the knowledge graph is mixed and prepared for consumption, a good knowledge graph solution allows exploration of the entire knowledge graph in arbitrary combinations – on demand. This means that users are empowered to answer known and unforeseen questions.
Out of the box, so to speak, a knowledge graph solution enables broad and deep descriptive analysis. From this foundation, organizations create predictive and prescriptive analytics. The value of the knowledge graph here is that a user, application, or automated client can immediately access all relevant data in a single query. Robust solutions automatically generate queries based on user interactions.
The knowledge graph should expose data services to enable AI/ML and other intelligent software clients in a uniform and simple interface, which is not necessarily domain specific.
Once the previous steps are established, decision-making becomes more dynamic, more informed and more adaptable. Because this superior knowledge graph implementation makes data accessible with the flexibility to change quickly, it leads to improvements in decision time. Because the knowledge graph potentially includes all data sources, decisions are more accurate and complete. The knowledge graph built using RDF and OWL waits changing, so adding new data sources, changing schemas is natural. This means emerging requirements are easy, especially compared to traditional approaches.
You might be pleasantly surprised! For example, the knowledge graph platform Anzo is deployed in production at a federal government organization. This organization uses Anzo to provide a 360 degree view of relevant information and to act as their primary data warehouse. The program architect said: “The delivery of the analytics platform over 8 months is considered a major achievement by [our] Leadership. The platform must connect the data between [our] the entire supply chain. They are now adding new data sources to the Knowledge Graph in weeks instead of months, and new data models and dashboards in days instead of weeks. Savings for initial use cases implemented within the first 90 days of using Anzo was over $17 million.