Meaning making – separating signal from noise. How do we transform the customer's next input into an action that creates a positive customer experience? We make the data more intelligent, so that it is able to guide our actions. The Data Lake builds on Big Data strengths by automating many of the manual development tasks, providing several self-service features to end-users, and an intelligent management layer to organize it all. This results in lower cost to create solutions, "smart" analytics, and faster time to business value.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Financial Services Delivering innovative products and services, based on a 360° view of the Customer, across all business lines, engaging all available data assets, internal and external
• Low Cost, High Performance Storage • Flexible, Easy-to-Use Data Organization • Performance-Optimized Analytics • Automation of most manual Development and
Query Activities • Self-Service End-User Features • Intelligent Processing
• Different business entities in physical systems actually share many of the same concepts, meanings, and relationships
• Semantic data science exposes common business concepts and connects them with their physical expression in production systems
• Data is “glued” together by its business meaning, rather than physical structures dictated by the underlying technologies
The conceptual model can be directly used by both business and IT users to operationalize data services, understand the data landscape, track data lineage, and
Comprehensive analysis creates rigid structure that is difficult to change, or
Minimal definition of data organization requires detailed understanding of data contents
Flexible data model can be revised or extended without redesign of the database
Agile, evolutionary refinement of the data organization, leveraging new insights as users work with the data
Connect External Data
External data is collected and loaded into the analytics repository.
Data is streamed, or is refreshed on a scheduled frequency.
External data can be sourced from databases, spreadsheets, Web pages, news feeds, and more; data is queried through common methods, without regard to location, with real-time values delivered at query time.
Integrate Data between Business Units or Business Partners
Governance activities establish common vocabulary, and data definitions
And, systems of record publish existing data specifications or ontology model; each organization defines data in a manner that is best suited for its business.
Shared data is copied to an integrated database.
Federation and virtualization features provide choices in which data to copy and which data to retain in the system(s) of record
Organization-specific definitions may require duplicating certain data in marts
All models can be supported through a single copy of the data, maintained in the data lake or system of record.
Capture and Embed Expertise Expertise often captured in the reporting
and analytics; change management challenge when updates required.
Expertise captured in the data definitions; single, shared definition minimizes change management efforts