Exploding Demands for Big Data, Analytics, Risk Management, Ultra-low Latency and Compute Power Requires Optimized HPC Infrastructures Building a smarter planet: Financial Services Robert Brinkman Infrastructure Architect for Banking and Financial Markets IBM Banking Center of Excellence
Building a smarter planet: Financial Services. Exploding Demands for Big Data, Analytics, Risk Management, Ultra-low Latency and Compute Power Requires Optimized HPC Infrastructures. Robert Brinkman Infrastructure Architect for Banking and Financial Markets IBM Banking Center of Excellence. - PowerPoint PPT Presentation
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Exploding Demands for Big Data, Analytics, Risk Management, Ultra-low Latency and Compute Power Requires Optimized HPC Infrastructures
Building a smarter planet: Financial Services
Robert BrinkmanInfrastructure Architect for Banking and Financial MarketsIBM Banking Center of Excellence
Financial Markets Industry Imperatives• Re-engineer for profitable growth: Renewed focus on the customer, Near real
time analytics• Improve the trade life cycle: Cloud and business process outsourcing• Optimize enterprise risk management: Data driven transformation and common
industry services
Appliances& Packages
ApplicationsIBM Provided, ISVs, Partners, Custom
Grid Stack
Low Latency
Stack
TransactionStack
High Message Rates
Data Value Decay
Big Data
Big Compute
High TransactionRates
Complex Data Models
Specialized Workloads
Packaged Hardware and Software
CloudStack
Discrete Components or Applications
Variable Workload
Messaging and Security
Nothing below this
point
Nothing below this
point
Dino VitaleDirector, Cross Technology Services
Morgan Stanley
Nothing below this
point
Nothing below this
point
Morgan Stanley: Road to Compute As a Service Trends
• Maximize efficiency of compute infrastructure • Cost / run-rate• Utilization – more with less, linear scale, sharing• Operational normalization
Challenges• Phasing• Dynamic provisioning and scaling on-demand of resources to applications according to
varying business needs and SLA• Multi-tenant workload protection• Application design and dependency management• Utility charge-back model options: pay-per-use, fixed allocation, hybrid approach • Sharing resources based on work load supply and demand • BCP
Convergence opportunities with “Big Data”• Increasing data volumes • Adaptive/real-time Scheduling• Resource management• Metrics / Data mining
ON-DEMAND DATA IN HIGH PERFORMANCE ENVIRONMENTEmile Werr, VP, Global Data Services
Global Head of Enterprise Data Architecture & Identity Management
Big Data (billions of transactions and multi-terabyte captured daily) Speed and business agility are essential to our business Different viewpoints and data patterns need to be analyzed Data coming out of a Trading Plant is not user-friendly Correlating disparate data & integration Moving large data around is expensive and complex System Capacity requirements need to efficiently handle 5x of our Avg daily volume. Data Spikes – the day after Flash Crash volume peaked over18.4 Bn transactions for NYSE Classic Matching engine (this excludes Options and other markets like Arca, Amex, Liffe, Euronext, etc.) Transaction volume growth sustained year-over-year Data needs to be readily available for a min of 7 years for Compliance It is too expensive to keep it all online Change is constant
Full Quote Size- Best Quote size from the last published best quote
Price level for calculating Shares Ahead & Shares Available
Trading systems generate vast transaction volumes at high speeds The GRID is utilized to transform, normalize and enrich the time-series data using massive parallel computing. This is done as EOD or Intra-Day batch processing. Date-Level Table scans (Queries) need also massive parallel processing (MPP) Appropriate technologies need to be utilized (10gb Network, Virtualized CPU/MEM, Appliance Databases, Scalable Storage Pools)
USE CASE: Market Reconstruction for Trading Surveillance
The Electronic Book (NYSE DBK) and Market Depth needs to be reconstructed and accessible via Fast Database
Who Traded Ahead or Interpositioning ? This can be answered by a Database Query
Data Lifecycle Management Methodology
Data Capture
End-User Workflow
Data Transformation & Archive
User Analytics“Business Intelligence”
On-Demand Data (ODD)
Trading DataMarket DataRef DataUser Generated Data
Transform, Normalize, EnrichPartition, compress and archive in storage poolsCreate Metadata (mappings)
EnterpriseSystems
Secure Data Access & NavigationLoad, Extract, Stream, Filter, Transform, PurgeUser-driven Data Mart Provisioning (“Sandboxing”)Schema Change Capture (“Data Structure Lineage”)
Utilize MPP Databases & HDFSIntegrate Reporting ToolsFacilitate User CollaborationCapture Knowledge (KM)Automate Data Archive & Purge
Global Data Services 3
FeedHandler
FeedHandler
Continuous Flow (Trickle Batch)
files
Data PumpData Pump
Data Capture Data Virtualization & Abstraction Business Demand