Deep Water - Bringing Tensorflow, Caffe, Mxnet to H2O

Post on 16-Apr-2017

3020 Views

Category:

Data & Analytics

1 Downloads

Preview:

Click to see full reader

Transcript

DeepWaterOr:BringingTensorFlowetal.toH2O

ArnoCandel,PhDChiefArchitect,Physicist&Hacker,H2O.ai

@ArnoCandelJuly182016

Overview

Machine Learning (ML)

Artificial Intelligence (A.I.)

Computer Science (CS)

H2O.ai

Deep Learning (DL)hot hot hot hot hot

ASimpleDeepLearningModel:ArtificialNeuralNetwork

heartbeat

blood pressure

oxygensend to regular care

send to intensivecare unit (ICU)

IN: data OUT: prediction

nodes : neuron activations (real numbers) — represent features arrows : connecting weights (real numbers) — learned during training

: non-linearity x -> f(x) — adds model complexity

from 1970s, now rebranded as DL

• conceptuallysimple• nonlinear• highlyflexibleandconfigurable

• learnedfeaturescanbeextracted• canbefine-tunedwithmoredata• efficientformulti-classproblems• world-classatpatternrecognition

DeepLearningProsandCons

Pros:• hardtointerpret• theorynotwellunderstood• slowtotrainandscore• overfits,needsregularization• manyhyper-parameters• inefficientforcategoricalvariables• verydatahungry,learnsslowly

Cons:

DeepLearninggotboostedrecentlybyfastercomputers

BriefHistoryofA.I.,MLandDL

John McCarthyPrinceton, Bell Labs, Dartmouth, later: MIT, Stanford

1955: “A proposal for the Dartmouth summer research project on Artificial Intelligence”

with Marvin Minsky (MIT), Claude Shannon (Bell Labs) and Nathaniel Rochester (IBM)

http://www.asiapacific-mathnews.com/04/0403/0015_0020.pdf

A step back: A.I. was coined over 60 years ago

1955proposalfortheDartmouthsummerresearchprojectonA.I.

“We propose that a 2-month, 10-man study of artificial intelligence be carried out during the summer of 1956 at Dartmouth College in Hanover, New Hampshire. The study is to proceed on the basis of the conjecture that every aspect of learning and any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it. An attempt will be made to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves. We think that a significant advance can be made in one or more of these problems if a carefully selected group of scientists work on it together for one summer.”

Step1:GreatAlgorithms+FastComputers

http://nautil.us/issue/18/genius/why-the-chess-computer-deep-blue-played-like-a-human

1997: Playing Chess (IBM Deep Blue beats Kasparov)

ComputerScience 30customCPUs,60billionmovesin3mins

“No computer will ever beat me at playing chess.”

Step2:MoreData+Real-TimeProcessing

http://cs.stanford.edu/group/roadrunner/old/presskit.html

2005: Self-driving CarsDARPA Grand Challenge, 132 miles (won by Stanford A.I. lab*)

Sensors&ComputerSciencevideo,radar,laser,GPS,7Pentiumcomputers

“No computer will ever drive a car!?”

*A.I. lab was established by McCarthy et al. in the early 60s

Step3:BigData+In-MemoryClusters

2011: Jeopardy (IBM Watson)

In-MemoryAnalytics/ML4TBofdata(incl.wikipedia),90servers,16TBRAM,Hadoop,6millionlogicrules

https://www.youtube.com/watch?v=P18EdAKuC1U https://en.wikipedia.org/wiki/Watson_(computer)

Note: IBM Watson received the question in electronic written form, and was often able to press the answer button faster than the competing humans.

“No computer will ever answer random questions!?”

“No computer will ever understand my language!?”

2014: Google(acquired Quest Visual)

DeepLearning ConvolutionalandRecurrent

NeuralNetworks,withtrainingdatafromusers

Step4:DeepLearning

• Translate between 103 languages by typing • Instant camera translation: Use your camera to translate text instantly in 29 languages • Camera Mode: Take pictures of text for higher-quality translations in 37 languages • Conversation Mode: Two-way instant speech translation in 32 languages • Handwriting: Draw characters instead of using the keyboard in 93 languages

Step5:AugmentedDeepLearning

2014: Atari Games (DeepMind)

2016: AlphaGo (Google DeepMind)

DeepLearning +reinforcementlearning,treesearch,

MonteCarlo,GPUs,playingagainstitself,…

Go board has approx. 200,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 (2E170) possible positions.

trainedfromrawpixelvalues,nohumanrules

“No computer will ever beat the best Go master!?”

Microsoft had won the Visual Recognition challenge: http://image-net.org/challenges/LSVRC/2015/

Step6:A.I.ChatbotshaveOpinionstoo!

WhatWillChange?

Today Tomorrow

BetterData—BetterModels—BetterResults

Example:Fraud

Prediction

WhataboutJobs?

Anything that can be automated will be automated.

Jobs of the past: assembly line work, teller (ATMs), taxi-firm receptionist

Jobs being automated away now: resume matching, driving, language translation, education

Jobs being automated away soon: healthcare, arts & crafts, entertainment, design, decoration, software engineering, politics, management, professional gaming, financial planning, auditing, real estate agent

Jobs of the future: professional sports, food & wine reviewer

Maybewe’llfinallygettheeatingmachine?

https://www.youtube.com/watch?v=n_1apYo6-Ow

1936 Modern Times (Charlie Chaplin)

LiveH2ODeepLearningDemo:PredictAirplaneDelays

10 nodes: all 320 cores busy

real-time, interactive model inspection in Flow

116M rows, 6GB CSV file 800+ predictors (numeric + categorical)

model trained in <1 min: 2M+ samples/second

Deep Learning Model

H2O Elastic Net (GLM): 10 secs alpha=0.5, lambda=1.379e-4 (auto)

H2O Deep Learning: 45 secs 4 hidden ReLU layers of 20 neurons, 1 epoch

Features have non-linear impact

Chicago, Atlanta, Dallas: often delayed

SignificantPerformanceGainswithDeepLearning

Predict departure delay (Y/N) on 20 years of airline flight data (116M rows, 12 cols, categorical + numerical data with missing values)

WATCH NOW

AUC: 0.656

AUC: 0.703 (higher is better, ranges from 0.5 to 1)

Feature importances

10 nodes: Dual E5-2650 (8 cores, 2.6GHz), 10GbE

READ MORE

Kaggle challenge 2nd place winner Colin Priest

READ MORE“FormyfinalcompetitionsubmissionIusedanensembleofmodels,including3deeplearningmodelsbuiltwithRandh2o.”

“IdidreallylikeH2O’sdeeplearningimplementationinR,though-theinterfacewasgreat,thebackendextremelyeasytounderstand,anditwasscalableandflexible.DefinitelyatoolI’llbegoingbackto.”

H2ODeepLearningCommunityQuotes

H2ODeepLearningCommunityQuotes

READ MORE WATCH NOW

“H2ODeepLearningmodelsoutperformotherGleasonpredictingmodels.”

READ MORE

“…combineADAMandApacheSparkwithH2O’sdeeplearningcapabilitiestopredictanindividual’spopulationgroupbasedonhisorhergenomicdata.Ourresultsdemonstratethatwecanpredicttheseverywell,withmorethan99%accuracy.”

H2ODeepLearningCommunityQuotes

H2OBooklets

DOWNLOAD

R Python Deep Learning GLMGBMSparkling Water

KDNuggetsPollaboutDeepLearningTools&Platforms

http://www.kdnuggets.com

H2OandTensorFlowaretied

usageofDeepLearningtoolsinpastyear

TensorFlow+H2O+ApacheSpark=Anythingispossible

https://github.com/h2oai/sparkling-water/blob/master/py/examples/notebooks/TensorFlowDeepLearning.ipynb https://www.youtube.com/watch?v=62TFK641gG8

IntegrationwithexistingGPUbackends

Leverageopen-sourcetoolsandresearchforTensorFlow,Caffe,mxnet,Theano,etc.

ScalabilityandEaseofUse/DeploymentofH2O

Distributedtraining,real-timemodelinspectionFlow,R,Python,Spark/Scala,Java,REST,POJO,Steam

ConvolutionalNeuralNetworks

Image,video,speechrecognition,etc.

RecurrentNeuralNetworks

Sequences,timeseries,etc.NLP:naturallanguageprocessing

HybridNeuralNetworksArchitectures

Speechtotexttranslation,imagecaptioning,sceneparsing,etc.

DeepWater:Next-GenDeepLearninginH2O

EnterpriseDeepLearningforBusinessTransformation

DeepWater:Next-GenDeepLearninginH2O

Don’tmissourDeepLearningsessionstomorrowafternoon!

TensorFlow mxnetCaffe H2O

top related