Top Banner
Kaggle Deep Learning Competition in Computer Vision Alexander Eckert <[email protected]> Oct, 2018 Thanks to Nicole Finnie for preparing the bulk of the slides!
13

Kaggle Deep Learning Competition

Feb 03, 2023

Download

Documents

Khang Minh
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Kaggle Deep Learning Competition

KaggleDeep Learning Competition

in Computer Vision

Alexander Eckert <[email protected]>

Oct, 2018

Thanks to Nicole Finnie for preparing the bulk of the slides!

Page 2: Kaggle Deep Learning Competition

Agenda

l A very brief introduction of Kaggle

l Kaggle 2018 Data Science Bowl – Nuclei detection

l Q & A

https://www.kaggle.com/c/data-science-bowl-2018

Page 3: Kaggle Deep Learning Competition

l Brief introduction of Kaggle

Prizes | Medals | Points

https://www.kaggle.com

Page 4: Kaggle Deep Learning Competition

Ranking & Rewards

l Competition ranking- Public leaderboard

l evaluated on subset of ground truthl visible during competition

- Private leaderboard l unseen datal determines final ranking

- Rewardsl Money, medals, ranking points

l Global ranking- Competitions- Kernels (Notebooks)

- Forum discussions

https://www.kaggle.com

Page 5: Kaggle Deep Learning Competition

Our team

https://www.kaggle.com/c/data-science-bowl-2018

l Our objectives- Apply and learn state of the art CV algorithms, ANN frameworks & architectures

- Take perspective of a data scientist, discuss approaches

- Compete

Page 6: Kaggle Deep Learning Competition

l 2018 Data Science Bowl – Nuclei Detection

Detect

Mask of one nucleushttps://www.kaggle.com/c/data-science-bowl-2018

Page 7: Kaggle Deep Learning Competition

Exploratory Data Analysis (EDA)K-Means

Page 8: Kaggle Deep Learning Competition

(Mini) U-Net – a CNN autoencoder

16 (3x3) filters

64 (3x3) filters

128 (3x3) filters

32 (3x3) filters

256 (3x3) filters

32 (3x3) filters

dropout 0.1

dropout 0.2

dropout 0.2

dropout 0.3

dropout 0.1

activation: Sigmoid

optimization: adamloss: binary cross entropy

input: 256x256x2 output: 256x256x2

https://lmb.informatik.uni-freiburg.de/people/ronneber/u-net/

Page 9: Kaggle Deep Learning Competition

Hidden feature engineering

KNN on edge pixels

channel 0 channel 1train image

windowing crops, flip(data augmentation)

2 U-Net modelscolour & grey

Train

Page 10: Kaggle Deep Learning Competition

Post processing of predictionsl Necessary for instance segmentation

l (Predicted masks – predicted contours) => label seeds

l Labelling (using random walker/watershed algorithm)

ground truth

post-processed

labelling (instance segmentation)predicted contours

test image

Page 11: Kaggle Deep Learning Competition

Model ensemblingbaseline

transform

noises

(data augmentation)

Pixel prediction using weighted majority vote

Final mean IoU: 0.545

ensembling

train images

test image

https://github.com/nicolefinnie/kaggle-dsb2018

Silver medal

Page 12: Kaggle Deep Learning Competition

Learnings

l Very time intensive- Important to efficiently (e.g. in parallel) evaluate promising paths and check classification results/errors

- GPUs and patience required for (deep learning) model training

l Follow forum discussions for new ideas, knowledge sharing and competition timeline

l Combine orthogonal approaches and create weighted ensembles

l Steep learning curve, but great for acquiring (or improving) skills- Keras/Tensorflow & PyTorch NN frameworks

- Python libraries like OpenCV, Pandas, Scikit, Jupyter

- Image processing techniques

- Network architectures for semantic segmentation

- Building modelling & prediction pipelines

Page 13: Kaggle Deep Learning Competition

Q & A

Sometimes it can go really wrong... (toxic image)What? A painting

in the test set?

Good job, U-Nets!