Really using Oracle analytic SQL functions

Post on 27-Jan-2015

118 Views

Category:

Technology

6 Downloads

Preview:

Click to see full reader

DESCRIPTION

Presentation on Oracle analytic SQL functions as I presented it on UKOUG 2012 conference in Birmingham

Transcript

Really UsingAnalytic Functions

Kim Berg HansenT. Hansen Gruppen A/S

#ukoug2012 Really Using Analytic Functions2

Who is this Kim?• a Danish SQL and PL/SQL Developer:

http://dspsd.blogspot.com• Professional geek since 1996• Oracle programmer since 2000• Single SQL Statement mantra (©Tom Kyte)• Danish Beer Enthusiast (http://ale.dk)• Likes to cook• Reads sci-fi

2012-12-05

#ukoug2012 Really Using Analytic Functions3

What’s up?• Why analytics?• Case 1: Top selling items• Case 2: Picking by FIFO• Case 3: Efficient picking route• Case 4: Picking efficiency• Case 5: Forecasting sales• Case 6: Forecast zero firework stock• Case 7: Multi-order FIFO picking (time permitting)• Any questions?

2012-12-05

#ukoug2012 Really Using Analytic Functions4

Why analytics?• Normal SQL functions operate on one row• Aggregates can do more rows but loose detail• When you need details together with subtotals, ranks, ratios,

comparisons, you could do: Client operations (tool or code with variables/arrays) Scalar subqueries (multiple access of same data) Analytic functions (often much more efficient )

• Analytics allow you to operate across the entire resultset, not just a single row

2012-12-05

#ukoug2012 Really Using Analytic Functions5

Top Selling ItemsCase 1

2012-12-05

#ukoug2012 Really Using Analytic Functions6

Top selling items• Classic task for a programmer:• Show top three by product group• Also show how big percentage they

sold of the total– Both of the total by product group– And of the grand total

2012-12-05

#ukoug2012 Really Using Analytic Functions7

Tables

2012-12-05

create table items( item varchar2(10) primary key, grp varchar2(10), name varchar2(20))/

create table sales ( item varchar2(10) references items (item), mth date, qty number)/

Items with groups

Sales per month

#ukoug2012 Really Using Analytic Functions8

Data - items

2012-12-05

insert into items values ('101010','AUTO','Brake disc'); insert into items values ('102020','AUTO','Snow chain'); insert into items values ('103030','AUTO','Sparc plug'); insert into items values ('104040','AUTO','Oil filter'); insert into items values ('105050','AUTO','Light bulb'); insert into items values ('201010','MOBILE','Handsfree'); insert into items values ('202020','MOBILE','Charger'); insert into items values ('203030','MOBILE','iGloves'); insert into items values ('204040','MOBILE','Headset'); insert into items values ('205050','MOBILE','Cover');

5 autoparts

5 mobile accessories

#ukoug2012 Really Using Analytic Functions9

Data – sales AUTO

2012-12-05

insert into sales values ('101010',date '2011-04-01',10); insert into sales values ('101010',date '2011-05-01',11); insert into sales values ('101010',date '2011-06-01',12); insert into sales values ('102020',date '2011-03-01', 7); insert into sales values ('102020',date '2011-07-01', 8); insert into sales values ('103030',date '2011-01-01', 6); insert into sales values ('103030',date '2011-02-01', 9); insert into sales values ('103030',date '2011-11-01', 4); insert into sales values ('103030',date '2011-12-01',14); insert into sales values ('104040',date '2011-08-01',22); insert into sales values ('105050',date '2011-09-01',13); insert into sales values ('105050',date '2011-10-01',15);

Sales for various months of 2011 for the autoparts

#ukoug2012 Really Using Analytic Functions10

Data – sales MOBILE

2012-12-05

insert into sales values ('201010',date '2011-04-01', 5); insert into sales values ('201010',date '2011-05-01', 6); insert into sales values ('201010',date '2011-06-01', 7); insert into sales values ('202020',date '2011-03-01',21); insert into sales values ('202020',date '2011-07-01',23); insert into sales values ('203030',date '2011-01-01', 7); insert into sales values ('203030',date '2011-02-01', 7); insert into sales values ('203030',date '2011-11-01', 6); insert into sales values ('203030',date '2011-12-01', 8); insert into sales values ('204040',date '2011-08-01',35); insert into sales values ('205050',date '2011-09-01',13); insert into sales values ('205050',date '2011-10-01',15);

Sales for various months of 2011 for the mobile accessories

#ukoug2012 Really Using Analytic Functions11

Base select

2012-12-05

select i.grp , i.item , max(i.name) name , sum(s.qty) qty from items i join sales s on s.item = i.item where s.mth between date '2011-01-01' and date '2011-12-01' group by i.grp, i.item order by i.grp, sum(s.qty) desc, i.item

Join items and sales

Sales for 2011

Group by to get total sales for 2011 per item

#ukoug2012 Really Using Analytic Functions12

Base select

2012-12-05

GRP ITEM NAME QTY

---------- ---------- -------------------- -----

AUTO 101010 Brake disc 33

AUTO 103030 Sparc plug 33

AUTO 105050 Light bulb 28

AUTO 104040 Oil filter 22

AUTO 102020 Snow chain 15

MOBILE 202020 Charger 44

MOBILE 204040 Headset 35

MOBILE 203030 iGloves 28

MOBILE 205050 Cover 28

MOBILE 201010 Handsfree 18

Couple of items in each group have identical sales

#ukoug2012 Really Using Analytic Functions13

Which TOP?

2012-12-05

select g.grp, g.item, g.name, g.qty , dense_rank() over (partition by g.grp order by g.qty desc) drnk , rank() over (partition by g.grp order by g.qty desc) rnk , row_number() over (partition by g.grp order by g.qty desc, g.item) rnum from ( select i.grp , i.item , max(i.name) name , sum(s.qty) qty from items i join sales s on s.item = i.item where s.mth between date '2011-01-01' and date '2011-12-01' group by i.grp, i.item

) g order by g.grp, g.qty desc, g.item

Base select as inline view

#ukoug2012 Really Using Analytic Functions14

Which TOP?

2012-12-05

GRP ITEM NAME QTY DRNK RNK RNUM

---------- ---------- -------------------- ----- ----- ----- -----

AUTO 101010 Brake disc 33 1 1 1

AUTO 103030 Sparc plug 33 1 1 2

AUTO 105050 Light bulb 28 2 3 3

AUTO 104040 Oil filter 22 3 4 4

AUTO 102020 Snow chain 15 4 5 5

MOBILE 202020 Charger 44 1 1 1

MOBILE 204040 Headset 35 2 2 2

MOBILE 203030 iGloves 28 3 3 3

MOBILE 205050 Cover 28 3 3 4

MOBILE 201010 Handsfree 18 4 5 5

The three different functions handle ties differently

#ukoug2012 Really Using Analytic Functions15

Without inline view

2012-12-05

select i.grp , i.item , max(i.name) name , sum(s.qty) qty , dense_rank() over (partition by i.grp order by sum(s.qty) desc) drnk , rank() over (partition by i.grp order by sum(s.qty) desc) rnk , row_number() over (partition by i.grp order by sum(s.qty) desc, i.item) rnum from items i join sales s on s.item = i.item where s.mth between date '2011-01-01' and date '2011-12-01' group by i.grp, i.item order by i.grp, sum(s.qty) desc, i.item

Analytics calculated last, so can include group by expressions as well as aggregates

#ukoug2012 Really Using Analytic Functions16

Without inline view

2012-12-05

GRP ITEM NAME QTY DRNK RNK RNUM

---------- ---------- -------------------- ----- ----- ----- -----

AUTO 101010 Brake disc 33 1 1 1

AUTO 103030 Sparc plug 33 1 1 2

AUTO 105050 Light bulb 28 2 3 3

AUTO 104040 Oil filter 22 3 4 4

AUTO 102020 Snow chain 15 4 5 5

MOBILE 202020 Charger 44 1 1 1

MOBILE 204040 Headset 35 2 2 2

MOBILE 203030 iGloves 28 3 3 3

MOBILE 205050 Cover 28 3 3 4

MOBILE 201010 Handsfree 18 4 5 5

Identical results

#ukoug2012 Really Using Analytic Functions17

TOP 3 – rank()

2012-12-05

select g.grp, g.item, g.name, g.qty, g.rnk from ( select i.grp , i.item , max(i.name) name , sum(s.qty) qty , rank() over (partition by i.grp order by sum(s.qty) desc)

rnk from items i join sales s on s.item = i.item where s.mth between date '2011-01-01' and date '2011-12-01' group by i.grp, i.item) g where g.rnk <= 3 order by g.grp, g.rnk, g.item

Analytic function cannot be in where clause

So inline view and filter on the alias

#ukoug2012 Really Using Analytic Functions18

TOP 3 – rank()

2012-12-05

GRP ITEM NAME QTY RNK

---------- ---------- -------------------- ----- -----

AUTO 101010 Brake disc 33 1

AUTO 103030 Sparc plug 33 1

AUTO 105050 Light bulb 28 3

MOBILE 202020 Charger 44 1

MOBILE 204040 Headset 35 2

MOBILE 203030 iGloves 28 3

MOBILE 205050 Cover 28 3

rank() works like the olympics – two gold medals mean no silver medal

#ukoug2012 Really Using Analytic Functions19

TOP 3 – dense_rank()

2012-12-05

select g.grp, g.item, g.name, g.qty, g.rnk from ( select i.grp , i.item , max(i.name) name , sum(s.qty) qty , dense_rank() over (partition by i.grp order by sum(s.qty) desc) rnk from items i join sales s on s.item = i.item where s.mth between date '2011-01-01' and date '2011-12-01' group by i.grp, i.item) g where g.rnk <= 3 order by g.grp, g.rnk, g.item

#ukoug2012 Really Using Analytic Functions20

TOP 3 – dense_rank()

2012-12-05

GRP ITEM NAME QTY RNK

---------- ---------- -------------------- ----- -----

AUTO 101010 Brake disc 33 1

AUTO 103030 Sparc plug 33 1

AUTO 105050 Light bulb 28 2

AUTO 104040 Oil filter 22 3

MOBILE 202020 Charger 44 1

MOBILE 204040 Headset 35 2

MOBILE 203030 iGloves 28 3

MOBILE 205050 Cover 28 3

dense_rank() also gives equal rank at ties, but does not skip ranks

#ukoug2012 Really Using Analytic Functions21

TOP 3 – row_number()

2012-12-05

select g.grp, g.item, g.name, g.qty, g.rnk from ( select i.grp , i.item , max(i.name) name , sum(s.qty) qty , row_number() over (partition by i.grp order by sum(s.qty) desc, i.item) rnk from items i join sales s on s.item = i.item where s.mth between date '2011-01-01' and date '2011-12-01' group by i.grp, i.item) g where g.rnk <= 3 order by g.grp, g.rnk, g.item

#ukoug2012 Really Using Analytic Functions22

TOP 3 – row_number()

2012-12-05

GRP ITEM NAME QTY RNK

---------- ---------- -------------------- ----- -----

AUTO 101010 Brake disc 33 1

AUTO 103030 Sparc plug 33 2

AUTO 105050 Light bulb 28 3

MOBILE 202020 Charger 44 1

MOBILE 204040 Headset 35 2

MOBILE 203030 iGloves 28 3

row_number() just numbers consecutively

If ties, then result is ”random”, so a good idea always to use ”unique” order by

#ukoug2012 Really Using Analytic Functions23

Percent of total

2012-12-05

select g.grp, g.item, g.name, g.qty, g.rnk , round(g.g_pct,1) g_pct , round(g.t_pct,1) t_pct from ( select i.grp , i.item , max(i.name) name , sum(s.qty) qty , rank() over (partition by i.grp order by sum(s.qty) desc) rnk , 100 * ratio_to_report(sum(s.qty)) over (partition by i.grp) g_pct , 100 * ratio_to_report(sum(s.qty)) over () t_pct from items i join sales s on s.item = i.item where s.mth between date '2011-01-01' and date '2011-12-01' group by i.grp, i.item) g where g.rnk <= 3 order by g.grp, g.rnk, g.item

ratio_to_report() returns number between 0 and 1

Multiply with 100 to get percent

#ukoug2012 Really Using Analytic Functions24

Percent of total

2012-12-05

GRP ITEM NAME QTY RNK G_PCT T_PCT

---------- ---------- -------------------- ----- ----- ------ ------

AUTO 101010 Brake disc 33 1 25.2 11.6

AUTO 103030 Sparc plug 33 1 25.2 11.6

AUTO 105050 Light bulb 28 3 21.4 9.9

MOBILE 202020 Charger 44 1 28.8 15.5

MOBILE 204040 Headset 35 2 22.9 12.3

MOBILE 203030 iGloves 28 3 18.3 9.9

MOBILE 205050 Cover 28 3 18.3 9.9

#ukoug2012 Really Using Analytic Functions25

Top selling items• What kind of top three do you wish?

– DENSE_RANK()– RANK()– ROW_NUMBER()

• PARTITION BY groups of items

• RATIO_TO_REPORT for percentages2012-12-05

#ukoug2012 Really Using Analytic Functions26

Picking by FIFOCase 2

2012-12-05

#ukoug2012 Really Using Analytic Functions27

Picking by FIFO

• Items stored in different locations in warehouse• Pick an order by First-In First-Out principle

2012-12-05

#ukoug2012 Really Using Analytic Functions28

Tables

2012-12-05

create table inventory ( item varchar2(10), loc varchar2(10), qty number, purch date)/

create table orderline ( ordno number, item varchar2(10), qty number)/

Item, location, quantity and date of purchase

Order number, item and quantity ordered

#ukoug2012 Really Using Analytic Functions29

Data

2012-12-05

insert into inventory values('A1', '1-A-20', 18, DATE '2004-11-01'); insert into inventory values('A1', '1-A-31', 12, DATE '2004-11-05'); insert into inventory values('A1', '1-C-05', 18, DATE '2004-11-03'); insert into inventory values('A1', '2-A-02', 24, DATE '2004-11-02'); insert into inventory values('A1', '2-D-07', 9, DATE '2004-11-04'); insert into inventory values('B1', '1-A-02', 18, DATE '2004-11-06'); insert into inventory values('B1', '1-B-11', 4, DATE '2004-11-05'); insert into inventory values('B1', '1-C-04', 12, DATE '2004-11-03'); insert into inventory values('B1', '1-B-15', 2, DATE '2004-11-02'); insert into inventory values('B1', '2-D-23', 1, DATE '2004-11-04'); insert into orderline values (1,'A1',24); insert into orderline values (1,'B1',18);

2 items each 5 locations various purchase dates

One order with a quantity of both items

#ukoug2012 Really Using Analytic Functions30

What to pick

2012-12-05

variable pick_order number;

begin :pick_order := 1;end;/

Picking application sets bind variable for which order to pick(Could be sales order, batch order, shop refill order)

#ukoug2012 Really Using Analytic Functions31

What can we pick

2012-12-05

select o.item , o.qty ord_qty , i.loc , i.purch , i.qty loc_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order order by o.item, i.purch, i.loc

Join orderline to inventory to see all that potentially can be picked

#ukoug2012 Really Using Analytic Functions32

What can we pick

2012-12-05

ITEM ORD_QTY LOC PURCH LOC_QTY

---------- ------- ---------- ---------- -------

A1 24 1-A-20 2004-11-01 18

A1 24 2-A-02 2004-11-02 24

A1 24 1-C-05 2004-11-03 18

A1 24 2-D-07 2004-11-04 9

A1 24 1-A-31 2004-11-05 12

B1 18 1-B-15 2004-11-02 2

B1 18 1-C-04 2004-11-03 12

B1 18 2-D-23 2004-11-04 1

B1 18 1-B-11 2004-11-05 4

B1 18 1-A-02 2004-11-06 18

Visually we can see we need 18 A1 from first location and 6 from second loc.

Likewise we will empty first 3 locs of B1 and pick 3 from fourth location

#ukoug2012 Really Using Analytic Functions33

Accumulate

2012-12-05

select o.item , o.qty ord_qty , i.loc , i.purch , i.qty loc_qty , sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and current row ) sum_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order order by o.item, i.purch, i.loc

Let’s try to create a rolling sum of the qty for each item

#ukoug2012 Really Using Analytic Functions34

Accumulate

2012-12-05

ITEM ORD_QTY LOC PURCH LOC_QTY SUM_QTY

---------- ------- ---------- ---------- ------- -------

A1 24 1-A-20 2004-11-01 18 18

A1 24 2-A-02 2004-11-02 24 42

A1 24 1-C-05 2004-11-03 18 60

A1 24 2-D-07 2004-11-04 9 69

A1 24 1-A-31 2004-11-05 12 81

B1 18 1-B-15 2004-11-02 2 2

B1 18 1-C-04 2004-11-03 12 14

B1 18 2-D-23 2004-11-04 1 15

B1 18 1-B-11 2004-11-05 4 19

B1 18 1-A-02 2004-11-06 18 37

Yup, when our sum is greater than the ordered qty, it looks like we have enough

#ukoug2012 Really Using Analytic Functions35

Filter accumulated

2012-12-05

select s.*from ( select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and current row ) sum_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order) s where s.sum_qty < s.ord_qty order by s.item, s.purch, s.loc

So let’s try to filter on that

#ukoug2012 Really Using Analytic Functions36

Filter accumulated

2012-12-05

ITEM ORD_QTY LOC PURCH LOC_QTY SUM_QTY

---------- ------- ---------- ---------- ------- -------

A1 24 1-A-20 2004-11-01 18 18

B1 18 1-B-15 2004-11-02 2 2

B1 18 1-C-04 2004-11-03 12 14

B1 18 2-D-23 2004-11-04 1 15

FAIL!

Missing the last location for each item

#ukoug2012 Really Using Analytic Functions37

Accumulate previous

2012-12-05

select o.item , o.qty ord_qty , i.loc , i.purch , i.qty loc_qty , sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding ) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order order by o.item, i.purch, i.loc

One small change to our rolling sum:

Sum of rows up to but not including the current row

#ukoug2012 Really Using Analytic Functions38

Accumulate previous

2012-12-05

ITEM ORD_QTY LOC PURCH LOC_QTY SUM_PRV_QTY

---------- ------- ---------- ---------- ------- -----------

A1 24 1-A-20 2004-11-01 18

A1 24 2-A-02 2004-11-02 24 18

A1 24 1-C-05 2004-11-03 18 42

A1 24 2-D-07 2004-11-04 9 60

A1 24 1-A-31 2004-11-05 12 69

B1 18 1-B-15 2004-11-02 2

B1 18 1-C-04 2004-11-03 12 2

B1 18 2-D-23 2004-11-04 1 14

B1 18 1-B-11 2004-11-05 4 15

B1 18 1-A-02 2004-11-06 18 19

As long as the sum of the previous rows are not enough, we continue

When the previous rows are sufficient, we stop

#ukoug2012 Really Using Analytic Functions39

Filter previous

2012-12-05

select s.* , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qtyfrom ( select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order) s where s.sum_prv_qty < s.ord_qty order by s.item, s.purch, s.loc

Now we can filter correctly

nvl() for first row

least() to get qty to be picked at that location

#ukoug2012 Really Using Analytic Functions40

Filter previous

2012-12-05

ITEM ORD_QTY LOC PURCH LOC_QTY SUM_PRV_QTY PICK_QTY

---------- ------- ---------- ---------- ------- ----------- --------

A1 24 1-A-20 2004-11-01 18 0 18

A1 24 2-A-02 2004-11-02 24 18 6

B1 18 1-B-15 2004-11-02 2 0 2

B1 18 1-C-04 2004-11-03 12 2 12

B1 18 2-D-23 2004-11-04 1 14 1

B1 18 1-B-11 2004-11-05 4 15 3

#ukoug2012 Really Using Analytic Functions41

Picking list FIFO

2012-12-05

select s.loc , s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qtyfrom ( select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order) s where s.sum_prv_qty < s.ord_qty order by s.loc

Now order by location to make a pick list

#ukoug2012 Really Using Analytic Functions42

Picking list FIFO

2012-12-05

LOC ITEM PICK_QTY

---------- ---------- --------

1-A-20 A1 18

1-B-11 B1 3

1-B-15 B1 2

1-C-04 B1 12

2-A-02 A1 6

2-D-23 B1 1

Ready for operator to go picking

#ukoug2012 Really Using Analytic Functions43

Picking small qty

2012-12-05

select s.loc , s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qtyfrom ( select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc -- << only line changed rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order) s where s.sum_prv_qty < s.ord_qty order by s.loc

Change pick policy by changing order:

Here empty small quantities first to clean out locations

#ukoug2012 Really Using Analytic Functions44

Picking small qty

2012-12-05

LOC ITEM PICK_QTY

---------- ---------- --------

1-A-20 A1 3

1-A-31 A1 12

1-B-11 B1 4

1-B-15 B1 2

1-C-04 B1 11

2-D-07 A1 9

2-D-23 B1 1

Lots of picks

Will clean locations quickly for new incoming goods

#ukoug2012 Really Using Analytic Functions45

Picking few picks

2012-12-05

select s.loc , s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qtyfrom ( select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty desc, i.loc -- << only line changed rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order) s where s.sum_prv_qty < s.ord_qty order by s.loc

Or policy of picking as few times as possible

#ukoug2012 Really Using Analytic Functions46

Picking few picks

2012-12-05

LOC ITEM PICK_QTY

---------- ---------- --------

1-A-02 B1 18

2-A-02 A1 24

Only two picks

But will be at expense of leaving small quantities all over warehouse

#ukoug2012 Really Using Analytic Functions47

Picking short route

2012-12-05

select s.loc , s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qtyfrom ( select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.loc -- << only line changed rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order) s where s.sum_prv_qty < s.ord_qty order by s.loc

Policy of not driving to the far warehouse if possible

#ukoug2012 Really Using Analytic Functions48

Picking short route

2012-12-05

LOC ITEM PICK_QTY

---------- ---------- --------

1-A-02 B1 18

1-A-20 A1 18

1-A-31 A1 6

All picked in the very fist aisle in the warehouse

#ukoug2012 Really Using Analytic Functions49

Picking by FIFO• SUM() by item

• Ordered by purchase date

• Rolling sum to find how much was picked by ”previous rows”

• Filter away rows where sufficient has already been picked

2012-12-05

#ukoug2012 Really Using Analytic Functions50

Efficient picking routeCase 3

2012-12-05

#ukoug2012 Really Using Analytic Functions51

Picking small qty

2012-12-05

select s.loc , s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qtyfrom ( select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc -- << only line changed rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order) s where s.sum_prv_qty < s.ord_qty order by s.loc

Same data as FIFO picking case

But for this case we will use the policy of picking small quantities

#ukoug2012 Really Using Analytic Functions52

Picking small qty

2012-12-05

LOC ITEM PICK_QTY

---------- ---------- --------

1-A-20 A1 3

1-A-31 A1 12

1-B-11 B1 4

1-B-15 B1 2

1-C-04 B1 11

2-D-07 A1 9

2-D-23 B1 1

Because that gives many picks and shows this case best

Notice anything about these data?

#ukoug2012 Really Using Analytic Functions53

Picking route

• Is this a smart route to drive?

2012-12-05

#ukoug2012 Really Using Analytic Functions54

Better picking route

• We need to change direction every other aisle

2012-12-05

#ukoug2012 Really Using Analytic Functions55

Decipher loc

2012-12-05

select to_number(substr(s.loc,1,1)) warehouse , substr(s.loc,3,1) aisle , to_number(substr(s.loc,5,2)) position , s.loc , s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qtyfrom ( select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order

) s where s.sum_prv_qty < s.ord_qty order by s.loc

In this case location can be split into warehouse, aisle and position simply by substr()

#ukoug2012 Really Using Analytic Functions56

Decipher loc

2012-12-05

WAREHOUSE AISLE POSITION LOC ITEM PICK_QTY

--------- ----- -------- ---------- ---------- --------

1 A 20 1-A-20 A1 3

1 A 31 1-A-31 A1 12

1 B 11 1-B-11 B1 4

1 B 15 1-B-15 B1 2

1 C 4 1-C-04 B1 11

2 D 7 2-D-07 A1 9

2 D 23 2-D-23 B1 1

Now we can use analytics on the individual parts of the location

#ukoug2012 Really Using Analytic Functions57

Rank aisles

2012-12-05

select to_number(substr(s.loc,1,1)) warehouse , substr(s.loc,3,1) aisle , dense_rank() over ( order by to_number(substr(s.loc,1,1)) -- warehouse , substr(s.loc,3,1) -- aisle ) aisle_no , to_number(substr(s.loc,5,2)) position , s.loc , s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qtyfrom ( select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order) s where s.sum_prv_qty < s.ord_qty order by s.loc

Ordering by warehouse and aisle will give same rank to positions in same aisle

dense_rank() ensures consecutive ranks

#ukoug2012 Really Using Analytic Functions58

Rank aisles

2012-12-05

WAREHOUSE AISLE AISLE_NO POSITION LOC ITEM PICK_QTY

--------- ----- -------- -------- ---------- ---------- --------

1 A 1 20 1-A-20 A1 3

1 A 1 31 1-A-31 A1 12

1 B 2 11 1-B-11 B1 4

1 B 2 15 1-B-15 B1 2

1 C 3 4 1-C-04 B1 11

2 D 4 7 2-D-07 A1 9

2 D 4 23 2-D-23 B1 1

Now we have numbered the aisles in the order they are to be visited

#ukoug2012 Really Using Analytic Functions59

Odd up, even down

2012-12-05

select s2.warehouse, s2.aisle, s2.aisle_no, s2.position, s2.loc, s2.item, s2.pick_qty

from ( select to_number(substr(s.loc,1,1)) warehouse , substr(s.loc,3,1) aisle , dense_rank() over ( order by to_number(substr(s.loc,1,1)) -- warehouse , substr(s.loc,3,1) -- aisle ) aisle_no , to_number(substr(s.loc,5,2)) position , s.loc, s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty

) s2 order by s2.warehouse , s2.aisle_no , case when mod(s2.aisle_no,2) = 1 then s2.position else -s2.position end

mod() and case allows us to order positive on odd aisles and negative on even aisles

#ukoug2012 Really Using Analytic Functions60

Odd up, even down

2012-12-05

WAREHOUSE AISLE AISLE_NO POSITION LOC ITEM PICK_QTY

--------- ----- -------- -------- ---------- ---------- --------

1 A 1 20 1-A-20 A1 3

1 A 1 31 1-A-31 A1 12

1 B 2 15 1-B-15 B1 2

1 B 2 11 1-B-11 B1 4

1 C 3 4 1-C-04 B1 11

2 D 4 23 2-D-23 B1 1

2 D 4 7 2-D-07 A1 9

And so aisle 1-A is ascending, 1-B is descending,1-C is ascending,2-D is descending

#ukoug2012 Really Using Analytic Functions61

Single door

• Direction has to ”restart” per warehouse

2012-12-05

#ukoug2012 Really Using Analytic Functions62

Partition warehouse

2012-12-05

select s2.warehouse, s2.aisle, s2.aisle_no, s2.position, s2.loc, s2.item, s2.pick_qtyfrom (

select to_number(substr(s.loc,1,1)) warehouse , substr(s.loc,3,1) aisle , dense_rank() over ( partition by to_number(substr(s.loc,1,1)) -- warehouse order by substr(s.loc,3,1) -- aisle ) aisle_no , to_number(substr(s.loc,5,2)) position , s.loc, s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty) s2 order by s2.warehouse , s2.aisle_no , case when mod(s2.aisle_no,2) = 1 then s2.position else -s2.position end

Move the warehouse part from the order by to the partition by

#ukoug2012 Really Using Analytic Functions63

Partition warehouse

2012-12-05

WAREHOUSE AISLE AISLE_NO POSITION LOC ITEM PICK_QTY

--------- ----- -------- -------- ---------- ---------- --------

1 A 1 20 1-A-20 A1 3

1 A 1 31 1-A-31 A1 12

1 B 2 15 1-B-15 B1 2

1 B 2 11 1-B-11 B1 4

1 C 3 4 1-C-04 B1 11

2 D 1 7 2-D-07 A1 9

2 D 1 23 2-D-23 B1 1

Now the aisle_no is restarted for each warehouse, so the first visited aisle of a warehouse is always odd and therefore sorted ascending

#ukoug2012 Really Using Analytic Functions64

Efficient picking route• DENSE_RANK() to number the aisles in order visited

• Order the output– ”Up” on odd aisles– ”Down” on even aisles

• Partition by warehouse if door is missing2012-12-05

#ukoug2012 Really Using Analytic Functions65

Picking efficiencyCase 4

2012-12-05

#ukoug2012 Really Using Analytic Functions66

Picking efficiency• How fast can operators

pick items?• How much do they wait

idle for totes to arrive?

2012-12-05

#ukoug2012 Really Using Analytic Functions67

Table

2012-12-05

create table missions ( missionid number primary key, loadunit number, departpos varchar2(10), departtime date, arrivepos varchar2(10), arrivetime date)/

Missions are everytime a tote (loadunit) goes from one position to another position on the conveyor system

#ukoug2012 Really Using Analytic Functions68

Mission data

2012-12-05

insert into missions values ( 35986751, 10063485, 'STORE', timestamp '2012-04-12 06:38:07', 'PLF4', timestamp '2012-04-12 08:00:03' );

insert into missions values ( 35986752, 10016906, 'STORE', timestamp '2012-04-12 06:38:07', 'PLF4', timestamp '2012-04-12 08:01:41' );

insert into missions values ( 35986754, 10059580, 'STORE', timestamp '2012-04-12 06:38:07', 'PLF4', timestamp '2012-04-12 08:01:09' );

insert into missions values ( 35986755, 10056277, 'STORE', timestamp '2012-04-12 06:38:07', 'PLF4', timestamp '2012-04-12 08:01:16' );

insert into missions values ( 35986757, 10051547, 'STORE', timestamp '2012-04-12 06:38:07', 'PLF4', timestamp '2012-04-12 08:02:40' );

...2690 inserts snipped out...insert into missions values ( 35992214, 10064588, 'PLF4', timestamp '2012-04-12 11:13:20', 'STORE', timestamp '2012-04-12

11:15:12' );insert into missions values ( 35992216, 10066518, 'PLF4', timestamp '2012-04-12 11:13:22', 'STORE', timestamp '2012-04-12

11:15:30' );insert into missions values ( 35992219, 10082114, 'PLF4', timestamp '2012-04-12 11:13:43', 'STORE', timestamp '2012-04-12

11:15:35' );insert into missions values ( 35992220, 10033235, 'PLF4', timestamp '2012-04-12 11:13:52', 'STORE', timestamp '2012-04-12

11:15:50' );insert into missions values ( 35992223, 10056459, 'PLF4', timestamp '2012-04-12 11:14:59', 'STORE', timestamp '2012-04-12

11:21:03' );

#ukoug2012 Really Using Analytic Functions69

Arrivals

2012-12-05

select a.arrivepos pos , a.arrivetime time , a.loadunit , a.missionid from missions a where a.arrivepos in ('PLF4','PLF5') and a.arrivetime >= to_date('2012-04-12 08:00:00',

'YYYY-MM-DD HH24:MI:SS') and a.arrivetime <= to_date('2012-04-12 23:59:59',

'YYYY-MM-DD HH24:MI:SS') order by a.arrivepos, a.arrivetime

All missions arriving at picking stations PLF4 and PLF5 on April 12th after 08:00

#ukoug2012 Really Using Analytic Functions70

Arrivals

2012-12-05

POS TIME LOADUNIT MISSIONID ---- -------- --------- --------- PLF4 08:00:03 10063485 35986751 PLF4 08:00:11 10069588 35986762 PLF4 08:01:09 10059580 35986754 ...PLF4 12:47:51 10069370 35990243 PLF4 12:47:58 10026743 35990248 PLF4 12:49:06 10013439 35990250 PLF5 08:00:00 10040198 35987250 PLF5 08:00:07 10008351 35987251 PLF5 08:00:14 10068629 35987225 ...PLF5 11:28:47 10078376 35990936 PLF5 11:28:56 10035491 35990918 PLF5 11:29:07 10010287 35991015

1453 rows selected.

#ukoug2012 Really Using Analytic Functions71

Departures

2012-12-05

select d.departpos pos , d.departtime time , d.loadunit , d.missionid from missions d where d.departpos in ('PLF4','PLF5') and d.departtime >= to_date('2012-04-12 08:00:00',

'YYYY-MM-DD HH24:MI:SS') and d.departtime <= to_date('2012-04-12 23:59:59',

'YYYY-MM-DD HH24:MI:SS') order by d.departpos, d.departtime

All missions departing from picking stations PLF4 and PLF5 on April 12th after 08:00

#ukoug2012 Really Using Analytic Functions72

Departures

2012-12-05

POS TIME LOADUNIT MISSIONID ---- -------- --------- --------- PLF4 08:00:00 10067235 35988299 PLF4 08:00:08 10063485 35988300 PLF4 08:01:07 10069588 35988307 ...PLF4 11:13:43 10082114 35992219 PLF4 11:13:52 10033235 35992220 PLF4 11:14:59 10056459 35992223 PLF5 08:00:06 10040198 35988296 PLF5 08:00:13 10008351 35988302 PLF5 08:00:35 10068629 35988303 ...PLF5 11:08:36 10018796 35992157 PLF5 11:08:45 10058221 35992158 PLF5 11:09:00 10030575 35992159

1247 rows selected.

#ukoug2012 Really Using Analytic Functions73

Combined events

2012-12-05

select pos, time, ad, loadunit, missionid from ( select a.arrivepos pos, a.arrivetime time, 'A' ad, a.loadunit, a.missionid from missions a where a.arrivepos in ('PLF4','PLF5') and a.arrivetime >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and a.arrivetime <= to_date('2012-04-12 23:59:59','YYYY-MM-DD HH24:MI:SS') union all select d.departpos pos, d.departtime time, 'D' ad, d.loadunit, d.missionid from missions d where d.departpos in ('PLF4','PLF5') and d.departtime >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and d.departtime <= to_date('2012-04-12 23:59:59','YYYY-MM-DD HH24:MI:SS')) s1order by pos, time

#ukoug2012 Really Using Analytic Functions74

Combined events

2012-12-05

POS TIME AD LOADUNIT MISSIONID ---- -------- -- --------- --------- PLF4 08:00:00 D 10067235 35988299 PLF4 08:00:03 A 10063485 35986751 PLF4 08:00:08 D 10063485 35988300 PLF4 08:00:11 A 10069588 35986762 PLF4 08:01:07 D 10069588 35988307 PLF4 08:01:09 A 10059580 35986754 PLF4 08:01:14 D 10059580 35988308 PLF4 08:01:16 A 10056277 35986755 PLF4 08:01:24 D 10056277 35988309 PLF4 08:01:26 A 10081310 35986764 PLF4 08:01:39 D 10081310 35988310 PLF4 08:01:41 A 10016906 35986752 ...2700 rows selected.

Arrivals and departures joined allows us to see the loadunit arriving and a little bit later departing

#ukoug2012 Really Using Analytic Functions75

Lead the next event

2012-12-05

with s1 as ( select a.arrivepos pos, a.arrivetime time, 'A' ad, a.loadunit, a.missionid from missions a where a.arrivepos in ('PLF4','PLF5') and a.arrivetime >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and a.arrivetime <= to_date('2012-04-12 23:59:59','YYYY-MM-DD HH24:MI:SS') union all select d.departpos pos, d.departtime time, 'D' ad, d.loadunit, d.missionid from missions d where d.departpos in ('PLF4','PLF5') and d.departtime >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and d.departtime <= to_date('2012-04-12 23:59:59','YYYY-MM-DD HH24:MI:SS')

)select pos , time , lead(time) over ( partition by pos order by time, missionid ) nexttime , ad , loadunit from s1 order by pos, time

The analytic function lead() gives for each row the time of the next row

#ukoug2012 Really Using Analytic Functions76

Lead the next event

2012-12-05

POS TIME NEXTTIME AD LOADUNIT ---- -------- -------- -- --------- PLF4 08:00:00 08:00:03 D 10067235 PLF4 08:00:03 08:00:08 A 10063485 PLF4 08:00:08 08:00:11 D 10063485 PLF4 08:00:11 08:01:07 A 10069588 PLF4 08:01:07 08:01:09 D 10069588 PLF4 08:01:09 08:01:14 A 10059580 PLF4 08:01:14 08:01:16 D 10059580 PLF4 08:01:16 08:01:24 A 10056277 PLF4 08:01:24 08:01:26 D 10056277 PLF4 08:01:26 08:01:39 A 10081310 PLF4 08:01:39 08:01:41 D 10081310 PLF4 08:01:41 08:01:57 A 10016906...2700 rows selected.

So on each ’D’ row NEXTTIME is the time of the following ’A’ row

And on each ’A’ row NEXTTIME is the time of the following ’D’ row

#ukoug2012 Really Using Analytic Functions77

Lead on

2012-12-05

with s1 as ( select a.arrivepos pos, a.arrivetime time, 'A' ad, a.loadunit, a.missionid from missions a where a.arrivepos in ('PLF4','PLF5') and a.arrivetime >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and a.arrivetime <= to_date('2012-04-12 23:59:59','YYYY-MM-DD HH24:MI:SS') union all select d.departpos pos, d.departtime time, 'D' ad, d.loadunit, d.missionid from missions d where d.departpos in ('PLF4','PLF5') and d.departtime >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and d.departtime <= to_date('2012-04-12 23:59:59','YYYY-MM-DD HH24:MI:SS')

)select pos, time , lead(time) over ( partition by pos order by time, missionid ) nexttime , lead(time,2) over ( partition by pos order by time, missionid ) next2time , ad, loadunit from s1 order by pos, time

lead() accepts a second parameter telling how many rows forward the function should ”look”

#ukoug2012 Really Using Analytic Functions78

Lead on

2012-12-05

POS TIME NEXTTIME NEXT2TIM AD LOADUNIT ---- -------- -------- -------- -- --------- PLF4 08:00:00 08:00:03 08:00:08 D 10067235 PLF4 08:00:03 08:00:08 08:00:11 A 10063485 PLF4 08:00:08 08:00:11 08:01:07 D 10063485 PLF4 08:00:11 08:01:07 08:01:09 A 10069588 PLF4 08:01:07 08:01:09 08:01:14 D 10069588 PLF4 08:01:09 08:01:14 08:01:16 A 10059580 PLF4 08:01:14 08:01:16 08:01:24 D 10059580 PLF4 08:01:16 08:01:24 08:01:26 A 10056277 PLF4 08:01:24 08:01:26 08:01:39 D 10056277 PLF4 08:01:26 08:01:39 08:01:41 A 10081310 PLF4 08:01:39 08:01:41 08:01:57 D 10081310 PLF4 08:01:41 08:01:57 08:01:59 A 10016906...2700 rows selected.

The NEXT2TIME column ”looks” 2 rows forward

#ukoug2012 Really Using Analytic Functions79

Filter double lead

2012-12-05

with s1 as ( select a.arrivepos pos, a.arrivetime time, 'A' ad, a.loadunit, a.missionid from missions a where a.arrivepos in ('PLF4','PLF5') and a.arrivetime >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and a.arrivetime <= to_date('2012-04-12 23:59:59','YYYY-MM-DD HH24:MI:SS') union all select d.departpos pos, d.departtime time, 'D' ad, d.loadunit, d.missionid from missions d where d.departpos in ('PLF4','PLF5') and d.departtime >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and d.departtime <= to_date('2012-04-12 23:59:59','YYYY-MM-DD HH24:MI:SS'))

select pos, time arrive, nexttime depart, next2time nextarrive, loadunit from ( select pos, time , lead(time) over ( partition by pos order by time, missionid ) nexttime , lead(time,2) over ( partition by pos order by time, missionid ) next2time , ad, loadunit from s1 ) s2 where ad = 'A' order by pos, arrive

Since we use the double lead we now have all the data necessary on the ’A’ rows and do not need the ’D’ rows anymore

#ukoug2012 Really Using Analytic Functions80

Filter double lead

2012-12-05

POS ARRIVE DEPART NEXTARRI LOADUNIT ---- -------- -------- -------- --------- PLF4 08:00:03 08:00:08 08:00:11 10063485 PLF4 08:00:11 08:01:07 08:01:09 10069588 PLF4 08:01:09 08:01:14 08:01:16 10059580 PLF4 08:01:16 08:01:24 08:01:26 10056277 PLF4 08:01:26 08:01:39 08:01:41 10081310 PLF4 08:01:41 08:01:57 08:01:59 10016906...PLF4 10:59:47 10:59:54 10:59:56 10076144 PLF4 10:59:56 11:00:11 11:00:12 10012882 PLF4 11:00:12 11:00:28 11:00:29 10035898 PLF4 11:00:29 11:00:42 11:00:44 10076793...1453 rows selected.

We can now see a tote arrives 08:00:03, leaves again at 08:00:08, and a new tote arrives at 08:00:11

Note the tote that arrived 10:59:56 leaves after 11:00:00

#ukoug2012 Really Using Analytic Functions81

Pick and wait

2012-12-05

with s1 as ( ... )

select pos, arrive, depart, nextarrive , (depart - arrive) * 24 * 60 * 60 pickseconds , (nextarrive - depart) * 24 * 60 * 60 waitseconds from ( select pos, time arrive, nexttime depart, next2time nextarrive, loadunit from ( select pos, time , lead(time) over ( partition by pos order by time, missionid ) nexttime , lead(time,2) over ( partition by pos order by time, missionid ) next2time , ad, loadunit from s1 ) s2 where ad = 'A'

) s3where arrive >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and arrive <= to_date('2012-04-12 10:59:59','YYYY-MM-DD HH24:MI:SS')order by pos, arrive

Calculate pick seconds and wait seconds

Filter on desired 3 hour interval

#ukoug2012 Really Using Analytic Functions82

Pick and wait

2012-12-05

POS ARRIVE DEPART NEXTARRI PICKSECONDS WAITSECONDS ---- -------- -------- -------- ----------- ----------- PLF4 08:00:03 08:00:08 08:00:11 5 3 PLF4 08:00:11 08:01:07 08:01:09 56 2 PLF4 08:01:09 08:01:14 08:01:16 5 2 PLF4 08:01:16 08:01:24 08:01:26 8 2...PLF4 08:58:08 08:58:08 09:11:36 0 808 PLF4 09:11:36 09:12:55 09:12:56 79 1 ...PLF4 10:59:47 10:59:54 10:59:56 7 2 PLF4 10:59:56 11:00:11 11:00:12 15 1 PLF5 08:00:00 08:00:06 08:00:07 6 1 PLF5 08:00:07 08:00:13 08:00:14 6 1...PLF5 10:57:54 10:59:58 10:59:59 124 1 PLF5 10:59:59 11:00:09 11:00:10 10 1

1155 rows selected.

How fast did the operator pick and how long time did he wait for a new tote to arrive

#ukoug2012 Really Using Analytic Functions83

Hourly stats

2012-12-05

with s1 as ( ... )

select pos , trunc(arrive,'HH24') hour , count(*) picks , avg(pickseconds) secondsprpick , sum(pickseconds)/60 minutespicked , 100*sum(pickseconds)/sum(pickseconds+waitseconds) pickpct , avg(waitseconds) secondsprwait , sum(waitseconds)/60 minuteswaited , 100*sum(waitseconds)/sum(pickseconds+waitseconds) waitpct , avg(pickseconds+waitseconds) secondsprcycle , sum(pickseconds+waitseconds)/60 minutestotal , 60 * count(*) / sum(pickseconds+waitseconds) cyclesprmin from ( select pos, arrive, depart, nextarrive , (depart - arrive) * 24 * 60 * 60 pickseconds , (nextarrive - depart) * 24 * 60 * 60 waitseconds from ( select pos, time arrive, nexttime depart, next2time nextarrive, loadunit from ( select pos, time , lead(time) over ( partition by pos order by time, missionid ) nexttime , lead(time,2) over ( partition by pos order by time, missionid ) next2time , ad, loadunit from s1 ) s2 where ad = 'A' ) s3 where arrive >= to_date('2012-04-12 08:00:00','YYYY-MM-DD HH24:MI:SS') and arrive <= to_date('2012-04-12 10:59:59','YYYY-MM-DD HH24:MI:SS')

) s4group by pos, trunc(arrive,'HH24')order by pos, trunc(arrive,'HH24')

Now we can use the previous select as basis for some plain statistics by the hour

#ukoug2012 Really Using Analytic Functions84

Hourly stats

2012-12-05

sec sec sec cycle

pr min pick pr min wait pr min pr

POS HOUR PICKS pick pickd pct wait waitd pct cycle total min

----- -------- ------ ----- ------ ----- ----- ------ ----- ----- ------ -----

PLF4 08:00:00 156 20.3 52.9 73.9 7.2 18.7 26.1 27.5 71.6 2.2

PLF4 09:00:00 159 13.2 35.0 71.9 5.1 13.6 28.1 18.3 48.6 3.3

PLF4 10:00:00 165 19.8 54.5 90.8 2.0 5.5 9.2 21.8 60.0 2.8

PLF5 08:00:00 247 12.9 53.2 85.3 2.2 9.2 14.7 15.2 62.4 4.0

PLF5 09:00:00 179 15.9 47.4 82.3 3.4 10.2 17.7 19.3 57.6 3.1

PLF5 10:00:00 249 10.9 45.4 75.3 3.6 14.9 24.7 14.5 60.4 4.1

6 rows selected.

#ukoug2012 Really Using Analytic Functions85

Picking efficiency• Log over tote missions arriving and departing the picking

stations

• LEAD() on mission log to find the departure following an arrival => picking time

• LEAD(,2) on mission log to find the arrival following a departure => waiting time

2012-12-05

#ukoug2012 Really Using Analytic Functions86

Forecasting salesCase 5

2012-12-05

#ukoug2012 Really Using Analytic Functions87

Forecasting sales• Forecast the sales

of next year• But follow the

trend of the item

2012-12-05

#ukoug2012 Really Using Analytic Functions88

Table

2012-12-05

create table sales ( item varchar2(10), mth date, qty number)/

Simple table of monthly sales by item

#ukoug2012 Really Using Analytic Functions89

Data

2012-12-05

insert into sales values ('Snowchain', date '2008-01-01', 79); insert into sales values ('Snowchain', date '2008-02-01', 133); insert into sales values ('Snowchain', date '2008-03-01', 24); ... insert into sales values ('Snowchain', date '2010-10-01', 1); insert into sales values ('Snowchain', date '2010-11-01', 73); insert into sales values ('Snowchain', date '2010-12-01', 160); insert into sales values ('Sunshade' , date '2008-01-01', 4); insert into sales values ('Sunshade' , date '2008-02-01', 6); insert into sales values ('Sunshade' , date '2008-03-01', 32); ... insert into sales values ('Sunshade' , date '2010-10-01', 11); insert into sales values ('Sunshade' , date '2010-11-01', 3); insert into sales values ('Sunshade' , date '2010-12-01', 5);

Item Snowchain sells good in winter and trends up

Item Sunshade sells good in summer and trends down

#ukoug2012 Really Using Analytic Functions90

Slope

2012-12-05

select sales.item , sales.mth , sales.qty , regr_slope( sales.qty , extract(year from sales.mth) * 12 + extract(month from sales.mth) ) over ( partition by sales.item order by sales.mth range between interval '23' month preceding and current row ) slope from sales order by sales.item, sales.mth

Graph slope: y-axis is qty x-axis is a number with the scale of 1=a monthRange between gives sliding 2-year window

#ukoug2012 Really Using Analytic Functions91

Slope

2012-12-05

ITEM MTH QTY SLOPE ---------- ---------- ----- -------- Snowchain 2008-01-01 79 Snowchain 2008-02-01 133 54.000 Snowchain 2008-03-01 24 -27.500 ...Snowchain 2010-10-01 1 -2.274 Snowchain 2010-11-01 73 -2.363 Snowchain 2010-12-01 160 -.991 Sunshade 2008-01-01 4 Sunshade 2008-02-01 6 2.000 Sunshade 2008-03-01 32 14.000 ...Sunshade 2010-10-01 11 .217 Sunshade 2010-11-01 3 -.200 Sunshade 2010-12-01 5 -.574

72 rows selected.

Slope value most accurate for 2010 data where 2 year sliding window contains full set of data

#ukoug2012 Really Using Analytic Functions92

Transpose using slope

2012-12-05

select item, mth, qty , qty + 12 * slope qty_next_year from ( select sales.item, sales.mth, sales.qty , regr_slope( sales.qty , extract(year from sales.mth) * 12 + extract(month from

sales.mth) ) over ( partition by sales.item order by sales.mth range between interval '23' month preceding and current row ) slope from sales

) where mth >= date '2010-01-01' order by item, mth

As x-axis had scale of 1=a month and y-axis was qty, multiplying slope with 12 gives how much qty goes up or down in a year

#ukoug2012 Really Using Analytic Functions93

Transpose using slope

2012-12-05

ITEM MTH QTY QTY_NEXT_YEAR

---------- ---------- ----- -------------

Snowchain 2010-01-01 167 188,313043

Snowchain 2010-02-01 247 304,855652

Snowchain 2010-03-01 42 96,3913043

Snowchain 2010-04-01 0 42,6991304

Snowchain 2010-05-01 0 30,8869565

Snowchain 2010-06-01 0 19,0747826

Snowchain 2010-07-01 0 7,2626087

Snowchain 2010-08-01 1 -3,4295652

Snowchain 2010-09-01 0 -16,121739

Snowchain 2010-10-01 1 -26,292174

Snowchain 2010-11-01 73 44,6434783

Snowchain 2010-12-01 160 148,109565

Sunshade 2010-01-01 2 -11,617391

Sunshade 2010-02-01 8 -11,137391

Sunshade 2010-03-01 28 9,11304348

Sunshade 2010-04-01 26 8,86086957

Sunshade 2010-05-01 23 9,66434783

Sunshade 2010-06-01 46 39,1130435

Sunshade 2010-07-01 73 79,4486957

Sunshade 2010-08-01 25 31,7147826

Sunshade 2010-09-01 13 18,0504348

Sunshade 2010-10-01 11 13,6086957

Sunshade 2010-11-01 3 ,594782609

Sunshade 2010-12-01 5 -1,8869565

#ukoug2012 Really Using Analytic Functions94

Forecast

2012-12-05

select item , add_months(mth, 12) mth , greatest(round(qty + 12 * slope), 0) forecast from ( select sales.item, sales.mth, sales.qty , regr_slope( sales.qty , extract(year from sales.mth) * 12 + extract(month from sales.mth) ) over ( partition by sales.item order by sales.mth range between interval '23' month preceding and current row ) slope from sales

) where mth >= date '2010-01-01' order by item, mth

Rather than column QTY_NEXT_YEAR we add a year to the month and call it a forecast

We round the numbers and skip any negatives

#ukoug2012 Really Using Analytic Functions95

Forecast

2012-12-05

ITEM MTH FORECAST

---------- ---------- ---------

Snowchain 2011-01-01 188

Snowchain 2011-02-01 305

Snowchain 2011-03-01 96

Snowchain 2011-04-01 43

Snowchain 2011-05-01 31

Snowchain 2011-06-01 19

Snowchain 2011-07-01 7

Snowchain 2011-08-01 0

Snowchain 2011-09-01 0

Snowchain 2011-10-01 0

Snowchain 2011-11-01 45

Snowchain 2011-12-01 148

Sunshade 2011-01-01 0

Sunshade 2011-02-01 0

Sunshade 2011-03-01 9

Sunshade 2011-04-01 9

Sunshade 2011-05-01 10

Sunshade 2011-06-01 39

Sunshade 2011-07-01 79

Sunshade 2011-08-01 32

Sunshade 2011-09-01 18

Sunshade 2011-10-01 14

Sunshade 2011-11-01 1

Sunshade 2011-12-01 0

#ukoug2012 Really Using Analytic Functions96

Actual + forecast

2012-12-05

select item, mth, qty, type from ( select sales.item, sales.mth, sales.qty, 'Actual' type from sales union all select item , add_months(mth, 12) mth , greatest(round(qty + 12 * slope), 0) qty , 'Forecast' type from ( select sales.item, sales.mth, sales.qty , regr_slope( sales.qty , extract(year from sales.mth) * 12 + extract(month from sales.mth) ) over ( partition by sales.item order by sales.mth range between interval '23' month preceding and current row ) slope from sales

) where mth >= date '2010-01-01' ) order by item, mth

UNION ALL of the actual data and the forecast data for a complete set of sales data that can be shown in a graph

#ukoug2012 Really Using Analytic Functions97

Actual + forecast

2012-12-05

ITEM MTH QTY TYPE

---------- ---------- ----- ----------

Snowchain 2008-01-01 79 Actual

Snowchain 2008-02-01 133 Actual

Snowchain 2008-03-01 24 Actual ...Snowchain 2010-10-01 1 Actual

Snowchain 2010-11-01 73 Actual

Snowchain 2010-12-01 160 Actual

Snowchain 2011-01-01 188 Forecast

Snowchain 2011-02-01 305 Forecast

Snowchain 2011-03-01 96 Forecast ...Snowchain 2011-10-01 0 Forecast

Snowchain 2011-11-01 45 Forecast

Snowchain 2011-12-01 148 Forecast

Sunshade 2008-01-01 4 Actual

Sunshade 2008-02-01 6 Actual

Sunshade 2008-03-01 32 Actual ...Sunshade 2010-10-01 11 Actual

Sunshade 2010-11-01 3 Actual

Sunshade 2010-12-01 5 Actual

Sunshade 2011-01-01 0 Forecast

Sunshade 2011-02-01 0 Forecast

Sunshade 2011-03-01 9 Forecast ...Sunshade 2011-10-01 14 Forecast

Sunshade 2011-11-01 1 Forecast

Sunshade 2011-12-01 0 Forecast

#ukoug2012 Really Using Analytic Functions98

Actual + forecast

Data from previous slide is the graph

• ”Actual” is normal lines

• ”Forecast” is stapled lines

2012-12-05

#ukoug2012 Really Using Analytic Functions99

Forecasting sales• REGR_SLOPE() to calculate trend

• RANGE window for sliding trend calculation over three years

• ”Transpose” last years sales by the slope to get next years forecast

2012-12-05

#ukoug2012 Really Using Analytic Functions100

Forecast zero stockCase 6

2012-12-05

#ukoug2012 Really Using Analytic Functions101

Forecast zero stock• Fireworks sell like crazy

last week of December• What hour will a store

run out of stock?

2012-12-05

#ukoug2012 Really Using Analytic Functions102

Tables

2012-12-05

create table fw_store ( shopid varchar2(10) primary key, containers integer)/

create table fw_sales ( shopid varchar2(10) references fw_store (shopid), saleshour date, salesnem number)/

Stores are defined by how many storage containers

Sales are hourly data per shop inNet Explosive Mass

#ukoug2012 Really Using Analytic Functions103

Tables

2012-12-05

create table fw_daybudget ( shopid varchar2(10) references fw_store (shopid), budgetdate date, budgetnem number)/

create table fw_hourbudget ( hour integer, percent number)/

Daily budget ofNet Explosive Massper shop

Percentage of a days budget expected to be in each hour

#ukoug2012 Really Using Analytic Functions104

Data - store

2012-12-05

insert into fw_store values ('AALBORG' , 4); insert into fw_store values ('GLOSTRUP' , 4); insert into fw_store values ('HADERSLEV', 3);

#ukoug2012 Really Using Analytic Functions105

Data - sales

2012-12-05

insert into fw_salesselect shopid, day + numtodsinterval(hour,'hour') saleshour, salesnemfrom ( select 'AALBORG' shopid, date '2011-12-27' day, 4 h9, 6 h10, 5 h11, 20 h12, 19 h13, 22 h14, 27 h15, 11 h16, 16 h17, 4 h18 from dual union all select 'AALBORG' , date '2011-12-28', 7, 17, 18, 13, 27, 28, 20, 14, 10, 19 from dual union all select 'AALBORG' , date '2011-12-29', 10, 14, 20, null, null, null, null, null, null, null from dual union all select 'GLOSTRUP' , date '2011-12-27', 1, 6, 6, 14, 17, 17, 13, 15, 7, 7 from dual union all select 'GLOSTRUP' , date '2011-12-28', 4, 14, 30, 35, 22, 21, 35, 34, 15, 25 from dual union all select 'GLOSTRUP' , date '2011-12-29', 6, 13, 50, null, null, null, null, null, null, null from dual union all select 'HADERSLEV', date '2011-12-27', 4, 7, 13, 15, 17, 13, 18, 19, 10, 3 from dual union all select 'HADERSLEV', date '2011-12-28', 8, 5, 14, 18, 20, 18, 15, 24, 12, 1 from dual union all select 'HADERSLEV', date '2011-12-29', 1, 19, 33, null, null, null, null, null, null, null from dual) s1unpivot exclude nulls ( salesnem for hour in ( h9 as 9, h10 as 10, h11 as 11, h12 as 12, h13 as 13, h14 as 14, h15 as 15, h16 as 16, h17 as 17, h18 as 18 ))

#ukoug2012 Really Using Analytic Functions106

Data - daybudget

2012-12-05

insert into fw_daybudget values ('AALBORG' , date '2011-12-27', 150); insert into fw_daybudget values ('AALBORG' , date '2011-12-28', 200); insert into fw_daybudget values ('AALBORG' , date '2011-12-29', 300); insert into fw_daybudget values ('AALBORG' , date '2011-12-30', 500); insert into fw_daybudget values ('AALBORG' , date '2011-12-31', 400); insert into fw_daybudget values ('GLOSTRUP' , date '2011-12-27', 150); insert into fw_daybudget values ('GLOSTRUP' , date '2011-12-28', 200); insert into fw_daybudget values ('GLOSTRUP' , date '2011-12-29', 300); insert into fw_daybudget values ('GLOSTRUP' , date '2011-12-30', 500); insert into fw_daybudget values ('GLOSTRUP' , date '2011-12-31', 400); insert into fw_daybudget values ('HADERSLEV', date '2011-12-27', 100); insert into fw_daybudget values ('HADERSLEV', date '2011-12-28', 150); insert into fw_daybudget values ('HADERSLEV', date '2011-12-29', 200); insert into fw_daybudget values ('HADERSLEV', date '2011-12-30', 400); insert into fw_daybudget values ('HADERSLEV', date '2011-12-31', 300);

#ukoug2012 Really Using Analytic Functions107

Data - hourbudget

2012-12-05

insert into fw_hourbudget values ( 9, 4); insert into fw_hourbudget values (10, 8); insert into fw_hourbudget values (11, 10); insert into fw_hourbudget values (12, 12); insert into fw_hourbudget values (13, 12); insert into fw_hourbudget values (14, 12); insert into fw_hourbudget values (15, 14); insert into fw_hourbudget values (16, 14); insert into fw_hourbudget values (17, 10); insert into fw_hourbudget values (18, 4);

#ukoug2012 Really Using Analytic Functions108

Starting NEM

2012-12-05

select s.shopid , s.containers * 250 startnem from fw_store sorder by s.shopid

start

SHOPID nem

---------- ------

AALBORG 1000

GLOSTRUP 1000

HADERSLEV 750

Three stores:

2 has 4 containers( = 1000 kg NEM)

1 has 3 containers( = 750 kg NEM)

#ukoug2012 Really Using Analytic Functions109

Budget per hour

2012-12-05

select db.shopid , db.budgetdate + numtodsinterval(hb.hour,'hour')

budgethour , db.budgetnem * hb.percent / 100 budgetnem from fw_daybudget db cross join fw_hourbudget hborder by db.shopid, budgethour

Cartesian join of daily budget with hour percentages gives us an hourly budget

#ukoug2012 Really Using Analytic Functions110

Budget per hour

2012-12-05

budgt SHOPID BUDGETHOUR nem ---------- ------------------- ------ AALBORG 2011-12-27 09:00:00 6 AALBORG 2011-12-27 10:00:00 12 AALBORG 2011-12-27 11:00:00 15 ...AALBORG 2011-12-31 15:00:00 56 AALBORG 2011-12-31 16:00:00 56 AALBORG 2011-12-31 17:00:00 40 AALBORG 2011-12-31 18:00:00 16 GLOSTRUP 2011-12-27 09:00:00 6 GLOSTRUP 2011-12-27 10:00:00 12 GLOSTRUP 2011-12-27 11:00:00 15 ...HADERSLEV 2011-12-31 16:00:00 42 HADERSLEV 2011-12-31 17:00:00 30 HADERSLEV 2011-12-31 18:00:00 12

150 rows selected.

These hourly budget data is now directly comparable to hourly sales data

#ukoug2012 Really Using Analytic Functions111

WITH clauses

2012-12-05

with shop as ( select s.shopid , s.containers * 250 startnem from fw_store s), budget as ( select db.shopid , db.budgetdate + numtodsinterval(hb.hour,'hour')

budgethour , db.budgetnem * hb.percent / 100 budgetnem from fw_daybudget db cross join fw_hourbudget hb)...

Use the starting NEM and hourly budget selects as WITH clauses

#ukoug2012 Really Using Analytic Functions112

Budget + sales

2012-12-05

... Shop and Budget WITH clauses ...select budget.shopid, shop.startnem, budget.budgethour hour , budget.budgetnem , sum(budget.budgetnem) over ( partition by budget.shopid order by budget.budgethour rows between unbounded preceding and current row ) budgetnemacc , sales.salesnem , sum(sales.salesnem) over ( partition by budget.shopid order by budget.budgethour rows between unbounded preceding and current row ) salesnemacc from shop join budget on budget.shopid = shop.shopid left outer join fw_sales sales on sales.shopid = budget.shopid and sales.saleshour = budget.budgethourorder by budget.shopid, budget.budgethour

Join shop and budget, outer join to sales – then we can accumulate both budget and sales

#ukoug2012 Really Using Analytic Functions113

Budget + sales

2012-12-05

budgt sales start budgt nem sales nem SHOPID nem HOUR nem acc nem acc ---------- ------ ------------------- ------ ------ ------ ------ AALBORG 1000 2011-12-27 09:00:00 6 6 4 4 AALBORG 1000 2011-12-27 10:00:00 12 18 6 10 AALBORG 1000 2011-12-27 11:00:00 15 33 5 15 AALBORG 1000 2011-12-27 12:00:00 18 51 20 35 ...AALBORG 1000 2011-12-29 10:00:00 24 386 14 331 AALBORG 1000 2011-12-29 11:00:00 30 416 20 351 AALBORG 1000 2011-12-29 12:00:00 36 452 351 AALBORG 1000 2011-12-29 13:00:00 36 488 351 ...AALBORG 1000 2011-12-31 15:00:00 56 1438 351 AALBORG 1000 2011-12-31 16:00:00 56 1494 351 AALBORG 1000 2011-12-31 17:00:00 40 1534 351 AALBORG 1000 2011-12-31 18:00:00 16 1550 351

”Now” is December 29th exactly 12:00, so sales data stops thereAccumulated data show we are behind budget

#ukoug2012 Really Using Analytic Functions114

Yet another WITH

2012-12-05

... Shop and Budget WITH clauses ...

), nem as ( select budget.shopid, shop.startnem, budget.budgethour hour , case when budget.budgethour < to_date('2011-12-29 12:00:00', 'YYYY-MM-DD HH24:MI:SS') then 'S' else 'B' end salesbudget , case when budget.budgethour < to_date('2011-12-29 12:00:00', 'YYYY-MM-DD HH24:MI:SS') then nvl(sales.salesnem,0) else budget.budgetnem end qtynem from shop join budget on budget.shopid = shop.shopid left outer join fw_sales sales on sales.shopid = budget.shopid and sales.saleshour = budget.budgethour)

Real code use SYSDATE rather than hardcodet

qtynem contains actual sales for as long as we have it, and budget data after ”now”

#ukoug2012 Really Using Analytic Functions115

Stock level

2012-12-05

... Shop, Budget and Nem WITH clauses ...select nem.shopid , nem.hour , nem.salesbudget , nem.qtynem , sum(nem.qtynem) over ( partition by nem.shopid order by nem.hour rows between unbounded preceding and current row ) sumnem , greatest(nem.startnem - nvl( sum(nem.qtynem) over ( partition by nem.shopid order by nem.hour rows between unbounded preceding and 1 preceding ) ,0),0) stocknem from nem order by shopid, hour

Accumulate qtynem similar to FIFO code and subtract from startnem to calculate stock at the beginning of each hour

#ukoug2012 Really Using Analytic Functions116

Stock level

2012-12-05

S qty sum stock SHOPID HOUR B nem nem nem ---------- ------------------- - ------ ------ ------ AALBORG 2011-12-27 09:00:00 S 4 4 1000 AALBORG 2011-12-27 10:00:00 S 6 10 996 AALBORG 2011-12-27 11:00:00 S 5 15 990 ...AALBORG 2011-12-29 10:00:00 S 14 331 683 AALBORG 2011-12-29 11:00:00 S 20 351 669 AALBORG 2011-12-29 12:00:00 B 36 387 649 AALBORG 2011-12-29 13:00:00 B 36 423 613 ...AALBORG 2011-12-30 15:00:00 B 70 945 125 AALBORG 2011-12-30 16:00:00 B 70 1015 55 AALBORG 2011-12-30 17:00:00 B 50 1065 0 AALBORG 2011-12-30 18:00:00 B 20 1085 0 ...AALBORG 2011-12-31 17:00:00 B 40 1469 0 AALBORG 2011-12-31 18:00:00 B 16 1485 0

At the beginning of hour 16 on December 30th, there will be 55 kg NEM leftDuring the hour we expect to sell 70 kg and will run out

#ukoug2012 Really Using Analytic Functions117

The hour of zero stock

2012-12-05

... Shop, Budget and Nem WITH clauses ...

select shopid , max(hour) + numtodsinterval( max(stocknem) keep (dense_rank last order by hour) / max(qtynem) keep (dense_rank last order by hour) ,'hour' ) zerohour from ( select nem.shopid, nem.hour, nem.salesbudget, nem.qtynem , sum(nem.qtynem) over ( partition by nem.shopid order by nem.hour rows between unbounded preceding and current row ) sumnem , greatest(nem.startnem - nvl( sum(nem.qtynem) over ( partition by nem.shopid order by nem.hour rows between unbounded preceding and 1 preceding ) ,0),0) stocknem from nem

) where stocknem > 0 group by shopid order by shopid

The last hour we still have stockThe stock we have left divided by the qty expected sold that hour gives the last fraction of hour before we reach zero

#ukoug2012 Really Using Analytic Functions118

The hour of zero stock

2012-12-05

SHOPID ZEROHOUR

---------- -------------------

AALBORG 2011-12-30 16:47:08

GLOSTRUP 2011-12-30 15:59:08

HADERSLEV 2011-12-30 15:58:55

And so our logistics planner has a forecast for when the shops will run out of fireworks

#ukoug2012 Really Using Analytic Functions119

Model the same

2012-12-05

... Shop, Budget and Nem WITH clauses ...select shopid, rn, hour, startnem, salesbudget, qtynem, sumnem, stocknem, zerohourfrom nemmodelpartition by (shopid)dimension by (rn)measures ( hour, startnem, salesbudget, qtynem, qtynem sumnem, startnem stocknem, cast(null as date) zerohour)rules sequential order iterate (49) ( sumnem[iteration_number+1] = sumnem[iteration_number] + qtynem[iteration_number], stocknem[iteration_number+1] = stocknem[iteration_number] - qtynem[iteration_number] + case when qtynem[iteration_number] > stocknem[iteration_number] then startnem[0] else 0 end, zerohour[iteration_number+1] = case when qtynem[iteration_number+1]>stocknem[iteration_number+1] then hour[iteration_number+1] + numtodsinterval(stocknem[iteration_number+1] / qtynem[iteration_number+1], 'hour’) else null end)order by shopid, hour

#ukoug2012 Really Using Analytic Functions120

Model the same

2012-12-05

start S qty sum stock SHOPID RN HOUR nem B nem nem nem ZEROHOUR ---------- --- ------------------- ------ - ------ ------ ------ ------------------- AALBORG 0 2011-12-27 09:00:00 1000 S 4 4 1000 AALBORG 1 2011-12-27 10:00:00 1000 S 6 8 996 AALBORG 2 2011-12-27 11:00:00 1000 S 5 14 990 ...AALBORG 35 2011-12-30 14:00:00 1000 B 60 819 185 AALBORG 36 2011-12-30 15:00:00 1000 B 70 879 125 AALBORG 37 2011-12-30 16:00:00 1000 B 70 949 55 2011-12-30 16:47:08 AALBORG 38 2011-12-30 17:00:00 1000 B 50 1019 985 AALBORG 39 2011-12-30 18:00:00 1000 B 20 1069 935 ...AALBORG 48 2011-12-31 17:00:00 1000 B 40 1433 571 AALBORG 49 2011-12-31 18:00:00 1000 B 16 1473 531

#ukoug2012 Really Using Analytic Functions121

Change the law

2012-12-05

with shop as ( select s.shopid , s.containers * 100 startnem from fw_store s), budget as (...), nem as (...)select shopid, rn, hour, startnem, salesbudget, qtynem, sumnem, stocknem, zerohourfrom nemmodel...

If politicians decide no longer 250 kg NEM per container, now 100 kg NEM per container

#ukoug2012 Really Using Analytic Functions122

Change the law

2012-12-05

start S qty sum stock SHOPID RN HOUR nem B nem nem nem ZEROHOUR ---------- --- ------------------- ------ - ------ ------ ------ ------------------- AALBORG 0 2011-12-27 09:00:00 400 S 4 4 400 AALBORG 1 2011-12-27 10:00:00 400 S 6 8 396 ...AALBORG 23 2011-12-29 12:00:00 400 B 36 355 49 AALBORG 24 2011-12-29 13:00:00 400 B 36 391 13 2011-12-29 13:21:40 AALBORG 25 2011-12-29 14:00:00 400 B 36 427 377 ...AALBORG 33 2011-12-30 12:00:00 400 B 60 699 105 AALBORG 34 2011-12-30 13:00:00 400 B 60 759 45 2011-12-30 13:45:00 AALBORG 35 2011-12-30 14:00:00 400 B 60 819 385 ...AALBORG 42 2011-12-31 11:00:00 400 B 40 1137 67 AALBORG 43 2011-12-31 12:00:00 400 B 48 1177 27 2011-12-31 12:33:45 AALBORG 44 2011-12-31 13:00:00 400 B 48 1225 379 ...AALBORG 49 2011-12-31 18:00:00 400 B 16 1473 131

Using MODEL clause allows for forecast repeated refill of stock

#ukoug2012 Really Using Analytic Functions123

Zero hours

2012-12-05

... Shop, Budget and Nem WITH clauses ...select shopid, zerohourfrom ( select shopid, rn, hour, startnem, salesbudget, qtynem, sumnem, stocknem, zerohour from nem model partition by (shopid) dimension by (rn) measures ( hour, startnem, salesbudget, qtynem, qtynem sumnem, startnem stocknem, cast(null as date) zerohour ) rules sequential order iterate (49) ( sumnem[iteration_number+1] = sumnem[iteration_number] + qtynem[iteration_number], stocknem[iteration_number+1] = stocknem[iteration_number] - qtynem[iteration_number] + case when qtynem[iteration_number] > stocknem[iteration_number] then startnem[0] else 0 end, zerohour[iteration_number+1] = case when qtynem[iteration_number+1]>stocknem[iteration_number+1] then hour[iteration_number+1] + numtodsinterval(stocknem[iteration_number+1] / qtynem[iteration_number+1], 'hour’) else null end )

)where zerohour is not nullorder by shopid, zerohour

#ukoug2012 Really Using Analytic Functions124

Zero hours

2012-12-05

SHOPID ZEROHOUR

---------- -------------------

AALBORG 2011-12-29 13:21:40

AALBORG 2011-12-30 13:45:00

AALBORG 2011-12-31 12:33:45

GLOSTRUP 2011-12-29 11:51:36

GLOSTRUP 2011-12-30 12:49:00

GLOSTRUP 2011-12-31 11:16:30

HADERSLEV 2011-12-29 11:47:16

HADERSLEV 2011-12-30 13:01:15

HADERSLEV 2011-12-31 11:02:00

With smaller amount in the containers we need to refill shops multiple times

#ukoug2012 Really Using Analytic Functions125

Forecast zero stock• SUM() on budget sales data from ”now” forward

• Identify hour when rolling sum exceeds stock– BETWEEN CURRENT ROW AND 1 PRECEDING

(Similar technique as picking by FIFO)

• More than analytics:– MODEL clause for repeated refill of stock2012-12-05

#ukoug2012 Really Using Analytic Functions126

Multi-order FIFO pickingCase 7

2012-12-05

#ukoug2012 Really Using Analytic Functions127

Multi-order FIFOMonty Latiolais, president of ODTUG:

• Need to pick multiple orders• Each order by First-In-First-Out• Second order ”continues” where

first order stops and so on

2012-12-05

#ukoug2012 Really Using Analytic Functions128

Tables

2012-12-05

create table inventory ( item varchar2(10), -- identification of the item loc varchar2(10), -- identification of the location qty number, -- quantity present at that location purch date -- date that quantity was purchased)/

create table orderline ( ordno number, -- id-number of the order item varchar2(10), -- identification of the item qty number -- quantity ordered )/

Just like Case 2

#ukoug2012 Really Using Analytic Functions129

Data

2012-12-05

insert into orderline values (51, 'A1', 24); insert into orderline values (51, 'B1', 18); insert into orderline values (62, 'A1', 8); insert into orderline values (73, 'A1', 16); insert into orderline values (73, 'B1', 6);

Inventory data exactly like Case 2Orderline data this time for three orders:

#ukoug2012 Really Using Analytic Functions130

Batch pick

2012-12-05

with orderbatch as ( select o.item , sum(o.qty) qty from orderline o where o.ordno in (51, 62, 73) group by o.item)select <FIFO sql>...

Group by on orderline creates an orderbatch

#ukoug2012 Really Using Analytic Functions131

Batch pick

2012-12-05

with orderbatch as ( ...)

select s.loc, s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderbatch o join inventory i on i.item = o.item) s where s.sum_prv_qty < s.ord_qty order by s.loc

We can apply the FIFO code on the orderbatch

#ukoug2012 Really Using Analytic Functions132

Batch pick

2012-12-05

LOC ITEM PICK_QTY

------ ---- --------

1-A-02 B1 5

1-A-20 A1 18

1-B-11 B1 4

1-B-15 B1 2

1-C-04 B1 12

1-C-05 A1 6

2-A-02 A1 24

2-D-23 B1 1

Works OK, but operator cannot see how much of each pick goes to what order

#ukoug2012 Really Using Analytic Functions133

Pick qty intervals

2012-12-05

with orderbatch as ( ... )

select s.loc, s.item, least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty , sum_prv_qty + 1 from_qty , least(sum_qty, ord_qty) to_qty from ( select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and current row ),0) sum_qty from orderbatch o join inventory i on i.item = o.item) s where s.sum_prv_qty < s.ord_qty order by s.item, s.purch, s.loc

Let’s use more analytics

Two rolling sums allow us to calculate from_qty and to_qty

#ukoug2012 Really Using Analytic Functions134

Pick qty intervals

2012-12-05

LOC ITEM PICK_QTY FROM_QTY TO_QTY

------ ---- -------- -------- --------

1-A-20 A1 18 1 18

2-A-02 A1 24 19 42

1-C-05 A1 6 43 48

1-B-15 B1 2 1 2

1-C-04 B1 12 3 14

2-D-23 B1 1 15 15

1-B-11 B1 4 16 19

1-A-02 B1 5 20 24

So ”from 1 to 18” of the 48 pieces of A1 are picked in the first location”From 19 to 42” are picked in the second locationAnd so on…

#ukoug2012 Really Using Analytic Functions135

Order qty intervals

2012-12-05

select o.ordno , o.item , o.qty , nvl(sum(o.qty) over ( partition by o.item order by o.ordno rows between unbounded preceding and 1 preceding ),0) + 1 from_qty , nvl(sum(o.qty) over ( partition by o.item order by o.ordno rows between unbounded preceding and current row ),0) to_qty from orderline o where ordno in (51, 62, 73) order by o.item, o.ordno

Do the same with the order quantities

#ukoug2012 Really Using Analytic Functions136

Order qty intervals

2012-12-05

ORDNO ITEM QTY FROM_QTY TO_QTY

----- ---- -------- -------- --------

51 A1 24 1 24

62 A1 8 25 32

73 A1 16 33 48

51 B1 18 1 18

73 B1 6 19 24

”From 1 to 24” of the 48 pieces ordered of A1 is on order no 51And so on

#ukoug2012 Really Using Analytic Functions137

Overlapping intervals

2012-12-05

with orderlines as ( select o.ordno, o.item, o.qty , nvl(sum(o.qty) over ( partition by o.item order by o.ordno rows between unbounded preceding and 1 preceding ),0) + 1 from_qty , nvl(sum(o.qty) over ( partition by o.item order by o.ordno rows between unbounded preceding and current row ),0) to_qty from orderline o where ordno in (51, 62, 73)), orderbatch as (...

The orderlines with qty intervals we put in a with clause

#ukoug2012 Really Using Analytic Functions138

Overlapping intervals

2012-12-05

...), orderbatch as ( select o.item , sum(o.qty) qty from orderlines o group by o.item), fifo as (...

We create the orderbatch from the orderlines as a second with clause

#ukoug2012 Really Using Analytic Functions139

Overlapping intervals

2012-12-05

...

), fifo as ( select s.loc, s.item, s.purch, least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty , sum_prv_qty + 1 from_qty, least(sum_qty, ord_qty) to_qty from ( select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and current row ),0) sum_qty from orderbatch o join inventory i on i.item = o.item ) s where s.sum_prv_qty < s.ord_qty)...

And the FIFO calculation with qty intervals as a third with clause

#ukoug2012 Really Using Analytic Functions140

Overlapping intervals

2012-12-05

with orderlines as ( ...), orderbatch as ( ...), fifo as ( ...)

select f.loc, f.item, f.purch, f.pick_qty, f.from_qty, f.to_qty , o.ordno, o.qty, o.from_qty, o.to_qty from fifo f join orderlines o on o.item = f.item and o.to_qty >= f.from_qty and o.from_qty <= f.to_qty order by f.item, f.purch, o.ordno

Now we join the fifo and orderlines on overlapping intervals

#ukoug2012 Really Using Analytic Functions141

Overlapping intervals

2012-12-05

LOC ITEM PURCH PICK_QTY FROM_QTY TO_QTY ORDNO QTY FROM_QTY TO_QTY ------ ---- ---------- -------- -------- ------ ----- ------ -------- ------ 1-A-20 A1 2004-11-01 18 1 18 51 24 1 24 2-A-02 A1 2004-11-02 24 19 42 51 24 1 24 2-A-02 A1 2004-11-02 24 19 42 62 8 25 32 2-A-02 A1 2004-11-02 24 19 42 73 16 33 48 1-C-05 A1 2004-11-03 6 43 48 73 16 33 48 1-B-15 B1 2004-11-02 2 1 2 51 18 1 18 1-C-04 B1 2004-11-03 12 3 14 51 18 1 18 2-D-23 B1 2004-11-04 1 15 15 51 18 1 18 1-B-11 B1 2004-11-05 4 16 19 51 18 1 18 1-B-11 B1 2004-11-05 4 16 19 73 6 19 24 1-A-02 B1 2004-11-06 5 20 24 73 6 19 24

The single pick of 24 at location 2-A-02 is joined to three orderlines all with overlapping intervals

#ukoug2012 Really Using Analytic Functions142

Individual pick qty

2012-12-05

with orderlines as ( ...), orderbatch as ( ...), fifo as ( ...)

select f.loc, f.item, f.purch, f.pick_qty, f.from_qty, f.to_qty , o.ordno, o.qty, o.from_qty, o.to_qty , least( f.loc_qty , least(o.to_qty, f.to_qty) - greatest(o.from_qty,

f.from_qty) + 1 ) pick_ord_qty from fifo f join orderlines o on o.item = f.item and o.to_qty >= f.from_qty and o.from_qty <= f.to_qty order by f.item, f.purch, o.ordno

The intervals can now be used for calculating how much from the location is picked for the individual order

#ukoug2012 Really Using Analytic Functions143

Individual pick qty

2012-12-05

LOC ITEM PURCH PICK_QTY FROM_QTY TO_QTY ORDNO QTY FROM_QTY TO_QTY PICK_ORD_QTY

------ ---- ---------- -------- -------- ------ ----- ------ -------- ------ ------------

1-A-20 A1 2004-11-01 18 1 18 51 24 1 24 18

2-A-02 A1 2004-11-02 24 19 42 51 24 1 24 6

2-A-02 A1 2004-11-02 24 19 42 62 8 25 32 8

2-A-02 A1 2004-11-02 24 19 42 73 16 33 48 10

1-C-05 A1 2004-11-03 6 43 48 73 16 33 48 6

1-B-15 B1 2004-11-02 2 1 2 51 18 1 18 2

1-C-04 B1 2004-11-03 12 3 14 51 18 1 18 12

2-D-23 B1 2004-11-04 1 15 15 51 18 1 18 1

1-B-11 B1 2004-11-05 4 16 19 51 18 1 18 3

1-B-11 B1 2004-11-05 4 16 19 73 6 19 24 1

1-A-02 B1 2004-11-06 5 20 24 73 6 19 24 5

#ukoug2012 Really Using Analytic Functions144

Pick list

2012-12-05

with orderlines as ( ...), orderbatch as ( ...), fifo as ( ...)

select f.loc , f.item , f.pick_qty pick_at_loc , o.ordno , least( f.loc_qty , least(o.to_qty, f.to_qty) - greatest(o.from_qty, f.from_qty) + 1 ) qty_for_ord from fifo f join orderlines o on o.item = f.item and o.to_qty >= f.from_qty and o.from_qty <= f.to_qty order by f.loc, o.ordno

Tidy up the select and order by location and we have new pick list

#ukoug2012 Really Using Analytic Functions145

Pick list

2012-12-05

LOC ITEM PICK_AT_LOC ORDNO QTY_FOR_ORD

------ ---- ----------- ----- -----------

1-A-02 B1 5 73 5

1-A-20 A1 18 51 18

1-B-11 B1 4 51 3

1-B-11 B1 4 73 1

1-B-15 B1 2 51 2

1-C-04 B1 12 51 12

1-C-05 A1 6 73 6

2-A-02 A1 24 51 6

2-A-02 A1 24 62 8

2-A-02 A1 24 73 10

2-D-23 B1 1 51 1

The operator now knows to pick 24 pieces of A1 at location 2-A-02 and distribute them with 6 for order 51, 8 for order 62 and 10 for order 73

#ukoug2012 Really Using Analytic Functions146

Pick list with route

2012-12-05

...(orderlines, orderbatch and fifo with clauses)...

), pick as ( select to_number(substr(f.loc,1,1)) warehouse , substr(f.loc,3,1) aisle , dense_rank() over ( order by to_number(substr(f.loc,1,1)), -- warehouse substr(f.loc,3,1) -- aisle ) aisle_no , to_number(substr(f.loc,5,2)) position , f.loc, f.item, f.pick_qty pick_at_loc, o.ordno , least( f.loc_qty , least(o.to_qty, f.to_qty) - greatest(o.from_qty, f.from_qty) + 1 ) qty_for_ord from fifo f join orderlines o on o.item = f.item and o.to_qty >= f.from_qty and o.from_qty <=

f.to_qty)...

Move the pick list into a fourth with clause

Add the ranking aisle_no calculation

#ukoug2012 Really Using Analytic Functions147

Pick list with route

2012-12-05

with orderlines as ( ...), orderbatch as ( ...), fifo as ( ...), pick as ( ...)

select p.loc, p.item, p.pick_at_loc, p.ordno, p.qty_for_ord from pick p order by p.warehouse , p.aisle_no , case when mod(p.aisle_no,2) = 1 then p.position else -p.position end

And so a big select of 4 with clauses and the final pick list of multiple orders by FIFO with efficient route

#ukoug2012 Really Using Analytic Functions148

Pick list with route

2012-12-05

LOC ITEM PICK_AT_LOC ORDNO QTY_FOR_ORD

------ ---- ----------- ----- -----------

1-A-02 B1 5 73 5

1-A-20 A1 18 51 18

1-B-15 B1 2 51 2

1-B-11 B1 4 51 3

1-B-11 B1 4 73 1

1-C-04 B1 12 51 12

1-C-05 A1 6 73 6

2-A-02 A1 24 51 6

2-A-02 A1 24 73 10

2-A-02 A1 24 62 8

2-D-23 B1 1 51 1

All done

What more can we wish for?

#ukoug2012 Really Using Analytic Functions149

Multi-order FIFO• Do FIFO on the sum of the orders• Calculate From/To qty intervals of picks• Calculate From/To qty intervals of orders• Join overlapping intervals

“Don’t you just love these kind of challenges?It’s why we do what we do!” – Monty Latiolais

2012-12-05

#ukoug2012 Really Using Analytic Functions150

Analytics forever…and ever and ever…

2012-12-05

#ukoug2012 Really Using Analytic Functions151

It never ends…

• We use analytic functions all the time

• We can’t imagine living without analytics

• WheelGuide®• Replenish shop stock• Call Center statistics• Spare parts guide• Customer count / work schedule / number

of orders• Booking calendar for mechanics• Shop space management• Discover idle hands• Detect seasonal variations for sales• Efficiency of Royal Danish Mail• …

2012-12-05

#ukoug2012 Really Using Analytic Functions152

Do It Yourself• Just start using analytics• The more you do the more often you find cases• When you start to think you need to process your data

procedurally – think again!• Use the power of SQL to let the database do the hard work

processing data• That’s what the database does best• And you’re paying for it so why not use it• If not – you are missing out on great functionality

2012-12-05

#ukoug2012 Really Using Analytic Functions153

Any questions?• Download presentation

from UKOUG• Or you can get presentation

as well as the scripts at:

http://goo.gl/g46b4@kibeha kibeha@gmail.com

http://dspsd.blogspot.com

2012-12-05

top related