December 2001
IAB Online Ad Measurement Study
The Interactive Advertising Bureau (IAB), the Media Rating Council (MRC) and the Advertising Research Foundation (ARF), along with PricewaterhouseCoopers are pleased to present the Online Advertising Measurement Study. This report aggregates data and information compiled by PricewaterhouseCoopers from 11 participating companies identified and selected by the IAB.
The 11 participating companies include a cross-section of the key industry players -- portals, destination sites, ad networks and third-party ad servers -- which combined represent nearly two-thirds of total industry revenues.
PricewaterhouseCoopers would like to thank all of the participants and their companies for their contribution of valuable time spent in the development of the IAB Online Ad Measurement Guidelines. We appreciate the support that the participants provided us to complete this study.
The overall purpose of this project is to determine the general source of measurement and reporting differences between the aforementioned industry players. The project findings in this report include a summary of those discrepancies found and contributing factors, in addition to recommendations designed to address the discrepancies in measurement and reporting of online advertising data.
This study was conducted independently by PricewaterhouseCoopers on behalf of the IAB. The report aggregates the findings generated from over 60 interviews with management responsible for key aspects of the online advertising process, in addition to validating leading online advertising metrics against definitions via scripted testing. This consulting report does not represent an audit as described by the AICPA.
The IAB/MRC/ARF have reviewed the results and proposed recommendations, subsequently worked with the 11 participants and other key constituents (e.g., AAAA) to develop a set of Industry Measurement Guidelines. The Guidelines represent a consensus of acceptable practices, and will be distributed to the industry at large on January 15, 2002 to adopt on a voluntary basis.
2 2001 PricewaterhouseCoopers. All rights reserved.
3rd Party Ad Networks / Ad Servers Avenue A DoubleClick Engage
Portals America Online Lycos MSN Yahoo!
Destination Sites CNET Forbes.com New York Times Digital Walt Disney Internet Group
3 2001 PricewaterhouseCoopers. All rights reserved.
Table of Contents
Background and Scope 4
About this Report.... 5
Detailed FindingsMetrics & Definitions... 6Processes & Controls.... 33
RecommendationsMetrics & Definitions... 44Processes & Controls.... 47
Appendix.. 49
Page
4 2001 PricewaterhouseCoopers. All rights reserved.
Background and ScopeThe primary objectives of the IAB Online Ad Measurement Study were to:
review the current measurement criteria and practices used by a representative group of sell-side companies for online advertising and audience measurement reporting
document and report the comparability of existing metrics used by the industry propose a common set of industry definitions and guidelines for data analysis and reporting
The scope of work consisted of: coordinating and working with 11 participating companies identified and selected by the IAB
three 3rd party ad servers / ad networks four destination sites four portal sites
scheduling and conducting on-site interviews with appropriate business unit management to understand and document each participants online advertising measurement and reporting system:
what types of audience and advertising data are measured how the data is measured and how it is reported
performing scripted testing to assess whether the participating companys collection and reporting systems track audience and advertising metrics in accordance with the companies definitions
determining discrepancies between definitions, editing procedures and reporting, and follow-up on testing issues
Our work focused on specific metrics and ad formats: a comprehensive list of audience measurement and ad delivery metrics was documented scope of this study focused on the five metrics identified on page 7 of this report five metrics were tested, with an emphasis on ad impressions and clicks
5 2001 PricewaterhouseCoopers. All rights reserved.
Background and Scope -- About this Report
Standard Metric Definitions + Well-Controlled Process =
Reliable Ad Campaign
Measurement Reporting
A fundamental premise of this report is that in order to achieve reliable, accurate and comparable ad campaign measurement reporting, there must exist a set of standardized metric definitions that are applied to a well-controlled process.
Refer to pages 33-41 for additional information on the concept of how a well-controlled process manages the risks inherent in high-volume, low-dollar transactions -- hallmarks of online advertising management systems.
This study focused on specific audience measurement and advertising metrics. There are, however, many other metrics and related research aspects that were not addressed, but can be identified through the work of various industry organizations (e.g., MRC Minimum Standards for Media Rating Research).
Detailed Findings: Metrics, Definitions, Audits
Online Ad Measurement Study
7 2001 PricewaterhouseCoopers. All rights reserved.
Five Primary Metrics Measure and Track Audience and Ad Transactions
Top Metrics
Based on responses from participants, the top five metrics used consistently for ad delivery reporting and audience measurement include the following:
Ad Impressions
Clicks
Unique Visitors
Total Visits
Page Impressions
Although other metrics exist, they are not uniformly defined or utilized. For example:
time spent on page number of completed user registrations conversions
8
11 11
10 10
0
1
2
3
4
5
6
7
8
9
10
11
AdImpressions
Clicks UniqueVisitors
Total Visits PageImpressions
#
P
a
r
t
i
c
i
p
a
n
t
s
Leading Ad and Audience Measurement Metrics
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
8 2001 PricewaterhouseCoopers. All rights reserved.
Ad Impressions are the Dominant Currency Metric for Ad Revenues
Ad Impressions are the dominant Currency Metric
While all five of the key metrics are tracked and reported, Ad Impressions are the dominant currency metric (metric upon which revenue-generating contracts are based).
Clicks as a Currency Metric
Only participants using a Cost-per-Action pricing model base contract revenues on click results. Three of eleven participants use Cost per Action.
Page Impressions - Secondary Currency Metric
In addition to ad impressions, one participant uses page impressions to track transactions where advertisers pay for embedded content or sponsor a page.
Other Currency Metrics
Other less common currency metrics include email ad metrics (e.g., subscribers, messages delivered, message opened) and ad metrics based on user actions (e.g., conversions and referrals).
3
00
3
11
0
1
2
3
4
5
6
7
8
9
10
11
AdImpressions
Clicks UniqueVisitors
Total Visits PageImpressions
#
P
a
r
t
i
c
i
p
a
n
t
s
Currency Metrics
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
9 2001 PricewaterhouseCoopers. All rights reserved.
Ad Impressions
10 2001 PricewaterhouseCoopers. All rights reserved.
Ad Impression Measurement: Server Initiated and Client InitiatedThe measurement of an ad impression transactions requested by a server or a client (browser) is tied to the process used to request the ad. Listed below are definitions of the two ad measurement processes that result from server or client initiated ad impression requests.
Server Initiated Measurement
This process occurs when a web server, prior to serving a web page to a user agent request (browser, robot, other), builds the web page with links to an ad resource (image/asset server, internal ad server, 3rd party ad server), and records an ad impression transaction. The ad impression transaction is recorded (via logs on the web server, or logs/real-time on an internal ad server) prior to serving the requested web page to the user agent.
Client Initiated Measurement
Occurs when the measurement of an ad impression is the result of a direct connection between the user agent (browser, robot, other), and the ad server. This process can take two forms:
1. Impressions served via advanced html tags (IFRAME/Javascript/ILAYER tags). In this case the ad server (typically) records an impression transaction and responds to the user agent with the contents of the selected creative, which may include html that refers to image/assets, or html that refers to another ad resource (3rd party server).
2. In some cases the ad impression transaction is recorded via an independent request (via an HTTP 302 redirect) to a special ad transaction logging server. This independent request may utilize web beacon technology and is initiated by the user agent at the same time it requests the image/rich media from the image/asset server.
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
11 2001 PricewaterhouseCoopers. All rights reserved.
Ad Impressions: Server Initiated vs Client Initiated Ad Requests
#
P
a
r
t
i
c
i
p
a
n
t
s
Process for Tracking ImpressionsThe delivery processes listed below are for requesting ad impressions only.
Server Initiated Ad Request
Five of eleven participants use a server initiated process to request the majority of ads.
Client Initiated Ad Request Seven of eleven participants use a client initiated
process to request the majority of ads.
One participant uses both client and server initiated approaches depending on where the ad is served in the ad management system.
5
7
0
1
2
3
4
5
6
7
8
9
10
11
Server Initiated Client Initiated
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Narrative
1. Browser calls Publisher Web Server.
2. Web Server calls Publisher Ad Server .
2a. Impression counted at the Publisher' Ad Server (See Type 1on chart) when the ad request occurs.
3. Publisher Ad Server responds with a reference to an ad asset(image, video, audio, etc) on an Asset or Image Server. Thereference may also be a link to a Third Party Ad server.
4. Publisher Web Server responds with HTML content includingembedded ad content from Publisher Ad Server and any adassets (image, audio, video) stored on the Publisher Web Server
4a. Impression counted at the Publisher Web Server(See Type 2 on chart) when the HTML content is prepared, butafter the Publisher Ad Server response.
5. If the ad involves a remote asset (i.e. image, audio, video notlocated on the web server), the browser requests asset from theasset server.5a. If the Publisher Web Server response includes a link to aThird Party Ad Server, the Browser will request the ad from theThird Party Ad Server.5b. The Third Party Ad Server will record the ad request.5c. The Third Party Ad Server will respond with a 302 redirect orHTML to an Asset Server.
6. The Asset Server responds to the asset request by sending theasset back to the browser.
Browser
Publisher Web Server
Asset or Image Server
4 (content response)1 (content request)
5 (asset request)
6 (asset response)
3 (ad response)
2 (ad request)Publisher Ad Server
Type 1:PublisherLoggingon AdServer
2a (Ad Server: ad request counted)
Type 2:Publisher
Logging onWeb Server
after retrievingAd
4a (Web Server: Ad request counted)
Third Party Ad Server
5c (ad response: 302)
5a (ad request)
Third PartyLogging
5b (Third Partyad request counted)
12 2001 PricewaterhouseCoopers. All rights reserved.
Server Initiated Ad Request and Counting Process
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
13 2001 PricewaterhouseCoopers. All rights reserved.
Server Initiated Process: Majority of Participants Count at the Ad Server
#
P
a
r
t
i
c
i
p
a
n
t
s
U
s
i
n
g
S
e
r
v
e
r
I
n
i
t
i
a
t
e
d
A
d
R
e
q
u
e
s
t
P
r
o
c
e
s
s
Server Initiated Transaction Logging PointFive of eleven participants use a Server Initiated Ad request process. The ad request process that a participant usesdetermines in part where the participant counts ad impressions. The statistics listed below are for counting ad impressions only.
Ad Sever Logging
Two of the four participants using a server initiated ad request process count ad impressions at the ad server, after receiving a request from the web server, but prior to rendering the content.
Web Sever Logging
One of the five participants counts the ad impressions at the web server prior to rendering the content, after the ad server responds to the web server request (illustrated as Type 2 on the Illustrative Server Initiated Ad Request diagram on page 12).
4
1
0
1
2
3
4
5
6
7
8
9
10
11
Ad Server Web Server
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Browser
Publisher Web Server
Third Party Ad Server
Asset or Image Server
2 (content response)
1 (content request)
5c (ad response: 302)
5a (ad request)
5 (asset request)
6 (asset response)
4 (ad response: 302 or HTML)3 (ad request)
Publisher Ad Server
Publisher Ad Counting Server
5f (ad response: 302)
5d (ad request to Counting Server)
Type 1:PublisherLogging
3a (ad request counted)Third Party
Logging
Type 2:PublisherLogging
5e (ad request counted)
5b (Third Partyad request counted)
Type 1: PublisherAd Logging prior toBrowser requesting
Ad Asset
Type 2: PublisherAd Logging
simultaneously toBrowser requesting
Ad Asset
Narrative
1. Browser (user agent) calls the Publisher Web Server.
2. Publisher Web Server responds with HTML content including areference to make a request to the Publisher Ad Server.
3. Browser parses the HTML from the Publisher Web Server andmakes secondary calls to the Publisher Ad Server (usually IMG/IFRAME SRC/ILAYER/SCRIPT SRC tags)
3a. Type 1: Publisher Ad Server records the ad impression prior toBrowser requesting ad asset.
4. Publisher Ad Server responds to Browser with a 302 redirect (if anIMG SRC Tag) or HTML.
5. Browser requests asset from Asset Server.
5a. If Publisher' Ad Server responds with a link to a Third Party AdServer, the Browser will request the ad from the Third Party Ad Server.
5b. The Third Party Ad Server records the ad request.
5c. Third Party Ad Server responds to the Browser with a 302 redirect (ifan IMG SRC Tag) or HTML to the Asset Server.
5d. A Publisher may record an ad impression at the same time theimage is rendered by the Browser by issuing a request to a Publisher AdCounting Server using either a web beacon (for rich media ads) or animage call to a portion of an ad.
5e. Type 2: Publisher Ad Counting Server records the ad impressionsimultaneously to Browser requesting ad asset.
5f. Publisher Ad Counting Server responds to the Browser with a 1x1transparent image (web beacon) or an image call to a portion of the ad.
6. Asset Server responds to the asset request from the Browser withthe image or rich media content.
14 2001 PricewaterhouseCoopers. All rights reserved.
Client Initiated Ad Request and Counting Process
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
15 2001 PricewaterhouseCoopers. All rights reserved.
Client Initiated Process: Majority of Participants Count at the Ad Server
#
P
a
r
t
i
c
i
p
a
n
t
s
U
s
i
n
g
C
l
i
e
n
t
I
n
i
t
i
a
t
e
d
A
d
R
e
q
u
e
s
t
P
r
o
c
e
s
s
Client Initiated Transaction Logging PointThe ad request process that a participant uses determines in part where the participant counts ad impressions. The statistics listed below are for counting ad impressions only.
Counting in a Client Initiated Ad Request Process
Six of the seven participants using a client initiated ad request process count ad impressions at the ad server, after receiving a request for an ad by the client.
One participant counts ad impressions at a separate ad logging server, after the ad server responds to the client, and when the client initiates a separate redirect call to the ad logging server.
6
1
0
1
2
3
4
5
6
7
8
9
10
11
Ad Server Separate Ad Logging Server
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
16 2001 PricewaterhouseCoopers. All rights reserved.
All Participants Support the Use of Cache Busting
All eleven participants support the use of cache busting technology. Cache busting mechanisms are employed to the reduce the potential for an ad request to be cached in either a web browser or a proxy server. Cached ads result in undercounting impressions, because the impression is being served from a proxy or browser cache, rather than an ad server.
Types of cache busting technology utilized include:
Appending a random number to the end of the ad request.
Appending a time/date stamp to the end of the ad request.
Third party ad serving firms provide cache-busting guidelines to websites that do not have cache busting capabilities.
11
00
1
2
3
4
5
6
7
8
9
10
11
Support Cache Busting No Support for Cache Busting
#
P
a
r
t
i
c
i
p
a
n
t
s
S
u
p
p
o
r
t
i
n
g
C
a
c
h
e
B
u
s
t
i
n
g
F
u
n
c
t
i
o
n
a
l
i
t
y
Cache Busting Technology Utilized
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
17 2001 PricewaterhouseCoopers. All rights reserved.
Clicks
18 2001 PricewaterhouseCoopers. All rights reserved.
Clicks: Uniform Definition and Use of 302 Redirects
All eleven participants track clicks and have a consistent approach to the definition.
Uniform Definition
The definition of the click metric is the most consistently accepted and applied of the five key metrics.
A click is a user-initiated action of clicking on an ad element, causing a re-direct to another web location. A click does not include information on whether or not the user completed the redirect transaction.
Use of 302 Redirects
All participants base click metrics on redirects (or transfers) successfully processed by the ad server.
11
00
1
2
3
4
5
6
7
8
9
10
11
Click tracking via 302 redirects Not tracking Clicks via 302redirects
#
P
a
r
t
i
c
i
p
a
n
t
s
Participants Using 302 Redirects for Tracking Clicks
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
19 2001 PricewaterhouseCoopers. All rights reserved.
Click Request and Counting Process
Narrative
1. Browser clicks on an ad which causes browser to requesta target site from the Publisher Ad/Click Transaction Server.The target site URL is typically included in the request.
1a. Ad/Click Transaction Server records the click.
2. Ad/Click Transaction Server responds to the Browser witha redirect (HTTP 302) to Target Site location.
3. Browser follows redirect to Target Site.3a.In the case of a third party-served ad, the target sitelocation is actually that of the Third Party Ad server.3b. The Third Party Ad/Click Server records the click request.3c. The Third Party Ad/Click Server responds with redirect tothe Target Site.
4. Target Site server responds to the Browser.
Browser
Target site
3 (target site request)
2 (click response: 302)1 (click request)
Publisher Ad/Click Transaction Server
PublisherClick
Logging1a (click request counted)
4 (target site response)
Third Party Ad/Click Server
3a (click request)
3c (click response: 302)
Third PartyClick
Logging
3b (Third Partyclick request counted)
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
20 2001 PricewaterhouseCoopers. All rights reserved.
Unique Visitors
21 2001 PricewaterhouseCoopers. All rights reserved.
Unique Visitors: Cookie and Registration Based Methods Utilized
Ten of eleven participants track unique visitors. There are two primary methods used to track each unique visitor:
Cookie Based Definition
Eight of the ten participants tracking unique visitors use cookies, and two of the eight also use IP Address in addition to cookies. Participants typically distinguish between recurring cookies (repeat visitors) and new cookies (new visitors or repeat visitors that delete cookies).
Registration Based Definition
Two of the ten participants tracking unique visitors use registered users or user login counts.
1
2
8
0
1
2
3
4
5
6
7
8
9
10
Cookie-Based Registrations Not Tracking UniqueVisitors
#
P
a
r
t
i
c
i
p
a
n
t
s
Methods for Tracking Unique Visitors
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
22 2001 PricewaterhouseCoopers. All rights reserved.
Cookie Based Unique Visitors: Process for Determining New Visitors
Among the eight participants tracking visitors using cookies, there are different techniques used to determine if a new cookie should be considered a new visitor.
Count All New Cookies
Four of the eight participants count all new cookies as new visitors. Two of the eight also use IP Address in addition to cookies for additional user validation.
Exclude All New Cookies
One participant does not count any new cookies as a new visitor. In this case, a unique cookie must visit the site at least twice to be considered a new visitor.
Exclude Some New Cookies Based On Historical Data
Three of the eight participants attempt to estimate the number of repeat visitors with new cookies using known user data. For example, a portion of new cookies from the unique visitor count is excluded based on an estimate of new cookies that represent repeat visitors that do not accept cookies, or have deleted cookies from previous visits.
3
1
4
0
1
2
3
4
5
6
7
8
9
10
Count All NewCookies
Exclude All NewCookies
Excludes Some NewCookies based on
Historical Data
#
P
a
r
t
i
c
i
p
a
n
t
s
Process for Determining New Visitors
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
23 2001 PricewaterhouseCoopers. All rights reserved.
Total Visits
24 2001 PricewaterhouseCoopers. All rights reserved.
Calculating Total Visits: Based on Actual, Sample or Statistical Analysis
Ten of the eleven participants calculate total visits. The specific business rules used to define a visit (or session) vary with most of the participants. For example, some participants use a time based attribute for terminating a visit after 30 minutes of inactivity. In addition to time-based rules, there are three methods used to calculate total visits.
Actual
Six of ten participants use all of the user activity data to calculate total visits.
Sampling
Three of ten participants use a sample (several days during the period) of user activity to estimate total visits. Some participants rely on outsourced service providers for this measurement.
Statistical Analysis
One participant performs statistical analysis to estimate total visits.
1
3
6
0
1
2
3
4
5
6
7
8
9
10
Actual Data Sampling Days orPeriods
Statistical Analysis
Use of Actual, Sample or Statistical Analysisto Determine Total Visits
#
P
a
r
t
i
c
i
p
a
n
t
s
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
25 2001 PricewaterhouseCoopers. All rights reserved.
Page Impressions
26 2001 PricewaterhouseCoopers. All rights reserved.
Page Impression Tracking
The page impression metric is used by eight of the eleven participants; the 3rd party ad server participants do not track this metric.
Publisher participants may use this metric if content-based advertising is embedded within a page or a page is sponsored by an advertiser.
Page impressions are tracked using two different methods:
Logging at the Web Server
Six of the eight participants use standard web server logs for page impressions. Page Impression transactions are usually only counted if accompanied by successful HTML status codes and are filtered from robotic activity.
Logging at a Separate Tracking Server
Two of the eight participants use web beacon technology to track page impressions. These participants utilize either 3rd party web metric tracking firms or internal tracking servers to record the request.
Page Impression Logging Points
6
2
0
1
2
3
4
5
6
7
8
9
10
11
Web Server Separate Tracking Server
#
P
a
r
t
i
c
i
p
a
n
t
s
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Page Impression Delivery and Measurement Process
(1) The user browser calls the web server using a URL entered by the user (may be initiated by the user or via an automated page refresh).
No participants measure page impressions at this point.
User Browser
Web Server
Separate Tracking Server
Data
Data
(2)(1)
(3) (4)
27 2001 PricewaterhouseCoopers. All rights reserved.
(2) The web server responds to the user browser by checking for an existing cookie and creating one if one does not exist. The web server then renders the content and web beacons if beacons are used for external tracking.
Six of eight publisher participants measure page impressions at this point.
(3) Where web beacons are used for tracking purposes, the users browser calls the ad server to request an invisible tracking image (i.e. one clear pixel) and passes on the users cookie along with the beacon request.
No participants track page impressions at this point.
(4) Where web beacons are used for tracking purposes, the ad server responds to the tracking request by rendering and logging the tracking image.
Two of eight publisher participants measure page impressions at this point.
Note: 3rd party servers may track page impressions at this point as part of specific Cost per Action contracts.
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
28 2001 PricewaterhouseCoopers. All rights reserved.
Filtering
Internal IP Addresses
Robots/Spider Activity
Filtered Log Data Data
29 2001 PricewaterhouseCoopers. All rights reserved.
All Participants Perform Some Level of Robotic Activity FilteringFiltering for robotic activity falls into three categories:
Basic
All eleven participants perform some basic robotic activity filtering. Basic filtering techniques include:
Use of a robot.txt file to prevent well-behaved robots from scanning the ad server
Exclusion of transactions from User Agent Strings that are either empty and/or contain the word bot
List of Known Robots
Eight of eleven participants also exclude transactions from lists of known robots. The lists are typically based on User Agent String and/or IP Address and may be maintained by a 3rd Party Auditor. The number of identified robots each participant lists varies from approximately ten to over seven hundred.
Behavioral Filtering
Two of eleven participants also conduct advanced behavioral filtering. These participants define business rules, such as 50 clicks by a single cookie during a daily period, to identify robotic behavior.
Robotic Filtering Approaches
2
8
11
0
1
2
3
4
5
6
7
8
9
10
11
Basic Filtering List Filtering Behavioral Filtering#
P
a
r
t
i
c
i
p
a
n
t
s
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
30 2001 PricewaterhouseCoopers. All rights reserved.
Filtering of Internal IP AddressesFour of eleven participants filter transactions from within their company by removing all activity originating from IP Address ranges on their company network.
The reasons for filtering this activity include:
- Eliminating any activity generated by internal monitoring tools. These tools are similar to robots and are often used to verify that a server is working properly.
- The demographic associated with a user within the company does not represent the primary demographic of the website.
- Removing all activity generated by a company user conducting testing within the live environment to ensure that a creative is being served properly.
Internal IP Address Filtering
7
4
0
1
2
3
4
5
6
7
8
9
10
11
Removes Internal IP Addresses Includes Internal IP Addresses
#
P
a
r
t
i
c
i
p
a
n
t
s
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
31 2001 PricewaterhouseCoopers. All rights reserved.
Independent Verification Audits
32 2001 PricewaterhouseCoopers. All rights reserved.
Independent VerificationNine of the eleven participants use 3rd Party firms to conduct independent audits over the ad delivery processes or individual (or campaign) transactions. Two of the eleven participants do notemploy independent audits.
Process Audits
Five of the eleven participants employ outside auditors to conduct process audits over the entire transactional process used to generate online ad metrics. These audits are conducted periodically (e.g.,every 6 months) and result in a audit opinion (performed under AICPA standards) over the effectiveness of controls and processes in place.
Activity Audits
Seven of the eleven participants employ outside firms to re-count their transactional data and provide verified metric reports to advertisers. These activity reports are typically produced monthly.
Both Process and Activity Audits
Three of the nine participants conduct both process and activityaudits.
No audits
Two of the eleven participants do not employ independent verification.
Types of Audits Conducted by Participants
5
7
2
0
1
2
3
4
5
6
7
8
9
10
11
Process Audit Activity Audit No Audit Performed
#
P
a
r
t
i
c
i
p
a
n
t
s
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Detailed Findings: Processes & Controls
Online Ad Measurement Study
34 2001 PricewaterhouseCoopers. All rights reserved.
The Online Advertising Process
The diagram on following slide illustrates the basic process involved with selling, delivering and reporting online advertising.
Slide 36 describes the concepts of how a well-controlled framework is the foundation for assessing control risks associated with managing and delivering online advertising, as well as reporting accurate, complete and reliable data.
Slides 37- 42 step through the online advertising process, beginning with the sales insertion order and ending with campaign reporting. Each of these slides describe the expected controls involved to successfully complete that component of the process.
Adjacent to the expected controls content box is the content box highlighting what was actually observed through inquiry with the eleven participants, including how many participants did not have the requisite controls in place to effectively manage that component of the process.
The resulting gap analysis from this benchmarking exercise provides the evidence supporting our findings and proposed recommendations for achieving reliable and comparable reporting metrics.
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Good Controls Result in Reliable Reporting
Standard Metric Definitions + Well-Controlled Process =
Reliable Reporting
35 2001 PricewaterhouseCoopers. All rights reserved.
A Simplified Overview of the Online Advertising Process
Sales Order Processing>>
Delivery
Advertisers Reporting
Trafficking
Data Aggregation
Area most critical to traffic metrics
Agreement on advertiser campaign requirements and sales terms in Insertion Order
Capture of Insertion Order detail in the delivery system
Establishment of creative in the delivery system
Delivery and recording of ad and traffic activity by the delivery system
Collection and aggregation of delivery and traffic data from the delivery system logs into the reporting system
Summarization and presentation of delivery or traffic data to external parties
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
36 2001 PricewaterhouseCoopers. All rights reserved.
(1)Capture
(2)Collection
(3)Formatting
(4)Filtering
(5)Summarization
(6)Extrapolation
(7)Compilation
(8)Presentation
1) Ad and/or traffic activity is captured in the server log or real time
Detailed Steps Involved in the Reporting Process
3) Logs are reformatted and/or sorted without altering values.
2) Transaction Data is collected from servers.
4) Logs are filtered to exclude invalid entries.-Robotic Activity-Internal IP addresses-Automated page refreshes
5) Data is summarized and metric results are calculated following company established definitions (see Appendix).
6) If data sampling is used for audience measurement, results are extrapolated to the entire population of data.
7) Metric data from separate systems is combined to determine the total metric.
8) Metric data is presented to the appropriate parties.
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Poorly Designed Controls Have Serious Consequences
PricewaterhouseCoopers
Process Category
! Sales
! Order Processing
! Trafficking
! Ad Delivery
! Data Aggregation
! Campaign Reporting
Do I/Os accurately reflect advertiser specifications?
Standard Criteria
Are ad campaigns loaded as specified in the I/O?
Are campaigns trafficked to the appropriate server?
Are ads being served on schedule to meet campaign commitments?
Are all ad transactions completely and accurately written to the logs?
Are logs from all servers consolidated for report generation?
Key Risks / Consequences! Manual data entry errors
! Inaccurate sales records
! Invalid insertion orders being processed
! Wrong ad creative loaded for delivery
! Improper ad modifications entered in the system
! Ad deliveries not recorded in the proper period
! Log files for ads delivered not reconciling with ads reported
! Over reporting of ad transactions
37 2001 PricewaterhouseCoopers. All rights reserved.Key Control Objectives -- Completeness, Accuracy. Validity & Restricted Access
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
38 2001 PricewaterhouseCoopers. All rights reserved.
Observed Weaknesses
VALIDITY
Format and content of Insertion Order varies by participant.
Initial Insertion Orders are typically signed by the advertiser before the campaign is established, but revision Insertion Orders are often not signed before processing.
COMPLETENESS & ACCURACY
Revision Insertion Orders are not always linked to the original Insertion Order.
RESTRICTED ACCESS
Security controls vary by participant.
Sales Process Cumbersome Due to Lack of Standard Insertion Order
Expected Controls
VALIDITY
Formal written, consistent I/O is used for all ad sales
Sales are recorded only after a valid sales order or contract has been properly authorized.
Changes made to existing sales contracts are properly authorized by the customer and reviewed by appropriate company personnel.
COMPLETENESS & ACCURACY
Controls exist to ensure that valid sales are recorded once and only once.
Appropriate personnel review Insertion Orders for accuracy after they are established in the system.
RESTRICTED ACCESS
Appropriate Access to Insertion Order data is restricted to appropriate users.
Sales personnel do not have the ability to execute campaigns in the delivery system.
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
39 2001 PricewaterhouseCoopers. All rights reserved.
Order Processing Controls Weaker for Revised Insertion Orders
Expected Controls
VALIDITY
Approval of the detailed Insertion Order should be obtained from both the client and internal management.
A clear policy and enforcement mechanism should exist for managing Insertion Order revisions.
COMPLETENESS & ACCURACY
Sales should create or review the Insertion Order directly in the delivery system to avoid errors due to manual data entry.
The order system should perform validation on data inputs including dates, guaranteed ad impressions, etc. to prevent errors.
A standard Insertion Order should be used for all ad sales.
RESTRICTED ACCESS
Access to input data should be restricted to appropriate individuals.
Observed Weaknesses
VALIDITY
Insertion Order format and content vary by participant.
The controls surrounding Insertion Order revisions are universally weaker than for the controls surrounding initial Insertion Orders.
COMPLETENESS & ACCURACY
Many of the participants manually enter the Insertion Order data into the delivery system from the Insertion Order.
Many of the participants order processing systems do not perform validations on key fields.
RESTRICTED ACCESS
Security controls vary by participant.
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
40 2001 PricewaterhouseCoopers. All rights reserved.
Management of General System Risks Often Informal
Expected Controls
SYSTEM MAINTENANCE
System changes are appropriately approved.
Systems changes are adequately tested.
Access to production is restricted to appropriate individuals.
SYSTEM SECURITY
Physical access is appropriately restricted.
Logical access is appropriately restricted.
SYSTEM OPERATIONS
A disaster recovery plans exists.
Data is backed up on a regular basis.
System performance/availability is monitored.
SYSTEM DEVELOPMENT
Projects are authorized and tracked.
Requirements are appropriately defined.
Controls are considered during development
Observed Weaknesses
SYSTEM MAINTENANCE
Change management policies and procedures are often informal.
SYSTEM SECURITY
Strength and nature of controls vary significantly by participant and by system.
SYSTEM OPERATIONS
Many participants do not have a functional disaster recovery plan
SYSTEM DEVELOPMENT
Control requirements are often left out of the development process
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
41 2001 PricewaterhouseCoopers. All rights reserved.
Security Controls Vary by Participant
Expected Controls
VALIDITY
All log generating servers are configured consistently, ensuring that proper naming conventions and logging is implemented for all servers.
COMPLETENESS & ACCURACY
Cache busting techniques are employed to ensure the capture of all user activity.
Controls exist to ensure the delivery system logs activity once and only once.
Controls exist to ensure that the delivery system logs activity in the proper period.
Campaigns are paced to achieve targets.
RESTRICTED ACCESS
Access to system resources and data is restricted using user groups and passwords.
Observed Weaknesses
VALIDITY
New servers are sometimes deployed and configured inconsistently.
COMPLETENESS & ACCURACY
Participants do not all support cache busting techniques.
RESTRICTED ACCESS
Security controls vary by participant.
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
42 2001 PricewaterhouseCoopers. All rights reserved.
Data Aggregation Exclusions Not Consistent
Expected Controls
VALIDITY
Error codes are excluded from metrics.
Non-user initiated activity (e.g. robots, spiders) is excluded from summarized activity.
Internal IP Addresses are excluded from aggregated data.
Automated checks identify corrupt or suspect data
COMPLETENESS & ACCURACY
Collection controls ensure all logs are collected once and only once.
An audit trail is maintained to report the number of files aggregated and lists the files that could not be processed.
RESTRICTED ACCESS
Access to logs is restricted to appropriate personnel to prevent tampering, loss or destruction.
Observed Weaknesses
VALIDITY
Robot filtering is not consistently used and involves various techniques.
Internal IP addresses are typically not filtered out of aggregated data.
COMPLETENESS & ACCURACY
Data retention policies differ by participant and may not satisfy audit or contractual requirements.
Participants are occasionally forced to restate previously reported activity due to corrupt or lost data.
Data recovery capabilities differ by participant.
Timing constraints or latency may cause reporting discrepancies.
RESTRICTED ACCESS
Security controls vary widely by participant.
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
43 2001 PricewaterhouseCoopers. All rights reserved.
Multiple Systems and Terms Increase Reporting Risk and Complexity
Expected Controls
VALIDITY
Reporting is based on the same standard metric definitions used during the sales process and included in the Insertion Order.
Estimates used in reporting are based on standard criteria and are periodically validated for reasonableness and accuracy.
COMPLETENESS & ACCURACY
Report functionality ensures that all data is summarized and presented for the specified period and advertiser.
Advertisers are notified when reporting data is available.
RESTRICTED ACCESS
Advertisers have direct and timely access to their data and only to their data.
Observed Weaknesses
VALIDITY
Participants do not always use the terms and definitions included the Insertion Order for reporting purposes.
Estimates used in reporting vary and are not consistently validated for reasonableness and accuracy.
COMPLETENESS & ACCURACY
Participants generally report the data only for the specified period with some exceptions.
Advertisers are generally notified when reporting data is available.
RESTRICTED ACCESS
Most participants allow advertisers direct access to reports via a web interface; however, some participants continue to compile and distribute reports manually from single or multiple data sources
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Proposed Recommendations: Metrics & Definitions
Online Ad Measurement Study
45 2001 PricewaterhouseCoopers. All rights reserved.
Proposed Recommendations for Ad Delivery & Audience Measurement Metrics
P
r
i
m
a
r
y
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Adopt standard definitions for the following metrics (include specific exclusions /inclusions and estimation criteria in these standard definitions):
1. ad impression -- Consider a definition that includes best practice filtering and logging attributes. Proposed definition: A measurement of responses from an ad delivery system to an ad request from the user browser, which is filtered from robotic activity and is recorded at a point as close as possible to the actual viewing of the creative material by the user browser.
2. click (through) -- Consider a definition that includes best practice filtering and logging attributes. Proposed definition: A measurement of the user-initiated action of clicking on an ad element, causing a re-direct to another web location. Tracked and reported as a 302 redirect at the ad server. This measurement is filtered for robotic activity and is recorded at a point as close as possible to the actual viewing of the destination web location by the user browser.
3. total visits Resolve whether the approaches to determining visitor counts can be addressed in one definition (i.e., cookies, user-registration) and require disclosure of the definition. Resolve whether session time limits should also be included in the definition(s).
4. unique visitor After resolving the two issues related to the visitor definition, consider the additional issues for defining unique visitors, including the use of sampling and estimates, and the treatment (i.e., include or exclude visitors that do not accept cookies) of new cookies for cookie-based calculations.
5. page impressions -- Consider a definition that includes best practice filtering and logging attributes. Proposed definition: A measurement of responses from a web server to a page request from the user browser, which is filtered from robotic activity and error codes, and is recorded at a point as close as possible to the actual viewing of the page by the user browser.
46 2001 PricewaterhouseCoopers. All rights reserved.
S
e
c
o
n
d
a
r
y
P
r
i
m
a
r
y
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reporting
Disclose the recording technique used for all ad delivery and audience measurement metrics using the agreed-upon industry definitions. Include suggested items to be disclosed such as time definitions, type of metric used, what is excluded and when you change your processes.
Use a different term for estimated reach calculations using a sample of user activity from the entire population.
Establish definitive periods for reporting (e.g., day, week, month, four-week period)
Consider adopting standard definitions for additional formats (e.g., email, rich media)
Proposed Recommendations for Ad Delivery & Audience Measurement Metrics
Recommendations: Processes & Controls
Online Ad Measurement Study
48 2001 PricewaterhouseCoopers. All rights reserved.
Proposed Recommendations for Processes and ControlsP
r
i
m
a
r
y
S
e
c
o
n
d
a
r
y
Standard Metric Definitions +
Well-Controlled Process =
Reliable Reportin
g
Develop standard for cache busting practices, including responsibility of respective parties in a 3rd party serving arrangement.
Standardize exclusions such as filtering for robots/spiders and auto page refreshes.
Require inclusion of general system technology controls (i.e. security, data backup/retention procedures, change control practices, etc) and other process controls .
Encourage the use of independent third parties to perform periodic reviews of compliance with the underlying processes used to generate reporting information.
Adopt a standard industry insertion order, including format, content, and standard terms & conditions.
Establish a clear policy and process for limiting and managing revision insertion orders
Appendix
Online Ad Measurement Study
50 2001 PricewaterhouseCoopers. All rights reserved.
Appendix: Project TeamThe study team included professionals specializing in the online advertising industry; study advisors and contributors were drawn from a variety of organizations
PricewaterhouseCoopers
Study Team
Study Advisors / Contributors
Engagement Partners: Tom Hyland, Russ Sapienza Project Directors: Suzanne Faulkner, Pete Petrusky, Troy Skabelund, Matt McKittrick Team Members: Michael Hulet, Chad Fisher, Jee Cho, Justin Wright,
Nic Pacholski, Brianna Sorenson, Ying Li
Interactive Advertising Bureau
Greg Stuart
Robin Webster
Media Rating Council
George Ivie
Advertising Research Foundation
Jim Spaeth
M&A assistance tax planning and compliance capital structuring employee benefits and
executive compensation packages
Appendix: Organizational ProfilesThe report has been conducted independently by PricewaterhouseCoopers on behalf of IAB.
PricewaterhouseCoopers (www.pwcglobal.com), the worlds largest professional services organization, helps its clients build value, manage risk and improve their performance. Drawing on the talents of more than 150,000 people in 150 countries, PricewaterhouseCoopers provides a full range of business advisory services to leading global, national and local companies and to public institutions.
PricewaterhouseCoopers New Media Group was the first practice of its kind at a Big Five firm. Currently located in New York, Los Angeles, Boston, Seattle and the Bay Area, our New Media Group includes accounting, tax and consulting professionals who have broad and deep experience in the three areas that converge to form New Media: advanced telecommunications, enabling software and content development/distribution.
management consulting business assurance services Web advertising delivery
auditing privacy auditing and
consultation
Founded in 1996, the Interactive Advertising Bureau (IAB) is the leading online global advertising industry trade association with over 300 active member companies in the United States alone.
IAB activities include evaluating and recommending standards and practices, fielding research to document the effectiveness of the online medium and educating the advertising industry about the use of online and digital advertising.
Current membership includes companies that are actively engaged in the sales of Internet advertising (publishers), with associate membership including companies that support advertising, interactive advertising agencies, measurement companies, research suppliers, technology providers, traffic companies and other organizations from related industries.
The IAB is an expanding global organization with certified International chapters and Corporate members in Asia and the Far East, North and South America, Europe and Eastern Europe, South Africa and The Caribbean.
2001 PricewaterhouseCoopers. All rights reserved.51
52 2001 PricewaterhouseCoopers. All rights reserved.
Appendix: Participating OrganizationsThe following companies participated in the study, selected by the IAB:
Destination Sites Portals3rd Party Ad Servers
and / or Networks