Unblocking The Main Thread Solving ANRs and Frozen Frames
Event Processing Applied to Streams of TV Channel Zaps and Sensor Middleware with Virtualization
1. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Event Processing Applied to
Streams of TV Channel Zaps and
Sensor Middleware with Virtualization
PhD Defense
P˚al Evensen
Department of Electrical Engineering and Computer Science
University of Stavanger, Norway
April 23rd, 2013
2. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Outline
Introduction
Background
Project Context
Research Contributions
SenseWrap
SenseWrap
EventCaster
System Architecture
Esper and EPL
Television Statistics
Background
Implementation
Statistics
Deployment
Paradigm Comparison
Performance
Complexity
Conclusions
AdScorer
Overview
Experiments
Live Scoring Results
Summary
10. Altibox at home
INTERNET
From 10 to 400
Mbit/s, both ways
DIGITAL TV
Choose between
150 TV channels,
film rental and more.
ALARM SERVICES
Alarm monitoring
centre. Direct alarm
connection to the fire
brigade.
MOBILE PHONE
SERVICES
Same rate, no
matter who you call
IP TELEPHONY
No charges to
Landlines in
Scandinavia
11. Tv
FILM RENTAL
Choose from
1000 titles
Now in HD too
FOOTBALL
The whole Premier
League and
Tippeligaen (Norwegian
football pools league)
NEWS
Up to date local news
TV-GUIDE
A complete overview
for your TV evening
12. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Research Contributions
Two main topics:
1. Sensors in smart homes
• Middleware for hiding the heterogeneity of devices
2. Event processing applied to TV channel zaps
• Architecture for efficient processing of high volumes of events
• Evaluating programming models for stateful event processing
13. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Outline
Introduction
Background
Project Context
Research Contributions
SenseWrap
SenseWrap
EventCaster
System Architecture
Esper and EPL
Television Statistics
Background
Implementation
Statistics
Deployment
Paradigm Comparison
Performance
Complexity
Conclusions
AdScorer
Overview
Experiments
Live Scoring Results
Summary
15. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
SenseWrap Features
• IP-enabling sensor devices
• Uniform interface to hetereogeneous devices
• A “blueprint” for developers of smart home applications
• Automatic network configuration and service discovery
16. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
SenseWrap Architecture
Virtual
Sensor
Physical sensors
Sensewrap
Virtual
Sensor
Sensor protocol (ZigBee, Bluetooth, etc)
z
e
r
o
c
o
n
f
Driver Driver
Gateway
Service
UDP TCP
Service
TCP
Service
HTTP
Client Client Client
17. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Outline
Introduction
Background
Project Context
Research Contributions
SenseWrap
SenseWrap
EventCaster
System Architecture
Esper and EPL
Television Statistics
Background
Implementation
Statistics
Deployment
Paradigm Comparison
Performance
Complexity
Conclusions
AdScorer
Overview
Experiments
Live Scoring Results
Summary
21. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Event Processing Language (EPL)
• Declarative query language derived from SQL
• Operates on stream data as opposed to relational data
• Looks for event patterns that matches the query, and
produces an output event
• Includes additional operators, such as sliding windows
• Part of the Esper framework (Open Source)
23. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Sliding Windows
win:time(10 sec)
time
0 10 20
now
e1 e2 e3 e4 e5
1 select ∗ from ChannelWin.win:time(10 sec)
24. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Event Patterns
Simple
A
Event
Complex
C
Event
Simple
B
Event
1 select ∗ from pattern [every a=A −> b=B]
25. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Outline
Introduction
Background
Project Context
Research Contributions
SenseWrap
SenseWrap
EventCaster
System Architecture
Esper and EPL
Television Statistics
Background
Implementation
Statistics
Deployment
Paradigm Comparison
Performance
Complexity
Conclusions
AdScorer
Overview
Experiments
Live Scoring Results
Summary
26. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Current State of TV Viewer Statistics
• Sample size is 0.045 % of Norwegian television households
• Only 0.022 % of American households are sampled
• Data collection requires specialized equipment
• Transferred once a day
27. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Motivation
• IP-based Set Top Boxes (STBs) allows for more accurate
statistics
• Ads can be evaluated on an individual basis
• Additional behavioral markers such as mute and volume
changes allows for better understanding
28. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
The Future of Media Measurement
• Current measurement methods for TV not in line with new
models of media consumption
• Gartner: The online/offline division of media will be replaced
by measured/unmeasured by 2015
• STB data increasingly being used to augment traditional TV
ratings
29. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Altibox Deployment of IPTV
• Over 320,000 STBs deployed in Norway and Denmark
30. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Our Scenario
• Two-way communication
• Immediate transfer of zap, mute, hdmi and volume events
31. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Observed Events
• Channel change event (also called a zap event)
• HDMI status event: TV set on/off
• STB audio on/off event (mute)
• STB volume change event
32. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Number of zap events/day over a 15-day period
31 1 2 3 4 5 6 7 8 9 1011121314
2
2.2
2.4
2.6
·106
Avg=2212097
35. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Trude/18
Sykkel:Girod`Italia14.etappe
Sport&Spill
Reisemål
Lørdagsmagasinet
Sportsnyhetene
Været
Vinnpåminuttet9:12
Nyhetene
GodkveldNorge
Sportsnyhetene
Film:Mordpåkreditt
Sykkel:Girod`Italia-oppsummering
Farmen30:30
Nyhetene
Akvariet
TV2hjelperdeg11
0
10000
20000
30000
40000
50000
60000
12:00 14:00 16:00 18:00 20:00 22:00
Viewers
Time
Viewers for TV2 Norge + TV2 HD, Saturday 21.05.2011, Separated by Zip Codes
Total viewers
0001-3999
4000-5999
6000-7999
8000-9999
36. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
30000
35000
40000
45000
50000
55000
60000
65000
70000
10:00 12:00 14:00 16:00 18:00 20:00 22:00
Seere
Tidspunkt
Seerstatistikk for TV2 Norge og NRK1, Lørdag 21.05.2011
TV2 Norge + HD
NRK1 + HD
37. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Annoyance Detection
• Aimed at detecting ads/programs that is causing viewers to
change channel
• Triggers an output event on rapid drop in viewers on a
particular channel
• Sliding window algorithm, continously comparing viewer
number with the last minute average
38. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Annoyance Detector
Annoyance detector
ZapSnap
.win.time
(1 min)
channelName,
viewers,
avg(viewers)
avg(viewers)
having(...)
channelName,
viewers
1 select channelName, viewers, avg(viewers)
2 from ChannelWin(viewers > 2000).win:time(1 min)
3 group by channelName
4 having viewers < avg(viewers) ∗ 0.85
39. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Deployment Details
• Processes over 40,000 events/min at peak hours
• Deployed in the Altibox network across four servers:
ZapCollector Message
bus
Database
Core
+
Manager
47. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Memory consumption
From memory From disk
600
800
1,000
1,200
1,400
1,600 1,570
1,401
590 577
Memoryused(MB)
Esper Java
48. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Performance Evaluation - Observations
• The Java implementation outperforms the Esper version
• Can be accredited to abstraction overhead imposed by the
Esper engine
• Performance margin decrease as network overhead is
introduced, but is still significant
• UDP benchmarks offer 45% higher throughput with the Java
version
49. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Halstead’s Metrics
• Metrics that describe the complexity and effort required to
write a program
• Sees a program as a series of tokens, classified as either
operands or operators
50. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Complexity Metrics
0 100 200 300 400
Vocabulary (n1+n2)
Program length (N1+N2)
Total operators (N1)
Total operands (N2)
Unique operators (n1)
Unique operands (n2)
Lines of code
(Java)
(EPL)
Viewer Statistics
Annoyance Detector
Parsing and Query Setup
Utility Functions
51. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Complexity Evaluation - Observations
• The EPL implementation scores slightly better for the viewer
statistics application
• EPL scores significantly better for the annoyance detector
• Initial effort was similar for both applications
• Expanding EPL applications is significantly easier than Java
52. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Summary
• The Java implementation is the most performant
• The Esper implementation is easier to expand
• Can be attributed to different degrees of abstraction
53. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Outline
Introduction
Background
Project Context
Research Contributions
SenseWrap
SenseWrap
EventCaster
System Architecture
Esper and EPL
Television Statistics
Background
Implementation
Statistics
Deployment
Paradigm Comparison
Performance
Complexity
Conclusions
AdScorer
Overview
Experiments
Live Scoring Results
Summary
54. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Deployment
TV Network
AdDetector
Mute
AdSuccess-
Evaluator
STBs
Inputadapters Stats
Ad start
Channel zap events Filtering,
transformation
Zap
Ad success score
Historical
stats
Processing
QueueingInput
55. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Defining an advertisement as a context in EPL
Listing 1: Context declaration
1 create context AdBreakCtx as
2 initiated by
3 AdIdentified(begin=true) as ad
4 terminated by
5 AdIdentified(detectId=ad.detectId, begin=false) as endAd
Listing 2: Populating the context
1 context AdBreakCtx
2 create window STBsnapshots.win:keepall() as STBWin
3 insert where
4 channel in (context.ad.channel) and hdmi=true and mute=false
56. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Experimental Setup for AdScorer Evaluation
• 1.5 hour of prime time TV was recorded
• Advertisement times were manually recorded
• 23 days of STB data was sent through the system in order to
reach a correct state
• Logged STB data and AdIdentified-events was then pushed to
the system
• Simulation run
57. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Viewer numbers
Figure: Viewership (in thousands) for the three largest channels, NRK1,
TV2, and TVN.
58. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Viewer numbers
12
13
14
15
16
17
18
19
20
21
S E S E S E
TVN
Figure: Viewership (in thousands) for a single channel (TVN).
61. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Raw output for a single ad
1 Event: {
2 AdId: Mitsubishi ASX,
3 channel: TV2 Norge,
4 startTime: 22:55:12,
5 stopTime: 22:55:32,
6 viewersBegin: 83846,
7 retained: 79244,
8 IAR: 94.51,
9 mutes: 141,
10 viewersLost: 4602,
11 average volume: 49.64
12 }
62. Introduction SenseWrap EventCaster Television Statistics Paradigm Comparison AdScorer
Summary
• AdScorer is capable of delivering an unprecedented level of
detail, not possible through the current measurement regime
• The results indicate that our implementation is capable of
scoring advertisements on multiple channels simultaneously in
near real-time with consistent results