3. AGENDA
Background
HTTP Streaming - where we are and where we are coming from
Varnish and Streaming
Two types of streaming and how they are different
Live Streaming
OTT Streaming
4. Media Streaming on HTTP - Why?
HTTP is the closest we have to a universal transport protocol. Content is cacheable
The web is full of Audio and Video media besides static objects such as images
High latency = A global audience with a content origin located far away
Far away means Slow. Users have no patience, they want the content NOW!
Solution: Break up media file into smaller objects and provide a manifest (playlist).
This allows for HTTP Caching to ensure efficient delivery infrastructure
5. September 5, 1995
First Live Stream Event
ESPN SportsZone internet-radio
streams a baseball game using
Progressive Networks technology
Mid 2000s
Vast majority of Internet traffic was
HTTP-based and content delivery
networks (CDNs) were increasingly being
used. Move Networks introduces HTTP-
based adaptive streaming.
2020?
Video rules the world and the
amount of existing media is
enormous.
Over 80% of content on the
Internet will be streamed media.
Mid 2010s
Facebook, YouTube, Netflix and Spotify
have mobile users as their primary
market segment. Most content on the
Internet is media (Audio/Video). Cloud
solutions and Hybrid CDN usage
dominate the content delivery space.
6.
7. Streaming video accounts for over two-
thirds of all internet traffic, and this share is
expected to jump to 82% by 2020,
according to Cisco's June 2016 Visual
Networking Index report
8. Distribution Infrastructure
Media Server
Principles for HTTP streaming
Audio/Video
Input
Media
Encoder
Stream
Segmenter
Origin Web Server
Index
File
.ts
HTTP Client
Origin Web Server, e.g. Adobe Media Server, Is quite slow managing each request.
It would require a large number of servers to deliver to a wide, global audience
9. Distribution Infrastructure
Media Server
Principles for HTTP streaming
Audio/Video
Input
Media
Encoder
Stream
Segmenter
Origin Web Server
Index
File
.ts
HTTP Client
Origin Web Server, e.g. Adobe Media Server, Is quite slow managing each request.
It would require a large number of servers to deliver to a wide, global audience
This provides an excellent
opportunity for HTTP Caching!
10. Distribution Infrastructure
Media Server
Principles for HTTP streaming with Varnish
Audio/Video
Input
Media
Encoder
Stream
Segmenter
Origin Web Server
Index
File
.ts
HTTP Client
A Varnish Server would take care of all transactions and deliver instantly.
As an additional bonus it offers an efficient origin-shield, as very few requests
would reach the origin web server.
Varnish Server
11. 2008
Globo (2nd largest commercial TV network)
Streams Big Brother Brasil online reaching million of users with multiple 24/7 streams broadcast.
Ustream, now IBM Live Video
Uses Varnish to cache Live streams and for GeoIP recognition capabilities.
2009
Consortium of public broadcasters in Germany (ARD)
Increases capacity of an Apple’s HLS linear stream server by delivering Video over IP with Varnish.
2012
BBC, London Olympics: “Varnish is very performant when it comes to live streaming.”
2014
Star TV (India) using Accenture Video Solution
25 million video views with Varnish during an Indian-Pakistan cricket match.
2016
Varnish in use for HTTP Media Streaming
12. Live Streaming (Linear)
Streaming of a live event
where all viewers watch the
same event (in one or a few
formats).
Dataset is very limited.
Two Types of Streaming
OTT Streaming
Streaming videos On Demand
from a huge catalogue of
titles.
Dataset is massive
14. Varnish 4 introduces separated frontend and backend requests and many other
streaming related improvements
Basic Live streaming can be done out of the box with Varnish Cache, and works
well for small scale projects and tests. At large scale, its shortcomings will be
visible
Varnish Plus builds on the solid base of Varnish Cache and adds all the
necessary additional pieces to the Media Streaming puzzle
What can it do? 6000 HD streams @3Mbit H264 and up to 80 Gbps throughput
per server
Adapting Varnish for HTTP Media Streaming
16. Developing Varnish Plus for HTTP Media Streaming
2008 - Varnish 2 - Objects need to be fetched entirely before being served.
Clients had to wait for Varnish to receive full object
2011 - Varnish 3 - One client can stream content but other clients need to wait
2011 - Varnish 3.0.2 s+ branch is developed for Comcast, Facebook, Brightcove and Tidal
Fetch&Pass allows 1...n clients to stream content. It is required to support multiple client live streams.
2012 - Varnish 3.0.3 plus branch stabilized for SFR, Hostworks and Dyn.
Includes Streaming and Persistence are the pillars for Media Content caching, specially for mobile
2013 - Varnish Plus is launched - Streaming and basic Persistent storage featured and stable.
Features our world-class support and Varnish enhancement software and admin, monitoring tools.
2014 - Varnish 4 - Streaming gets support. Persistence storage is deprecated.
2014 - Varnish Plus introduces Massive Storage Engine (MSE) & Varnish High Availability (VHA)
2015 - Varnish Plus adds SSL/TLS support on both the HTTP backend and client side
2016 - Varnish Plus introduces Massive Storage Engine 2 with Persistence
17. Varnish High Availability (Peer replication)
In live streaming, all chunk requests are going to be focused on the few “current”
video segment, making for a very predictable caching strategy.
VHA allows to leverage this by replicating chunks across the whole caching layer
with no additional load for the origin.
The bigger the number of cache servers, the bigger the savings in bandwidth
towards origin and computing and network resources.
20. Varnish High Availability
To learn all about VHA, don’t miss Francisco’s talk at 14.10!
Varnish 1 Varnish 2
Backen
d
Backen
dBackend
Cluster
Client
VHA
V
H
A
VHA
21. Varnish High Availibilty
(Peer Replication)
Real Time Statistics,
Admin & Monitoring Tools
“Unlimited Dataset”
Massive Storage Engine
Unbroken SSL/TLS chain
back to origin servers
Supported by core
Developers
Varnish Plus Provides
Stable versions
(No forced Upgrades)
Live Streaming with Varnish Plus
22. How to size your Live streaming infrastructure
Streaming Requirements
Example: Regular 24-30 Fps Movie
Varnish server with 2x10Gb interfaces for 20Gb bonded link
Resolution (p) Kbps
Theoretical number of
concurrent streams per
Server
240 400 50,000
360 800 25,000
480 1200 16,667
720 2400 8,333
1080 4800 4,167
24. Distribution Infrastructure
Media Storage Server
What about OTT?
Over-The-Top Content
Origin Web Server
Index
File
.ts
HTTP Client
Varnish Plus Server
OTT requires secure HTTP transport (client and backend SSL/TLS), bandwidth
saving strategies (VHA peer replication) and a solution for video content library
storage size management, which means a cache in the 100s of TBs (MSE with
persistence can do 100+ TBs caches).
26. Varnish High Availibilty
(Peer Replication)
Real Time Statistics,
Admin & Monitoring Tools
“Unlimited Dataset”
Massive Storage Engine
“Persistent Dataset”
Massive Storage Engine
Unbroken SSL/TLS chain
back to origin servers
Supported by core
Developers
Varnish Plus Provides
Stable versions
(No forced Upgrades)
Know-How from experts
that have engineered
such solutions before
OTT Streaming with Varnish Plus
30. Other companies using Varnish for Streaming
Viaplay
Tidal
SoundCloud
Twitch
YouSee
SFR
Bell Canada
Echostar
Dish Network
Sling Box TV
RTE
Deutsche Telekom
Vimond
Altibox
Freesat
Multichoice
Sky Cable Philippines
RTS
Globo
Videotron
Amedia
MTV Europe
Blizzard
Canadian Broadcast Corp
Cinesoft
Vimeo
Eurosport
TF1
France Television
NRK
BBC
ESPN
NBA
The New York Times
Stackpath/MaxCDN
Cachefly
32. What does the future look like?
In 2020, 80% of all traffic on the Internet will be Video
OTT and more specifically Video media have been, are and will continue to be a
very important decision driving factor for future versions of Varnish
Roadmap / Plans:
Pre-fetching (Q3)
Video Dashboards - Real Time Statistics
Better support for HTTP Long Polling
33. What we are cooking in the Varnish Lab
MSE3
Improve start-up time
Better I/O utilization
Use Hugepages (on Linux)
Removing transient storage
A better LRU than LRU algorithm
VHA.next
Supporting POST request replication
35. Streaming on HTTP
The technology had a huge impact because it allowed streaming media to be distributed far and wide using
CDNs (over standard HTTP) and cached for efficiency, while at the same time eliminating annoying
buffering and connectivity issues for customers.
Other HTTP-based adaptive streaming solutions soon followed:
Microsoft Smooth 2008
Netflix 2008 (developed its own technology)
Apple HTTP Live Streaming (HLS), 2009
Adobe, HTTP Dynamic Streaming (HDS), 2010
MPEG-DASH, 2014