Varnish Introduces the Varnish Streaming Server

Varnish Software, a long-time leader in caching software and digital content delivery, recently unveiled a new package for streaming content, the Varnish Streaming Server. The Swedish-headquartered firm has been providing streaming solutions for a number of years to various global companies, including Sky, RTÉ and Twitch. The only difference with the new product is that it’s all packaged into a single solution, which Varnish describes as “scalable, easy and flexible”.

The motivation behind the new package? The huge growth in streaming taking place worldwide. In Cisco’s latest research report, they predict that 82% of IP traffic will be video by 2021. Companies across industries and sectors (from universities to car makers) are seeing unparalleled growth in video as part of their content delivery strategy. This includes live video, which has become more important in many sectors, including healthcare, education and recruitment. Inside this rapidly growing landscape, Varnish wanted to make their services for over-the-top (OTT), video on demand (VoD) and live streaming as straightforward as possible.

The biggest challenges for streaming relate to quality and volume. Users expect seamless, rapid delivery at a high quality. Companies on their own will not be able to solve all the challenges associated with streaming, such as network bandwidth limitations or compression efficiency. However, there are some areas that are possible to control and it is these that Varnish is focusing its attention on. These include scalability of software, latency, resilience, flexibility, origin shield/backend protection, transparency and security.

Varnish has an edge in terms of its HTTP streaming capabilities because of its leading expertise in caching software and its uniquely flexible caching tehnology. In its own words, “HTTP is what Varnish was built for”. Its Streaming Server can be deployed in multiple ways to address the most persistent challenges related to streaming:

  • As a “standalone component” for delivering video, which can offer an efficient way to scale out a company’s platform
  • As a storage platform for delivering huge amounts of content/traffic from a single location resourcefully (high-volume VoD)
  • As an “origin shield” – when used in conjunction with a CDN to protect backend and content
  • As a complex policy and logic engine that enables essential services, such as geographically restricted content
  • As a performance engine for reducing latency by putting a pre-warmed cache in place/pre-fetch technology guaranteeing that you deliver content quickly.

Varnish Streaming Server has six critical features, which allow it to deliver the solutions listed above:

(i) Stores and serves huge amounts of content through the Varnish Massive Storage Engine (MSE), which provides a large amount of local cache storage. It is built for up to 100+ terrabytes of storage on each node, and has a persistent datastore so that the entire cache isn’t lost on restart.

(ii) Its Origin Shield protects your backend from excess traffic, allowing for a solid streaming performance and a reasonable cache-hit rate, even during live streams. An extra layer of security is also in-built as all the traffic passes through Varnish. Cache replication will be built into your architecture to ensure that there is no single point of failure. Unlike CDNs, Varnish is able to act both as a content replication engine and as a protective layer against traffic floods, which reduces pressure on the origin and leads to greater resilience of your streaming service.

(iii) Varnish Configuration Language (VCL) is highly customizable and flexible, allowing you to configure and control the logic of your content. This is done through the Varnish additional modules (VMODs), and might for instance, include geo-blocking at a country or even city level. VMODs can also be used for flexible rate limiting, filtering IP addresses and abuse suppression.

(iv) Prefetching content in VoD is available to “keep your cache warm” and boost performance by anticipating future needs. The VMOD-http allows you to act predictively, anticipating what chunks of content a user client will most logically request next, allowing you to prefetch that particular content. When that estimate is accurate, latency should be reduced as the timelapse between content request and return is eliminated.

(v) Compared to many streaming solutions on the market which function more like a “black box” and do not offer users insight into how they work, or provide features to allow them to reconfigure the service to their needs, Varnish has prioritized transparency from the start. Compared to other caching solutions, Varnish has a “very verbose log output”, meaning it writes to a circular buffer called Varnish Shared Memory Log (VSL), which the user can access and read.

(vi) Security features are built into the Varnish Streaming Server, including secure connections with TLS/SSL despite the varying requirements of content in relation to SSL. This improves security on the Varnish platform, particularly over mobile. One of Varnish’s popular features is Varnish Total Encryption, which will encrypt the entire cache in order to safeguard cache data against bugs. It also prevents the cache leak, which “provides the lockdown the cache historically lacked”.

Varnish is pitching its Streaming Server at CDNs and all types of content provider, from corporations to media outlets to broadcast networks.

In its white paper, Varnish promises of its service, “We have helped our customers globally to build advanced, scalable and fast streaming solutions on their own terms through the whole lifecycle of the software: Design, feature development and enhancements, implementation and optimization. Varnish Streaming Server offers all the flexibility and performance to make the streaming experience high performance, robust and efficient while giving end-users what they want and expect.

Digiprove sealCopyright secured by Digiprove © 2018