Quantcast
Channel: Digital TV Europe » Mikael Dahlgren
Viewing all articles
Browse latest Browse all 2

Quality assurance and the promise of ‘big data’

$
0
0

Stock Traders Viewing MonitorsQuality assurance software for multiscreen and OTT can help deliver a better experience and retain subsribers, but is the transformative promise of ‘big data’ from IP systems oversold? Stuart Thomson reports.

Amid the rush to launch and develop multiscreen and over-the-top TV services, how to ensure the consistent – and consistently high – quality on such services remains something of a challenge.

Specialist technology suppliers have for some time been providing solutions to enable service providers to deliver better Quality of Service for multiscreen TV services, while broadcasters have also sought to improve the delivery of catch-up and on-demand services delivered to the main screen via IP.

However, there is still uncertainty about how to make money from OTT services, as well as the best business model for multiscreen delivery, meaning that whether there is a case for investment in quality assurance technologies is sometimes still a tough call, particularly in the case of technologies that monitor the experience of all users in a service provider’s subscriber base or footprint.

In this environment, there has been much talk recently about the promise of ‘big data’ – the totality of information about user behaviour and the performance of different services and delivery architectures that could be provided by quality assurance agents embedded in user devices at the player level. This, it is said, could help transform the way in which service providers evolve their offerings over time.

Business case

For multiscreen services offered by pay TV providers in particular, quality assurance itself remains a key concern, but questions remain over how cost-effective it is for service providers to invest heavily in the technology.

It is useful to distinguish in this context between multiscreen or TV everywhere services that are delivered by a pay TV service provider over their own network, services of the same type that allow users to access the content while roaming, where it may be delivered over third-party networks, and ‘pure’ over-the-top services delivered by content providers or aggregators over the web to multiple devices.

“Normally, to deliver some form of transparency [to operators] is a cost saver, but it depends on how you calculate things. There is a difference between business cases for pure OTT, multiscreen and free-to-air,” says Mikaël Dahlgren, CEO of quality assurance provider Agama Technologies. “The business cases are also different in different parts of the world. For many quality assurance vendors the free-to-air market has not been a big area to focus on, because there is a straightforward business case for [quality assurance for] pay TV customers.”

For Simen Frostad, CEO of quality assurance specialist Bridge Technologies, the acquisition of systems designed to check quality of video after it has emerged from the operator’s headend and before it disappears into the cloud is now a “pretty mature” market.

However, he says, there is a distinction to be drawn between systems of this type taking the ultimate step of installing agents on all consumer premises equipment.

“When you want to check individual devices the cost is still a concern,” says Frostad.

According to Yoann Hinard, professional services manager at quality assurance specialist Witbe, one of the main changes that has taken place over the last year or so is that service providers are now focusing more on quality of delivery over third-party networks through using multiple transit partners and CDNs.

“Service providers wanted to make things work on their own network and were satisfied with best-effort on third-party network. The main change is that they now rely on multiple CDN providers to deliver good quality on any kind of network,” says Hinard.

What to monitor

One of the key questions facing service providers is what exactly to monitor.  The use of CDN providers implies a loss of control and also means that content owners and operators are paying a lot to CDN providers. Witbe’s customers, says Hinard, want to make sure that what comes out at the other end matches what is promised. “They want to check the CDN provider can deliver the necessary performance on the ISPs’ networks including peering, transit and all the things CDN providers can put on different networks to make it work. They don’t care how it is done but it must work,” he says.

Content and service providers can use quality assurance technology to monitor what is played on devices once the content has passed through the cloud. However, says Hinard, they are still some way off automating the process of choosing the best CDN to ensure optimal delivery of their content. “A lot of customers have tried automated switching but these solutions have not met their expectations that they could solve everything,” he says. “So they are working on monitoring the performance of each CDN individually and checking how that changes over time.”

In the best-case scenario, quality assurance technology can monitor what is being played out from the headend, what is being delivered by the CDN and what the user experience is on a selection of the most popular devices, says Rinard. Witbe typically will deploy a mix of protocol-layer probes and robots to monitor the experience on a sample of a range of devices.

“To understand customer Quality of Experience you have to look at the headend where you create your services because the service can only be as good as what you create,” says Johan Görsjö, director of product management at Agama. “But you have to look at the subscriber experience to understand what is the actual quality of what’s delivered. You have multiple CDNs. You could be using a hotspot or [be] in the home.” For Görsjö, quality assurance measures should focus on the end points of the delivery chain – what comes out of the headend and what the experience is on the end user’s device. “The part in between doesn’t require an architecture tailored for each and every deployment because you have different ways of reaching end customers. You may use a number of CDNs and it could be difficult to mandate them to install monitoring equipment in their data centres,” he says.

Kirk George, marketing and strategy director at another quality assurance specialist, IneoQuest, says it has “virtualised” its monitoring probe and enabled it to monitor the performance of a service in a particular region – even when the CDN over which content is delivered is provided by a third party, something it already does for customers.

“We notice errors and we have someone manning it 24/7 – from the metrics we create we can tell [our customer] what the problem is with their CDN provider,” he says. “We are looking for other service provider customers as they build their own CDN. Any service provider that builds their own CDN wants to have visibility all the way to the subscriber and they want to advertise to customers that offer a better viewing experience.”

George says that IneoQuest is currently selling its technology to pay TV service providers, OTT providers and manufacturers alike, and is also seeing opportunities with mobile operators. “Mobile operators are the conduits for content but subscribers are leaving their network provider because they are blaming them for poor quality video on the network,” he says.

George says that, in the case of mobile providers, IneoQuest can measure at the level of individual cells.

Mariner Partners, whose customers for OTT quality assurance include French service provider Bouygues Telecom, is another vendor that emphasises its ability to deliver solutions flexibly via software. “This allows the operator to avoid relying on hardware technology,” says president and general manager Marc Savoie. “It leverages the fact that you can virtualise all of this…in the cloud and scale it up on-demand as you need it.”

Rather than relying on hardware-based probes placed as close as possible to the customer, software can enable all end points to be monitored for Quality of Experience. “Our agent sits on the set-top box with an MPEG-DASH client natively and data can be extracted on the quality of the video being delivered – although DASH right now is not widely deployed as a standard.”

Savoie says that adoption of quality assurance for multiscreen video inside and outside the home has been faster than in the managed IPTV world, with cable and satellite TV providers also launching IP-based multiscreen services to complement their DVB-based broadcast offerings.

Development to execution

Added to the increasing complexity of the delivery chain that service providers must face is the fact that the time between the conception, development and execution of new services is becoming every shorter as competition becomes more intensive.

“The cycle of developing, operating and optimising the network is accelerating. The interfaces between them are fusing and the traditional segue-way between development and operations is disappearing,” says John Maguire, director of strategy and marketing for TV technology at quality assurance specialist S3 Group. From selling quality assurance solutions to development teams prior to services being launched, he says, S3’s technology is now used by operations teams. Maguire says that operations and development functions within service providers are increasingly fusing into ‘DevOps’ with product development specialists working alongside operations teams to launch new features and products on an ongoing basis.

This means that quality assurance technology providers have to cater to a wider range of users of their products and therefore have to tailor their user interfaces and the datasets they provide to the needs of individuals working with them, says Maguire.

According to Maguire, service providers are increasingly looking for ways to identify and predict problems rather than reacting to them after the event. Data from quality assurance technology can assist with this.

A focus on operations means that quality assurance must identify a wide range of problems, not only including the quality of the picture being played out but ensuring that the right catalogue appears in user guides on iPads and on-demand movies are correctly identified and registered so that they play out in response to a request from a user.

“People expect movies downloaded to their iPads to work. When we are sitting on the couch we have no idea of the

In Focus

Retail devices and quality assurance

One of the factors adding complexity to the… Read more

complexity of everything that has to go right for that to work,” says Maguire. “We have to make sure it goes right at the packet level but we are focused on the full package of services.”

For Agama’s Dahlgren, successful service providers include those that manage to get different parts of their organisation to work together successfully. However, it is difficult to make any generalisations about how the internal structures set up by operators are evolving to accommodate the complex requirements of multiscreen and OTT delivery. “In general we deal with different parties at different operators who are all organised in different ways. Those operators that are successful have a quite holistic view and good interconnection between departments,” he says. “You could have a small technical issue in production but then have a lot of customers experiencing problems because of it. It is very important to sort out these problems at an early stage so good communications between development teams and back-office teams and customer care is important. Sorting out problems through continuous improvement is important.” For the quality assurance technology provider, the key is to provide information that is meaningful to each of the different teams.

Useful information

Quality assurance technology has to be deployed intelligently if it is to deliver useful ‘actionable’ information. Bridge’s Frostad points out that in the case of peer-to-peer delivery, commonly used for OTT services, it is meaningless to try to measure packet loss, as “errors are a normal state”. If a million devices are simultaneously used to watch a live stream, there will be a significant rate of failure. Alarm systems have to be calibrated to give meaning to staff monitoring them, therefore.

“Traditional alarming has to be rethought,” he says. Providers have to be able to distinguish between problems caused by the video they output from their headend or something related to the network. If the latter, they have to be able to distinguish between problems related to their CDNs and those related – for example – to a cell tower in a mobile network.

For Mariner Partners’ Savoie, quality assurance technology can be put to multiple uses and must be tailored according to the requirements of different groups of users.

“Part of our value is in visualisation,” he says. Mariner has achieved success by focusing on the needs of operations teams within customers’ businesses, he says. By providing data and metrics that can be put to practical use by frontline customer care staff and field engineers, quality assurance technology providers can deliver real value, says Savoie.

For Bridge’s Frostad, making data provided by systems useable by operations staff is key. “Data visualisation is what we are calibrating,” he says. “We have the hard part done and we can feed data into central storage. Now we have to look at displaying operational data in real time.”

Frostad says that there is currently a mismatch between service providers’ internal processes and what needs to be done to make the most of the data on offer.

“OTT broadcasters do not have huge operational centres and visualisation walls. They need different tools and they have a different cost structure,” he says. “They need real-time tools. In normal broadcast situations you can sit and watch a monitor. But for OTT you need to understand whether it is the CDN that fails or the device that doesn’t perform as it’s meant to and be presented with more intelligent data that enables you to do stuff.”

The goal as with traditional video service delivery is to try to anticipate problems and head them off before they happen. Increasingly customers are using social networks and forums to share their experience of problems and monitoring this is key to any quality assurance regime. However, monitoring technology must be able to address issues before they get to this point, according to Agama’s Dahlgren.

“If you have a problem with your delivery it is going to be too late if you wait till you see it on social media. You are more dependent on technology for that. But if you want to know what customers think in general that type of social media information is very relevant,” he says.

Big data

In addition to providing useful information for development engineers and operations support staff, some providers of quality assurance technology have also proselytised the potentially transformative benefits of ‘big data’ offered by software agents that can be embedded in end-user devices at the player level. The wealth of data that can potentially be made available thanks to quality assurance technology can, it is argued, be used by service providers to shape the way their services are packaged and provisioned as well as to identify the types of devices and user experiences that will deliver growth.

But can such data be managed and usefully distilled by service providers and broadcasters?

Some technology providers certainly believe so. IneoQuest recently launched a version of its existing Audience Measurement Platform – initially released to provide viewership statistics to US switched digital-video implementations – for adaptive bit-rate streaming, enabling service providers to gather viewership metrics across an unmanaged network with no set-top box.

“Our probe technology monitors the output of a CDN,” says George. “It performs QoS and communication between devices, but also now viewer analytics.”

According to George, this enables service providers to monitor how content is being consumed and will enable them to understand better their subscriber base and personalise their content offering. George admits that “not all are ready” to make full use of the data on offer from quality assurance systems, although he says that IneoQuest has a customer that is using its data platform.

“In the linear world the data was not used in this way and they just monitored [the video quality], but the market has matured,” he says. “They are still trying to understand what you can do. We are customising data for different players. We can modify the UI to show what data they want – what are the best performing CDNs and video assets and how many devices there are in a particular region. This is all marketing data that allows service providers to optimise their infrastructure.”

For George service providers face a growing challenge from the proliferation of devices – particularly in the Android ecosystem – that can receive video, often with multiple operating systems even on the same generation of device to add complexity. Data can enable them to fine-tune what they offer by providing the information that will enable them to focus only on the most popular devices – thus saving money.

The same applies to mobile video, where IneoQuest is seeing growing interest in its technology, according to George. “Here, reducing latency involves understanding what problems are most popular at that point,” he says. Consumption of mobile video could be influenced by what sports are playing on TV in a local bar, for example.

S3’s Maguire agrees that data is becoming increasingly important, but cautions against “throwing all big data into a big pot.” He says:  “We are hopeful of combining data from our probes with [other sources] but we are taking it step by step. There are interesting opportunities in terms of the data sets we have available.”

Limitations

Others are more cautious still, and not everyone is convinced that ‘big data’ from end devices will fulfill all of the promises being made.

Bridge’s Frostad, along with his reservations about the impact of the cost of implementing quality assurance agents in end devices, also points out that existing audience measurement providers including Nielsen already have sophisticated techniques for measuring online viewing.

Data based on modeling behaviour can be more meaningful – and therefore deliver a more accurate reflection of what’s actually happening – to service providers than raw data provided by quality assurance agents, he says.

“You have to live in the real world. You can measure this or that but you have to remember that the first priority [for quality assurance technology] is to gauge quality,” he says. “When we do that right we can think about other things. I don’t see [big data] driving this market.”

For Witbe’s Rinard also, technologies that test each and every end-user device have proved to be disappointing, largely because this approach delivers a vast quantity of data that is, in aggregate, for the most part useless to the service provider. “By monitoring each and every device you get a huge amount of data,” he says. “We have some deployments of this but what we have seen is that the data is very ‘noisy’. You have many Android devices that don’t report things in the same way because the chipsets are different. You can have metrics that could be collected in different ways.” The data can also be confusing because it has no visibility into what the individual user is doing. Someone using peer-to-peer technology to access content while watching something could give one set of results while someone else could be trying to watch something on a mobile device while riding the elevator or entering a subway.

Operators are left with the ability to use data from end devices for statistical analysis of consumer usage patterns rather than active troubleshooting of problems. “For the marketing guys it is great, but not so useful for the engineering teams,” says Rinard. In fact, says Rinard, service providers typically can gather data from end users on their own without the intervention of third-party quality assurance solutions “because they control the app”. The true value of quality assurance technology, he suggests, is to ensure quality rather than provide data to feed into marketing strategies.

Demand for quality assurance is likely to remain strong as customers’ expectations of video quality grows. Developments that are likely to accelerate this include demand for higher-resolution video over the web to feed the growing number of large, connected screens.

“Consumers’ expectations are rising and that is good for us,” says Mariner Partners’ Savoie. “4K being delivered to consumers will create other wave bandwidth requirements but we are starting to see better quality video on better screens in the home all the time,” he says.

At the same time, the sheer breadth of the video distribution ecosystem means that service providers are finding it increasingly difficult – or unaffordable – to invest in systems designed to guarantee a uniform quality of experience to their end users, and some may be beginning to question in particular whether the transformative promise of ‘big data’ will match the hype.

For service providers and technologists alike, setting realistic goals and deploying technologies to meet them is likely to be at the centre of their thinking about quality assurance.


Viewing all articles
Browse latest Browse all 2

Latest Images

Trending Articles





Latest Images