circle icon

February 15, 2022

Changing the Game: How Second Screens are Transforming the In-Stadium Experience

circle icon

By Todd Erdley | Founder and President at Videon

By incorporating ultra-low latency (ULL) streaming into the stadium experience, every seat in the house can be an immersive experience.

The sports broadcasting industry is hungry for different ways to monetize and drive engagement amongst fans. We are on the brink of a revolution when it comes to in-stadium experiences. To best understand the transformative effect latency can have on an in-stadium experience, it’s best to think through a real-world example.

Latency delays don’t win championships

Say you and your best bud drive out to Indianapolis to experience the National Championship between UGA and Alabama. Your friend graduated from Alabama, and you from Georgia, but somehow you still manage to break bread together. The stadium offers a novel new experience. Log in on your smart device and be able to stream replays, stats, and behind the scenes from any camera in the complex. UGA scores, again, and you feel the 41-year losing streak screaming to a halt. You decide to replay the touchdown to gloat to your buddy, but it takes 40 seconds to get the content. The moment’s over, and you might not feel the rush quite as much as you could have. The novelty is just that, a novelty.

Now, what if that delay was less than a second? Oh, the face-rubbing glory. Oh, the opportunity to live in the moment experiencing the game in a manner you never could have conceived.

40-second latency in over-the-top streaming content is fairly common in the industry–but common doesn’t win championships. As more people use second-screen devices at live events, the delay caused by latency can create frustration and lead users to disengage.

Disengagement by fans leaves a lot of money on the table. At Videon, we want to help drive engagement and in doing so, drive monetization. We are able to do so through EdgeCaster.

Moving production workflows out of the cloud

So let’s say that you are producing the National Championship. There will be 100,000 fans in the stadium and you need second-screen viewing that performs as well, or better, than traditional broadcast feeds.

This is where EdgeCaster comes in. Up to this point, video encoders have just been video encoders, but EdgeCaster is so much more. Because it is an edge compute device—with an encoder—it handles critical cloud elements in production workflow that create delay. That is why we always say that we’ve moved functions like packaging from the cloud to the point of video origination. To the edge of the edge. Since EdgeCaster enabled player-ready video, there is no need for the cloud to do anything but deliver the video at the lowest latency possible.

How it Works

When we say that we’ve created an edge compute device, we mean that EdgeCaster with LiveEdge® has the computing power and programming to format and send data to the cloud-ready for delivery. By using cutting-edge technology like chunked transfer for HTTP workflows, or SRT, or WebRTC, LiveEdge® does the work that normally would cause latency in the cloud. EdgeCaster with LiveEdge Streaming does not force you into a specific solution. It enables a wide range of solutions, kind of like a Swiss Army Knife. And when it comes to ultra-low latency delivery, everyone has an opinion. That’s why we give you the power to choose. Perhaps you might want the lowest latency possible with WebRTC. Perhaps you might want SRT. Or perhaps you want a chunked transfer with HTTP. EdgeCaster and LiveEdge put the power of what you want in your control to use when you want it and how you want it. And if you really want choice, you can use multiple output formats at the same time. While a Swiss Army knife with the scissors and a blade extended might be awkward, EdgeCaster with LiveEdge streaming supporting WebRTC and chunked transfer HTTP at the same time is anything but clunky, it is fantastic. .

But that’s not enough.

We also make sure that the content is “cloud-friendly”. That means we work with our cloud partners to ensure all of the kinks are out of the system. There is a lot of work that goes into optimizing Common Media Application Format (CMAF) and has been fully integrated and tested with AWS, Fastly, Akamai, and other CDN partners that accept HLS/DASH delivery. And there is a lot of work required to optimize WebRTC with partners like Phenix and Red5Pro. In short, ultra-low latent second-screen experiences will not disappoint.

Workflow Using AWS for Stream Delivery

The Outcome

Video streamed using our technology can arrive at the second screen device faster than broadcast. While that might not mean a lot if you are in your living room, it has a huge meaning if you want to experience live events in person with a second screen experience that will create moments never before considered.  Enhancing the audience experience will be a game-changer that results in an explosion of engagement and mind-blown fans who feel more involved than ever.

In the meantime, check out the latest news.


Insert Dynamic Advertising Markers and Graphic Overlays into Low Latency Live Streams

More  >

Three reasons why you need LiveEdge® Cloud to scale your live video business and the features that make it possible

More  >

IMG Arena runs initial ML Inferences on the Videon Compute Platform

More  >

Ready for intelligent, flexible, and efficient live streams.

Live video workflows are transforming to live streaming at warp speed–whether it’s a new video protocol, standard, technology or architecture–the ability to adapt quickly is paramount. Leading the charge are use cases that require quality, security and ultra-low-latency, such as live events, sports and betting.

Let us show you how

Get More Info