Artificial Intelligence: Transforming the Live Sports Landscape
PFT Blog Team | 06 Sep 2019

Artificial Intelligence: Transforming the Live Sports Landscape
Artificial Intelligence: Transforming the Live Sports Landscape Click To Tweet

By Adrish Bera, Senior Vice President, AI and Machine Learning, OVP and Analytics

Sports broadcasters and streaming platforms are always looking for new ways to engage fans and to deliver immersive experiences that bring them closer to the real-time action. To gain speed and efficiency, and to create new revenue opportunities, live sports producers are now exploring innovative technologies, with Artificial Intelligence (AI) and Machine Learning (ML) at the forefront.

Today, advanced AI-led solutions are capable of identifying and extracting metadata for specific game objects, constructs, players, events and actions. This aids in near real-time content discovery and helps lead viewers to the content most relevant to them. Such solutions can also create sports highlight packages based on the events taking place in a game as well as what viewers want to see. AI and ML play a vital role in achieving unprecedented efficiency in sports production, boosting viewership and increasing ad monetization. Let’s take a deep dive to understand how AI is transforming the live sports production landscape.

Powerful use cases

Linear broadcast: Improved storytelling using the power of AI

  • Cataloguing, discovery and search
    AI-led tagging of the game content segment by segment leveraging automatic content recognition, so that the live match as well as the producer’s entire sports archive become searchable.
  • Auto-highlight package creation
    Once the content is tagged automatically and exhaustively, machines can cut auto-discovered highlight packages based on pre-defined and auto-adjustable rules. Editors can also input parameters to order specific highlight packages.
  • Interactive TV experience
    Empowering set top box providers with dynamic match content, compelling stories and highlight clips during the match.
  • Interact blog

    Key events, VoD highlight packages such as 5 wicket haul, batsmen’s milestones, etc. can be notified by “Active TV” button

  • Improved post-match storytelling
    Post-match presenters can search match content extensively for creating powerful visual commentary.
  • Support for long-form content creation
    Making sports content archives discoverable enables easy retrieval of content for long-form content creation (like the career story of a player) as well as to syndicate/sell clips as a storefront.
  • In-stadia brand impression measurement
    Measuring brand exposure from bill boards, stumps, sight screens, bats, jerseys etc. This data can be used in production or post production to charge brands premium rates and increase ad monetization.

OTT Platforms: Immersive experiences for viewer engagement and improved monetization

  • Video notifications
    Users automatically get notified about key events of a match as soon as the event occurs. They can play these videos in a single click. This attracts more users to the app, while the video views can be monetized through pre-roll and other AVOD mechanisms.
  • Interact blog

  • Video Scorecards
    AI-generated and curated video clips can be added to Scorecards or the commentary feed. This makes boring, textual scorecards and live commentary come alive.
  • Immersive OTT experiences
    The metadata extracted using AI engines can be overlaid on top of the video to provide an “Amazon X-Ray” type experience. Users can explore different facets of the game in greater detail, without having to compromise the live viewing experience.
  • Interact blog

    Auto generated Highlights packages for the match is available as overlay on the OTT video player. User can explore details of a shot or search any events in the match instantly

  • Customized playlists & search
    Users can be presented with auto-curated highlights packages or playlists as the game progresses. They can use free text search capabilities to look for match events, and can also generate their own personalized playlists.

AI Models for Sports

Content Tagging

To deliver the powerful use cases listed above, we need to be able to tag a sport exhaustively, and with extremely high accuracy. The more nuances you can tag in a game, the richer the downstream use cases.

ML is based on the premise that a machine needs to be supplied with large volumes of training data to build a classification algorithm. Once the algorithm is built, it can predict results with the new set of data (test data). By this logic, if we feed the machine a lot of footage of a particular sport, it should begin to understand the sport’s actions. The reality is however quite different, as each sport is complex and unique. Even an uninitiated adult human would not be able to decipher a game like cricket, if he/she is just left with thousands of hours of match footage.

We need to therefore codify basic game logic & rules, and apply ML within a focused context for the machine to mimic human cognition. Let’s take the example of cricket to understand this better. While watching a cricket match, we decipher and appreciate the game through different elements in the match content. These are: 1) Our knowledge of the game actions like bowling, fielding, umpire signals etc. 2) On-screen graphics telling us the score highlights and the current state of the game 3) Sounds from the stadium like ball hitting bat, applause, appeal etc. and 4) Experts’ commentary.

Interact blog

AI mimics human recognition to discern the game

To help machines understand the game like a human, we need to build a model based on these varied inputs. Typically, we deploy different kinds of neural networks like CNN, R-CNN, LSTM as well as Computer Vision techniques to decipher various aspect of the four elements listed above. A wide variety of classification and cognition engines are used in tandem to “discern” the game from different perspectives:

  • Object detection – Different engines focused on specific groups of game objects and formations
  • Optical flow-based Computer Vision to identify movement of a ball/moving object
  • Face detection to recognize players
  • Scale Invariant Feature Transformation (SIFT) to detect complex objects and actions
  • Audio Spectrogram to find key contact points like a ball hitting a bat/racquet
  • Audio classification and excitement level classification
  • Image classification

To train these engines, we need to sift through hundreds of hours of match footage and annotate different frames, objects, actions etc. and generate training data for ML. These cognition engines are then stitched together using game logic and understanding of sports production in order to catalogue the game, segment by segment. E.g. for cricket, we deploy more than 11 such engines to extract 25+ attributes per ball including batsman, non-striker, bowler, runs scored, type of shot, fours, sixes, replays, crowd excitement levels, celebrations, wickets, bowling type, ball synopsis etc.

Interact blog

AI model for Sports requires compilations from several AI & ML Engines

Search and Discovery

To make an archive or the footage of a single match discoverable, AI-generated metadata should be part of a search index that uses an advanced semantic search engine like Elastic Search. Techniques that help deliver sharper search results include Natural Language Processing (NLP), Key Entity Recognition, Stemming, Thesaurus, Fuzzy Logic and Duplication Removal.

Highlight Creation

Once the AI engine catalogues a game thoroughly, the next challenge is to create instant highlight packages that capture the game’s key events and drama. For spectators, excitement levels typically peak during the high points of a game. For instance, in football this could be when a goal is scored or missed. Visual clues like a referee/umpire signal or text overlay on screen also denote key events. A highlight creation engine can tap these high points and attributes to create a simple highlight package.

But creating compelling highlights involves much more. For instance, a good cricket match highlight package is not just a combination of fours, sixes and wickets. The human editor artfully cuts a package that captures suspense, comedy, drama and tells a compelling story. If a batsman is beaten a couple of times before being out, the editor shows all 3 balls – not just the last one. Other elements like tournament montages, pre-match ceremony, player entry, toss of coin etc. also need to be learned and included in highlights.

AI-generated highlights should be created using learnable business rules that are built from past match highlight productions. These need to be improved over time as the engine learns what works better and what does not. These rules are based on comprehensive attributes/tags for each match segment. Much like a human editor, machines can also bring in AI-based audio smoothening, scene transitions between two segments, smooth commentary cuts etc., so that human editors need to put in minimal effort to do Quality Check (QC) and finishing.

Technology and Logistics Challenges

No two sports are alike. Given the complexities of each sport, producers cannot use off-the-shelf, 3rd party video recognition engines to tag content, discover meaningful clips or create highlights. One needs to build and train custom models to tag sports content effectively. Also, specific models need to be built for specific sports – incorporating game logic, expert methods and historical learning. This is a painstaking process, but any attempt to create a generic sport model is likely to fail.

Each sport is also produced differently. For example, in football a movement towards a goal is shown as a combination of long and close-up shots taken from different camera angles. For cricket or baseball, the camera tries to keep the ball in the middle of the frame while the batsman is hitting. We need to train our engines with specific knowledge of these nuances and production techniques.

Despite these challenges, the demand for automatic live or near real-time highlight package creation cannot be overstated. Today, producers need to swiftly publish highlight clips to platforms like Facebook, Twitter and YouTube, for which they need game tagging and highlights generation to take place quickly. AI engines with both on-premise and cloud hosting capabilities can help achieve the necessary speed.

Business Benefits

The use cases described above, if delivered effectively, can bring concrete business benefits. These include reducing Total Cost of Operations (TCOP), increasing monetization from content & ads and improving user engagement through enhanced storytelling. Our research has found that employing an AI-led custom model for cricket can save up to 60-70% of editors’ time. It can significantly reduce the manual effort involved in cricket tagging, leaving editors free to focus on performing overall QC and capturing the few attributes that AI may not have been able to. Overall, the potential for using AI and ML in the live sports arena is enormous. To witness powerful outcomes, we need to create AI-led custom models for different sports, deploy these at scale and fine tune them for several months. Such models can help automate routine tasks, enabling human resources to focus their intellect on far more creative pursuits.

Click here and we'll get you a meeting with our Subject Matter Expert


Subscribe to get emails delivered to your inbox

Email *

Thank you!
Our team will get in touch with you shortly.