Your web browser is out of date. Update your browser for more security, speed and the best experience on this site.

Update your browser
CapTech Home Page

Articles January 18, 2021

Sports Innovation Challenge Winner: Using Audio and Natural Language Processing to Increase Engagement

CapTech
Author
CapTech

The Inspiration Behind The Solution

In the late 1990s, a music listening app called Shazam rose to popularity based on a simple premise: a song is heard but cannot be identified by the listener. For the user, discovering that pesky earworm is now simple: just open up the app and tap their smartphone screen. The artist and song title are promptly delivered. Of course, the technology behind the app is far more complicated. Using the smartphone’s microphone, the app identifies the track by finding a match in its directory of “audio fingerprints.”

Our winning innovation challenge team set out to solve a similar problem in the context of sports. They imagined someone watching a sport with whom they were unfamiliar, and what questions they may have about the rules, players, or teams. For their proof of concept, they chose football because it offered a treasure trove of data.

Research indicates that football is the most popular sport to watch online or broadcast TV in the U.S. — and a Facebook study shows 94% of participants keep a smartphone on hand while watching TV, making it a de facto remote control. So, the team hatched their plan. With the lack of a strong broadcast-mobile crossover application, they could take advantage of an unrealized opportunity to increase sports engagement through a mobile app of their own design.

How the App Works


The team started with a journey map to visualize how a potential user might interact with the application. The persona they used was a football newbie at a friend’s house to watch a game, using the app to follow along and learn. By the end, the user increased their understanding and interest in the sport, increasing engagement — and potentially creating a new fan.

The app uses four steps to create this experience:

Step 1: The app “listens” to a live sports broadcast in real-time

Step 2: The app then recognizes key words and phrases, such as player names or sports-specific terminology

Step 3: Information on targeted phrases is gathered from a variety of web-based services via an API

Step 4: Finally, the information is presented on an endlessly scrolling screen that updates in real-time

The technology at the heart of this app is Natural Language Processing (NLP), which is reliant on good data. The irony is the NLP solution initially served as a stumbling block because it struggled to listen to sound directly from a television. However, after much trial and error, the team discovered an ingenious solution: closed captioning could be used to provide the critical data they needed in the form of keywords. Next, the overarching challenge was getting the computer to understand the data, especially since it needs to be in real-time. The team used Machine Learning to train the NLP aspect of the app to pick up things like player and team names. This led to some humorous interpretations:

Actual closed captioning:

Announcer 1: “That’ll be a flag on Jackson…”

Announcer 2: “Watch — inside leverage he has the whole corner route and they gotta go to Kelce.”

Referee: Pass interference, defense, number 31.”

How the app interpreted:

“Lag on Jackson much inside leverage he has the whole corner route and gotta go to Kelsey bathroom ferrets defense always.”

This illustrates how critical it was to train the NLP model. Poor audio and closed caption quality impacted the app’s performance, even with the impressive technology stack the team employed for their innovation. Accurate data is the difference between an engaging fan experience and seeing “bathroom ferrets” on the scrolling interface. However, the team knew it was possible to overcome their roadblock and found that, when they combined closed captioning with additional data sources, they were able to train the model to understand and see the game more clearly.

For a future state, the team envisions using a more robust NLP model that can handle complex phrase recognition, allowing users to receive more detailed player information and customization (a user could choose to follow a specific team, for example).

Additional Applications and Considerations 

The team sees incredible potential for their app. To start, it could be used for other sports, ranging from baseball to golf to basketball. Broadcast and sports-specific networks could use the app to enhance the viewer experience and generate new fans of a respective sport. The app could potentially allow entities to connect to other properties or market additional products or services — even gambling platforms.

Also imagine this technology being used by different forms of entertainment, like TV and movies. Knowing that many people have their smartphones in hand while watching their diversion of choice, they could learn more information about an actor, historical figure, or event. The app could also be used while watching a foreign language film to gain a better understanding of key phrases or cultural references.

The potential for personalization is limitless. By tailoring the customer experience to each individual’s interests, the experience of using the app consistently improves through a perpetual cycle. Companies can consistently gain new data and insights about their viewers, allowing them to continually improve on the experience they create.

The Team 

  • Jonathan Tang (Director, Systems Integration)
  • Patrick Maribojoc (Senior Consultant, Systems Integration)
  • Robert Stanley (Manager, Management Consulting)
  • Nancy Zhang (Consultant, Systems Integration)
  • Raymond Hummels (Consultant, Management Consulting
  • Ryan Donovan (Consultant, Systems Integration)