More than one Machine Box customer has asked about detecting when there are color bars, black video, end credits, and production logos in their media assets using Tagbox. They have oodles of archival video footage that was either digitized from videotape or produced in a way where color bars, slates, and other elements are present in the video file.
I wanted to test out whether one could solve this in a few minutes with Tagbox’s training endpoint, so I found about 100 examples of end credits, color bars, black screens, and production logos.
The idea is for media asset management solutions, workflow solutions, and transcoding solutions to use Videobox to run every video through Tagbox’s trained image recognition model, and get back a JSON object with time stamps of where these things appear.
I stored each example image into its own folder on my laptop, spun up Tagbox, and ran a handy little script to iterate over every image and
teachthem to Tagbox. After about 3 minutes it was ready to test.
I found some video samples with ending credits, opening production logos, color bars, and black between segments and ran it through Tagbox via Videobox.
The resulting response to the Videbox API told me exactly where these elements were in each video file with excellent accuracy. A developer can take this JSON and use it to improve their search indexes or timelines in a UI.
The Videobox console has a a way to visualize this metadata on a crude timeline for demonstration purposes. I took a screenshot of that and overlaid it with the video inside an iMovie timeline.
As you can see, the color bars end at 9 seconds into the video, followed by some production logos, a second of black, and then more logos.
You can adjust the confidences in your UI to better suit different situations and use cases as well, so that elements that Tagbox has a low confidence on can be filtered out.
Give it a try now for free — takes about 20 minutes.