As someone who used to work at Critical Mention I can speak a little bit to this. The infrastructure was set up about 9-10 years ago and initially involved one very dedicated engineer driving around the country installing DVR's at the different affiliate stations in almost all (if not all) 210 DMA's. These DVR's would "phone" in with video, as well as close captioning text to the datacenters. Over the course of the last 10 years the stack has changed a few times, but more recently the real time processing occurred as a series of Python lookups queued in Rabbit MQ. The resultant data (Close captioning text as well as Schedule, Demographic, Affiliate) information was indexed in a full text search cluster. Creating a searchable set of all TV video across 90 days is an engineering feat in and of itself. That being said I don't think any companies have been able to mine the data, summarize or aggregate it in any way that can be truly beneficial to a brand.