Social TV - View and Contribute to Public Opinions about Your Content Live

What is Social TV?

Until recently, the TV viewing experience has not been a very social activity compared to activities on the World Wide Web. To solve this problem, the voice-enabled Social TV system allows users to interact, follow and monitor the online social media messages related to a TV show while watching it. To make interaction easier, messages can be created using spoken language. Social TV also provides metadata information about TV shows such as trends, hot topics, popularity as well as aggregated sentiment, all of which are valuable for TV program search and recommendation. The goal of the Social TV prototype is to create a TV experience that combines TV viewing and interaction with social media networks (Twitter). It uses machine learning techniques to retrieve high quality tweets related to TV programs being watched. Specifically, it uses data mining techniques for sentiment analysis, popularity estimation, top tweets, and hot topics. And it supports all basic Twitter functions such as reading tweets, sending tweets, replying to tweets, and retweeting. Tweets can be creating by speaking into the Voice Activated Remote Control.

Architecture

The architecture includes three major function blocks: Data Manager, Data Mining Module and Application Manager. The Data Manager retrieves tweets relevant to TV shows and archives them. The Data Mining Module indexes the collected tweets, analyzes them, and supports various types of searches. The Application Manager hosts the TV application page (on the Set Top Box) which integrates the archived Twitter data, Twitter stream data, and the generated data from the Data Mining Module into an interactive interface on the TV. The prototype also plays live TV so the user truly has an enhanced viewing experience.

Social TV Architecture

Interface Examples

Some of the key features of the prototype are demonstrated in screen captures below. The user can see tweets about the current TV program being watched and navigate those tweets. As new tweets arrive, they will become available so the user can see them. This near real-time interaction with the crowd is what makes this feature so valuable. Effectively, this information can be used to help recommend what to watch.


Social TV Relevant Messages

This image demonstrates Twitter messages relevant to the live television program "Inside Edition". More than one potentially relevant message was found, so the user can navigate through these messages with intuitive movements on the remote.  

Social TV Sending Tweets

This image illustrates the interface for sending tweets using the Voice Activated Remote Control. The user would press the TALK button on the remote, speak the tweet (in this case, "who is bold and who is beautiful"), and then release the TALK button. After the tweet appears on the TV, the user can send the tweet by pressing the OK button on the remote. These tweets and others from the people you follow can be found in the user's own Twitter feed (My Tweets).  

Social TV Program Trends Summary

This image provides various charts for the program "Dr. Phil" showing the results of sentiment analysis (negative, neutral, and positive), a word cloud containing words that occur most frequently in tweets about this program, and a popularity chart that shows the number of tweets in the last week about this program.  


Project Members

Bernard Renger

Behzad Shahraray

Related Projects

Project Space

AT&T Application Resource Optimizer (ARO) - For energy-efficient apps

Assistive Technology

CHI Scan (Computer Human Interaction Scan)

Client Communications Center

CoCITe – Coordinating Changes in Text

CollaboraTV

Connecting Your World

Darkstar

Daytona

E4SS - ECharts for SIP Servlets

Scalable Ad Hoc Wireless Geocast

AT&T 3D Lab

Graphviz System for Network Visualization

Information Visualization Research - Prototypes and Systems

Swift - Visualization of Communication Services at Scale

AT&T Natural VoicesTM Text-to-Speech

Smart Grid

Speech Mashup

Speech translation

StratoSIP: SIP at a Very High Level

Telehealth

Content Augmenting Media (CAM)

Content-Based Copy Detection

Content Acquisition Processing, Monitoring, and Forensics for AT&T Services (CONSENT)

MIRACLE and the Content Analysis Engine (CAE)

Visual API - Visual Intelligence for your Applications

Enhanced Indexing and Representation with Vision-Based Biometrics

Visual Semantics for Intuitive Mid-Level Representations

eClips - Personalized Content Clip Retrieval and Delivery

iMIRACLE - Content Retrieval on Mobile Devices with Speech

AT&T WATSON (SM) Speech Technologies

Wireless Demand Forecasting, Network Capacity Analysis, and Performance Optimization