Communication Kit

on .

Find below project communication material that is available for download:

Project Communication    

Project Flyer (PDF)                         Project Presentation (PDF)           

SocialSensor 2014 Newsletter        SocialSensor 2013 Q1 Newsletter     SocialSensor 2012 Q2 Newsletter            

 SocialSensor STCN e-letter

After STCSN published its second E-Letter earlier this year, we proudly announce its third edition, which contains fresh results from the SocialSensor project on Sensing User Generated Input for Improved Media Discovery and Experience. We are sure you will enjoy this edition, since it contains a lot of interesting thoughts and results from this European research project.

 

SocialSensor:Sensing User Generated Input for Improved Media Discovery and Experience || The video

 

 

ThessFest || Promotional Video for tiff54

 

 

 

SocialSensor: Unfolding Digital Truth || A video on the News Use case

 

 

Infotainment app/UI

on .

The infotainment application suite actually consists of three main parts (cf. figure below), each of which serves different end user requirements:

  • EventLive: This is the primary infotainment application targeting event attendants. Its features aim at improving the event experience by facilitating the information access (e.g. by easy access to program, film information, recommendation, etc.), interaction (ratings, comments) and social sharing. It also serves as a means of collecting aggregate usage data that are of interest to event organizers.
  • Dashboard: The event dashboard exposes analytics functionalities to event organizers. In V1, the functionalities were limited to top rated/most popular films, most active users, a twitter activity map and a gallery of shared MediaItems. An instance of the dashboard was set up and made available to organizers for the latest Thessaloniki Film International Film Festival.
  • Social Media Walls: This is an application that is administered by event organizers, but is ultimately offered to end users at the premises of the event. Ultimately, the admin part of social media walls will be blended with the dashboard.

infotainment-ui

News app/UI

on .

The news app consists of three distinct major layers (storage, business and data access, presentation).  The storage, data access and business layers constitute a unified framework (the backend) that merges the various repositories and exposes them in a web services format following the business rules of the system.  The presentation layer consists of as a set of end user applications (mobile UI, tablet UI, web front-end). The front end user interfaces exchange data with the backend by means of HTTP messages. For the facilitation of the development, the SocialSensor java-client SDK is used in certain cases internally by the client applications. More specifically, the news app web front-end is built in Java Server Faces web framework together with Primefaces, following a Model-View-Controller approach. A loosely coupled architecture is also followed on both the tablet and mobile app that are implemented in native objective-C for iOS 7, and communicate with the backend by HTTP calls to the exposed RESTful services.

news-ui

semantic-middleware

on .

This module mainly consists of components that are deployed within a mobile device (which acts as a peer node in a network of peer devices) and operate “under-the-hood” to offer a set of network-aware services. More specifically, it consists of the following (cf. figure below):

  • Core app: This includes all the application logic, ranging from the implementation of the “standard” features (e.g. program browsing, search, my films, rating, etc.) to the control of the middleware components described below.
  • DASH recording and streaming: This implements the DASH video adaptation encoder and decoder, which adapts the video streaming depending on the network conditions.
  • Peer-to-peer IPsetter: This maintains an index of the peer devices IP addresses by polling them at regular intervals. It exposes the index through the infotainment API.
  • My Media: This implements the app/UI view of the peer-to-peer video/trailer streaming functionality, enabling users to assign categories to videos they stream and to select the categories of videos they want to play. 
  • 2D Map: This implements the 2D map visualization, including overlays of different pieces of information, e.g. tweets, venues, screenings, etc. and components such as the timeslider.
  • 3D Vis: 3D map visualization relies on a server-side component. The functionality is available to the mobile app end users through an embedded Chrome browser.

semantic-middleware

infotainment-orchestrator & recommender

on .

The infotainment orchestrator takes care of the collection, indexing and retrieval of social content around large events. To a large extent, its main research components are described in the EventSense paper (Schinas et al., 2013) and in D2.2. The orchestrator coordinates the collection of content from Twitter, Facebook and other social media sources using as input the event hashtag, its Facebook page and event name. The orchestrator also arranges the operation of the following components (cf. figure below):

  • Entity Extractor: For each incoming Item, the Entity Extractor detects references to entities of interest for the event, e.g. films. To achieve high-accuracy, it makes use of a bi-lingual entity gazetteer (available through the infotainmentDB).
  • Sentiment Analyzer: Same as the one used by the news-orchestrator.
  • Dysco Creator: The DySCO Creator clusters incoming Items based on the document-pivot method described in D2.2 and in (Aiello et al., 2013).
  • Title Extractor: The title extractor is the same as the one used by the news orchestrator.
  • Contributor Statistics: For each contributor (twitter account), some aggregate statistics are computed regarding their activity and the sentiment of their posts.
  • Film Statistics: For each film, aggregate statistics are computed regarding their rating, popularity (based both on the myFilms feature of mobile app and tweets about the film), and sentiment.
  • User Profiler: For each SocialSensor user (mobile app user), we collect the ratings and bookmarks (films added to myFilms). In case the user is logged in with their Twitter account, we also collect their followers and friends for use by the recommender.

In addition to the infotainment-orchestrator, the recommender, which implements the recommendation methods described in D5.2, runs as an independent service that makes use of the user profiles and ratings through the BigIndex server (cf. D4.2). The indexing of user profiles and ratings is handled by the orchestrator, while a REST service has been set up to serve the actual recommendations to the end users (of the mobile app).

infotainment-orchestrator

news-orchestrator

on .

The news-orchestrator acts as the monitoring and controlling entity of the analysis and indexing phase of the workflow. Its role is to trigger in a sequential way the various modules that participate in the DySCO formulation. It starts by synchronizing the stream manager’s output to mongoDB with the analysis workflow , acting as an intermediate buffer that pushes content in batches to the various analysis modules. Once the analysis modules interact to fill in the different metadata fields of Items and DySCO objects, the orchestrator encodes the objects into Solr-compatible documents and feeds them into the Solr server. After this point, all DySCOs and Items are available to be retrieved and visualized at the presentation layer through the available User Interfaces.

More specifically, the following components are operated by the news-orchestrator (cf. figure below):

  • Entity Extractor: For each incoming Item, the Entity Extractor detects references to named entities. This is based on the Stanford CoreNLP library. Note that this entity extractor is different from the one used by the infotainment-orchestrator.
  • Sentiment Analyzer: The Sentiment Analyzer is responsible for the detection of sentiment labels (positive/neutral/negative) for each incoming Item. Details on the adopted approach are provided in D2.2.
  • DyscoCreator: The DySCO Creator clusters incoming Items based on the BN-gram method described in D2.2 and in (Aiello et al., 2013). In V2, we plan to explore improvements of the method, as well as additional methods (e.g. the SFPM approach described in D2.2). Several of these topic detection implementations have been made available as open-source project in GitHub.
  • DyscoMatcher: This matches the newly created DySCOs with DySCOs created in previous timeslots (provided their similarity exceeds a certain threshold). In V2, this component might be considerably revised due to the foreseen changes in the DySCO management lifecycle.
  • Aggregator: This aggregates the different elements that were extracted per Item (n-grams, keywords, named entities) on a per DySCO basis.
  • Title Extractor: This uses a set of business rules and heuristics to extract a human readable title for each new DySCO. The set of these rules has been revised during the evaluation based on feedback from end users, and is expected to be further updated in V2.
  • Ranker: This component (to be created in V2) will associate importance weights to the discovered DySCOs. It will take into account external sources (e.g. RSS feeds, Reddit topics).
  • Influencer Extractor: This is executed in an asynchronous way (on top of Hadoop) and periodically extracts influencers per keyword (for a set of trending keywords defined on the basis of the created DySCOs).
  • Query Creator: This will be responsible for (a) forming appropriate SolrQueries that are used for the retrieval (from the SocialSensor store) of Items, MediaItems and WebPages related to a DySCO of interest, and (b) forming appropriate queries that are used by the stream-manager to fetch (from the wrapped online social networks) additional Items and MediaItems that are related to the newly created DySCO. The source code of the query creator is available in the GitHub socialsensor-query-builder project.

news-orchestrator

storage-indexing

on .

The storage-indexing components of SocialSensor include the following:

  • mongoDB: This stores the metadata of Items, MediaItems and WebPages, as well as auxiliary data, such as the Twitter accounts to monitor, and the URLs to fetch.
  • Solr: This hosts the Items after having them populated with their metadata as well as the DySCOs  in two separate fully searchable collections. Also, in a separate collection, it stores MediaItems associated with a set of properties in order to be searchable by a full text search query. Finally, for the purposes of the n-gram analysis, which is involved in the Dysco Creation, the TopicDetectionItems collection has been created. The latter is not a permanent storage but a temporary repository for the processed Items of each timeslot.
  • mm-index: This index is dedicated to the indexing of image features to enable fast and scalable similarity-based search. The underlying indexing mechanism is documented in D4.2 and thorough evaluation results are available in D4.3. Its source code is also available in the multimedia-indexing GitHub project.
  • infotainmentDB: This is a database, dedicated to the infotainment use case, storing the schema and content around an event of interest (e.g. for a film festival: film program, film details, directors, etc.).

Access to those is possible through methods of the socialsensor-framework-client as well as through a REST API (e.g. the infotainment API).

storm-focused-crawler

on .

The storm-focused-crawler deals with the management of the URLs that were extracted from the Items and MediaItems collected by the stream-manager. This process is described in D4.3 in more detail. The main operations in this process are the following:

  • Multimedia Fetcher: For URLs pointing to media content (e.g. links to YouTube, twitpic, etc.), the actual media content is downloaded (in case of videos, the video thumbnails and not the video itself).
  • Article Extractor: In case of URLs pointing to general web pages, a simple article extraction technique is applied in order to extract the main article text and title. In case a photo is featured in the article, its URL is also extracted and forwarded to the Multimedia Fetcher.
  • VLAD Feature Extractor: In this step, a single feature vector (VLAD) is extracted from the image content. The local features (SURF descriptors) used for its computation are not stored. Further details on the implementation and evaluation of this process are described in D4.2 and D4.3.
  • Feature Indexer: The next step, after feature extraction, is the indexing of the feature vector using Product Quantization (PQ) and Asymmetric Distance Computation (ADC) for fast similarity-based search. Further details are available in D4.2 and D4.3.
  • Location Estimation: In this step, a geographical location is inferred for an input Item or MediaItem based on its textual metadata and its extracted features. This step is not implemented yet, but some research that is necessary for its development has been conducted (described in D4.3).

The aforementioned steps are implemented on top of a Stormtopology. The project source code is available in the storm-focused-crawler GitHub project.

storm-focused-crawler

stream-manager

on .

This module is responsible for the collection of Items (tweets, posts, etc.) and MediaItems from Online Social Networks in two ways:

  • Using the streaming API of Twitter, the stream-manager continuously monitors a set of Twitter accounts (e.g. lists of newshounds) producing a real-time stream of Items as input to the system. In V1, a single static list of Twitter accounts was monitored. In V2, multiple lists are supported and the provenance information (which list each Item came from) is maintained. In addition, V2 will support dynamic updates of the monitored lists (offering appropriate UI controls to end users).
  • Using the REST API of Twitter, Facebook and other media sharing platforms (Flickr, Tumblr, YouTube, Google+ and Instagram), the stream-manager collects MediaItems in a targeted way after a DySCO is created (using the DySCO fields such as entities and keywords to form appropriate queries) and associates them with the input DySCO. In addition to this, in V2 the stream-manager will also collect Items (from Twitter and Facebook) using the REST API and provide them as input to the DySCO generation process.

The module architecture is illustrated below. The Item Collector takes care of the Item collection from the Streaming API of Twitter, while the Search Manager performs the targeted collection of content following the generation of DySCOs. The MediaItem Extractor is responsible for the extraction of MediaItems from the collected Items. The stream-manager stores the collected Items and MediaItems metadata in mongoDB. In V1, the stream-manager was also responsible for invoking their indexing in Solr. However, in V2 for efficiency reasons, this responsibility was moved to the orchestrators.

The project source code is available in the socialsensor-stream-manager GitHub project.

stream-manager

Infotainment Use Case

on .

The infotainment use case targets at individuals attending large events, such as festivals, expos, etc. The use case is centred on the principle that the different aspects of a user’s context can constitute valuable cues for proactive search and discovery of relevant media content. By leveraging the user’s context for search, especially in mobile settings, the physical surroundings of a user act as a lens on the social media content that relates to her current activities, location, and physical social ties. SocialSensor will deliver a platform supporting diverse usage scenarios, such as context-triggered multimedia search, proximity-based real-time activity recommendation, facilitation of social networking aspects, and  real-time interaction with the event acts through innovative interaction methods.

                   Applications      |       Preliminary Evaluation Results       |      54th Thessaloniki International Film Festival     |      Fête de la Musique 

The core of the platform is based on empowering the attendee by increased awareness of his own behaviour compared with the crowds. In particular:

sim

 EventLive is a mobile app with an aim to improve the experience of large event attendants, mainly through a rich set of intelligent features available in a mobile setting. With personalized recommendations according to each users profile and social graph and sentiment scores that uncover the public opinion, rich interactive maps for effective visualization and navigation and mobile2mobile live streaming and sharing, EventLive allows event spectators to plan, share, discover and enjoy the event by facilitating the information access (e.g. by easy access to program, film information, recommendations according to user profiles, option to create personal schedule, sentiment scores etc.), user interaction (with ratings and comments) as well as social sharing. With the help of EventSense, EventLive captures and displays what is around the attendee be it venues, points of interest, trends and happenings, profile based recommendations while in the same time it allows the users to keep in touch with her friends and keep them updated around and about the event.

thessfest dashboard

 

EventSense Dashboard is a real-time event web application offering insights to event professionals. By detecting trends and influencers and performing sentiment analysis it allows event managers to harness the power of social media to discover highlights, reveal the social sentiment, deliver better experiences and improve event marketing strategies. Giving event organizers the ability to stay on top of emerging trends, news, happenings, user comments and live and real time data analytics is of utmost importance and a key asset to capture the pulse of the crowd. EventSense dashboard bridges the gap between event organizers and event attendees and turns big data into awareness for the first. Social media analytics tools make it easy to provide a consolidated, relevant view of social media and web key performance metrics such as multimedia shared in Instagram, Youtube or Flickr or map based social media activity and most importanatly influential users to keep decision makers armed with the vital intelligence to drive meaningful results.

 

timeline

 

A Social Media Wall is an innovative way to display social streams as part of an event. Moving beyond the traditional social feed that is consumed on users' devices, it is a novel engagement tool that event organizers can use to create dynamic media-rich visualizations that are projected both online and at the event premises targeting event attendants. A Social Media Wall is a display breakthrough as it combines pieces of social media content coming from big social streams to the visual elements of infographics and interactive platforms to engage the spectator and allow the event manager to benefit from real time social media content throughout the lifecycle of the event. Geolocated users' tweets popping up live in a map, and real time updated  timelines to tell the story of an event through photos, most retweeted and shared media items, attendees' views and highlights, videos and live streams are but a few tools that social media walls platform delivers to event organizers to keep discussions going and turn attendees from online socializing to real life socializing

 The Thessaloniki International Film Festival and the Fête de la Musique Berlin events leverage SocialSensor for a stellar mobile app for their fans, and novel and collective knowledge models for the organizers. 

                     > Watch the promotional video created for the official application of the Thessaloniki International Film Festival here.

To learm more about the Infotainment use case click on the links to find out about our applications, how we evaluated the platform and the latest results during the 54th Thessaloniki International Film Festival that was held in November and Fête de la Musique Berlin that took place in June 2013. A Festival Timeline enriched with crowdsourced suggestions from end users as welll as with collective stories from TIFF54 and aggregated social data connected to thte event is also available. 

The prototype features:

EventLive was implemented as a mobile application suitable for festival attendants that aims at helping them plan, share, discover and enjoy the event by providing recommendations, insights and advanced mobile to mobile networking, including live video streaming and social sharing. EventLive also provides advanced 2D and 3D mappings that assist the user in spotting film venues, and localise twitter conversations in a time-aware manner.

Twitter Sentiment: Powered by EventSense and located under the description of every film  the feature reveals the social sentiment around a film showing in different colors the percentages for positive (green), negative (red) and neutral (yellow) tweets related to the specific film. 

Social Search and Recommendations: Recommendations are based on the user's personal selections  already existing in her "MyFilm" list, her film ratings and the "MyFilm" selections of the rest users. The Search functionality is performed in a separate tab in the EventLive user interface. The user can enter one or more keywords and is returned with a list of films relevant to his query. The list of films is ranked according to the same criteria that film recommendations are ranked.

My media search and share: After connecting the user is able either to (a) record a video and live stream it or (b) search and discover other services that live stream videos. Before recording/live streaming a video, the user is asked to annotate it using tags from a tag cloud of film festival-related tags. He is also able to provide a free text annotation of the video. The search for videos functionality allows the user discovering videos that are currently live streaming by selecting the type of desired video from the tag cloud list.

2D maps: allow users to explore information like screenings, venues, or tweets, geolocated in a map with an extension to easily navigate on the temporal aspect of the mentioned information. Besides this, our system offers basic analytical tools, like heat maps, to identify spots of high density of tweets and similar data.

3D maps: Users can spatially see where interesting places and eventually friends currently are. They can touch/click on 3D objects, e.g. a building, and get background information on the venue info and current schedule of films. Switching between "birdseye" and "walking in the street" views provides a maximum support on orientation.