Resumo: |
News are no longer simply produced and consumed, but instead continually evolve over time as a cooperative dialog between news outlets and the public at-large. News presentation must fundamentally reflect this, providing anytime organization of the latest events, conveying how story elements developed over time, and integrating the story in the larger world context. In short, the days of simple online aggregation are over; the world has already moved on. While the idea of journalists using computers as information discovery tools goes back several decades, never before has computation been understood to be so tightly integrated with the core of journalistic practice. Journalistic excellence today requires advanced data mining and search technologies, together with novel web services and integrative mashups.
We identify the following important challenges facing the field:
Automatic analysis of content, including news, blogs, micro-blogs, comments: detect and resolve references to named entities (e.g., public figures); tracking these entities and events involving them; assess quality (e.g., readability); infer polarity (e.g., sentiment); detect cases and patterns of re-use (e.g., via "memes" or larger units of similar text) and information flow.
Automatic analysis of explicit and implicit social networks: infer implicit social networks based on information flow patterns involving content producers and consumers; discover communities; infer authority and credibility of sources; find experts; identify influential community members.
Design of rich visualization and interaction interfaces for presenting dynamic, personalized news and learning about implicit relationships between news stories and reader communities.
Case-study evaluation of developed computational journalism methodology in a production setting, to provide a critical analysis of practical impact on newsroom quality, efficiency, and economics (cost and revenue).
We research new tools for |
Resumo News are no longer simply produced and consumed, but instead continually evolve over time as a cooperative dialog between news outlets and the public at-large. News presentation must fundamentally reflect this, providing anytime organization of the latest events, conveying how story elements developed over time, and integrating the story in the larger world context. In short, the days of simple online aggregation are over; the world has already moved on. While the idea of journalists using computers as information discovery tools goes back several decades, never before has computation been understood to be so tightly integrated with the core of journalistic practice. Journalistic excellence today requires advanced data mining and search technologies, together with novel web services and integrative mashups.
We identify the following important challenges facing the field:
Automatic analysis of content, including news, blogs, micro-blogs, comments: detect and resolve references to named entities (e.g., public figures); tracking these entities and events involving them; assess quality (e.g., readability); infer polarity (e.g., sentiment); detect cases and patterns of re-use (e.g., via "memes" or larger units of similar text) and information flow.
Automatic analysis of explicit and implicit social networks: infer implicit social networks based on information flow patterns involving content producers and consumers; discover communities; infer authority and credibility of sources; find experts; identify influential community members.
Design of rich visualization and interaction interfaces for presenting dynamic, personalized news and learning about implicit relationships between news stories and reader communities.
Case-study evaluation of developed computational journalism methodology in a production setting, to provide a critical analysis of practical impact on newsroom quality, efficiency, and economics (cost and revenue).
We research new tools for providing greater automation in news gathering, analysis, and delivery, while respecting practical constraints of news producers and consumers. We emphasize decomposition of stories into finer-grained elements and discovery of implicit relations between them. We also emphasize the relationship between news and social networks, both explicit and implicit, which underlie the news and significantly shape its content, quality, and authority. Hands-on experience in the newsroom will enable practitioners to innovate current practice of news production and identify important avenues for future research in computational journalism.
REACTION is organized in seven complementary research tasks which jointly address the four problem areas identified above:
Mining Resources (lead by: Paula Carvalho, LASIGE)
Entity and Event Tracking (lead by: Francisco Couto, LASIGE)
Web Community Sensing (lead by: Eduarda Mendes Rodrigues, FEUP)
Tracking Information Flow (lead by: Mathew Lease, UTA)
Interaction and Personalization (lead by: Luis Francisco-Revilla, UTA)
Query and Visualization (lead by: Luís Sarmento, SAPO/LIACC )
Computational Newsroom (lead by: António Granado, CIMJ) |