Tactile reasoning and adaptive architecture for intelligence sense-making

Conference paper


Wong, B., Choudhury, S. and North Atlantic Treaty Organization 2011. Tactile reasoning and adaptive architecture for intelligence sense-making. Symposium on "Emerged/Emerging "Disruptive" Technologies (E2DT)". Madrid, Spain 09 - 10 May 2011
TypeConference paper
TitleTactile reasoning and adaptive architecture for intelligence sense-making
AuthorsWong, B., Choudhury, S. and North Atlantic Treaty Organization
Abstract

Visual analytics is the science of analytical facilitated by interactive visual interfaces [1]. Visual analytics combines automated analysis techniques with interactive visualizations to facilitate reasoning and making sense of large and complex data sets [2]. A key component of visual analytics is information visualisation, which is the communication of abstract data through visual representations that simplify, aggregate and reveal important relationships [3]. However, information visualisation is just one part of the equation that is visual analytics. The ability to manipulate the data directly and to query and initiate analytic processes through that manipulation with the resulting information is the other major component of visual analytics [1]. Together,
interaction, visualisation, and analytics, combine to create powerful tools for supporting the analysis
and reasoning with large, mix‐format, multi‐source data sets.
We are interested in the application of tactile reasoning to visual analytics. We define tactile reasoning as an interaction technique that supports the analytical reasoning process by the direct manipulation of information objects in a graphical user interface (GUI). In a study by Maglio et al [4] they found that participants using scrabble pieces (individual alphabets on tiles) generated more words when they were allowed to manipulate the scrabble pieces than when they are not allowed to
interact with the pieces. The act of tactile manipulation of the scrabble pieces, i.e. the ability to rearrange them, allowed the participants to form words that they could not form without interaction. Tactile reasoning, we therefore hypothesise, enables individuals to see patterns in
visually presented data sets they might otherwise not see through the manipulation, rearrangement and other interaction with the information objects.
In this paper we describe the concept of tactile reasoning in the context of visual analytics, and the adaptive architecture needed to support it during real‐time manipulation. We conduct our investigation through a lab prototype – INVISQUE – Interactive Visual Search and Query Environment [4,5]. INVISQUE provides an information visualisation interface coupled with a “reasoning
workspace” that facilitates tactile reasoning. INVISQUE was funded by JISC to provide an alternative interface to improve information search and retrieval and sense‐making in electronic library resource discovery systems such as the Emerald and ISI electronic journal databases. We have
developed an adaptive architecture which underlies INVISQUE and supports the sense‐making by providing the system with the capability to rapidly adapt to changing circumstances.

Keywordsvisual analytic, tactile reasoning, human-computer interaction
ConferenceSymposium on "Emerged/Emerging "Disruptive" Technologies (E2DT)"
Publication dates
PrintMay 2011
Publication process dates
Deposited06 Jun 2011
Output statusPublished
Web address (URL)http://ftp.rta.nato.int/public/PubFullText/RTO/MP/RTO-MP-IST-099/MP-IST-099-17.doc
LanguageEnglish
File
Permalink -

https://repository.mdx.ac.uk/item/835yv

Download files

  • 30
    total views
  • 6
    total downloads
  • 0
    views this month
  • 0
    downloads this month

Export as