Abstract (EN):
GUI interfaces require considerable visual attention for their operation excluding the access to important information coded only in the layout. In an Era of mobile devices, we must enhance the auditory designs, to facilitate the interactive contents access to the blind, people with low vision, and/or in any use context. This essay is part of an experimental approach at the human perception based on the theories of form - Gestalt - and the Computational in order to process and implement the brain acquisition signal, obtaining relations between the visual and sound stimuli. We present a computational approach that underlay the electrical signal acquisition of the brain to stimuli response - "Event-Related Potentials" (P300) - based on a fundamental visual syntax that assumes the Gestalt phenomenology with new statistical interim results to the modeling multi-perceptive of information processing (visual and auditory), with the ultimate goal of framing a lexicon and/or basic patterns common that can be applied directly to a well-grounded development of GUI - "Graphic User Interfaces" and AUI - "Auditory User Interfaces".
Idioma:
Inglês
Tipo (Avaliação Docente):
Científica
Nº de páginas:
4