Cognitive Model for Image Sensor Motion Control
Abstract
This paper presents a neurobiology and cognitive psychology inspired model, implemented using a neural-network-like parallel computational strategy. The goal of the model is to show how a visual cortex inspired system can control camera alignment using a given camera hardware setup, in a similar way to the brain´s controlling eye movements. The system (computational model and camera hardware) are integrated into an experimental environment, the Intelligent Space, a room of ubiquitous computing. The intelligent space is based on several Distributed Intelligent Network Devices (DIND). A DIND has a sensor input, integrated intelligence and a communication interface. The model presented in this paper serves as the integrated intelligence component of a particular DIND. The proposed model implemented in a parallel hardware performs real time operation.