The European Union is funding a new project to allow world citizens to search the emerging wireless sensor networks called the "Search engine for MultimediA enviRonment generated contenT." The SMART effort is smarter than its acronym, promising to allow anyone to ask questions like “Which places are crowded?” after a soccer game, or even “Where are riots and fights happening?” as well as more traditional sensor-related searches like "Where is the live music going on that my friends are attending?": R. Colin Johnson
The goal of the Intelligent Fusion Manager is to reason for events and provide them in the form of collection files to be indexed by the search engine.
Here is what SMART says about its sensor node search engine: The Future Internet will include a large number of internet-connected sensors (including cameras and microphone arrays), which provide opportunities for searching and analyzing large amounts of multimedia data from the physical world, while also integrating them into added-value applications. Despite the emergence of techniques for searching physical world multimedia (including the proliferation of participatory sensing applications), existing multimedia search solutions do not provide effective search over arbitrary large and diverse sources of multimedia data derived from the physical world.
SMART will introduce a holistic open source web-scale multimedia search framework for multimedia data stemming from the physical world. To this end, SMART will develop a scalable search and retrieval architecture for multimedia data, along with intelligent techniques for real-time processing, search and retrieval of physical world multimedia. The SMART framework will boost scalability in both functional and business terms, while being extensible in terms of sensors and multimedia data processing algorithms. The SMART framework will enable answering of queries based on the intelligent collection and combination of sensor generated multimedia data, using sensors and perceptual (A/V) signal processing algorithms that match the application context at hand.
This matching will be based on the sensors’ context and metadata (e.g., location, state, capabilities), as well as on the dynamic context of the physical world as the later is perceived by processing algorithms (such as face detectors, person trackers, classifiers of acoustic events and components for crowd analysis). At the same time, SMART will be able to leverage Web2.0 social networks information in order to facilitate social queries on physical world multimedia. The main components of the SMART search framework will be implemented as open source software over the Terrier open source engine.