Abstract (EN):
The exponential growth of the Internet of Things (IoT) has resulted in significant data production, characterized by the continuous collection and transmission of data from interconnected devices and sensors to Internet-based storage and processing services. In this scenario, several challenges arise in efficiently managing huge amounts of data, especially when IoT applications involve multiple heterogeneous sensor-based sources, necessitating effective data fusion techniques. A new approach based on artificial intelligence is proposed in this article to process data at the edge of the network infrastructure, adopting a contextual relevance paradigm as an optimization criterion. For this purpose, an Artificial Intelligence (AI)-based mechanism is developed to predict analytical metrics for different sensor data configurations, notably reduction and distortion ratios, across a range of pre-selected data reduction algorithms. With these predictions based on existing datasets, a selector algorithm module is adopted to identify the most suitable data reduction approach for specific IoT applications. This approach differs from existing work by using contextual relevance to guide the dynamic selection of Data Reduction (DR) algorithms, predicting results rather than computing them, which significantly reduces computational overhead. It is technique-agnostic, scalable, and adaptable to any DR method with available training data, acting as an open decision-support solution. This innovative approach showcases its ability to dynamically select data reduction algorithms for performance enhancement, which can contribute significantly to improving many heterogeneous IoT-based applications. Experimental results using IoT time-series data demonstrate its effectiveness in predicting DR metrics and selecting optimal algorithms with minimal latency, highlighting its potential in time-critical and data-intensive IoT scenarios.
Language:
English
Type (Professor's evaluation):
Scientific
No. of pages:
21