Abstract (EN):
The development of biomedical signal processing algorithms typically assumes that the data can be sampled at an uniform rate and without loss of samples. Although this is a valid assumption for Holter applications or clinical testing, these assumptions become questionable in the presence of remote monitoring of patients through inherently lossy communication networks. The task for the networking engineers has been to create better, more reliable protocols to avoid packet losses from affecting the signal processing algorithms. However, inherent constraints from resource-constrained devices and lossy networks used for remote monitoring make this objective infeasible in many situations. Given irreparable losses due to data transmission, this paper poses the following questions: (i) how would the current algorithms react to losses, and (ii) what alternatives are available to still guarantee reliable monitoring and detection of emergency events. For the latter, we consider two options: the use of current algorithms after a loss concealment stage, and the design of loss aware algorithms. We argue that a joint design of network protocols and signal processing algorithms is instrumental for providing reliable biomedical monitoring. We propose a simple, yet powerful model of the network under a variety of packet loss channels as well as data packetization mechanisms. Extensive numerical results are provided for addressing question (i), focusing on the sensitivity and positive predictivity of standard ECG algorithms under a variety of network scenarios. We use the MIT-BIH arrhythmia database and simple loss concealment mechanisms and show that even small percentages of packet losses can have a significant impact on a algorithm's performance.
Language:
English
Type (Professor's evaluation):
Scientific