01.

Title 1

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Read More

02.

Title 2

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Read More

03.

Title 3

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Read More

04.

Title 4

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.

Read More

Paulo Lopez-Meyer, Edward Sazonov, The 5th International Conference on Sensing Technology, Nov. 28th - Dec. 1st, 2011, Palmerston North, New Zealand, pp. 156-160.

 

Respiratory Inductance Plethysmography (RIP) allows studying the physiology of the breathing process and provides ability to characterize it in an automated manner. An efficient and robust automatic breath segmentation technique that can be applied to sensor signals acquired in free living conditions is of interest due to the significant variability in the breathing patterns of different activities. The gold standard for
breath segmentation has been visual recognition of breath cycles based on respiratory signals interpreted by an expert. This process is impractical for long term monitoring of respiration, especially when intended to be performed by wearable sensors. In this work a feasibility study is presented using a technique for automatic breath segmentation based on peak and valley detection to determine beginning and end of a breath segment. The proposed segmentation method is applied to breathing recordings collected from four different activities: resting, reading aloud, food intake and smoking, acquired by a wearable RIP sensor.  Significant differences in the breathing waveforms patterns are easily observed for each one of these activities. Results suggest that the breath segmentation technique studied in this paper has enough robustness to be used under different activities, with up to 96.6% accuracy for resting, 89.9% for reading, 91.1% for food intake and 89.2% for smoking.