Article Data

  • Views 426
  • Dowloads 75

Original Research

Open Access

An accurately supervised motion-aware deep network for non-contact pain assessment of trigeminal neuralgia mouse model

  • Zhiheng Feng1
  • Mingcai Chen2
  • Jue Zhang3
  • Xin Peng4,*,

1Academy for Advanced Interdisciplinary Studies, Peking University, 100871 Beijing, China

2State Key Laboratory for Novel Software Technology, Nanjing University, 210023 Nanjing, Jiangsu, China

3College of Engineering, Peking University, 100871 Beijing, China

4Department of Oral and Maxillofacial Surgery, Peking University School of Stomatology, 100081 Beijing, China

DOI: 10.22514/jofph.2024.008 Vol.38,Issue 1,March 2024 pp.77-92

Submitted: 13 April 2023 Accepted: 10 November 2023

Published: 12 March 2024

*Corresponding Author(s): Xin Peng E-mail: pxpengxin@263.net

Abstract

Pain assessment in trigeminal neuralgia (TN) mouse models is essential for exploring its pathophysiology and developing effective analgesics. However, pain assessment methods for TN mouse models have not been widely studied, resulting in a critical gap in our understanding of TN. With the rapid advancement of deep learning, numerous pain assessment methods based on deep learning have emerged. Nonetheless, these methods have some limitations: (1) insufficiently objective supervision signals for training, (2) failure to account for the dynamic behavioral characteristics of mouse models in the constructed models and (3) inadequate generalization ability of the models. In this study, we initially constructed an objective pain grading dataset as the ground truth for model training, which remedy the limitations of prior studies that relied on subjective evaluation as supervisory signals. Then we proposed a novel deep neural network, named trigeminal neuralgia pain assessment network (TNPAN), which fuses the static texture characteristics and dynamic behavioral characteristics of mouse facial expressions. The promising experimental results demonstrate that TNPAN exhibits exceptional accuracy and generalization capability in pain assessment.


Keywords

Oral diseases; Trigeminal neuralgia; Pain assessment; Convolutional neural networks


Cite and Share

Zhiheng Feng,Mingcai Chen,Jue Zhang,Xin Peng. An accurately supervised motion-aware deep network for non-contact pain assessment of trigeminal neuralgia mouse model. Journal of Oral & Facial Pain and Headache. 2024. 38(1);77-92.

References

[1] Colloca L, Ludman T, Bouhassira D, Baron R, Dickenson AH, Yarnitsky D, et al. Neuropathic pain. Nature Reviews Disease Primers. 2017; 3: 17002.

[2] Cruccu G, Di Stefano G, Truini A. Trigeminal neuralgia. The New England Journal of Medicine. 2020; 383: 754–762.

[3] Langford DJ, Bailey AL, Chanda ML, Clarke SE, Drummond TE, Echols S, et al. Coding of facial expressions of pain in the laboratory mouse. Nature Methods. 2010; 7: 447–449.

[4] Dalla Costa E, Minero M, Lebelt D, Stucke D, Canali E, Leach MC. Development of the horse grimace scale (HGS) as a pain assessment tool in horses undergoing routine castration. PLOS ONE. 2014; 9: e92281.

[5] Littlewort GC, Bartlett MS, Lee K. Automatic coding of facial expressions displayed during posed and genuine pain. Image and Vision Computing. 2009; 27: 1797–1803.

[6] Khan RA, Meyer A, Konik H, Bouakaz S. Framework for reliable, real-time facial expression recognition for low resolution images. Pattern Recognition Letters. 2013; 34: 1159–1168.

[7] Werner P, Al-Hamadi A, Limbrecht-Ecklundt K, Walter S, Gruss S, Traue HC. Automatic pain assessment with facial activity descriptors. IEEE Transactions on Affective Computing. 2017; 8: 286–299.

[8] Bargshady G, Zhou X, Deo RC, Soar J, Whittaker F, Wang H. Enhanced deep learning algorithm development to detect pain intensity from facial expression images. Expert Systems with Applications. 2020; 149: 113305.

[9] Tuttle AH, Molinaro MJ, Jethwa JF, Sotocinal SG, Prieto JC, Styner MA, et al. A deep neural network to assess spontaneous pain from mouse facial expressions. Molecular Pain. 2018; 14: 1744806918763658.

[10] Vidal A, Jha S, Hassler S, Price T, Busso C. Face detection and grimace scale prediction of white furred mice. Machine Learning with Applications. 2022; 8: 100312.

[11] Deuis JR, Vetter I. The thermal probe test: a novel behavioral assay to quantify thermal paw withdrawal thresholds in mice. Temperature. 2016; 3: 199–207.

[12] McLennan K, Mahmoud M. Development of an automated pain facial expression detection system for sheep (Ovis Aries). Animals. 2019; 9: 196.

[13] Sotocina SG, Sorge RE, Zaloum A, Tuttle AH, Martin LJ, Wieskopf JS, et al. The rat grimace scale: a partially automated method for quantifying pain in the laboratory rat via facial expressions. Molecular Pain. 2011; 7: 1744–1755.

[14] Deuis JR, Dvorakova LS, Vetter I. Methods used to evaluate pain behaviors in rodents. Frontiers in Molecular Neuroscience. 2017; 10: 284.

[15] Kopaczka M, Ernst L, Heckelmann J, Schorn C, Tolba R, Merhof D. Automatic key frame extraction from videos for efficient mouse pain scoring. 2018 5th International Conference on Signal Processing and Integrated Networks (SPIN). IEEE: New York. 2018.

[16] Szegedy C, Vanhoucke V, Ioffe S, Shlens J, Wojna Z. Rethinking the inception architecture for computer vision. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE: New York. 2016.

[17] Lu J, Yang B, Liao J, Chen B, Lu M, Zhang W, et al. Olfactory ensheathing cells alleviate facial pain in rats with trigeminal neuralgia by inhibiting the expression of P2X7 receptor. Brain sciences. 2022; 12: 706.

[18] Dixon WJ. Efficient analysis of experimental observations. Annual Review of Pharmacology and Toxicology. 1980; 20: 441–462.

[19] Chaplan SR, Bach FW, Pogrel JW, Chung JM, Yaksh TL. Quantitative assessment of tactile allodynia in the rat paw. Journal of Neuroscience Methods. 1994; 53: 55–63.

[20] Bonin RP, Bories C, De Koninck Y. A simplified up-down method (SUDO) for measuring mechanical nociception in rodents using von frey filaments. Molecular Pain. 2014; 10: 1744–1726.

[21] He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE: New York. 2016

[22] Insafutdinov E, Andriluka M, Pishchulin L, Tang S, Levinkov E, Andres B, et al. ArtTrack: articulated multi-person tracking in the wild. 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE: New York. 2017

[23] Insafutdinov E, Pishchulin L, Andres B, Andriluka M, Schiele B. DeeperCut: a deeper, stronger, and faster multi-person pose estimation model. In Leibe B, Matas J, Sebe N, Welling M (eds) Computer Vision—ECCV 2016. Lecture Notes in Computer Science (pp. 34–50). 1st edn. Springer: Cham. 2016.

[24] Pishchulin L, Insafutdinov E, Tang S, Andres B, Andriluka M, Gehler P, et al. DeepCut: joint subset partition and labeling for multi person pose estimation. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). IEEE: New York. 2016.

[25] Bottou L. Stochastic gradient descent tricks. Lecture Notes in Computer Science. 2012; 10: 421–436.

[26] Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, et al. Pytorch: an imperative style, high-performance deep learning library. Advances in Neural Information Processing Systems 32. NeurIPS: Vancouver. 2019.

[27] Terrier L, Hadjikhani N, Destrieux C. The trigeminal pathways. Journal of Neurology. 2022; 269: 3443–3460.

[28] Gambeta E, Chichorro JG, Zamponi GW. Trigeminal neuralgia: an overview from pathophysiology to pharmacological treatments. Molecular Pain. 2020; 16: 1744806920901890.

[29] Melzack R. A New Theory. A gate control system modulates sensory input from the skin before it evokes pain perception and response. Science. 1965; 150: 971–979.

[30] Wilson SG, Mogil JS. Measuring pain in the (knockout) mouse: big challenges in a small mammal. Behavioural Brain Research. 2001; 125: 65–73.

[31] Yang L, Ertugrul IO, Cohn JF, Hammal Z, Jiang D, Sahli H. facs3d-net: 3d Convolution based Spatiotemporal Representation for Action Unit Detection. 2019 8Th International Conference on Affective Computing and Intelligent Interaction (ACII). IEEE: New York. 2019.

[32] Qiu Z, Yao T, Mei T. Learning spatio-temporal representation with pseudo-3D residual networks. 2017 IEEE International Conference on Computer Vision (ICCV). IEEE: New York. 2017.

[33] Dolensek N, Gogolla N. Machine-learning approaches to classify and understand emotion states in mice. Neuropsychopharmacology. 2021; 46: 250–251.

[34] Wotton JM, Peterson E, Anderson L, Murray SA, Braun RE, Chesler EJ, et al. Machine learning-based automated phenotyping of inflammatory nocifensive behavior in mice. Molecular Pain. 2020; 16: 1744806920958596.

[35] Neff EP. Painless pain assessments with machine learning. Lab Animal. 2018; 47: 149.

[36] Salama ES, El-Khoribi RA, Shoman M, Shalaby MAW. EG-based emotion recognition using 3D convolutional neural networks. International Journal of Advanced Computer Science and Applications. 2018; 9: 329–337.

[37] Ji S, Xu W, Yang M, Yu K. 3D convolutional neural networks for human action recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2013; 35: 221–231.


Abstracted / indexed in

Science Citation Index (SCI)

Science Citation Index Expanded (SCIE)

BIOSIS Previews

Scopus

Cumulative Index to Nursing and Allied Health Literature (CINAHL)

Submission Turnaround Time

Conferences

Top