The diffusion of smart-phones offers access to the best remote expertise in stress echo (SE). To evaluate the reliability of SE based on smart-phone filming and reading. A set of 20 SE video-clips were read in random sequence with a multiple choice six-answer test by ten readers from five different countries (Italy, Brazil, Serbia, Bulgaria, Russia) of the "SE2020" study network. The gold standard to assess accuracy was a core-lab expert reader in agreement with angiographic verification (0 = wrong, 1 = right). The same set of 20 SE studies were read, in random order and > 2 months apart, on desktop Workstation and via smartphones by ten remote readers. Image quality was graded from 1 = poor but readable, to 3 = excellent. Kappa (k) statistics was used to assess intra- and inter-observer agreement. The image quality was comparable in desktop workstation vs. smartphone (2.0 +/- 0.5 vs. 2.4 +/- 0.7, p = NS). The average reading time per case was similar for desktop versus smartphone (90 +/- 39 vs. 82 +/- 54 s, p = NS). The overall diagnostic accuracy of the ten readers was similar for desktop workstation vs. smartphone (84 vs. 91%, p = NS). Intra-observer agreement (desktop vs. smartphone) was good (k = 0.81 +/- 0.14). Inter-observer agreement was good and similar via desktop or smartphone (k = 0.69 vs. k = 0.72, p = NS). The diagnostic accuracy and consistency of SE reading among certified readers was high and similar via desktop workstation or via smartphone.

Stress echocardiography with smartphone: real-time remote reading for regional wall motion

Citro R;
2017-01-01

Abstract

The diffusion of smart-phones offers access to the best remote expertise in stress echo (SE). To evaluate the reliability of SE based on smart-phone filming and reading. A set of 20 SE video-clips were read in random sequence with a multiple choice six-answer test by ten readers from five different countries (Italy, Brazil, Serbia, Bulgaria, Russia) of the "SE2020" study network. The gold standard to assess accuracy was a core-lab expert reader in agreement with angiographic verification (0 = wrong, 1 = right). The same set of 20 SE studies were read, in random order and > 2 months apart, on desktop Workstation and via smartphones by ten remote readers. Image quality was graded from 1 = poor but readable, to 3 = excellent. Kappa (k) statistics was used to assess intra- and inter-observer agreement. The image quality was comparable in desktop workstation vs. smartphone (2.0 +/- 0.5 vs. 2.4 +/- 0.7, p = NS). The average reading time per case was similar for desktop versus smartphone (90 +/- 39 vs. 82 +/- 54 s, p = NS). The overall diagnostic accuracy of the ten readers was similar for desktop workstation vs. smartphone (84 vs. 91%, p = NS). Intra-observer agreement (desktop vs. smartphone) was good (k = 0.81 +/- 0.14). Inter-observer agreement was good and similar via desktop or smartphone (k = 0.69 vs. k = 0.72, p = NS). The diagnostic accuracy and consistency of SE reading among certified readers was high and similar via desktop workstation or via smartphone.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11695/135092
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 8
social impact