Author + information
- Received September 4, 2013
- Revision received September 9, 2013
- Accepted September 12, 2013
- Published online November 1, 2013.
- Partho P. Sengupta, MD, DM∗ ()
- Zena and Michael A. Wiener Cardiovascular Institute, Icahn Mount Sinai School of Medicine, New York, New York
- ↵∗Reprint requests and correspondence:
Dr. Partho P. Sengupta, Icahn Mount Sinai School of Medicine, One Gustave L. Levy Place, New York, New York 10029.
Accelerating trends in the dynamic digital era (from 2004 onward) has resulted in the emergence of novel parametric imaging tools that allow easy and accurate extraction of quantitative information from cardiac images. This review principally attempts to heighten the awareness of newer emerging paradigms that may advance acquisition, visualization and interpretation of the large functional data sets obtained during cardiac ultrasound imaging. Incorporation of innovative cognitive software that allow advanced pattern recognition and disease forecasting will likely transform the human-machine interface and interpretation process to achieve a more efficient and effective work environment. Novel technologies for automation and big data analytics that are already active in other fields need to be rapidly adapted to the health care environment with new academic-industry collaborations to enrich and accelerate the delivery of newer decision making tools for enhancing patient care.
Echocardiography remains the most versatile cardiac imaging tool in clinical practice, offering an ever-increasing array of measurements. Improved computer capabilities over the years—as predicted by Moore's law—have enhanced data extraction and visualization. New opportunities are emerging with interactive web-based networks providing easy access to massive datasets in cardiac imaging. In addition to the rise of big data, migration of hardware functionality to software has made possible miniaturization of clinical ultrasound devices. These emerging technologies have the potential to make clinical imaging smarter and more cost effective. This review provides a broad perspective on projected developments in cardiac ultrasound, highlighting new pathways to acquire and analyze large datasets using cognitive tools that are particularly relevant for cardiac function assessment.
The Big Data Revolution in Cardiac Ultrasound
Big data computing is perhaps the most significant development in the digital world over the last decade. It has been estimated that somewhere between 2003 and 2004, the amount of data created by the digital world accelerated exponentially, surpassing the total amount of data created in the previous 40,000 years of human civilization (1). A similar growth rate can also be seen in the field of echocardiography, where the number of publications has increased exponentially over the past few decades (Fig. 1A). And in addition to the increase in the number of publications, closer examination reveals a shift in what researchers are studying as well (Fig. 1B). In the 1980s, publications about Doppler imaging experienced the greatest growth in terms of publication volume. However, by the beginning of the 21st century, the number of publications on parametric imaging had increased, surpassing the number of publications from all other echocardiography disciplines; coincidently, this occurred sometime between 2003 and 2004.
The number of quantitative parameters that can be measured during an echocardiogram continues to grow. For example, current guidelines have emphasized the routine reporting of at least 12 measures of cardiac chamber quantification and the use of 8 variables for grading the severity of diastolic dysfunction (2,3). Moreover, with the recent growth in parametric imaging techniques such as 2-dimensional (2D) speckle tracking echocardiography, several new quantitative variables have been tested in clinical trials and are awaiting further standardization. It remains unclear, however, how reasonably these variables may be incorporated into a busy clinical practice.
Notwithstanding an increase in the number of variables, the new parameters offer a better understanding of the relationship between the structure and function of the myocardium. For example, speckles that are tracked to measure myocardial deformation are not just random acoustical interference patterns, but rather they have their genesis in the underlying counter-directional helical muscle geometry of the left ventricle (LV) (4). Knowing this has enabled researchers to comprehend the peculiarities of heart motion during the phases of the cardiac cycle as well as the contributions of 3-directional deformation toward maintaining global LV pump function. As a consequence, we know that a disease state that affects LV function impacts deformation in all 3 directions. For example, multidirectional deformation in heart failure reveals the presence of unique interactions that differentiates the phenotypic presentations (5). Nontransmural disease is associated with reduced longitudinal mechanics, whereas the mechanics in other directions are relatively preserved. On the other hand, a reduction of LV deformation in all 3 directions is associated with transmural disease and a reduction of the LV ejection fraction. Thus, it is often the cluster of deformation variables, rather than a single variable, that identifies a diseased phenotypic presentation.
The introduction of 3-dimensional (3D) parametric imaging, pending standardization, may further increase the number of indices. Moreover, the interaction of myocardial tissue mechanics with LV fluid mechanics may further improve understanding of the hemodynamic presentations associated with diseased states. For example, the characterization of intracavity blood flow in 2 dimensions reveals the presence of a rotating mass of blood or a vortex formation inside the LV (6). The strength of the vortex formation may be relevant to in-depth understanding of the variables useful for improving the mechanical efficiency of the LV in heart failure (7,8).
As the quantity of parametric data increases, 2 challenges are surfacing. First, how can we store and later access these large databases? Second, what if the number of potential data combinations is so enormous that making sense of the data is practically unmanageable for a busy clinician? For example, if one examines the decision tree for grading the severity of diastolic dysfunction in current guidelines, the algorithm recognizes patterns based on outcomes from 8 echocardiographic variables (3). However, the 8 variables can mathematically exist in a staggering 40,320 combinations (8 factorial) with an equivalent number of decision trees. And yet we hope that in real life, a single decision tree, based on human observations and consensus, would faithfully capture all patterns. How feasible will it be to recognize and handle these combinations? One could only imagine that a clinician would be interested in methods that would assemble data automatically with decision trees that score and categorize any variance appropriately.
Data Efficiency Using Cloud-Based Automation
The Internet has broadened the scope of medical information systems and has led to the development of distributed and interoperable information sources and services. Global healthcare systems offer a rich information sink that facilitates patient and practitioner mobility. An example of such converged framework and shared services, known as “cloud computing,” was first tested for large-scale echocardiography use in the landmark ASE-REWARD (American Society of Echocardiography: Remote Echocardiography With Web-Based Assessments for Referrals at a Distance) study (9), conducted at a meditation camp in a remote part of India in January 2012. A total of 1,024 patients were imaged with handheld ultrasound units in <48 h. The scanned data were then uploaded to a cloud-based platform and distributed to 75 centers worldwide. We learned from this study that clinicians easily recognized structural datasets such as valve lesions, but the greatest variability came from their assessment of functional data such as the LV ejection fraction. To standardize the functional assessments, we went back to perform the ASE VISION (American Society of Echocardiography: Value of Interactive Scanning for Improving Outcomes of New Learners) study (10). During this study, sonographers on the West Coast of the United States trained physicians in India using a novel Internet-based imaging solution (StatVideo, Andover, Massachusetts) that allowed real-time interactions as the Indian physicians scanned the patients. These physicians then scanned over 1,000 patients for cardiac function assessment. Both studies thus established that physicians could be educated remotely using a cloud computing environment for performing focused echocardiograms that proved useful in expediting mass triage in remote areas.
Use of Robotics and Automation in Cardiac Ultrasound
Quantitative measurements in echocardiography show a marked dependence on the skills of the operator to ensure image quality during repeated measurements. Compared with traditional hand-held cardiac ultrasound examinations, automation in image acquisition and analysis may be useful for more cost-effective and widespread applications of cardiac ultrasound while maintaining image quality. Currently, robotic arms are able to process complete 3D image volumes without human intervention and are being used for laparoscopic procedures as an intraoperative guidance tool (11). The use of robotic arms brings in accuracy, consistency, dexterity, and maneuverability. Moreover, robotic arms are capable of precisely tracking and generating evenly spaced slices without suffering human challenges such as hand cramps, tremors, and fatigue. The use of a robotic cardiac ultrasound arm may be a useful complement for the emerging field of telemedicine. A robotized tele-echocardiography approach was first developed in the late 1990s and is currently being used for enhancing the quality of health care in medically remote areas (12). This concept is currently also being explored for performing long-distance intercontinental examinations (Fig. 2). Future developments in robotic arms are expected to couple with automated image analysis (machine vision) for complete automated acquisition, recognition, and quantitative assessments of cardiac images without the need for manual interactions. This may be helpful for performing high-throughput cardiac ultrasound scanning in hospitals and communities, and for collecting large-volume datasets in conjunction with cloud-based automation.
As more and more data become available, the challenge will be how to mine the vast datasets to facilitate and expedite decision making. One of the steps in meeting this challenge could be to develop cognitive tools for automated analysis of big functional datasets. Until recently, computer-aided intelligence was labeled as “artificial” and was thought to be potentially useful for clinical decision making using artificial intelligence–based techniques such as a neural network, a support vector machine, a decision tree, or a fuzzy expert system with rules produced by a genetic algorithm. Recent developments in the field of cognitive computing have been boosted by the development of natural brain-like intelligence (13). Our minds do not incorporate the slow, sequential, logical ways of thinking that is a characteristic of artificial intelligence (M. Aparicio, personal communication, September 2013). Instead, our thinking is fast and mostly based on our memory of associations that “connect the dots.” In the past, brain-like thinking was dismissed in favor of pure logic. Inaccuracies of human reasoning were reasoned by favoring statistical analytics. However, statistics per se originally was applied to dice and card games, which do not reflect the more general world. When a player trains with high intensity, a better performance is due to more than just chance. Just like our brains can make sensible decisions in a complex and ever-changing world, computer-aided natural intelligence can help with anticipatory sense making so that we connect the dots for predictive decision making. Such estimates could be additionally supplemented with statistical reasoning to enable the selection of the best connections, especially in nonlinear, high-dimensional situations such as with complex diseases.
Figure 3 shows the application of a natural-intelligence cognitive tool (Saffron Technology, Cary, North Carolina) for assessing speckle-tracking echocardiography-derived functional data. This technology has been used for national security and defense, and for war zone assessments and aerospace missions (14). We first programmed the software to see a dataset for developing a memory bank of associations. For this model, we used one of the toughest examples in the clinical world where multivariable data are important for arriving at a correct diagnosis—the differentiation of deformation patterns seen in constrictive pericarditis versus restrictive cardiomyopathy. Previous investigations had suggested a specific constellation of deformation patterns that differentiate the former from restrictive cardiomyopathy (15). The challenge was to see whether the cluster of deformation variables that differentiate constrictive pericarditis from restrictive cardiomyopathy could be recognized using a brain-like cognitive tool. The 2D speckle tracking data from each patient included 20 time points in a cardiac cycle, 9 measurements (velocity, displacement, strain, and strain rate in 2 directions, and area) obtained from 6 segments each in apical 4-chamber view and short-axis view of the LV at the level of papillary muscle. Close to 2 million associations were examined in 15 patients with constriction and 14 patients with restrictive cardiomyopathy. Given the high dimensionality and assumed nonlinearity of echocardiography deformation data, a coincidence matrix of associations was constructed and the entropy-based information was computed for each class. Because the resulting matrices were extremely large, and although all the coincident information was exploited for classification, heat map visualization (Figs. 3 to 5) was limited to the most informative interactions. By differentiating these heat maps, the clear separation of disease states was obvious. Dominant associations between variables seen in restrictive cardiomyopathy (Fig. 4, red) were separable from dominant associations seen in constrictive pericarditis (Fig. 3, green), although some overlap was seen to exist (Fig. 5, yellow).
The aforementioned strategies of cloud automation, robotics, data analytics, and cognitive tools can be projected to one of the many ways intelligent and efficient platforms could develop in the field of cardiac imaging. Advances in digital sensors, communications, computation, and storage will continue to create huge collections of data. For example, search engine companies such as Google, Yahoo, and Microsoft have created an entirely new industry by capturing information that is freely available on the World Wide Web and providing it to people in useful ways such as satellite images, driving directions, and image retrieval. Just as search engines have transformed how we access information, other forms of big data computing will transform the activities of scientific researchers and medical practitioners. The National Science Foundation has enthusiastically embraced data-intensive computing, with several new funding initiatives for creating the cyber infrastructure of 21st century (16). The White House administration has also announced support in 2012 for big data initiatives and for building the road map for a trustworthy cyberspace (17,18).
For those who fear that this growth of computer use in cardiac ultrasound imaging will remove us from our patients, the answer is that the merger will be seamless. Cardiac images and analytics will be available in real time on wearable computers. Wearable computers on glasses, wristbands, and professional clothing are not a futuristic dream anymore (19,20). Their applicability is currently being tested at point-of-care locations. Due to the early stage of development of this technology, the devices are still complex. Once they reach a mature stage, cardiac ultrasound imaging and advanced analytics will be easily integrated into the professional workflow. The potential of such augmented reality is exciting, offering us many new challenges. As a community, we have a responsibility to embrace these changes, building the future for our patients' lives and ourselves.
Presented as the 2013 Feigenbaum Lecture at the 49th Annual Scientific Session of the American Society of Echocardiography, Minneapolis, Minnesota, July 1, 2013.
Dr. Sengupta has received research funding from Forest Laboratories; and is on the advisory board for Medical Intelligence Ltd.
- Received September 4, 2013.
- Revision received September 9, 2013.
- Accepted September 12, 2013.
- Goldin I.
- Lang R.M.,
- Bierig M.,
- Devereux R.B.,
- et al.,
- for the Chamber Quantification Writing Group, American Society of Echocardiography's Guidelines and Standards Committee, European Association of Echocardiography
- Sengupta P.P.,
- Korinek J.,
- Belohlavek M.,
- et al.
- Goliasch G.,
- Goscinska-Bis K.,
- Caracciolo G.,
- et al.
- Abe H.,
- Caracciolo G.,
- Kheradvar A.,
- et al.
- ↵deBruyn J. Making a Computer ‘Think’. Triangle Business Journal. December 2, 2011. Available at: http://www.bizjournals.com/triangle/print-edition/2011/12/02/making-a-computer-think.html?page=2. Accessed September 4, 2013.
- Shane H.
- Sengupta P.P.,
- Krishnamoorthy V.K.,
- Abhayaratna W.P.,
- et al.
- ↵Kalil T. Big data is big deal. The White House, Office of Science and Technology Policy. Available at: http://www.whitehouse.gov/blog/2012/03/29/big-data-big-deal. Accessed September 4, 2013.
- Obama B. “Remarks by the President on securing our nation's cyber infrastructure” [transcript]. May 29, 2009. The White House Office of the Press Secretary. Available at: http://www.whitehouse.gov/the-press-office/remarks-president-securing-our-nations-cyber-infrastructure. Accessed September 4, 2013.
- First US Surgery Transmitted Live via Google Glass (w/Video). Medical Xpress. August 27, 2013. Available at: http://medicalxpress.com/news/2013-08-surgery-transmitted-google-glass-video.html. Accessed August 29, 2013.