You are on page 1of 6

Computer Assisted Medical & Surgical Diagnostics Lovely Professional University,Phagwara,Punjab

AbstractComputer Assisted Surgery represents new concepts and set of methods, that uses the computer technology for presurgical planning and for guiding or performing surgical inventions. This paper will present various new techniques and technologies used for the Computer Assisted Medical and Surgical Diagnostics. This paper presents the evolution of diagnostic and therapeutic procedures as a process of convergence of technologies coming from different fields and involving different disciplines. In particular, it illustrates how modern surgery evolved thanks to fundamental biology knowledge thus, with the introduction of imaging techniques intraoperatively and with the introduction of robotics, surgical procedures became much more predictable, precise and effective. There are various organization which are currently working on Computer Aided Medical Surgeries. Index Terms Smart Instruments, Smart Surgical Tools, remote patient monitoring, computer assisted surgery, virtual surgery, Image Guided Operating Robot. Wireless Instruments, which are playing a very vita role in the Computer Aided Surgery. There is a great demand for wireless medical instruments targeting a variety of applications including diagnostics, surgical, in vivo, remote patient monitoring, and indoor positioning. A smart instrument highlights the fact that many of these wireless devices combine bio- sensors with a wireless capability. This wireless capability has many facets as it can allow the bio- sensor to connect to other devices and larger networks in an ad hoc and flexible way. For example, small battery powered bio - sensors such as an insulin pump with glucometer, wearable fall detector, electrocardiogram monitor, and an oximeter can be wirelessly connected through a body area network (BAN) to allow an elderly patient to be remotely monitored by a healthcare provider while in their home. Other wireless hospital applications include locating assets and streamlining operation of hospital staff through integrated enterprise architectures that directly link a doctors smart phone or personal digital assistant (PDA) to the larger hospital network. X-RAY imaging is routinely used to guide intravascular therapy. Percutaneous trans luminal coronary angioplasty and stent placement are two common cardiac procedures that are performed under X-ray fluoroscopy. First, a contrast en- hanced angiogram is acquired to diagnose and locate stenosis in the coronary artery tree. Then, a catheter carrying a balloon or stent is advanced to the site of the lesion and deployed under image guidance.
II SMART BIOINSTRUMENTS

I INTRODUCTION
THE goal of this paper is to review various technologies used in todays era for computer and medical aided surgery and diagnostics. The motivation for establishing virtual and computer aided surgery as part of the clinical routine is based on the objectives of decreasing surgical risks through an individualized patient-centered operation planning process and increasing surgical precision. Before the dawn of biomedical engineering, the advancement of medical instrumentation was primarily driven by doctors, physicians and surgeons. A transitional phase began about three decades ago, where medical professionals and engineers began to apply their specialties together to solve various medical science problems. together to solve various medical science problems. Bioinstrumentation became one of the fastest growing industries, providing numerous solutions from clinical equipment to life - sustaining medical implants. Advancements in sensor technologies and innovation in computerization spawned a vast amount of new devices for numerous applications. There are various methods for performing the Medical Surgeries with the help of Computers. Augmented reality (AR) has been the topic of intensive research in the last decades. Many scientific and commercial fields identified AR as a promising technology to improve existing solutions and to enable new applications that were not possible without AR. Beside the gaming and entertainment community, and different industrial engineering branches, medical engineering determined the potential of AR for being applied to preoperative diagnoses, intraoperative navigation, and postoperative control. Intraoperative navigation systems for orthopedic surgery like spine, knee or hip surgery present the imaging data and the navigational information on monitors somewhere in the operating Apart from Augmented Reality there is

Smart bioinstrumentation is the offspring of the IEEE 1451 Smart Transducer Interface Standard and medical instruments. It is achieve d by enabling each sensing instrument with a network capable application processor (NCAP) to provide local control and feedback, and at the same time, connecting each instrument to a higher level network for remote monitor, control and analysis, thus infusing the sensing system with a higher level of cognitive ability or smarts. Smart instruments can be divided into five categories: diagnostics, surgical, in vivo, remote patient monitoring, and patient or asset tracking (indoor positioning). Each of these applications operates in a different indoor environment and has a different set of functional requirements. The requirements for a specific application can be defined by answering some basic questions such as: What is the minimum requirement for data r ate? What is the maximum allowable latency of the application? How many access points? What kind of interference within the operational

Fig 2 The primary angle (PA) and secondary angle (SA) define the geometric orientation of the imaging system with respect to the patient. Zero degree primary and secondary angles correspond to an anterior-posterior projection. The primary angle diagram is viewed from the patients feet.

IV. IMAGE GUIDED OPERATING ROBOT


The aim of Image Guided Operating Robot (IGOR) is to help surgeons and physicians to use in rational and quantitative way multimodality data, in order to plan and to perform medical interventions. The IGOR objectives aim at improving the quality of an intervention, since they result in making it easier, more accurate, faster, closer from a pre-operative simulation where accurate objectives can be defined, and also since they enable to envisage new interventions and to validate protocols of therapeutic research. Obviously, this is a long term research, with many potential clinical applications.

These factors drive the choice of the wireless protocol to be used for these applications. Figure 1 provides an overview of different applications including diagnostics, surgical, in vivo, remote patient monitoring, and patient/ asset tracking with examples including current research and commercial devices. III X-RAY PROJECTION METHODS A. X-Ray Projection Imaging X-ray projection imaging is a three-dimensional (3-D) to twodimensional (3-D) imaging process that can be described by a 3x 4 projection matrix P.

IV.A General Methodology


IGOR methodology is based on three non sequential steps, briefly described in this section. Acquisition and modelling of multi-modality information : At that stage, information is acquired mostly with medical imaging devices (CT, MRI, Digital Radiology, Ethnography, Positron or Single Photon Emission Tomography, ...) , but also with multimodality sensorscoming from classical Computer Vision, or devised to analyze various signals : electrophysiology, pressure, Doppler, tactile feelings. Definition of a surgical strategy: A strategy now has to be defined, using all the available information. A medical objective is assigned, and simulation of the intervention makes it possible to anticipate the morphological and functional consequences of the intervention. This definition of the strategy is first based on pre-operative data (3D images and anatomical models). Registration of multimodality information and systems : Preoperative images, anatomical models, intra-operative information and finally guiding systems (such as robots) must be registered. First, a reference system is associated with each modality or system, then, after calibration, the transformations between all the reference systems are estimated during the registration step. These transformations are usually defined by 6 parameters (rigid matching) or by piecewise functions (elastic matching). For a general method based on segmentation of reference anatomical structures and on accurate 3D/2D and 3D/3D registration between reference structures. V. VIRTUAL SURGERY Fig. 3. Visualizations of liver tumors and vessel trees. (a) Visualization of simulated tumor within vascular tree and transparent liver tissue. (b) Liver segment (VIII) that depends

which is the product of a 3x3 matrix representing a perspective projection, and a 3x4 matrix describing the orientation of the imaging system relative to a world coordinate system. Nu And nr are the image dimensions in pixels. The 3x3 Rpa and Rsa matrix represents the orientation of the imaging C-arm, as defined by the primary and secondary angles (Fig. 2) IS is the intensifier size, SOD is the source- to-isocenter distance, and SID is the source-to intensifier distance (Fig. 2). Patients undergoing diagnostic left heart catheterization were recruited to participate in an IRB approved protocol. All images were acquired on a Siemens biplane cardiovascular angiography system at a rate of 30 frames/s. The clinician was not constrained in positioning the C-arms. The technicians were requested to not move the patient table during the image acquisition.

on vessels touched by the tumor have been virtually resected before visualization. (c) Simulated tumor has been placed in a more proximal location (segments V and VI) than in (a). (d) Visualization with virtually resected liver segment. Standards for digital images, the better connectivity of modern imaging devices, and networks or telephone lines with a bandwidth suitable for image communication resulted in the development of various teleradiology systems. Beyond common basic functions, researches must consider the variations in national standards. Therefore, different approaches exist. Grigsby reviews the state of teleradiology in the United States. Similar systems have been developed and used in Europe. Although many radiological procedures have been implemented so far, it is desirable to extend the role of teleradiology systems in the clinical routine by employing dedicated soft-ware modules. The possibility of adding specific components becomes important so that the software can be adapted to the local workflow and tasks. Furthermore, interdisciplinary collaborations can benefit from an appropriate interface description. Providing functional openness to developers overcomes the limitations of systems that are based on data interfaces only. As time has passed, the objectives and demands placed on the computer- assisted processing of medical images have grown. As before, visualization still plays a significant role in the range of available methods. However, it is supplemented with image processing tasks in order to offer problem-oriented operation planning and technical developments in the field of virtual reality. Many applications have been developed to analyze medical data sets with tools for interactive seg-mentation, measurement, surgical simulation, and planning. Some take advantage of fast graphics hardware; others benefit from special input or output devices. Since virtual reality is anew medical technology compared to conventional procedures, most of the work is experimental. Recently, however, scenarios have been worked out in which virtual reality promises better diagnostic support or intraoperative execution of the planning results.

VI. COMPUTER SYSTEM

ASSISTED

DIAGNOSIS

To develop such a computer-assisted diagnosis system that may receive physician acceptance imposes analysis of the cognitive processes that underlay medical decision making. This analysis involves a comprehensive identification of the indices and the diagnostic decisions, as well as their organization that can be defined as the knowledge representation. This analysis also involves the description of the mechanisms for mobilizing this knowledge: faced with a new patient, how does the endoscopist reach the correct decision, according to his level of expertise. Cognitive and decision sciences, as well as artificial intelligence, have yielded substantial insights into the nature of diagnostic reasoning. Concrete applications have been developed in different fields of medical diagnosis. Nevertheless, the particular area of digestive endoscopy domain has not been explored yet.
A. IDENTIYING THE MEDICAL DECISION ELEMENTS

Endoscopy differs from traditional medical imaging

Fig. 3. Visualizations of liver tumors and vessel trees. (a) Visualization of simulated tumor within vascular tree and transparent liver tissue. (b) Liver segment (VIII) that depends on vessels touched by the tumor have been virtually resected before visualization. (c) Simulated tumor has been placed in a more proximal location (segments V and VI) than in (a). (d) Visualization with virtually resected liver segment.

modalities in several aspects. First, endoscopy is not based on the bio-physical response of organs to X-ray or ultrasounds, but allows a direct observation of the human internal cavities via an optical device and a light source. Second, the endoscopy investigation imposes a physical contact between the patient and the physician, and the endoscopist can assess the patient complaints before the endoscopic procedure Finally, the patients discomfort during the investigation prohibits repeated examinations and, with regard to the usual absence of storage system, no decision element remains available at the end of the examination; this requires that all information is gathered during a limited time period.The endoscopic procedure entails a systematic approach. After a local pharyngeal anesthesia, the endoscopic probe is introduced into the esophagus. The second step consists of a rapid progression in the digestive lumen until the third duodenum. The position of the distal extremity of the device is deduced from anatomical references (esogastric junction orpylorus, for example) and length of intubation. The third step the withdrawal of the device, which involves a systematic exploration of the whole digestive mucosa in order to detect parietal lesions and abnormal lumen sizes and contents. Each anomaly is described (anatomical location, size,

specific features as shape, color or relief regularity, etc.). The endoscopic diagnosis is based on the analysis of the association between these elementary lesions and the medical context, which can lead to the proposal of complementary procedures that could (or could not) confirm the endoscopic diagnosis. These include vital coloration of the digestive mucosa, histological examination of biopsy specimens obtained during endoscopy, or other morphological or functional procedures . All these elements, as well as the diagnostic decisions, are written on a medical report That concludes the investigation.

The above written paper describes various techniques of computer assisted medical and surgical diagnostics. In todays world we have various kinds of computer assisted medical surgeries which help us in mostly error free and fast medical diagnostic.
REFRENCES

B. NATURE OF OBJECTS

From an engineering point of view, the medical reasoning during endoscopy can be compared to the interpretation of a scene of objects in which the scenes are the endoscopic diagnoses and the objects are the set of gathered information before and during the examination (see Fig. 5). Two distinct decision spaces are identified: the finding diagnosis and the endoscopic diagnosis. The diagnosis of the endoscopic findings can be assimilated through a pattern- recognition process. It involves focusing on areas that differ from the aspect of the contiguous organ or fromthe normal anatomic schema of the organ, identifying the syn-tactic descriptors of these areas, and FIG 4. (1) From syntactic descriptors to endoscopic findings and (2) from objects to endoscopic diagnosis. fusing this elements of in-formation in order to identify organ lesions. The diagnosis of the endoscopic scene is based on the fusion of information de-rived from the collected objects. The nature of the objects is heterogeneous and includes information about the patient (age and gender), the reasons for examination, and findings. The focus of the investigation is to standardize the endoscopic record for current practice and to facilitate education and training in digestive endoscopy. The minimal standard terminology (MST) for a computerized endoscopic data-base constitutes the framework for the identification and organization of endoscopic information and decisions while the basic concepts of the terminology imply the incompleteness of the description. However, the syntactic descriptors of endoscopic findings and the set of values (or indices) for each descript or cannot be retrieved in MST while the terminology does not de-tail all the features of the endoscopic lesions.

CONCLUSION

[1] R. Pausch, D. Profitt, and G. Williams, Quantifying immersion in virtual reality, inComputer GraphicsProc. ACM SIGGRAPH97, Los Angeles, CA, 1997, pp. 1318. [2] G. Faulkner and M. Krauss, Evaluation of 3D input and output devices for the suitability for medical applications, in Proc. Computer Assisted Radiology, Berlin, Germany, 1995, pp. 1069 1074. [3] K. H. Englmeier, M. Haubner, A. L osch, F. Eckstein, M. D. Seemann, W. van Eimeren, and M. Reiser, Hybrid rendering of multidimensional image data, Methods of Information in Medicine, vol. 36, pp. 110,1997. [4] H. Evers, A. Mayer, U. Engelmann, A. Schr oter, U. Baur, K. Wolsiffer,and H. P. Meinzer, Integration of volume visualization and tools for data exploration in a teleradiology system using a plug- in concept, Int. J. Medical Informatics, vol. 53, nos. 23, pp. 265 275, 1999. [5] M. Haubner, C. Krapichler, A. Losch, K. H. Englmeier, and W. van Eimeren, Virtual reality in medicineComputer graphics and interaction techniques, IEEE Trans. Inform. Technol. Biomed. , vol. 1, pp. 6172, 1997. [6] K. H. Englmeier, M. Haubner, C. Krapichler, D. Schuhmann, M. Seemann, H. Furst, and M. Reiser, Virtual bronchoscopy based on spiral-CT images, in Proc. SPIE Conf. Medical Imaging, San Diego, CA, 1998, pp. 427438. [7] H. M. Fenlon and J. T. Ferucci, Virtual colonoscopy, AJR , vol. 169, pp. 453458, 1997. [8] G. W. Hunt, P. F. Hemler, and D. J. Vining, Automated virtual colonoscopy, in Proc. SPIE Conf. Medical Imaging Image Display, San Diego, CA, 1997, pp. 535541. [9] J. Beier, D. Schmitz, M. Gutberlet, T. Rohlfing, T. Vogel, and R. Felix, Quantification and virtual angioscopy of aortic stenoses by CT and MR, inProc. CAR98 , Amsterdam, The Netherlands, 1998, p. 876. [10] T. Nakagohri, F. A. Jolesz, S. Okuda, T. Asano, T. Kenmochi, O. Kainuma, Y. Tokor, H. Aoyama, W. E. Lorensen, and R. Kikinis, Virtual endoscopy of mucin-producing pancreas tumors, in Proc. MICCAI98, Cambridge, MA, 1998, pp. 926933. [11] A. M. Demiris, A. Mayer, and H. P. Meinzer, 3-D visualization in medicine: An overview, in Contemporary Perspectives in ThreeDimensional Biomedical Imaging , C. Roux et al., Eds. Amsterdam, The Netherlands: IOS Press, 1997, vol. 30, pp. 79105. [12] S. Morri s , J. Paradiso, Shoe - integrated sensor system for wireless gait analysis and real - time feedback, Proc. of the 2nd Joint IEEE EMBS and BMES Conf., Oct. 2002, pp. 24682469. [13] W. Qu, S. Islam, M. Mahfouz, G. To, and S. Mofasta,

Micro - cantilever array pressure measurement system for biomedical instrumentation, IEEE Sensors Journal , vol. 10, no.2, 321 - 330, Feb. 2010. [14] M. Kuhn, M. Mahfouz, J. Turnmire, Y. Wang, and A. Fathy, A multi - tag access scheme for indoor UWB localization systems used in medical environments, Proc. of IEEE Topical Conf . on Bio Wireless, Phoenix, AZ, Jan. 2011, pp. 75- 78. [15] A. Rohlmann, U. Gabel , F. Graichen, A. Bender, and G. Bergmann, An instrumented implant for vertebral body 3, No. 5, pp. 339 - 347, Oct. 2009. [16] New England Healthcare Institute, Tele- ICUs: Remote Management in Intensive Care Units, 2011, www.masstech.org/ehealth/cmyk_tele_icu.pdf. [17] F. Graichen, R. Arnold, A. Rohlmann, and G. Bergmann, Implantable 9- channel telemetry system for in - vivo load measurements with orthopedic implants, IEEE Trans Biomed Eng. Vol. 54, No. 2 pp. 253 - 261, Feb 2007 . [18] F. Graichen, G. Bergmann, and A. Rohlmann, Patient monitoring system for load measurement with spinal fixation devices, Med. Eng & Phy. Vol. 18, no 2 pp. 167 - 174, Mar. 1996. [19] F. Graichen, G. Bergmann, and A. Rohlmann, Hip endo prosthesis for in vivo measurement of joint force and temperature, J. Biomech., Vol. 32, no. 10, pp. 1113 1117, Oct. 1999. [20] D. D'Lima, C. Townsend, S. Arms, B. Morris, and C. Colwell, An implantable telemetry device to measure intra - articular tibial forces , J. Biomech., Vol. 38, no. 2, 299 - 304, Feb. 2005. [21] A. Rohlmann, U. Gabel, F. Graichen, A. Bender, G. Bergmann, An instrumented implant for vertebral body replacement that measures loads in the anterior spinal column, Med. En g. & Phy., Vol. 2 9, no. 5, pp. 580 - 585, Jun. 2007. [22] MacDonald M., M. Szpuszta, Pro ASP.Net 3.5 in C# 2008: Includes Silverlight 2 , 3rd Edition, 2009, 1478 pp. [23] Sack J. SQL Server 2008 Transact-SQL Recipes (Books for Professionals by Professionals). Apress. 2008, 872 pp. [24] Classification Manual for Voice Disorders-I, Special Interest Division 3, Voice and Voice Disorders . ASHA. Lawrence Erlbaum Associates, Mahwah, New Jersry 2006.

You might also like