TY - JOUR AU - Shuhaiber, Jeffrey H. AB - ObjectiveTo evaluate the history and current knowledge of computer-augmented reality in the field of surgery and its potential goals in education, surgeon training, and patient treatment.Data SourcesNational Library of Medicine's database and additional library searches.Study SelectionOnly articles suited to surgical sciences with a well-defined aim of study, methodology, and precise description of outcome were included.Data SynthesisAugmented reality is an effective tool in executing surgical procedures requiring low-performance surgical dexterity; it remains a science determined mainly by stereotactic registration and ergonomics. Strong evidence was found that it is an effective teaching tool for training residents. Weaker evidence was found to suggest a significant influence on surgical outcome, both morbidity and mortality. No evidence of cost-effectiveness was found.ConclusionsAugmented reality is a new approach in executing detailed surgical operations. Although its application is in a preliminary stage, further research is needed to evaluate its long-term clinical impact on patients, surgeons, and hospital administrators. Its widespread use and the universal transfer of such technology remains limited until there is a better understanding of registration and ergonomics.The computer has invaded society and has become an integral part of continual advancements in medicine and science. Surgeons can no longer ignore the impact of this technology in changing our daily activities and patient treatment. Patient histories are now stored as electronic records. Computer programs exist to place patient orders, including imaging tests. Computer-based simulation empowers surgical residents in visualizing anatomy.In the preoperative phase, most surgeons have a mental image of where the target lesion is and plan the route of exposure. Marking structures of interest on radiographic images that can be superimposed on live video camera images allows a surgeon to simultaneously visualize the surgical site and the overlaid graphic images, creating a so-called semi-immersive environment. The term is synonymous with augmented reality (AR). Virtual reality (VR) and AR are the 2 principal means by which computer technology will meet reality and offer the ultimate surgical environment.This article reviews the developmental milestones and application of AR in the operating room in various surgical specialties and discusses the hopes and fears engendered by this evolving technology.WHAT IS AR?Augmented reality is a recent technology that is similar to the VR paradigm. It combines 3-dimensional (3D) computer-generated objects and text superimposed onto real images and video, all in real time. The main difference between VR and AR is that the latter uses real images, video frames, and 3D graphics alone.The ability to quantify and manipulate spatial information to relate one set of data to another (registration) is fundamental to surgical navigation. Registration is the product of mathematical methods that relate 2 or more coordinate spaces and stereotactic operating systems that will integrate these databases into the operative field.Stereoscopy, which is the science of both vision and perception of parallax, is not new to medical imaging. It has been extensively used in general radiography and cerebral angiography.Once a stereoscopic image is generated, interaction and display in the operating room is possible.The need for an ideal head tracking system that accurately and continuously follows all subtle movements of the surgical site and transparently adjusts the initial registration through the whole surgical procedure in a noninvasive way was the driving force underpinning the evolution of AR.Several hurdles were recognized and overcome early on in neurosurgery by a series of surgical navigation systems advancing from mechanical arms to 2-dimensional charge-coupled devices.Limitations in orientation, anatomic landmark registration, and freedom of navigation became apparent to the operating surgeon as, for example, constant alteration of conjugate gaze between computed tomographic images and the surgical field.HOW IS AR ACHIEVED?The Hybrid Patient ModelAn important but crucial problem to be solved was how to merge all data and systems necessary for achieving AR. Building a hybrid patient model was a plausible answer, but it created a registration problem for both the developer and the surgeon.Virtual reality offered a potential solution to build a virtual 3D patient model. Enthusiasts appreciated that to create a hybrid model—real and virtual—a complete representation that merged the real patient during surgery with useful computerized patient data was vital. The latter would encompass preoperative images (computed tomography, magnetic resonance imaging, magnetic resonance angiography, and others), anatomic models, intraoperative images (x-rays, ultrasound, video endoscope, and microscope), and position and shape information, and coordinate auditory or visual systems with operative guiding systems, ie, systems that give the accurate position of a tool freely moved by the surgeon or robot.Lavallee et alput forward a definition of the hybrid model construction. As they defined it, a coordinate system would be associated with each preoperative and intraoperative imaging modality, each statistical geometrical model, each sensor, each surgical tool, and each guiding system. Building the hybrid model would require computing a chain of geometrical transformations (T1, T2, . . . Tn) between all involved coordinate systems. This system is the essence of successful functioning of AR.The AR SystemOnce the hybrid patient model provided the virtual component of AR, the next task was to register the virtual frame of reference with what the user is seeing: the real patient. This registration is more critical in an AR than a VR system because our eyes are more sensitive to visual misalignments than to the type of visual-kinesthetic errors arising in the VR system.The scene is viewed by an imaging device, which in this case is depicted by a video camera. The camera performs a perspective projection of the 3D world onto a 2-dimensional image plane. The intrinsic (focal length and lens distortion) and extrinsic (position and pose) parameters of the device determine exactly what is projected onto its image plane. The virtual image is generated with a standard computer graphics system. The virtual objects are modeled in an object reference frame. The graphics system requires information about imaging of the real scene so that it can correctly render these objects. These data will control the synthetic camera that is used to generate the image of the virtual objects. This image is then merged with the image of the real scene to form the AR image.Research continues to improve both registration technology and quality of display in the most ergonomic fashion.Display Technology in ARFor successful execution of AR, a large number of display modalities have been considered. Two main display modalites have been adopted: head-mounted displays and heads-up displays.Two types of head-mounted displays exist: video see-through and optical see-through. The video see-through display does not allow the operator's visual field to have direct contact with the real world, while the optical see-through display does. The optical see-through display offers less of a feeling of being immersed in the environment created by the display. Although no studies exist to show which type of head-mounted display is superior, the optical see-through display offers more control of the environment should an emergency arise or when misalignment of anatomic graphic images is recognized.The heads-up display has already been used in airplane cockpits and recently in some experimental automobiles, allowing 2 images to be merged on a monitor facing the head rather than the window of the cockpit or the windshield. All displays have an obligatory delay for image processing, but each type has distinct advantages.The video see-through displays allow the video-generated image to reach the rest of the AR system to provide immediate tracking information. The optical see-through device activates the human brain for further transformation processes regarding information tracking. This can cause eyestrain and, in severe cases, nausea and headaches for the surgeon. The resolution of the virtual image is directly mapped over the real-world view when an optical see-through display is used. With a monitor or video see-through display, both the real and virtual worlds are reduced to the resolution of the display device. These magnetic trackers also introduce errors caused by any surrounding metal objects in the environment, as well as measurement delays.In summary, imaging devices project a 3D world on a 2-dimensional image plane. The intrinsic and extrinsic parameters of the device determine exactly what is projected. These features are not error free.THE AR OPERATING ROOM ENVIRONMENTAugmented reality is still in a rapidly progressing stage of development with further challenges. Despite its infancy, attempts to apply AR in surgery have been successful and promising. Neurosurgery, otolaryngology, and maxillofacial surgery are the main disciplines that have used this technology to navigate their specific surgical fields.NeurosurgeryNo specialty has been more involved than neurosurgery in the implementation of computer-aided surgery since its inception. Neurosurgeons are always trying to resect the smallest possible volume of brain tissue containing tumor. While methods exist (eg, magnetic resonance imaging and computed tomography) for imaging and displaying the 3D structure of the brain, the surgeon must relate what he or she sees on the 3D display with the patient's actual anatomy. Understanding of registration, stereotactic surgery, and stereoscopic surgery offered answers as to how to go about navigating a brain tumor. Primitive solutions to this problem involved a stereotactic frame for the patient's skull, imaging the skull and frame as a unit. A search for a more reliable frame suitable for the surgeon and comfortable for the patient initiated development of automatic registration methods for frameless stereotaxy, image-guided surgery, and AR.In the AR environment, a navigation system superimposes a 3D image (volume graph) of the anatomic part of the brain on the real operating field. This creates a 3D anatomic atlas–like interactive environment for the navigating surgeon.Surgical navigation therefore is key for reduction of surgical intervention in a narrow operative field. To the advantage of the neurosurgeon, the surgical anatomy is more fixed in space than abdominal organs are, allowing feasible registration.These new technologies for surgical navigation and image analysis have been termed interactive image-guided neurosurgery. This system is composed of 5 fundamental elements: a method of registration of images and physical space, an interactive localization device, a computer with its requisite software interface and video display system, the integration of real-time feedback, and robotics.Concerns surrounding the application of AR are similar to those in other surgical disciplines. Tissue movement during surgery caused by cerebrospinal fluid leakage, gravity, and tumor resection can affect registration.General SurgeryThis specialty took over AR application later than other specialties. Already established traditional modes of general surgery and limited sources of fundingare possible reasons for its delay. Nonetheless, a number of steps toward the development of an AR system combined with computer-assisted surgery have been made, especially in the field of liver surgery. Soler et alpublished the first fully automatic 3D reconstruction liver model through detailed translation of anatomic knowledge in topologic and geometric constraints.Such an approach allows the surgeon to automatically build an anatomic segmentation of the liver, based on the Couinaud definition of the 8 subsegments of the liver, with delineation of the hepatic and portal veins in VR.Other steps to visualize complex anatomy included the development of teleimmersive collaborations in virtual pelvic floorand virtual abdomen.Although these models have not been used in the operating room, it is hoped that these environments will support widespread dissemination of surgical expertise.Another important step was the application of frameless stereotactic liver surgery in tumor resection.Similar to neurosurgery, an interactive image-guided surgery system for liver surgery was evaluated for accurate instrument tracking. The results from human and porcine data showed accuracies ranging from 1.4 to 2.1 mm. Liver motion due to insufflation was 2.5 ± 1.4 mm in laparoscopy, while total liver motion during respiration was 10.8 ± 2.5 mm.In the field of breast cancer, AR visualization was shown to be effective in phantom and clinical data.This novel approach allowed superimposition of 3D tumor models onto live video images of the breast, enabling the surgeon to perceive the exact 3D position of the tumor as if it were visible through the breast skin. Sato et alclaimed that surgical AR helped the surgeon target surgical resection in a more objective and accurate process, thereby minimizing risk of relapse and maximizing breast conservation. Further research is needed to work out AR's reliability and validity in surgical oncology.Orthopedic SurgeryAlthough AR applications in musculoskeletal surgery are not yet clinically available, several research systems are being used to solve orthopedic problems. These applications include implant alignment in total hip and knee replacement, where an AR system can be used to guide the proper placement of implant components on the basis of preoperative plans.Limb kinematics is the mathematical analysis of pressure distribution during motion and soft-tissue tension. In the AR environment, the visualization of the vector form could allow total knee replacement and high tibial osteotomies to be adjusted and tailored to the individual patient.In knee surgery, the application of AR is highlighted by a recent experimental and stand-alone device, the mechatronic arthroscope. This tool allows the surgeon to apply force without damage, virtually integrating preoperative and intraoperative images, to navigate the knee joint during the planning phase and to intervene during the AR phase.Variations on the same theme include accurate placement of an intramedullary rod, proper manipulation of the bones, tumor resection, and cartilage resurfacing.Maxillofacial SurgeryThe use of AR in maxillofacial surgery has extended to orthognathic surgery, tumor surgery, temporomandibular joint motion analysis, foreign body removal, osteotomy, minimally invasive biopsy, prosthetic surgery, and dental implantation.One of the chief attractions is the provision of information on deep-tissue structures during the operation, allowing surgery to be less invasive. The application of AR technology to osteotomies of the facial skeleton could allow points, lines, and planes to be transferred from stereolithographic skull models, cephalometric drawings, splints, and diagnostic imaging data to the patient.In the context of oncology, the surgeon draws the tumor borders manually as an overlay using VR system software tools onto the computed tomographic data set. Adjustments and alignments are made. Thereafter, the overlay can be transmitted into any other data sets through the video image of the operative field and later into the heads-up display or head-mounted display units. The resection margins can then be seen in the context of the tumor borders. This may minimize the deformity generated by traditional surgical methods while optimizing the chances for successful curative and reconstructive surgery.OtolaryngologyAugmented reality has stepped into the field of otolaryngology. Most of the advances have come from the introduction of minimally invasive surgery of the head and neck. Such systems are revolutionary in aiding the surgeon with intraoperative anatomic landmarks, especially when distorted or absent. The use of AR especially in diagnosis, biopsy from sinuses, skull base surgery, orbital decompression, carcinoma excision, and foreign body removal has many advantages and disadvantages.Improved patient safety with improved mechanical and registration accuracy (within 0.2-3 mm) during real-time surgeryallows for surgical precision. The technology is easy to use. However, surgeons cannot always predict which cases may benefit from localization, especially when associated with increased operative time and expense.The actual surgical time is unlikely to be prolonged. The latter disadvantage may not apply to routine surgery of the head and neck. Recently, a computerized image to reconstruct an anatomically accurate 3D computer model of the human temporal bone from serial histologic sections was achieved. A 3D virtual model of the temporal bone has been created and demonstrated as an efficient tool for education.The human temporal bone is a 3D complex anatomic region with many unique qualities that make anatomic teaching and learning difficult. The model may be interactively navigated from any viewpoint, greatly simplifying the task of conceptualizing and learning the anatomy. Automated tracking of tissue motion, however, remains a current research problem.Cardiovascular and Thoracic SurgeryMinimally invasive surgery in the chest via a thoracoscope has allowed AR to be used in thoracic surgery. Whether a thoracoscopic approach to diagnosis or treatment could replace more conventional approaches remains to be seen. However, according to Colt,the training capabilities will soon be enhanced by the incorporation of VR simulators. Thoracoscopy is partly the result of the impact of laparoscopic surgery on general surgical practice. This window to the pleural and pericardial cavity allows for diagnosis and treatment of pleural effusions, lung cancer, mediastinal tumors, vasospastic disease via thoracoscopic sympathectomies, empyema, and ligation of the patent ductus arteriosus.In cardiac surgery, the adoption of thoracoscopic access and a remotely operated robot using the surgeon's hands promises a novel method of endoscopic coronary artery bypass grafting. The robot provides the surgeon with delicate prehensile function using instruments. A television-video screen allows the surgeon to use his or her vision to track both the robot hands and the anatomy for coronary anastomoses.It has been stated that surgical technique with this method can be challenging to the surgical team.No literature exists with real-time AR yet. However, this technology promises to contribute to reduced hospital days, earlier return to normal activity, less pain, and better cosmesis.Totally endoscopic mitral valve repairand aortic valve replacement are now feasible. Further studies need to develop ways to facilitate the anastomosis, reduce errors, and superimpose anatomic images on real anatomic landmarks. A multicenter study will be essential to define the efficacy and clinical value of these techniques. As it stands, graft occlusion rate after minimally invasive direct coronary artery bypass remains slightly higher than that after traditional revascularization.In the field of off-pump coronary artery bypass grafting, both real-time imaging and automation will lead the way to improve the quality of coronary anastomosis. Visual synchronization and motion compensation will be required to present a still image of a beating heart.LIMITATIONS AND ONGOING RESEARCH IN ARThe absolute role and indications for AR in surgery are yet to be established. The data so far generated in AR are not substantial. The outcomes discussed in most publications to date include user-friendly features, accuracy of targeting tissues, and costs as end points.These outcomes are not measured quantitatively, and subjective statements are not supported with good organized research according to patient case mix. The use of multicenter trials and structured research will help determine the cost-effectiveness of AR and answer questions in an evidence-based medicine fashion.It must be noted that a universal problem for any surgeon in the AR environment is that the organ of interest does not behave as expected. Human organs are not rigid, but deform according to the rhythms of the heartbeat and respiration, according to pressure during laparoscopic insufflation,or when physically probed. This physical problem will be more marked for liverand intestinal surgery (pliable organs) than for bone and brain surgery (semirigid organs).Standard platforms for stereoscopic AR computer projection are recent innovationsand have not yet reached "wearable" applicability.Comfort issues may limit prolonged use. For example, the weight of a head-mounted display is determined by the type of motion-tracking systems: electromagnetic, ultrasonic, or optical. Moreover, concerns may arise regarding fitting such devices into an already crowded operating room environment.In addition, outcome has yet to be measured qualitatively (risk-benefit ratio) and quantitatively.A common underlying error-generating process in AR will always exist because of the tremendous variability in the fundamental elements: definition of accuracy, image acquisition, registration techniques, computers and software interfaces, iterative localization devices and intraoperative use, integration of real-time data, tissue displacement, robotics, and, finally, judgment and clinical experience.WHAT DOES THE FUTURE HOLD FOR AR?Augmented reality so far promises us additional information that cannot be detected by the 5 senses of a human being. Despite the basic function of AR systems as "x-ray vision" for surgical planning, the system extends to robots and simulation. Interventional AR systems are the most recent application to provide a "third hand" as an assistant. The clinical application of this tool is still very basic and passively driven by the surgeon because of concerns such as safety and minimizing device sophistication. The current versions of the passive-arm manipulator include use as a tool holder or retractor.The dynamic association of operating on a real organ with imaging data may create new modes of diagnosis and treatment of technically challenging patients. Very experienced surgeons can benefit from such systems by extending the limit of a safe area to allow for more complete and radical operative therapy, while less experienced surgeons may at least benefit by being oriented to critical anatomic landmarks. A new sense of perceiving the real and virtual world has been achieved. Advancing AR to become user-friendly has rekindled interest in real-time surgical anatomy as a way to maximize the number of safe surgical hands in the next century.SLTangCKKwohMYTeoNWSingKVLingAugmented reality systems for medical applications.IEEE Eng Med Biol Mag.1998;17(3):49-58.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9604701&dopt=AbstractRDayMPHeilbrunSKoehlerPMcDonaldWPetersVSiemionowThree-point transformation for integration of multiple coordinate systems: applications to tumor, functional, and fractionated radiosurgery stereotactic planning.Stereotact Funct Neurosurg.1994;63:76-79.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=7624655&dopt=AbstractTMPetersEnhanced display of three-dimensional data from computerized x-ray tomograms.Comput Biol Med.1975;5:49-52.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=1098843&dopt=AbstractPSt-JeanAFSadikotLCollinsAutomated atlas integration and interactive three-dimensional visualization tools for planning and guidance in functional neurosurgery.IEEE Trans Med Imaging.1998;17:672-680.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9874291&dopt=AbstractKRSmithKJFrankRDBucholzThe NeuroStation—a highly accurate, minimally invasive solution to frameless stereotactic neurosurgery.Comput Med Imaging Graph.1994;18:247-256.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=7923044&dopt=AbstractSLavalleePCinquinRSzeliskiBuilding a hybrid patient's model for augmented reality in surgery: a registration problem.Comput Biol Med.1995;25:149-164.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=7554833&dopt=AbstractMTuceryanDSGreerRTWhitakerCalibration requirements and procedures for a monitor-based augmented reality system.IEEE Trans Visualization Computer Graphics.1995;1:255-273.KNKutulakosJVallinoAffine object representations for calibration-free augmented reality.In: Proceedings of the 1996 IEEE Virtual Reality Annual International Symposium; March 30 to April 3, 1996; Santa Clara, Calif.Washington, DC: IEEE Computer Society; 1996:25-26.RTAzumaGBishopImproving static and dynamic registration in an optical see-through HMD.In: Proceedings of the 21st Annual Conference on Computer Graphics and Interactive Techniques; July 1994; New York, NY.New York, NY: ACM Press; 1994:197-204.EKEdwardsJPRollandKPKellerVideo see-through design for merging of real and virtual environments.In: Proceedings of the IEEE Virtual Reality Annual International Symposium; September 18-22, 1993; Seattle, Wash.Washington, DC: IEEE Computer Society; 1993:223-233.BDAdelsteinERJohnstonSREllisA testbed for characterizing dynamic response of virtual environment spatial sensors.In: Proceedings of the Fifth Annual Symposium on User Interface Software and Technology November 15-18, 1992; Monterey, Calif. New York, NY: ACM Press; 1992:15-22.WLGrimsonTLozano-PerezWMWellsGJEttingerSJWhiteRKikinisAn automatic registration method for frameless stereotaxy, image-guided surgery, and enhanced reality visualization.In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition; June 20-24, 1994; Seattle, Wash. Washington, DC: IEEE Computer Society; 1994:430-436.YMasutaniTDohiFYamaneHIsekiKTakakuraAugmented reality visualization system for intravascular neurosurgery.Comput Aided Surg.1998;3:239-247.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10207648&dopt=AbstractRMSatavaEmerging medical applications of virtual reality: a surgeon's perspective.Artif Intell Med.1994;6:281-288.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=7812423&dopt=AbstractLSolerHDelingetteGMalandainAn automatic virtual patient reconstruction from CT-scans for hepatic surgical planning.Stud Health Technol Inform.2000;70:316-322.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10977563&dopt=AbstractRKPearlREvenhouseMRasmussenThe Virtual Pelvic Floor, a tele-immersive educational environment.Proc AMIA Symp.1999;:345-348.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10566378&dopt=AbstractRMSatavaAccelerating technology transfer: new relationships for academia, industry, and government.Stud Health Technol Inform.1998;50:1-6.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10180523&dopt=AbstractAJHerlineJDStefansicJPDebelakImage-guided surgery: preliminary feasibility studies of frameless stereotactic liver surgery.Arch Surg.1999;134:644-650.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10367875&dopt=AbstractYSatoMNakamotoYTamakiImage guidance of breast cancer surgery using 3-D ultrasound images and augmented reality visualization.IEEE Trans Med Imaging.1998;17:681-693.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9874292&dopt=AbstractAMDiGioiaBJaramazMBlackwellThe Otto Aufranc Award: image guided navigation system to measure intraoperatively acetabular implant alignment.Clin Orthop.1998;355:8-22.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9917587&dopt=AbstractMBlackwellFMorganAMDiGioia IIIAugmented reality and its future in orthopaedics.Clin Orthop.1998;354:111-122.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9755770&dopt=AbstractPDarioMCCarrozzaMMarcacciA novel mechatronic tool for computer-assisted arthroscopy.IEEE Trans Inf Technol Biomed.2000;4:15-29.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10761770&dopt=AbstractGEnislidisAWagnerOPloderMTruppeREwersAugmented reality in oral and maxillofacial surgery.J Med Virtual Real.1995;1:22-24.AWagnerMRasseWMillesiREwersVirtual reality for orthognathic surgery: the augmented reality environment concept.J Oral Maxillofac Surg.1997;55:456-463.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9146514&dopt=AbstractGEnislidisAWagnerOPloderREwersComputed intraoperative navigation guidance—a preliminary report on a new technique.Br J Oral Maxillofac Surg.1997;35:271-274.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9291266&dopt=AbstractMRothDCLanzaJZinreichDYousemKAScanlanDWKennedyAdvantages and disadvantages of three-dimensional computed tomography intraoperative localization for functional endoscopic sinus surgery.Laryngoscope.1995;105:1279-1286.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=8523977&dopt=AbstractWFreysingerARGunkelWFThumfartImage-guided endoscopic ENT surgery.Eur Arch Otorhinolaryngol.1997;254:343-346.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9298670&dopt=AbstractRMetsonMCosenzaREGliklichWWMontgomeryThe role of image-guidance systems for head and neck surgery.Arch Otolaryngol Head Neck Surg.1999;125:1100-1104.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10522501&dopt=AbstractTPMasonELApplebaumMRasmussenAMillmanREvenhouseWPankoVirtual temporal bone: creation and application of a new computer-based teaching tool.Otolaryngol Head Neck Surg.2000;122:168-173.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10652385&dopt=AbstractMPFriedJKleefieldHGopalEReardonBTHoFAKuhnImage-guided endoscopic surgery: results of accuracy and performance in a multicenter clinical study using an electromagnetic tracking system.Laryngoscope.1997;107:594-601.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9149159&dopt=AbstractHGColtTherapeutic thoracoscopy.Clin Chest Med.1998;19:383-394.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9646989&dopt=AbstractRJDamianoEndoscopic coronary arery bypass grafting: the first steps on a long journey.J Thorac Cardiovasc Surg.2000;120:806-807.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=11003766&dopt=AbstractHVSchaffNew surgical techniques: implications for the cardiac anesthesiologist: mini-thoracotomy for coronary revascularization without cardiopulmonary bypass.J Cardiothorac Vasc Anesth.1997;11(2, suppl 1):6-9.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=9106007&dopt=AbstractLWTangGD'AnconaJBergslandAKawaguchiHLKaramanoukianRobotically assisted video-enhanced-endoscopic coronary artery bypass graft surgery.Angiology.2001;52:99-102.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=11228093&dopt=AbstractHMehmaneshRHenzeRLangeTotally endoscopic mitral valve repair.J Thorac Cardiovasc Surg.2002;123:96-97.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=11782761&dopt=AbstractIMoussaMOetgenVSubramanianYKobayashiNPatelJMosesFrequency of early occlusion and stenosis in bypass grafts after minimally invasive direct coronary arterial bypass surgery.Am J Cardiol.2001;88:311-313.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=11472717&dopt=AbstractYNakamuraKKishiRobotic stabilization that assists cardiac surgery on beating hearts.Stud Health Technol Inform.2001;81:355-361.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=11317768&dopt=AbstractMCzernuszenkoDPapeDSandinTDeFantiGDaweMBrownThe ImmersaDesk and Infinity Wall projection-based virtual reality displays.Comput Graph (ACM).1997;31(2):46-49.JSiegelMBauerA field usability evaluation of a wearable system.In: Proceedings of the First International Symposium on Wearable Computers; October 13-14, 1997; Cambridge, Mass.Washington, DC: IEEE Computer Society; 1997:18-23.RBerguerSurgery and ergonomics.Arch Surg.1999;134:1011-1016.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10487599&dopt=AbstractJTroccazMPeshkinBDaviesGuiding systems for computer-assisted surgery: introducing synergistic devices and discussing the different approaches.Med Image Anal.1998;2:101-119.http://www.ncbi.nlm.nih.gov/htbin-post/Entrez/query?db=m&form=6&Dopt=r&uid=entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=10646757&dopt=AbstractCorresponding author and reprints: Jeffrey H. Shuhaiber, MD, Department of Surgery, University of Illinois at Chicago, 840 Southwood St (CSB Suite 518-E), Chicago, IL 60612 (e-mail: shuhaibr@uic.edu).Accepted for publication July 12, 2003. TI - Augmented Reality in Surgery JF - JAMA Surgery DO - 10.1001/archsurg.139.2.170 DA - 2004-02-01 UR - https://www.deepdyve.com/lp/american-medical-association/augmented-reality-in-surgery-sLGtdLD1nr SP - 170 EP - 174 VL - 139 IS - 2 DP - DeepDyve ER -