搜档网
当前位置:搜档网 › Image-guided surgery and medical robotics in the cranial area

Image-guided surgery and medical robotics in the cranial area

Image-guided surgery and medical robotics in the cranial area
Image-guided surgery and medical robotics in the cranial area

Available online at https://www.sodocs.net/doc/346990627.html,/2007/1/e11

doi: 10.2349/biij.3.1.e11

biij

Biomedical Imaging and Intervention Journal

REVIEW ARTICLE Image-guided surgery and medical robotics

in the cranial area

G Widmann, MD

Department of Radiology, Innsbruck Medical University, Anichstr, Austria

Received 5 October 2006; accepted 21 February 2007

ABSTRACT

Surgery in the cranial area includes complex anatomic situations with high-risk structures and high demands for functional and aesthetic results. Conventional surgery requires that the surgeon transfers complex anatomic and surgical planning information, using spatial sense and experience. The surgical procedure depends entirely on the manual skills

of the operator. The development of image-guided surgery provides new revolutionary opportunities by integrating presurgical 3D imaging and intraoperative manipulation. Augmented reality, mechatronic surgical tools, and medical robotics may continue to progress in surgical instrumentation, and ultimately, surgical care. The aim of this article is to review and discuss state-of-the-art surgical navigation and medical robotics, image-to-patient registration, aspects of accuracy, and clinical applications for surgery in the cranial area. ? 2007 Biomedical Imaging and Intervention Journal.

All rights reserved.

Keywords: image-guided surgery, mechatronic surgical tools, medical robotics, image-to-patient registration accuracy

INTRODUCTION

Surgery in the cranial area includes operations of the fronto-zygomatico-maxillary complex, nasal cavity, paranasal sinuses, ear, and the skull base that have close proximity to highly critical structures such as nerves, vessels, the eye, cochlear and labyrinth organ, or the brain. Such operations often require re-establishing functional and aesthetic anatomy by repositioning displaced skeletal elements, or by grafting and contouring abnormal bony contours and transplants [1-5]. The need for accurate preoperative determination of the proposed surgical procedure is essential, and excellent intraoperative orientation and manual skills are required for surgical precision and reliable protection of vital anatomic structures [6-12]. Next generation surgical systems should explore and enhance imaging or manipulation, the two basic components of a surgical procedure [14]. The development of image-guided surgery provides new revolutionary opportunities by integration of presurgical 3D imaging, obtained by computed tomography (CT) or magnetic resonance imaging (MRI), and intraoperative manipulation through three fundamental issues [4,15,16]:

(1) Localisation - determination of a target’s locus (for example, tumour, foreign body, and so on) that defines a task the surgeon performs,

* Corresponding author. Present address: Interdisciplinary Stereotactic Intervention and Planning Laboratory Innsbruck (SIP-Lab), Department of Radiology, Innsbruck Medical University, A-6020 Innsbruck, Anichstr. 35 Austria. Tel.: +43/512/504-80927; Fax: +43/512/504-22758; E-mail: gerlig.widmann@i-med.ac.at (Gerlig Widmann).

(2) Orientation - information on current location on the patient’s anatomy that defines where the surgeon (with respect to the surgical tool) is operating, and

(3) Navigation - the process of (passive) guidance to reach a desired target from the current location (for example, biopsy, tumour resection, bone segment manipulation, implant positioning, and so on).

As a logical extension of image-guided surgery, the development of mechatronic surgical tools, tele-manipulated robotic arms, and semi- or fully- automated surgical robots are beginning to introduce the next revolution [17,18].

SURGICAL NAVIGATION SYSTEMS

Surgical navigation systems generally consist of a (transportable) work station, a monitor, a graphical user interface with software to plan and guide therapy, and a position measuring system (a three-dimensional coordinate-detection or tracking system, which can be either mechanical, electromagnetic, or optical) [6,12,19-22]. By providing a spatial coordinate system relative to the patient's anatomy (see chapter on image-to-patient transformation), the actual position of a probe or tracked surgical tool is shown with respect to cross-sectional images of the preoperative dataset (see chapter on image guidance).

Mechanical navigation systems

A mechanical navigation system consists of an articulated arm with six degrees of freedom [23-26]. Calculation of position is based on measurement of temperature changes recorded by a semiconductor temperature sensor within the gear of movable angles. As the spatial system is entirely self-referential, rigid fixation of both the patient and the navigation arm is an important prerequisite [19,20,23,24,27].

The advantages of the mechanical systems are acceptable precision, low susceptibility to failure, and sterile covering with a tube [21,23,25,29]. The disadvantages are impractical handling during some surgeries, restricted range (circa 60 cm), and mobility as well as the space requirements in the operating table [6,19,25,27]. Due to their bulkiness, the mechanical systems have been generally replaced by more flexible electromagnetic and optical navigation systems. Electromagnetic navigation systems

The position of electromagnetic navigation is measured by detecting of magnetic field changes with coils [19,21,29]. The electromagnetic transmitter is located near the operative site and the receiver is inside the surgical instrument. The advantages of electromagnetic navigation systems are the use of very small detector coils, absence of visual contact between instrument and sensor system, rapid computation of the signals, and easy sterilisation [21,25].

However, due to interference by external magnetic fields and metal objects, particularly those associated with drilling and sawing tools [12,21,30,31], incorrect position sensing of up to 4 mm may occur . To reduce the incorrect position sensing special titanium or ceramic instrument set is required [19,20,32]. Electromagnetic navigation systems are relatively contraindicated for use of patients with pacemakers and cochlear implants [29]. Optical navigation systems

Optical based systems are used for intra-operative navigation [4,16,21,25,33,34]. Position calculation is provided by a minimum of three infrared diodes or passive light reflecting reference elements mounted to the registered patient using dynamic reference frame (DRF) and the surgical tool (tracker), and recognition of the obtained patterns with a stereotactic camera. The advantages of optical navigation systems are high technical accuracy in the range of 0.1-0.4 mm [35,36], convenient handling, and easy sterilisation. The disadvantages are the necessity of constant visual contact between camera-array, DRF and instruments, and the potential susceptibility to interference through light reflexes on metallic surfaces in the operating environment [21,25,37-39].

MEDICAL ROBOTICS

Robots are generally defined as computer controlled devices with five to six degrees of freedom that can execute complex movements with high accuracy [14,40]. Medical robots can be classified based on technology, application, or role [14].

Using a technology-based classification, two groups of systems that differ substantially from each other can be distinguished:

●telemanipulators robots (not pre-programmed)

●pre-programmed surgical robots (automated or

semi-automated)

Application-based taxonomy distinguishes robots on the basis of surgical disciplines and operative procedures. Role-based taxonomy distinguishes robots into three discrete categories:

●passive (the role of the robot is limited in scope

or its involvement is largely low risk)

●restricted (the robot is responsible for more

invasive tasks with higher risk but still

restricted from essential portions of the

procedure)

●active (the robot is intimately involved in the

procedure and carries high responsibility and

risk).

Telemanipulated robots

Telemanipulated robots are non-autonomously working robotic arms (manipulator) that are controlled remotely by the surgeon using force-feedback joysticks or more advanced haptic devices (master console)

[18,41]. Compared to conventional endoscopic arms with limited mechanical control, telemanipulated robots provide a greater degree of freedom and have a computer controlled men-machine interface that allows for automatically processing of the input for the manipulator system without active interaction by the surgeon for motion scaling, tremor filtering, indexing, and so on. [18,41-44].

Pre-programmable surgical robots

Pre-programmable surgical robots can automatically or semi-automatically execute surgical tasks directly on the patient. These systems include:

●floor or operating table mounted robots with six

degrees of freedom

●roof mounted modified surgical microscopes

with generally six to seven active and one

passive degree of freedom [45-49]

The surgeon in the operating theatre supervises the execution of the plan by the robot [7,50].

Interactive assistant robots are navigated tool support systems that carry, guide, and move surgical instruments. The robot is primarily moved passively by the surgeon but the robot can limit the degrees of freedom of the movements. Favourable positions can be saved and reached again with high precision. The surgeon has a spatial interval in which free movements are allowed, preventing movement into high-risk areas [21,40].

Mechatronic surgical tools

As a separate development in surgical instrumentation, mechatronic surgical tools are dedicated to special tasks such as drilling or bone shaving [5,51,52]. These tools may include force feedback sensors to prevent bone perforation or navigated controlled systems that only work within a certain surgical accuracy threshold.

IMAGE-TO-PATIENT TRANSFORMATION

Image-to-patient (IP) transformation or registration is the essential determination of a one-to-one mapping between the coordinates in the image data and those in the patient [53,54]. The registration procedure is based on anatomical landmarks (bone or skin), artificial markers (fiducials, bone affixed or skin applied), teeth supported registration templates, external registration frames, and laser surface scanning [12,25,55-60]. Anatomical landmarks

Registration with anatomical landmarks uses clearly defined external (such as nasion, spina nasalis, tragi, medial canthi, mastoid, umbo, and so on) and/or internal landmarks [61,62]. However, precise identification of the landmarks in both the patient and the image dataset is subjective and depends on the experience of the operator [63]. Surface matching, which is done by touching about 40-80 points on the patient’s skin or bone, can refine anatomical registration [62,64]. However, this method is generally inaccurate and time-consuming.

Fiducial markers

The advantage of fiducial markers over anatomical landmarks is the enhanced localisation accuracy on the image data and the patient. Consequently, registration with skin-applied fiducials is more accurate than registration with surface anatomical landmarks [65-67]. However, the use of skin-applied fiducials is associated with high logistics because the markers must be placed prior to data set acquisition and must be kept in their position until the patient enters the operating room. The time lag between imaging and surgery, and the sensitivity to skin shift can lead to unfavourable inaccuracies [25,56,63,68-70]. Bone-implanted fiducials provide invariant spatial registration points with the highest possible accuracy and generally serve as the reference gold standard in registration [21,53,66,68,71-73]. The drawbacks of bone-implanted fiducials are their invasiveness, the need for additional surgery, and possible major patient discomfort for which they should not be left in place for an extended period [55,63,70,71]. Registration templates

Registration templates are non-invasive, denture fixed acrylic splints with integrated fiducial markers [36,39,60,71,74-81]. Proven accuracy similar to bone implanted fiducials is available for the regions of the maxilla, mandible, orbit and face [36,72,81]. Registration templates cannot be applied to edentulous patient, except when the templates are invasively secured to the underlying bone.

Vogele-Bale-Hohner (VBH) mouthpiece / external registration frame

The Vogele-Bale-Hohner (VBH) vacuum mouthpiece is an individualised mouthpiece that can be objectively and rigidly secured against the maxilla with submillimetric repositioning control, that is regulated by the amount of negative pressure on the scale of a vacuum pump [56,82-84]. Alternatively, the VBH mouthpiece can be glued to an acrylic template, similar to registration templates. Compared to registration templates, where the markers are integrated in the template, an external registration frame is connected to the VBH mouthpiece. The VBH mouthpiece can be removed after registration [55,59,82,83,85-89]. The external registration frame allows for broad marker distribution around the entire head volume. Supported with exchangeable markers for CT/MRI/PET/SPECT, the external registration frame can serve as a single reference device for multimodal surgical navigation and fusion imaging [56,84,90-92].

Laser surface registration

Laser surface registration is based on projection of visible laser beams on the patient’s skin [67-70,93]. The skin reflections are detected by a camera array and a virtual three-dimensional matrix of the skin anatomy of the patient is generated. The matrix, which is an advanced surface-matching algorithm, is then matched to the surface matrix of the pre-operative image-data set.

Currently, up to 300,000 skin surface points can be registered. This allows the registration accuracy reach comparable values to bone markers or registration templates [67]. However, the shift of the patient’s skin surface or different tension in muscles of expression when performing CT-data acquisition and during preoperative and intraoperative recording, may lead to an invalid data set correlation [68,69,93]. Though the patient might to be continuously tracked during surgery, the original geometry of the facial soft tissue may be destroyed by intraoperative swelling, surgical cuts, or during repositioning osteotomies [21,33,69,94]. To compensate, a combination with dynamic reference frames must be available for intraoperative tracking after the initial laser registration has been reported [94]. Laser surface registration is unsuitable for surgery in the mandible but is expected to serve as a sufficiently stable and relatively invariable reference base for many applications in cranio-maxillofacial surgery [66,67,70, 93,94].

IMAGE-GUIDANCE

For image-guidance, the correlation between the space coordinates of the image-data in the navigation system and the patient’s coordinates defined during registration are preserved during the surgical procedure. The coordinates are obtained by rigid fixation of the patient on the operating table, for example invasively via the Mayfield head clamp, or non-invasively via the vacuum mouthpiece based VBH head holder [56,64,95]. Alternatively, bone (invasive) or registration template (non-invasive) affixed DRFs are used for continuous patient tracking after initial registration [33,36,60,74,76].

During surgery, the navigation software indicates the actual real-time position of the tracked surgical tool within the patient’s presurgical 3D-data for intraoperative orientation, and shows the calculated accuracy of the tool’s position and angulations related to the predefined surgical plan. Integrated mechatronic surgical tools provide automatic on/off-regulation depending on the current position of the patient in relation to the planned working space or the connection of the drill speed to the operator accuracy. Integrated mechatronic surgical tools are immediate stopped when possible damage to vital structures occurs (= navigated control) [5,51,52]. In addition, the development of adjustable rigid aiming devices enables a steady linear approach to defined targets [56,85,96-98].

Visualisation of the navigation process is generally provided via the computer screen of the navigation system’s transportable workstation. A disadvantage of such a display is that the surgeon has to look up at the screen and therefore, cannot simultaneously view the surgical field [4,13,39,51,71].

In contrast, augmented reality (AR) provides navigational support by direct projection of segmented structures from the preoperative image data (surgical targets, resection lines, and planned implant position) to the patient. Therefore allowing complete interaction with the real world, while simultaneously making the virtual environment accessible [30,58,71,99,100-104,111]. AR can be based on monocular projection in the operating microscope or the binocular optics of a tracked surgical microscope projection for the purpose of building semitranslucent screens placed between the operating screen and the surgeon or the head mounted displays [4,22,30,58,71,99,100,105-110]. Recently, a promising AR concept using laser registration and stereotactic optical projection of tumour margins and osteotomy lines directly on the patient was presented. This concept does not necessitate navigation instruments [104,111,112]. ASPECTS OF ACCURACY

Terminology

Accuracy is of utmost importance for clinical application of image-guided surgery and medical robotics. Use of standardised terminology and measurement types is essential for correct understanding and comparability of accuracy reports [113].

Accuracy is qualitatively determined as the amount of approximation of the mean of the measurements to the true value (which refers to the term trueness) and quantitatively determined through the margin of error and the uncertainty of measurement, which is characterised by the variation of the mean value from several single measurements.

Precision is the inner accuracy of measurements obtained by repeated measurements (under the same circumstances and with the same measurement technique and system) and refers to the quantitative characterisation of the concision of the measuring instrument and its readout. Although often used as a synonym for accuracy, precision must be clearly distinguished from the term accuracy.

For evaluation of image-guided surgery, the suggested measurement types are as follows: [66,72,113-115]:

●Fiducial Localising Error (FLE): the error in

locating the fiducial points.

●Fiducial Registration Error (FRE) : the error

between corresponding fiducial points after

registration

●Target Registration Error (TRE) : the error

between corresponding points other than the

fiducial points after registration

●Target Positioning Error (TPE): the error

between the real position of the navigated

surgical tool and the calculated position during

the actual surgical procedure (TRE plus

additional factors).

The best indicator for a navigation system’s or medical robot’s accuracy is represented by the TPE, but the definitive overall accuracy of the surgical procedure has to be ultimately evaluated by directly comparing the achieved surgical result to the initial planning data. Influential factors of accuracy

The overall accuracy of image-guided / robotic surgery depends on all systematic and non-systematic (random) errors, from the data-set acquisition to the surgical procedure [116]. The accurate linking of the virtual planning to the surgical site depends on the accuracy of the registration procedure, which includes limitations in the image space and the device space (see chapter image-to-patient transformation). Image quality depends on the image resolution as represented by the voxel size and slice thickness. The thinner the slice thickness and the smaller the voxel size, the higher is the accuracy of determining the centre of the fiducial markers (fiducial-based registration) or the accuracy of the calculated 3D surface model (surface based registration) [88,119,120]. In principle, multi detector CT is more accurate than MRI, because MRI is prone to inhomogeneities of the magnetic field and, due to the longer examination time, more susceptible to motion artefacts [64,117-119]. The arrangement of fiducial markers is a critical factor and it is important to use as many points as possible (although the return diminishes rapidly after five or six markers are used), avoid near-collinear configurations, and ensure that the centroid of the fiducial points is as near as possible to the target [12,54]. The typical feedback provided by the registration software is a measure of the degree of alignment of the points used in the registration. Unfortunately these measures show no direct correlation to the TRE and to reliably control the registration accuracy intraoperatively, the real error between the image and the patient’s anatomy has to be checked prior to surgery by a few independent markers not used for initial registration and/or by anatomic landmarks [10,12,36,39,77,93]. This can be performed with the probe of the navigation system by comparing the probe’s real position (device space) to the virtual position displayed on the computer screen (image space). The accuracy of the surgical transfer is dependant on the technical accuracy of the navigation system, mechatronic, semi-active, or active robotic system and the surgical application accuracy. Notably, human error is attributed to imaging, registration, and transfer errors, for which every step has to be carefully managed.

CLINICAL APPLICATIONS

Image-guided surgery

Successful clinical applications of image-guided surgery in the cranial area have been already described for many procedures, such as the following (neurosurgical procedures excluded): oral implant surgery [10,16,37,38,52,73,77,79,103,121], removal of tumours and foreign bodies [16,33,58,76,81,122], bone segment navigation [60,122,123], temporo-mandibular-joint surgery [74,124], biopsy [16], frameless stereotactic interstitial brachytherapy [28,87], percutaneous radio frequency ablation of the Gasserion ganglion in medically untreatable trigeminal neuralgia [88,95,125], functional endoscopic sinus surgery and skull base surgery [5,9,12,22,107,126-128]. Use of mechatronic surgical tools has been tested for navigate-controlled drilling in oral implant surgery [52] and shaving in functional endoscopic sinus surgery [13,51].

Medical robotics

In the cranial area, robotic systems were considered to help the surgeon interactively with the following tasks [1,7,21,40,45,129]: (1) the drilling of holes with an automatic stop after penetrating the bone to protect the tissue lying deep to the bone, (2) the defined drilling of the implant bed for positioning of implants or bone fixtures for anaplastology, (3) the milling of the bone surfaces in plastic surgery according to a 3D-operation plan, (4) performing deep saw-cuts for osteotomies and allowing for the precise three-dimensional transportation of the subsequent bone segments or CAD/CAM (computer aided design / computer aided manufacturing) transplant, (5) the preoperative automatic selection of the necessary osteosynthesis plates, their bending by a special machine and their intraoperative positioning in defined positions, or (6) the automated guidance for non-flexible catheter implantation at brachytherapy.

Pre-clinical and clinical studies have been started around the millennium in Germany, France, USA and Japan for robot-assisted placement of craniofacial implants in ear anaplastology [130], resection of frontotemporal bone segments [131], implant fabrication combined with CAD/CAM technology in reconstructive surgery [21,79,131], model surgery in orthognatic surgery [26], passive guidance for the positioning of oral implants [133-135], and videoendoscopic ENT and skull base surgery [18,47-49,132].

Cost-benefit ratio

Image-guided surgery is considered to be more accurate than standard surgery. Comparative studies in oral implant surgery indicate significantly more accuracy compared to the manual freehand procedure even if performed by experienced surgeons [79,136,137]. In addition, no significant difference between experienced surgeons and trainees was found, which demonstrates that image-guidance is a valuable means for achieving a predictable and reproducible result without heavy reliance on the clinician’s surgical experience [10,79,136,138]. In other procedures, such as percutaneous interventions (which are generally a “blind” surgical procedure), removal of foreign bodies, access to deep seated locations, orientation in complex

and changed anatomic regions, etc., clear benefit of image-guidance is evident [4,12,16,33,128,143]. Generally, shorter operation time, safer manipulation around delicate structures and higher intraoperative accuracy have been reported [9,16,20,60,128,139,140]. Further, image-guidance may allow for more thorough surgical resection and potentially decreasing the need for revision procedures [140].

In a large clinical study for image-guided ENT surgery, it was found that image-guidance can provide additional relevant information that was not available to the surgeon solely by virtue of his existing knowledge and that every second application of the navigation system may lead to a change in surgical strategy [5]. Accordingly more benefit is obtained from additional orientation and resulting cognitive relief at the moment of stressed and distracted surgical situations. Another clinical study including 158 surgical procedures in cranio-maxillo-facial surgery showed high to very high medical benefits for image-guided biopsies, punctures of the trigeminal ganglion, removal of foreign bodies, osteotomies of the facial skeleton, arthroscopies of the temporomandibular joint and positioning of dental implants [16].

Image-guided surgery is more expensive than the standard procedure (navigation systems cost about USD 60,000 to USD 200,000) and requires presurgical imaging with registration elements, intraoperative image-to-patient registration and specialised equipment for tool tracking. However, these systems can be used for a wide range surgical procedures in different medical specialities [56,57,59,83-85,89,144] and thus may represent a valuable acquisition for an institution [16,33].

A further beneficial aspect is the associated automatic and complete electronic documentation of the intervention [16,116].

Robots are expected to be more accurate and more reliable than a human being. Robots can work as part of an interactive system, are immune to radiation and can be automatically programmed for documentation, evaluation and training protocols [14,40,45,46,129]. Except for very few cases, surgical robots will not execute operations fully autonomously but will support the physician to achieve optimal results [1,7,21,40,44, 45,129,141].

Considering the advantages mentioned above, image-guided surgery and medical robotics may have a positive cost/effort–benefit ratio, depending on the individual surgical task and the developmental stage of each system. The necessity of special knowledge for this technology is indisputable and the relationship between cost and benefit may additionally be dependent on familiarity and availability [15,113].

CONCLUSION

Due to the complex anatomic situations with high-risk structures and the high demands for functional and aesthetic results, surgery in the cranial area is a prototype for application of image-guided surgery and medical robotics. Successful clinical use has been already described for many different procedures and clear benefit is proved in terms of intraoperative orientation, surgical accuracy, safety and reduced operation time. The development of mechatronic surgical tools may additionally improve safety and surgical accuracy. For appropriate clinical application of image-guided surgery, it is important that the surgeon is aware of all influential factors of accuracy and the maximum error of each system / technique regarding the required surgical accuracy for the individual operation.

In the future, surgical navigation with integration of intraoperative imaging, improved augmented reality techniques, sophisticated mechatronic surgical tools and new robotic developments which are smaller, less expensive and easier to operate will enable continued progress in surgical instrumentation, and ultimately, surgical care.

REFERENCES

1.Hassfeld S, Brief J, Krempien R et al.[Computer-assisted oral,

maxillary and facial surgery]. Radiologe 2000; 40(3):218-26.

2.Xia J, Samman N, Yeung RW et https://www.sodocs.net/doc/346990627.html,puter-assisted three-

dimensional surgical planing and simulation. 3D soft tissue planning and prediction. Int J Oral Maxillofac Surg 2000;

29(4):250-8.

3.Troulis MJ, Everett P, Seldin EB et al.Development of a three-

dimensional treatment planning system based on computed tomographic data. Int J Oral Maxillofac Surg 2002; 31(4):349-57.

4.Nijmeh AD, Goodger NM, Hawkes D et al.Image-guided

navigation in oral and maxillofacial surgery. Br J Oral Maxillofac Surg 2005; 43(4):294-302.

5.Strauss G, Koulechov K, Rottger S et al.Evaluation of a

navigation system for ENT with surgical efficiency criteria.

Laryngoscope 2006; 116(4):564-72.

6.Hassfeld S, Muhling J, Zoller J. Intraoperative navigation in oral

and maxillofacial surgery. Int J Oral Maxillofac Surg 1995; 24(1 Pt

2):111-9.

7.Lueth T, Heissler E, Bier J. Evaluierung von navigations- und

Robotersystemen für den Einsatz in der Chirurgie. Navigations- und Robotersysteme als Führungshilfen. Schlag PM, ed. Tele- und computergestützte Chirurgie. Springer Verlag, 1998.

8.Gaggl A, Schultes G, Karcher H. Navigational precision of drilling

tools preventing damage to the mandibular canal. J Craniomaxillofac Surg 2001; 29(5):271-5.

9.Caversaccio M, Bachler R, Ladrach K et al. Frameless computer-

aided surgery system for revision endoscopic sinus surgery.

Otolaryngol Head Neck Surg 2000; 122(6):808-13.

10.Siessegger M, Schneider BT, Mischkowski RA et https://www.sodocs.net/doc/346990627.html,e of an

image-guided navigation system in dental implant surgery in anatomically complex operation sites. J Craniomaxillofac Surg 2001; 29(5):276-81.

11.Xia J, Ip HHS, Samman N et al. Three-dimensional virtual-reality

surgcial planning and soft tissue prediction for orthognatic surgery.

IEEE Trans Inform Tech Biomed 2001; 5(2):97-107.

12.Wise SK, DelGaudio JM. Computer-aided surgery of the paranasal

sinuses and skull base. Expert Rev Med Devices 2005; 2(4):395-408.

13.Strauss G, Koulechov K, Stopp S et al.[Improved accuracy and

precision of the automated shaver (navigated control) in functional endoscopic sinus surgery]. Laryngorhinootologie 2006; 85(8):559-

66.

14.Camarillo DB, Krummel TM, Salisbury JKJ. Robotic technology

in surgery: past, present, and future. Am J Surg 2004; 188(4A Suppl):2-15.

15.Vannier MW, Haller JW. Navigation in diagnosis and therapy. Eur

J Radiol 1999; 31(2):132-40.

16.Ewers R, Schicho K, Undt G et al. Basic research and 12 years of

clinical experience in computer-assisted navigation technology: a review. Int J Oral Maxillofac Surg 2005; 34(1):1-8.

17.Marescaux J, Solerc L. Image-guided robotic surgery. Semin

Laparosc Surg 2004; 11(2):113-22.

18.Strauss G, Winkler D, Jacobs S et al.Mechatronic in functional

endoscopic sinus surgery. First experiences with the daVinci Telemanipulatory System. HNO 2005; 53(7):623-30.

19.Marmulla R, Hilbert M, Niedert-Ellmann H. Introperative

precision of mechanical, electromechanical, infrared and laser-guided navigation systems in computer-assisted surgery. Mund Kiefer Gesichtschir 1998; 2(Suppl 1):145-8.

20.Gunkel AR, Freysinger W, Thumfart WF. Experience with various

3-dimensional navigation systems in head and neck surgery. Arch Otolaryngol Head Neck Surg 2000; 126(3):390-5.

21.Hassfeld S, Muhling J. Computer assisted oral and maxillofacial

surgery--a review and an assessment of technology. Int J Oral Maxillofac Surg 2001; 30(1):2-13.

22.Caversaccio, Freysinger. Computer assistance for intraoperative

navigation in ENT surgery. Minim Invasive Ther Allied Technol 2003; 12(1):36-51.

23.Freysinger W, Gunkel AR, Martin A et al.Advancing ear, nose,

and throat computer-assisted surgery with the arm-based ISG viewing wand: the stereotactic suction tube. Laryngoscope 1997;

107(5):690-3.

24.Freysinger W, Gunkel AR, Bale R et al.Three-dimensional

navigation in otorhinolaryngological surgery with the viewing wand. Ann Otol Rhinol Laryngol 1998; 107(11 Pt 1):953-8.

25.Hassfeld S, Muhling J. Comparative examination of the accuracy

of a mechanical and an optical system in CT and MRT based instrument navigation. Int J Oral Maxillofac Surg 2000; 29(6):400-

7.

26.Theodossy T, Bamber MA. Model surgery with a passive robot

arm for orthognatic surgery planning. Am Ass Oral Maxillofac Surg 2003; 61:1210-317.

27.Hassfeld S, Muhling J. Navigation in maxillofacial and

craniofacial surgery. Comput Aided Surg 1998; 3(4):183-7.

28.Kremer B, Klimek L, Andreopoulos D et al. A new method for the

placement of brachytherapy probes in paranasal sinus and nasopharynx neoplasms. Int J Radiat Oncol Biol Phys 1999;

43(5):995-1000.

29.Koele W, Stammberger H, Lackner A et al. Image guided surgery

of paranasal sinuses and anterior skull base--five years experience with the InstaTrak-System. Rhinology 2002; 40(1):1-9.

30.Wagner A, Ploder O, Enislidis G et al. Image-guided surgery. Int J

Oral Maxillofac Surg 1996; 25(2):147-51.

31.Wagner A, Schicho K, Birkfellner W et al. Quantitative analysis of

factors affecting intraoperative precision and stability of optoelectronic and electromagnetic tracking systems. Med Phys 2002; 29(5):905-12.

32.Birkfellner W, Watzinger F, Wanschitz F et al.Systematic

distortions in magnetic position digitizers. Med Phys 1998;

25(11):2242-8.

33.Heiland M, Habermann CR, Schmelzle R. Indications and

limitations of intraoperative navigation in maxillofacial surgery. J Oral Maxillofac Surg 2004; 62(9):1059-63.

34.Henderson JM, Holloway KL, Gaede SE et al.The application

accuracy of a skull-mounted trajectory guide system for image-guided functional neurosurgery. Comput Aided Surg 2004;

9(4):155-60.

35.Khadem R, Yeh CC, Sadeghi-Tehrani M et https://www.sodocs.net/doc/346990627.html,parative

tracking error analysis of five different optical tracking systems.

Comput Aided Surg 2000; 5(2):98-107.

36.Casap N, Wexler A, Persky N et al. Navigation surgery for dental

implants: assessment of accuracy of the Image Guided Implantology System. J Oralmaxillofac Surg 2004; 62(Suppl

2):116-9.

37.Watzinger F, Birkfellner W, Wanschitz F et al.Positioning of

dental implants using computer-aided navigation and an optical tracking system: case report and presentation of a new method. J Craniomaxillofac Surg 1999; 27(2):77-81.

38.Wagner A, Wanschitz F, Birkfellner W et https://www.sodocs.net/doc/346990627.html,puter-aided

placement of endosseous oral implants in patients after ablative tumour surgery: assessment of accuracy. Clin Oral Implants Res 2003; 14(3):340-8. 39.Meyer U, Wiesmann HP, Runte C et al. Evaluation of accuracy of

insertion of dental implants and prosthetic treatment by computer-aided navigation in minipigs. Br J Oral Maxillofac Surg 2003;

41(2):102-8.

40.Hassfeld S, Raczkowsky J, Bohner P et al. Robotik in der Mund-,

Kiefer- und Gesichtschirurgie. M?glichkeiten - Chancen - Risiken.

Mund Kiefer GesichtsChir 1997; 1:316-23.

41.Rockall TA, Darzi AW. Tele-manipulator robots in surgery. Br J

Surg 2003; 90(6):641-3.

42.Darzi A, Mackay S. Recent advances in minimal access surgery.

BMJ 2002; 324(7328):31-4.

43. Das H, Ohm T, Boswell C et al.Dexterity-enhanced telerobotic

microsurgery. Proceedings of the 8th International Conference on Advanced Robotics. IEEE, 1997: 5-10.

44.Cleary K, Nguyen C. State of the art in surgical robotics: clinical

applications and technology challenges. Comput Aided Surg 2001;

6(6):312-28.

45.Lueth T, Bier J. Robot assisted intervention in surgery. Gilsbach

JM, Stiehl HS, eds. Neuronavigation - neurosurgical and computer scientific aspects. Springer Verlag, 1999.

46.Bier J. Robotik. Mund Kiefer GesichtsChir 2000; 4(Suppl 1):356-

68.

47.Steinhart H, Bumm K, Wurm J et al. Surgical application of a new

robotic system for paranasal sinus surgery. Ann Otol Rhinol Laryngol 2004; 113(4):303-9.

48.Bumm K, Wurm J, Rachinger J et al.An automated robotic

approach with redundant navigation for minimal invasive extended transsphenoidal skull base surgery. Minim Invasive Neurosurg 2005; 48(3):159-64.

49.Wurm J, Bumm K, Steinhart H et al.[Development of an active

robot system for multi-modal paranasal sinus surgery]. HNO 2005;

53(5):446-54.

50.Woern H, Muehling J. Computer- and robot-based operation

theater of the future in cranio-facial surgery. Lemke HU, Vannier MV, Inamura K et al., eds. International Congress Series. Vol.

1230. 2001: 753-9.

51.Strauss G, Koulechov K, Richter R et al. [Navigated control: a new

concept in computer assisted ENT-surgery]. Laryngorhinootologie 2005; 84(8):567-76.

52.Brief J, Edinger D, Hassfeld S et al.Accuracy of image-guided

implantology. Clin Oral Implants Res 2005; 16(4):495-501.

53.Maurer CR Jr, Fitzpatrick JM, Wang MY et al.Registration of

head volume images using implantable fiducial markers. IEEE Trans Med Imaging 1997; 16(4):447-62.

54.West JB, Fitzpatrick JM, Toms SA et al. Fiducial point placement

and the accuracy of point-based, rigid body registration.

Neurosurgery 2001; 48(4):810-6; discussion 816-7.

55.Alp MS, Dujovny M, Misra M et al. Head registration techniques

for image-guided surgery. Neurol Res 1998; 20(1):31-7.

56.Bale RJ, Burtscher J, Eisner W et https://www.sodocs.net/doc/346990627.html,puter-assisted

neurosurgery by using a noninvasive vacuum-affixed dental cast that acts as a reference base: another step toward a unified approach in the treatment of brain tumors. J Neurosurg 2000;

93(2):208-13.

57.Bale RJ, Hoser C, Rosenberger R et al.Osteochondral lesions of

the talus: computer-assisted retrograde drilling--feasibility and accuracy in initial experiences. Radiology 2001; 218(1):278-82. 58.Schultes G, Gaggl A. [Ct-assisted navigation for insertion of dental

implants in maxilla models]. Schweiz Monatsschr Zahnmed 2001;

111(7):828-33.

59.Widmann G, Widmann R, Widmann E et al. In vitro accuracy of a

novel registration and targeting technique for image-guided template production. Clin Oral Implants Res 2005; 16(4):502-8. 60.Klug C, Schicho K, Ploder O et al.Point-to-point computer-

assisted navigation for precise transfer of planned zygoma osteotomies from the stereolithographic model into reality. J Oral Maxillofac Surg 2006; 64(3):550-9.

61.Anon JB, Lipman SP, Oppenheim D et https://www.sodocs.net/doc/346990627.html,puter-assisted

endoscopic sinus surgery. Laryngoscope 1994; 104(7):901-5. 62.Caversaccio M, Zulliger D, Bachler R et al.Practical aspects for

optimal registration (matching) on the lateral skull base with an optical frameless computer-aided pointer system. Am J Otol 2000;

21(6):863-70.

63.Wolfsberger S, Rossler K, Regatschnig R et al.Anatomical

landmarks for image registration in frameless stereotactic neuronavigation. Neurosurg Rev 2002; 25(1-2):68-72.

64.Germano IM, Villalobos H, Silvers A et al.Clinical use of the

optical digitizer for intracranial neuronavigation. Neurosurgery 1999; 45(2):261-9; discussion 269-70.

65.Helm PA, Eckel TS. Accuracy of registration methods in frameless

stereotaxis. Comput Aided Surg 1998; 3(2):51-6.

66.Maurer CR Jr, Maciunas RJ, Fitzpatrick JM. Registration of head

CT images to physical space using a weighted combination of points and surfaces. IEEE Trans Med Imaging 1998; 17(5):753-61.

67.Marmulla R, Muhling J, Luth T et al. Advanced surface-recording

techniques for computer-assisted oral and maxillofacial surgery. Br J Oral Maxillofac Surg 2004; 42(6):511-9.

68.Marmulla R, Hassfeld S, Luth T et al. Laser-scan-based navigation

in cranio-maxillofacial surgery. J Craniomaxillofac Surg 2003;

31(5):267-77.

69.Marmulla R, Luth T, Muhling J et al. Automated laser registration

in image-guided surgery: evaluation of the correlation between laser scan resolution and navigation accuracy. Int J Oral Maxillofac Surg 2004; 33(7):642-8.

70.Marmulla R, Muhling J, Wirtz CR et al.High-resolution laser

surface scanning for patient registration in cranial computer-assisted surgery. Minim Invasive Neurosurg 2004; 47(2):72-8. 71.Edwards PJ, King AP, Maurer CR Jr et al. Design and evaluation

of a system for microscope-assisted guided interventions (MAGI).

IEEE Trans Med Imaging 2000; 19(11):1082-93.

72.Birkfellner W, Solar P, Gahleitner A et al. In-vitro assessment of a

registration protocol for image guided implant dentistry. Clin Oral Implants Res 2001; 12(1):69-78.

73.Ewers R, Schicho K, Truppe M et al. Computer-aided navigation

in dental implantology: 7 years of clinical experience. J Oral Maxillofac Surg 2004; 62(3):329-34.

74.Schmelzeisen R, Gellrich NC, Schramm A et al.Navigation-

guided resection of temporomandibular joint ankylosis promotes safety in skull base surgery. J Oral Maxillofac Surg 2002;

60(11):1275-83.

75.Kniha F, Gahlert M, Lassen T et al.Anwendung der CT-

unterstützten Navigation in der dentalen Implantation unter besonderer Berücksichtigung der Sofortbelastung. Quintessenz Zahntech 2003; 29(7):842-63.

76.Schultes G, Zimmermann V, Feichtinger M et al.Removal of

osteosynthesis material by minimally invasive surgery based on 3-dimensional computed tomography-guided navigation. J Oral Maxillofac Surg 2003; 61(3):401-5.

77.Neugebauer J, Karapetian VE, Schuler M et al.Fabrication of

surgical template for CT-based implant planning. Int Poster J Dent Oral Med 2004; 6(4):Poster 248.

78.Blanchet E, Lucchini JP, Jenny R et al. An image-guided system

based on custom templates: case reports. Clin Implant Dent Relat Res 2004; 6(1):40-7.

79.Kramer FJ, Baethge C, Swennen G et al.Navigated vs.

conventional implant insertion for maxillary single tooth replacement. Clin Oral Impl Res 2005; 16:60-8.

80.Eggers G, Haag C, Hassfeld S. Image-guided removal of foreign

bodies. Br J Oral Maxillofac Surg 2005; 43(5):404-9.

81.Eggers G, Muhling J, Marmulla R. Template-based registration for

image-guided maxillofacial surgery. J Oral Maxillofac Surg 2005;

63(9):1330-6.

82.Martin A, Bale RJ, Vogele M et al.Vogele-Bale-Hohner

mouthpiece: registration device for frameless stereotactic surgery.

Radiology 1998; 208(1):261-5.

83.Sweeney RA, Bale R, Auberger T et al. A simple and non-invasive

vacuum mouthpiece-based head fixation system for high precision radiotherapy. Strahlenther Onkol 2001; 177(1):43-7.

84.Sweeney RA, Bale RJ, Moncayo R et al.Multimodality cranial

image fusion using external markers applied via a vacuum mouthpiece and a case report. Strahlenther Onkol 2003;

179(4):254-60.

85.Bale RJ, Vogele M, Freysinger W et al. Minimally invasive head

holder to improve the performance of frameless stereotactic surgery. Laryngoscope 1997; 107(3):373-7.

86.Bale RJ, Vogele M, Martin A et al. VBH head holder to improve

frameless stereotactic brachytherapy of cranial tumors. Comput Aided Surg 1997; 2(5):286-91.

87.Bale RJ, Freysinger W, Gunkel AR et al. Head and neck tumors:

fractionated frameless stereotactic interstitial brachytherapy-initial experience. Radiology 2000; 214(2):591-5. 88.Bale RJ, Laimer I, Martin A et al.Frameless stereotactic

cannulation of the foramen ovale for ablative treatment of trigeminal neuralgia. Neurosurgery 2006; 59(4 Suppl 2):ONS394-401; discussion ONS402.

89.Burtscher J, Sweeney R, Bale RJ et al. Neuroendoscopy based on

computer assisted adjustment of the endoscope holder in the laboratory. Minim Invas Neurosurg 2003; 46:208-14.

90.Kainz H, Bale R, Donnemiller E et al.Image fusion analysis of

99m Tc-HYNIC-octreotide scintigraphy and CT/MRI in patients with thyroid-associated orbitopathy: the importance of the lacrimal gland. Eur J Nucl Med Mol Imaging 2003; 30(8):1155-9.

91.Profanter C, Prommegger R, Gabriel M et https://www.sodocs.net/doc/346990627.html,puted axial

tomography-MIBI image fusion for preoperative localization in primary hyperparathyroidism. Am J Surg 2004; 187(3):383-7.

92.Profanter C, Wetscher GJ, Gabriel M et al. CT-MIBI image fusion:

a new preoperative localization technique for primary, recurrent,

and persistent hyperparathyroidism. Surgery 2004; 135(2):157-62.

93.Raabe A, Krishnan R, Wolff R et https://www.sodocs.net/doc/346990627.html,ser surface scanning for

patient registration in intracranial image-guided surgery.

Neurosurgery 2002; 50(4):797-801; discussion 802-3.

94.Marmulla R, Eggers G, Muhling J. Laser surface registration for

lateral skull base surgery. Minim Invasive Neurosurg 2005;

48(3):181-5.

95.Hajioff D, Dorward NL, Wadley JP et al.Precise cannulation of

the foramen ovale in trigeminal neuralgia complicating osteogenesis imperfecta with basilar invagination: technical case report. Neurosurgery 2000; 46(4):1005-8.

96.Dorward NL, Alberti O, Dijkstra A et al. Clinical introduction of

an adjustable rigid instrument holder for frameless stereotactic interventions. Comput Aided Surg 1997; 2(3-4):180-5.

97.Patel N, Sandeman D. A simple trajectory guidance device that

assists freehand and interactive image guided biopsy of small deep intracranial targets. Comput Aided Surg 1997; 2(3-4):186-92.

98.Germano IM, Queenan JV. Clinical experience with intracranial

brain needle biopsy using frameless surgical navigation. Comput Aided Surg 1998; 3(1):33-9.

99.Wagner A, Rasse M, Millesi W et al.Virtual reality for

orthognathic surgery: the augmented reality environment concept.

J Oral Maxillofac Surg 1997; 55(5):456-62; discussion 462-3. 100.W agner A, Kremser J, Watzinger F et al.[Telenavigation and expert consultation using a stereotaxic surgical videoserver]. Mund Kiefer Gesichtschir 2000; 4 Suppl 1:S369-74.

101.E nislidis G, Wagner A, Ploder O et https://www.sodocs.net/doc/346990627.html,puted intraoperative navigation guidance--a preliminary report on a new technique. Br J Oral Maxillofac Surg 1997; 35(4):271-4.

102.E dwards PJ, King AP, Hawkes DJ et al. Stereo augmented reality in the surgical microscope. Stud Health Technol Inform 1999;

62:102-8.

103.W anschitz F, Birkfellner W, Figl M et https://www.sodocs.net/doc/346990627.html,puter-enhanced stereoscopic vision in a head-mounted display for oral implant surgery. Clin Oral Implants Res 2002; 13(6):610-6.

104.K ahrs LA, Hoppe H, Eggers G et al. Visualization of surgical 3D information with projector-based augmented reality. Stud Health Technol Inform 2005; 111:243-6.

105.R oessler K, Ungersboeck K, Dietrich W et al.Frameless stereotactic guided neurosurgery: clinical experience with an infrared based pointer device navigation system. Acta Neurochir (Wien) 1997; 139(6):551-9.

106.L evy ML, Day JD, Albuquerque F et al.Heads-up intraoperative endoscopic imaging: a prospective evaluation of techniques and limitations. Neurosurgery 1997; 40(3):526-30; discussion 530-1. 107.Z heng G, Caversaccio M, Bachler R et al.Frameless optical computer-aided tracking of a microscope for otorhinology and skull base surgery. Arch Otolaryngol Head Neck Surg 2001;

127(10):1233-8.

108.F einer SK. Augmented reality: a new way of seeing. Sci Am 2002;

286(4):44-55.

109.B irkfellner W, Figl M, Huber K et al. A head-mounted operating binocular for augmented reality visualization in medicine--design and initial evaluation. IEEE Trans Med Imaging 2002; 21(8):991-7. 110.B irkfellner W, Figl M, Matula C et https://www.sodocs.net/doc/346990627.html,puter-enhanced stereoscopic vision in a head-mounted operating binocular. Phys Med Biol 2003; 48(3):49-57.

111.M armulla R, Hoppe H, Muhling J et al.An augmented reality system for image-guided surgery. Int J Oral Maxillofac Surg 2005;

34(6):594-6.

112.E ggers G, Salb T, Hoppe H et al. Intraoperative augmented reality: the surgeons view. Stud Health Technol Inform 2005; 111:123-5. 113.S trauss G, Hofer M, Korb W et al. Accuracy and precision in the evaluation of computer assisted surgical systems. A definition.

HNO 2006; 54(2):78-84.

114.F itzpatrick JM, West JB, Maurer CR Jr. Predicting error in rigid-body point-based registration. IEEE Trans Med Imaging 1998;

17(5):694-702.

115.F itzpatrick JM, West JB. The distribution of target registration error in rigid-body point-based registration. IEEE Trans Med Imaging 2001; 20(9):917-27.

116.W idmann G, Bale RJ. Accuracy in computer-aided implant surgery--a review. Int J Oral Maxillofac Implants 2006; 21(2):305-

13.

117.C ohen DS, Lustgarten JH, Miller E et al. Effects of coregistration of MR to CT images on MR stereotactic accuracy. J Neurosurg 1995; 82(5):772-9.

118.G olfinos JG, Fitzpatrick BC, Smith LR et al.Clinical use of a frameless stereotactic arm: results of 325 cases. J Neurosurg 1995;

83(2):197-205.

119.D orward NL, Alberti O, Palmer JD et al.Accuracy of true frameless stereotaxy: in vivo measurement and laboratory phantom studies. Technical note. J Neurosurg 1999; 90(1):160-8.

120.G alloway RL Jr, Maciunas RJ, Latimer JW. The accuracies of four stereotactic frame systems: an independent assessment. Biomed Instrum Technol 1991; 25(6):457-60.

121.C asap N, Kreiner B, Wexler A et al. Flapless approach for removal of bone graft fixing screws and placement of dental implants using computerized navigation: a technique and case report. Int J Oral Maxillofac Implants 2006; 21(2):314-9.

122.M armulla R, Niederdellmann H. Computer-assisted bone segment navigation. J Cranio Maxillofac Surg 1998; 26:347-59.

123.M armulla R, Niederdellmann H. Surgical planning of computer-assisted repositioning osteotomies. Plast Reconstr Surg 1999;

104(4):938-44.

124.W agner A, Undt G, Watzinger F et al.Principles of computer-assisted arthroscopy of the temporomandibular joint with optoelectronic tracking technology. Oral Surg Oral Med Oral Pathol Oral Radiol Endod 2001; 92(1):30-7.

125.B ale RJ, Laimer I, Schlager A, et al. Frameless stereotactic cannulation of the foramen ovale for ablative treatment of trigeminal neuralgia. Neurosurgery 2006;59(4 Suppl 2):ONS394-ONS402,

126.C arrau RL, Snyderman CH, Curtin HD et https://www.sodocs.net/doc/346990627.html,puter-assisted intraoperative navigation during skull base surgery. Am J Otolaryngol 1996; 17(2):95-101.

127.C aversaccio M, Nolte LP, Hausler R. Present state and future perspectives of computer aided surgery in the field of ENT and skull base. Acta Otorhinolaryngol Belg 2002; 56(1):51-9.

128.C aversaccio M, Romualdez J, Baechler R et al.Valuable use of computer-aided surgery in congenital bony aural atresia. J Laryngol Otol 2003; 117(4):241-8.

129.K orb W, Marmulla R, Raczkowsky J et al. Robots in the operating theatre--chances and challenges. Int J Oral Maxillofac Surg 2004;

33(8):721-32.

130.K lein M, Hein A, Lueth T et al.Robot-assisted placement of craniofacial implants. Int J Oral Maxillofac Implants 2003;

18(5):712-8.

131.W eihe S, Wehmoller M, Schliephake H et al.Synthesis of CAD/CAM, robotics and biomaterial implant fabrication: single-step reconstruction in computer-aided frontotemporal bone resection. Int J Oral Maxillofac Surg 2000; 29(5):384-8.

132.K avanagh KT. Applications of image-directed robotics in otolaryngologic surgery. Laryngoscope 1994; 104(3 Pt 1):283-93. 133.B rief J, Hassfeld S, Redlich T et al.Robot assisted insertion of dental implants - a clinical evaluation. Lemke HU, Vannier MV, Inamura K et al., eds. CARS 2000. Elsevier Science B.V., 2000: 932-7.

134.B rief J, Hassfeld S, Boeseke R et al.Int Poster J Dent Oral Med 2002; 4(1):Poster 109.

135.G oulette F, Dutreuil J, Laurgeau et al. A new method and clinical case for computer assisted dental implantology. Lemke HU, Vannier MV, Inamura K et al., eds. CARS 2002. Elsevier Science

B.V., 2002: 953-8.

136.S chermeier O, Hildebrand D, Lueth TC et al.Accuracy of an image-guided system for oral implantology. Lemke HU, Vannier

MV, Inamura K et al., eds. International Congress Series. Vol.

1230. 2001: 748-52.

137.S arment DP, Sukovic P, Clinthorne N. Accuracy of implant placement with a stereolithographic surgical guide. Int J Oral Maxillofac Implants 2003; 18(4):571-7.

138.S arment DP, Al-Shammari K, Kazor CE. Stereolithographic surgical templates for placement of dental implants in complex cases. Int J Periodont Restor Dent 2003; 23(3):287-95.

139.P aleologos TS, Dorward NL, Wadley JP et al. Clinical validation of true frameless stereotactic biopsy: analysis of the first 125 consecutive cases. Neurosurgery 2001; 49(4):830-5; discussion 835-7.

140.F ried MP, Moharir VM, Shin J et https://www.sodocs.net/doc/346990627.html,parison of endoscopic sinus surgery with and without image guidance. Am J Rhinol 2002;

16(4):193-7.

141.T abaee A, Kacker A, Kassenoff TL et al.Outcome of computer-assisted sinus surgery: a 5-year study. Am J Rhinol 2003;

17(5):291-7.

142.H owe RD, Matsuoka Y. Robotics for surgery. Annu Rev Biomed Eng 1999; 1:211-40.

143.W ittwer G, Adeyemo WL, Schicho K et https://www.sodocs.net/doc/346990627.html,puter-guided flapless transmucosal implant placement in the mandible: a new combination of two innovative techniques. Oral Surg Oral Med Oral Pathol Oral Radiol Endod 2006; 101(6):718-23.

144.W idmann G, Widmann R, Widmann E et https://www.sodocs.net/doc/346990627.html,e of surgical navigation systems for CT-guided template production. Int J Oral Maxillofac Implants 2007; 22(1):72-78.

十种技能职业生涯规划

最受老板欢迎的10种技能(组图) 作者:David Lee职业生涯规划 图解:没有学历,拥有这10种技能你同样可以赚大钱! 一、解决问题的能力 每天,我们都要在生活和工作中解决一些综合性的问题。那些能够发现问题、解决问题并迅速作出有效决断的人行情将持续升温,在商业经营、管理咨询、公共管理、科学、医药和工程领域需求量骤增。

图解:二、专业技能 现在,技术已经进入了人类活动的所有领域。工程、通讯、汽车、交通、航空航天领域需要大量能够对电力、电子和机械设备进行安装、调试和修理的专业人员。升温,在商业经营、管理咨询、公共管理、科学、医药和工程领域需求量骤增。 图解:三、沟通能力 所有的公司都不可避免地面临内部雇员如何相处的问题。一个公司的成功很多时候取决于全体职员能否团结协作。因此,人力资源经理、人事部门官员和管理决策部门必须尽量了解职员的需求并在允许的范围内尽量予以满足。 图解:四、计算机编程技能 如果你能够利用计算机编程的方法满足某个公司的特定需要,那么你获得工作的机会将大大增加。因此,你需要掌握C ++、Java、HTML、VisualBasic、Unix和SQL Server等计算机语言。

图解:五、信息管理能力 信息是信息时代经济系统的基础,掌握信息管理能力在绝大多数行业来说都是必须的。系统分析员、信息技术员、数据库管理员以及通信工程师等掌握信息管理能力的人才将会非常吃香。 图解:六、理财能力 随着平均寿命的延长,每个人都必须仔细审核自己的投资计划以保证舒适的生活以及退休后的生活来源。投资经纪人、证券交易员、退休规划者、会计等职业的需求量也将继续增加。

职业生涯技能及养成

职业生涯技能及养成 专业:药学年级:2010级姓名:郭玉奇学号:10101105 我最重要的五项自我管理技能:亲切的、温和的、感情外露的、逻辑性强的、敏感的我最重要的五项可迁移技能:制图、美化、理解、精确化、保存 我最重要的五项专业技能:英语、小说、信仰、数字、化学、计算机 药学专业是培养从事药品生产、经营、使用一线工作的高等技术应用型药学人才,所以技能的培养必须适应我国社会主义现代化建设和药学相关事业发展需要的。要求德、智、体、美全面发展,掌握药品生产、药品质量分析检验、药品使用、医药企业经营管理等方面的基本理论知识和专业技能,具有良好的职业道德、人文素养、实践能力和创新精神,毕业后能在医院(乡镇卫生院)药房、医药公司、制药企业、社会药房等单位从事药品生产、经营、药学服务等工作,也能够在药品检验监督部门从事药品检验监督管理,在药物研究机构、医药院校从事医药科研助理、教学辅助等工作的高等技术应用。 首先,我们应该拥有一定的专业技能,具有药用化学、药用植物学、人体解剖生理学、疾病学基础等基本知识与技能。具有药物化学、天然药物化学、天然药物学、药理学、药物分析、药剂学、药物制剂技术、药事管理、药品营销技术等方面的基本知识与技能。具有药物开发、生产及药动学、药效学的基本理论及药物的专业知识,了解最新进展。熟悉国家关于药物的生产标准、质量控制、药品营销等方面方针政策及相关法律法规。具有执业药师考证所需的基本知识和技能。具有药学专业所需要的英语阅读、会话能力。具有药学专业所需要的计算机操作与应用能力。为了,拥有以上的专业技能,在大学四年时间中,我们要努力学习,不要浪费时间,多了解一些与自己专业有关的国家动向。 其次,我们应该培养一定的社会能力,具有与社会主义市场经济相适应的创新精神和创业精神、立业能力。具有药品生产经营、质量控制意识,各部门间协调、合作能力。具有医药信息的收集与处理能力。具有适应职业变化的终身学习能力。具有学法、懂法、守法的意识,能应用法律维护企业和个人合法权益。为了培养以上技能,我们在大学期间要多去参加一些实践活动,与社会有更多的联系。 最后我们要培养一些药学方向的技能,熟悉一般原料药、常用制剂的基本生产过程,了解常用设备的基本原理,掌握基本使用技术。具有一定的医院药学知识,在职业药师的指导下,能开展相应的药学服务工作。具有制剂、药品采购、检验和仓储管理岗位的专业知识和技能。具有常用制剂、药物合成反应的基本知识和技能,了解常用制药设备的特点及使用方法。了解GMP的相关知识和质量检验方法。具有药品流通企业的采购、销售、质量检验、仓储管理、物流等岗位的专业知识和技能。为了达到以上的技能,我们应该多多的实习,只有在经验中才能找到方法。 以上就是我对于药学需要掌握的技能及如何培养的看法。 勤劳的蜜蜂有糖吃

相关主题