A typical operation in otolaryngology takes place under the microscope. The surgeon works with a large set of very delicate tools, which are managed by a scrub nurse∕tech. In practice, the surgeon is intimately familiar with all instruments, which can number into hundreds but the nurse might not be. This situation can lead to miscommunication and longer operations. Any delay is a costly proposition. Typical costs during procedures can be $250 per 15minute interval and every minute counts. Scrub nurse responsibilities include organization of instruments and their sterilization. During operation the surgeon requests an instrument, the technician must be able to identify it from a list of hundreds and place it correctly in the surgeon’s hand. The instruments might be microscopically different and look very similar. Each instrument is several hundred dollars and difficult to replace or repair. This study looks at the steps needed to automate the activities of a scrub nurse. First step towards this goal is the study of haptic and spoken language interfaces. The robot must know when to pick up a new instrument and when to release it. Next the robot must be able to plan out a stable and collision free motion and grasping movement. To enable all of this vision based localization of surgeon’s hands, instrument palette and any obstacle movement is necessary.