Thursday, 03 March 2011 19:40

Equilibrium

Balance System Function

Input

Perception and control of orientation and motion of the body in space is achieved by a system that involves simultaneous input from three sources: vision, the vestibular organ in the inner ear and sensors in the muscles, joints and skin that provide somatosensory or “proprioceptive” information about movement of the body and physical contact with the environment (figure 1). The combined input is integrated in the central nervous system which generates appropriate actions to restore and maintain balance, coordination and well-being. Failure to compensate in any part of the system may produce unease, dizziness and unsteadiness that can produce symptoms and/or falls.

Figure 1.  An outline of the principal elements of the balance system

SEN050F1

The vestibular system directly registers the orientation and movement of the head. The vestibular labyrinth is a tiny bony structure located in the inner ear, and comprises the semicircular canals filled with fluid (endolymph) and the otoliths (Figure 6). The three semicircular canals are positioned at right angles so that acceleration can be detected in each of the three possible planes of angular motion. During head turns, the relative movement of the endolymph within the canals (caused by inertia) results in deflection of the cilia projecting from the sensory cells, inducing a change in the neural signal from these cells (figure 2). The otoliths contain heavy crystals (otoconia) which respond to changes in the position of the head relative to the force of gravity and to linear acceleration or deceleration, again bending the cilia and so altering the signal from the sensory cells to which they are attached.

 

 

 

Figure 2. Schematic diagram of the vestibular labyrinth.

SEN050F2

 

Figure 3. Schematic representation of the biomechanical effects of a ninety-degree (forward) inclination of the head.

SEN050F3

Integration

The central interconnections within the balance system are extremely complex; information from the vestibular organs in both ears is combined with information derived from vision and the somatosensory system at various levels within the brainstem, cerebellum and cortex (Luxon 1984).

Output

This integrated information provides the basis not only for the conscious perception of orientation and self-motion, but also the preconscious control of eye movements and posture, by means of what are known as the vestibuloocular and vestibulospinal reflexes. The purpose of the vestibuloocular reflex is to maintain a stable point of visual fixation during head movement by automatically compensating for the head movement with an equivalent eye movement in the opposite direction (Howard 1982). The vestibulospinal reflexes contribute to postural stability and balance (Pompeiano and Allum 1988).

Balance System Dysfunction

In normal circumstances, the input from the vestibular, visual and somatosensory systems is congruent, but if an apparent mismatch occurs between the different sensory inputs to the balance system, the result is a subjective sensation of dizziness, disorientation, or illusory sense of movement. If the dizziness is prolonged or severe it will be accompanied by secondary symptoms such as nausea, cold sweating, pallor, fatigue, and even vomiting. Disruption of reflex control of eye movements and posture may result in a blurred or flickering visual image, a tendency to veer to one side when walking, or staggering and falling. The medical term for the disorientation caused by balance system dysfunction is “vertigo,” which can be caused by a disorder of any of the sensory systems contributing to balance or by faulty central integration. Only 1 or 2% of the population consult their doctor each year on account of vertigo, but the incidence of dizziness and imbalance rises steeply with age. “Motion sickness” is a form of disorientation induced by artificial environmental conditions with which our balance system has not been equipped by evolution to cope, such as passive transport by car or boat (Crampton 1990).

Vestibular causes of vertigo

The most common causes of vestibular dysfunction are infection (vestibular labyrinthitis or neuronitis), and benign positional paroxysmal vertigo (BPPV) which is triggered principally by lying on one side. Recurrent attacks of severe vertigo accompanied by loss of hearing and noises (tinnitus) in one ear are typical of a syndrome known as Menière’s disease. Vestibular damage can also result from disorders of the middle ear (including bacterial disease, trauma and cholesteatoma), ototoxic drugs (which should be used only in medical emergencies), and head injury.

Non-vestibular peripheral causesof vertigo

Disorders of the neck, which may alter the somatosensory information relating to head movement or interfere with the blood-supply to the vestibular system, are believed by many clinicians to be a cause of vertigo. Common aetiologies include whiplash injury and arthritis. Sometimes unsteadiness is related to a loss of feeling in the feet and legs, which may be caused by diabetes, alcohol abuse, vitamin deficiency, damage to the spinal cord, or a number of other disorders. Occasionally the origin of feelings of giddiness or illusory movement of the environment can be traced to some distortion of the visual input. An abnormal visual input may be caused by weakness of the eye muscles, or may be experienced when adjusting to powerful lenses or to bifocal glasses.

Central causes of vertigo

Although most cases of vertigo are attributable to peripheral (mainly vestibular) pathology, symptoms of disorientation can be caused by damage to the brainstem, cerebellum or cortex. Vertigo due to central dysfunction is almost always accompanied by some other symptom of central neurological disorder, such as sensations of pain, tingling or numbness in the face or limbs, difficulty speaking or swallowing, headache, visual disturbances, and loss of motor control or loss of consciousness. The more common central causes of vertigo include disorders of the blood supply to the brain (ranging from migraine to strokes), epilepsy, multiple sclerosis, alcoholism, and occasionally tumours. Temporary dizziness and imbalance is a potential side-effect of a vast array of drugs, including widely-used analgesics, contraceptives, and drugs used in the control of cardiovascular disease, diabetes and Parkinson’s disease, and in particular the centrally-acting drugs such as stimulants, sedatives, anti-convulsants, anti-depressants and tranquillizers (Ballantyne and Ajodhia 1984).

Diagnosis and treatment

All cases of vertigo require medical attention in order to ensure that the (relatively uncommon) dangerous conditions which can cause vertigo are detected and appropriate treatment is given. Medication can be given to relieve symptoms of acute vertigo in the short term, and in rare cases surgery may be required. However, if the vertigo is caused by a vestibular disorder the symptoms will generally subside over time as the central integrators adapt to the altered pattern of vestibular input—in the same way that sailors continuously exposed to the motion of waves gradually acquire their “sea legs”. For this to occur, it is essential to continue to make vigorous movements which stimulate the balance system, even though these will at first cause dizziness and discomfort. Since the symptoms of vertigo are frightening and embarrassing, sufferers may need physiotherapy and psychological support to combat the natural tendency to restrict their activities (Beyts 1987; Yardley 1994).

Vertigo in the Workplace

Risk factors

Dizziness and disorientation, which may become chronic, is a common symptom in workers exposed to organic solvents; furthermore, long-term exposure can result in objective signs of balance system dysfunction (e.g., abnormal vestibular-ocular reflex control) even in people who experience no subjective dizziness (Gyntelberg et al. 1986; Möller et al. 1990). Changes in pressure encountered when flying or diving can cause damage to the vestibular organ which results in sudden vertigo and hearing loss requiring immediate treatment (Head 1984). There is some evidence that noise-induced hearing loss can be accompanied by damage to the vestibular organs (van Dijk 1986). People who work for long periods at computer screens sometimes complain of dizziness; the cause of this remains unclear, although it may be related to the combination of a stiff neck and moving visual input.

Occupational difficulties

Unexpected attacks of vertigo, such as occur in Menière’s disease, can cause problems for people whose work involves heights, driving, handling dangerous machinery, or responsibility for the safety of others. An increased susceptibility to motion sickness is a common effect of balance system dysfunction and may interfere with travel.

Conclusion

Equilibrium is maintained by a complex multisensory system, and so disorientation and imbalance can result from a wide variety of aetiologies, in particular any condition which affects the vestibular system or the central integration of perceptual information for orientation. In the absence of central neurological damage the plasticity of the balance system will normally enable the individual to adapt to peripheral causes of disorientation, whether these are disorders of the inner ear which alter vestibular function, or environments which provoke motion sickness. However, attacks of dizziness are often unpredictable, alarming and disabling, and rehabilitation may be necessary to restore confidence and assist the balance function.

 

Back

Thursday, 03 March 2011 19:34

Physically-Induced Hearing Disorders

By virtue of its position within the skull, the auditory system is generally well protected against injuries from external physical forces. There are, however, a number of physical workplace hazards that may affect it. They include:

Barotrauma. Sudden variation in barometric pressure (due to rapid underwater descent or ascent, or sudden aircraft descent) associated with malfunction of the Eustachian tube (failure to equalize pressure) may lead to rupture of the tympanic membrane with pain and haemorrhage into the middle and external ears. In less severe cases stretching of the membrane will cause mild to severe pain. There will be a temporary impairment of hearing (conductive loss), but generally the trauma has a benign course with complete functional recovery.

Vibration. Simultaneous exposure to vibration and noise (continuous or impact) does not increase the risk or severity of sensorineural hearing loss; however, the rate of onset appears to be increased in workers with hand-arm vibration syndrome (HAVS). The cochlear circulation is presumed to be affected by reflex sympathetic spasm, when such workers have bouts of vasospasm (Raynaud’s phenomenon) in their fingers or toes.

Infrasound and ultrasound. The acoustic energy from both of these sources is normally inaudible to humans. The common sources of ultrasound, for example, jet engines, high-speed dental drills, and ultrasonic cleaners and mixers all emit audible sound so the effects of ultrasound on exposed subjects are not easily discernible. It is presumed to be harmless below 120 dB and therefore unlikely to cause NIHL. Likewise, low-frequency noise is relatively safe, but with high intensity (119-144 dB), hearing loss may occur.

“Welder’s ear”. Hot sparks may penetrate the external auditory canal to the level of the tympanic membrane, burning it. This causes acute ear pain and sometimes facial nerve paralysis. With minor burns, the condition requires no treatment, while in more severe cases, surgical repair of the membrane may be necessary. The risk may be avoided by correct positioning of the welder’s helmet or by wearing ear plugs.

 

Back

Thursday, 03 March 2011 18:10

Chemically-Induced Hearing Disorders

Hearing impairment due to the cochlear toxicity of several drugs is well documented (Ryback 1993). But until the latest decade there has been only little attention paid to audiologic effects of industrial chemicals. The recent research on chemically-induced hearing disorders has focused on solvents, heavy metals and chemicals inducing anoxia.

Solvents. In studies with rodents, a permanent decrease in auditory sensitivity to high-frequency tones has been demonstrated following weeks of high-level exposure to toluene. Histopathological and auditory brainstem response studies have indicated a major effect on the cochlea with damage to the outer hair cells. Similar effects have been found in exposure to styrene, xylenes or trichloroethylene. Carbon disulphide and n-hexane may also affect auditory functions while their major effect seems to be on more central pathways (Johnson and Nylén 1995).

Several human cases with damage to the auditory system together with severe neurologic abnormalities have been reported following solvent sniffing. In case series of persons with occupational exposure to solvent mixtures, to n-hexane or to carbon disulphide, both cochlear and central effects on auditory functions have been reported. Exposure to noise was prevalent in these groups, but the effect on hearing has been considered greater than expected from noise.

Only few controlled studies have so far addressed the problem of hearing impairment in humans exposed to solvents without a significant noise exposure. In a Danish study, a statistically significant elevated risk for self-reported hearing impairment at 1.4 (95% CI: 1.1-1.9) was found after exposure to solvents for five years or more. In a group exposed to both solvents and noise, no additional effect from solvent exposure was found. A good agreement between reporting hearing problems and audiometric criteria for hearing impairment was found in a subsample of the study population (Jacobsen et al. 1993).

In a Dutch study of styrene-exposed workers a dose-dependent difference in hearing thresholds was found by audiometry (Muijser et al. 1988).

In another study from Brazil the audiologic effect from exposure to noise, toluene combined with noise, and mixed solvents was examined in workers in printing and paint manufacturing industries. Compared to an unexposed control group, significantly elevated risks for audiometric high frequency hearing loss were found for all three exposure groups. For noise and mixed solvent exposures the relative risks were 4 and 5 respectively. In the group with combined toluene and noise exposure a relative risk of 11 was found, suggesting interaction between the two exposures (Morata et al. 1993).

Metals. The effect of lead on hearing has been studied in surveys of children and teenagers from the United States. A significant dose-response association between blood lead and hearing thresholds at frequencies from 0.5 to 4 kHz was found after controlling for several potential confounders. The effect of lead was present across the entire range of exposure and could be detected at blood lead levels below 10 μg/100ml. In children without clinical signs of lead toxicity a linear relationship between blood lead and latencies of waves III and V in brainstem auditory potentials (BAEP) has been found, indicating a site of action central to the cochlear nucleus (Otto et al. 1985).

Hearing loss is described as a common part of the clinical picture in acute and chronic methyl-mercury poisoning. Both cochlear and postcochlear lesions have been involved (Oyanagi et al. 1989). Inorganic mercury may also affect the auditory system, probably through damage to cochlear structures.

Exposure to inorganic arsenic has been implied in hearing disorders in children. A high frequency of severe hearing loss (>30 dB) has been observed in children fed with powdered milk contaminated with inorganic arsenic V. In a study from Czechoslovakia, environmental exposure to arsenic from a coal-burning power plant was associated with audiometric hearing loss in ten-year-old children. In animal experiments, inorganic arsenic compounds have produced extensive cochlear damage (WHO 1981).

In acute trimethyltin poisoning, hearing loss and tinnitus have been early symptoms. Audiometry has shown pancochlear hearing loss between 15 and 30 dB at presentation. It is not clear whether the abnormalities have been reversible (Besser et al. 1987). In animal experiments, trimethyltin and triethyltin compounds have produced partly reversible cochlear damage (Clerisi et al. 1991).

Asphyxiants. In reports on acute human poisoning by carbon monoxide or hydrogen sulphide, hearing disorders have often been noted along with central nervous system disease (Ryback 1992).

In experiments with rodents, exposure to carbon monoxide had a synergistic effect with noise on auditory thresholds and cochlear structures. No effect was observed after exposure to carbon monoxide alone (Fechter et al. 1988).

Summary

Experimental studies have documented that several solvents can produce hearing disorders under certain exposure circumstances. Studies in humans have indicated that the effect may be present following exposures that are common in the occupational environment. Synergistic effects between noise and chemicals have been observed in some human and experimental animal studies. Some heavy metals may affect hearing, most of them only at exposure levels that produce overt systemic toxicity. For lead, minor effects on hearing thresholds have been observed at exposures far below occupational exposure levels. A specific ototoxic effect from asphyxiants has not been documented at present although carbon monoxide may enhance the audiological effect of noise.

 

Back

Thursday, 03 March 2011 17:34

The Ear

Anatomy

The ear is the sensory organ responsible for hearing and the maintenance of equilibrium, via the detection of body position and of head movement. It is composed of three parts: the outer, middle, and inner ear; the outer ear lies outside the skull, while the other two parts are embedded in the temporal bone (figure 1).

Figure 1. Diagram of the ear.

SEN010F1

The outer ear consists of the auricle, a cartilaginous skin-covered structure, and the external auditory canal, an irregularly-shaped cylinder approximately 25 mm long which is lined by glands secreting wax.

The middle ear consists of the tympanic cavity, an air-filled cavity whose outer walls form the tympanic membrane (eardrum), and communicates proximally with the nasopharynx by the Eustachian tubes, which maintain pressure equilibrium on either side of the tympanic membrane. For instance, this communication explains how swallowing allows equalization of pressure and restoration of lost hearing acuity caused by rapid change in barometric pressure (e.g., landing airplanes, fast elevators). The tympanic cavity also contains the ossicles—the malleus, incus and stapes—which are controlled by the stapedius and tensor tympani muscles. The tympanic membrane is linked to the inner ear by the ossicles, specifically by the mobile foot of the stapes, which lies against the oval window.

The inner ear contains the sensory apparatus per se. It consists of a bony shell (the bony labyrinth) within which is found the membranous labyrinth—a series of cavities forming a closed system filled with endolymph, a potassium-rich liquid. The membranous labyrinth is separated from the bony labyrinth by the perilymph, a sodium-rich liquid.

The bony labyrinth itself is composed of two parts. The anterior portion is known as the cochlea and is the actual organ of hearing. It has a spiral shape reminiscent of a snail shell, and is pointed in the anterior direction. The posterior portion of the bony labyrinth contains the vestibule and the semicircular canals, and is responsible for equilibrium. The neurosensory structures involved in hearing and equilibrium are located in the membranous labyrinth: the organ of Corti is located in the cochlear canal, while the maculae of the utricle and the saccule and the ampullae of the semicircular canals are located in the posterior section.

Hearing organs

The cochlear canal is a spiral triangular tube, comprising two and one-half turns, which separates the scala vestibuli from the scala tympani. One end terminates in the spiral ligament, a process of the cochlea’s central column, while the other is connected to the bony wall of the cochlea.

The scala vestibuli and tympani end in the oval window (the foot of the stapes) and round window, respectively. The two chambers communicate through the helicotrema, the tip of the cochlea. The basilar membrane forms the inferior surface of the cochlear canal, and supports the organ of Corti, responsible for the transduction of acoustic stimuli. All auditory information is transduced by only 15,000 hair cells (organ of Corti), of which the so-called inner hair cells, numbering 3,500, are critically important, since they form synapses with approximately 90% of the 30,000 primary auditory neurons (figure 2). The inner and outer hair cells are separated from each other by an abundant layer of support cells. Traversing an extraordinarily thin membrane, the cilia of the hair cells are embedded in the tectorial membrane, whose free end is located above the cells. The superior surface of the cochlear canal is formed by Reissner’s membrane.

Figure 2. Cross-section of one loop of the cochlea. Diameter: approximately 1.5 mm.

SEN010F2

The bodies of the cochlear sensory cells resting on the basilar membrane are surrounded by nerve terminals, and their approximately 30,000 axons form the cochlear nerve. The cochlear nerve crosses the inner ear canal and extends to the central structures of the brain stem, the oldest part of the brain. The auditory fibres end their tortuous path in the temporal lobe, the part of the cerebral cortex responsible for the perception of acoustic stimuli.

 

 

 

 

 

Organs of Equilibrium

The sensory cells are located in the ampullae of the semicircular canals and the maculae of the utricle and saccule, and are stimulated by pressure transmitted through the endolymph as a result of head or body movements. The cells connect with bipolar cells whose peripheral processes form two tracts, one from the anterior and external semicircular canals, the other from the posterior semicircular canal. These two tracts enter the inner ear canal and unite to form the vestibular nerve, which extends to the vestibular nuclei in the brainstem. Fibres from the vestibular nuclei, in turn, extend to cerebellar centres controlling eye movements, and to the spinal cord.

The union of the vestibular and cochlear nerves forms the 8th cranial nerve, also known as the vestibulocochlear nerve.

Physiology of Hearing

Sound conduction through air

The ear is composed of a sound conductor (the outer and middle ear) and a sound receptor (the inner ear).

Sound waves passing through the external auditory canal strike the tympanic membrane, causing it to vibrate. This vibration is transmitted to the stapes through the hammer and anvil. The surface area of the tympanic membrane is almost 16 times that of the foot of the stapes (55 mm2/3.5 mm2), and this, in combination with the lever mechanism of the ossicles, results in a 22-fold amplification of the sound pressure. Due to the middle ear’s resonant frequency, the transmission ratio is optimal between 1,000 and 2,000 Hz. As the foot of the stapes moves, it causes waves to form in the liquid within the vestibular canal. Since the liquid is incompressible, each inward movement of the foot of the stapes causes an equivalent outward movement of the round window, towards the middle ear.

When exposed to high sound levels, the stapes muscle contracts, protecting the inner ear (the attenuation reflex). In addition to this function, the muscles of the middle ear also extend the dynamic range of the ear, improve sound localization, reduce resonance in the middle ear, and control air pressure in the middle ear and liquid pressure in the inner ear.

Between 250 and 4,000 Hz, the threshold of the attenuation reflex is approximately 80 decibels (dB) above the hearing threshold, and increases by approximately 0.6 dB/dB as the stimulation intensity increases. Its latency is 150 ms at threshold, and 24-35 ms in the presence of intense stimuli. At frequencies below the natural resonance of the middle ear, contraction of the middle ear muscles attenuates sound transmission by approximately 10 dB. Because of its latency, the attenuation reflex provides adequate protection from noise generated at rates above two to three per second, but not from discrete impulse noise.

The speed with which sound waves propagate through the ear depends on the elasticity of the basilar membrane. The elasticity increases, and the wave velocity thus decreases, from the base of the cochlea to the tip. The transfer of vibration energy to Reissner’s membrane and the basilar membrane is frequency-dependent. At high frequencies, the wave amplitude is greatest at the base, while for lower frequencies, it is greatest at the tip. Thus, the point of greatest mechanical excitation in the cochlea is frequency-dependent. This phenomenon underlies the ability to detect frequency differences. Movement of the basilar membrane induces shear forces in the stereocilia of the hair cells and triggers a series of mechanical, electrical and biochemical events responsible for mechanical-sensory transduction and initial acoustic signal processing. The shear forces on the stereocilia cause ionic channels in the cell membranes to open, modifying the permeability of the membranes and allowing the entry of potassium ions into the cells. This influx of potassium ions results in depolarization and the generation of an action potential.

Neurotransmitters liberated at the synaptic junction of the inner hair cells as a result of depolarization trigger neuronal impulses which travel down the afferent fibres of the auditory nerve toward higher centres. The intensity of auditory stimulation depends on the number of action potentials per unit time and the number of cells stimulated, while the perceived frequency of the sound depends on the specific nerve fibre populations activated. There is a specific spatial mapping between the frequency of the sound stimulus and the section of the cerebral cortex stimulated.

The inner hair cells are mechanoreceptors which transform signals generated in response to acoustic vibration into electric messages sent to the central nervous system. They are not, however, responsible for the ear’s threshold sensitivity and its extraordinary frequency selectivity.

The outer hair cells, on the other hand, send no auditory signals to the brain. Rather, their function is to selectively amplify mechano-acoustic vibration at near-threshold levels by a factor of approximately 100 (i.e., 40 dB), and so facilitate stimulation of inner hair cells. This amplification is believed to function through micromechanical coupling involving the tectorial membrane. The outer hair cells can produce more energy than they receive from external stimuli and, by contracting actively at very high frequencies, can function as cochlear amplifiers.

In the inner ear, interference between outer and inner hair cells creates a feedback loop which permits control of auditory reception, particularly of threshold sensitivity and frequency selectivity. Efferent cochlear fibres may thus help reduce cochlear damage caused by exposure to intense acoustic stimuli. Outer hair cells may also undergo reflex contraction in the presence of intense stimuli. The attenuation reflex of the middle ear, active primarily at low frequencies, and the reflex contraction in the inner ear, active at high frequencies, are thus complementary.

Bone conduction of sound

Sound waves may also be transmitted through the skull. Two mechanisms are possible:

In the first, compression waves impacting the skull cause the incompressible perilymph to deform the round or oval window. As the two windows have differing elasticities, movement of the endolymph results in movement of the basilar membrane.

The second mechanism is based on the fact that movement of the ossicles induces movement in the scala vestibuli only. In this mechanism, movement of the basilar membrane results from the translational movement produced by the inertia.

Bone conduction is normally 30-50 dB lower than air conduction—as is readily apparent when both ears are blocked. This is only true, however, for air-mediated stimuli, direct bone stimulation being attenuated to a different degree.

Sensitivity range

Mechanical vibration induces potential changes in the cells of the inner ear, conduction pathways and higher centres. Only frequencies of 16 Hz–25,000 Hz and sound pressures (these can be expressed in pascals, Pa) of 20 μPa to 20 Pa can be perceived. The range of sound pressures which can be perceived is remarkable—a 1-million-fold range! The detection thresholds of sound pressure are frequency-dependent, lowest at 1,000-6,000 Hz and increasing at both higher and lower frequencies.

For practical purposes, the sound pressure level is expressed in decibels (dB), a logarithmic measurement scale corresponding to perceived sound intensity relative to the auditory threshold. Thus, 20 μPa is equivalent to 0 dB. As the sound pressure increases tenfold, the decibel level increases by 20 dB, in accordance with the following formula:

Lx = 20 log Px/P0

where:

Lx = sound pressure in dB

Px = sound pressure in pascals

P0 = reference sound pressure(2×10–5 Pa, the auditory threshold)

The frequency-discrimination threshold, that is the minimal detectable difference in frequency, is 1.5 Hz up to 500 Hz, and 0.3% of the stimulus frequency at higher frequencies. At sound pressures near the auditory threshold, the sound-pressure-discrimination threshold is approximately 20%, although differences of as little as 2% may be detected at high sound pressures.

If two sounds differ in frequency by a sufficiently small amount, only one tone will be heard. The perceived frequency of the tone will be midway between the two source tones, but its sound pressure level is variable. If two acoustic stimuli have similar frequencies but differing intensities, a masking effect occurs. If the difference in sound pressure is large enough, masking will be complete, with only the loudest sound perceived.

Localization of acoustic stimuli depends on the detection of the time lag between the arrival of the stimulus at each ear, and, as such, requires intact bilateral hearing. The smallest detectable time lag is 3 x 10–5 seconds. Localization is facilitated by the head’s screening effect, which results in differences in stimulus intensity at each ear.

The remarkable ability of human beings to resolve acoustic stimuli is a result of frequency decomposition by the inner ear and frequency analysis by the brain. These are the mechanisms that allow individual sound sources such as individual musical instruments to be detected and identified in the complex acoustic signals that make up the music of a full symphony orchestra.

Physiopathology

Ciliary damage

The ciliary motion induced by intense acoustic stimuli may exceed the mechanical resistance of the cilia and cause mechanical destruction of hair cells. As these cells are limited in number and incapable of regeneration, any cell loss is permanent, and if exposure to the harmful sound stimulus continues, progressive. In general, the ultimate effect of ciliary damage is the development of a hearing deficit.

Outer hair cells are the most sensitive cells to sound and toxic agents such as anoxia, ototoxic medications and chemicals (e.g., quinine derivates, streptomycin and some other antibiotics, some anti-tumour preparations), and are thus the first to be lost. Only passive hydromechanical phenomena remain operative in outer hair cells which are damaged or have damaged stereocilia. Under these conditions, only gross analysis of acoustic vibration is possible. In very rough terms, cilia destruction in outer hair cells results in a 40 dB increase in hearing threshold.

Cellular damage

Exposure to noise, especially if it is repetitive or prolonged, may also affect the metabolism of cells of the organ of Corti, and afferent synapses located beneath the inner hair cells. Reported extraciliary effects include modification of cell ultrastructure (reticulum, mitochondria, lysosomes) and, postsynaptically, swelling of afferent dendrites. Dendritic swelling is probably due to the toxic accumulation of neurotransmitters as a result of excessive activity by inner hair cells. Nevertheless, the extent of stereociliary damage appears to determine whether hearing loss is temporary or permanent.

Noise-induced Hearing Loss

Noise is a serious hazard to hearing in today’s increasingly complex industrial societies. For example, noise exposure accounts for approximately one-third of the 28 million cases of hearing loss in the United States, and NIOSH (the National Institute for Occupational Safety and Health) reports that 14% of American workers are exposed to potentially dangerous sound levels, that is levels exceeding 90 dB. Noise exposure is the most widespread harmful occupational exposure and is the second leading cause, after age-related effects, of hearing loss. Finally, the contribution of non-occupational noise exposure must not be forgotten, such as home workshops, over-amplified music especially with use of earphones, use of firearms, etc.

Acute noise-induced damage. The immediate effects of exposure to high-intensity sound stimuli (for example, explosions) include elevation of the hearing threshold, rupture of the eardrum, and traumatic damage to the middle and inner ears (dislocation of ossicles, cochlear injury or fistulas).

Temporary threshold shift. Noise exposure results in a decrease in the sensitivity of auditory sensory cells which is proportional to the duration and intensity of exposure. In its early stages, this increase in auditory threshold, known as auditory fatigue or temporary threshold shift (TTS), is entirely reversible but persists for some time after the cessation of exposure.

Studies of the recovery of auditory sensitivity have identified several types of auditory fatigue. Short-term fatigue dissipates in less than two minutes and results in a maximum threshold shift at the exposure frequency. Long-term fatigue is characterized by recovery in more than two minutes but less than 16 hours, an arbitrary limit derived from studies of industrial noise exposure. In general, auditory fatigue is a function of stimulus intensity, duration, frequency, and continuity. Thus, for a given dose of noise, obtained by integration of intensity and duration, intermittent exposure patterns are less harmful than continuous ones.

The severity of the TTS increases by approximately 6 dB for every doubling of stimulus intensity. Above a specific exposure intensity (the critical level), this rate increases, particularly if exposure is to impulse noise. The TTS increases asymptotically with exposure duration; the asymptote itself increases with stimulus intensity. Due to the characteristics of the outer and middle ears’ transfer function, low frequencies are tolerated the best.

Studies on exposure to pure tones indicate that as the stimulus intensity increases, the frequency at which the TTS is the greatest progressively shifts towards frequencies above that of the stimulus. Subjects exposed to a pure tone of 2,000 Hz develop TTS which is maximal at approximately 3,000 Hz (a shift of a semi-octave). The noise’s effect on the outer hair cells is believed to be responsible for this phenomenon.

The worker who shows TTS recovers to baseline hearing values within hours after removal from noise. However, repeated noise exposures result in less hearing recovery and resultant permanent hearing loss.

Permanent threshold shift. Exposure to high-intensity sound stimuli over several years may lead to permanent hearing loss. This is referred to as permanent threshold shift (PTS). Anatomically, PTS is characterized by degeneration of the hair cells, starting with slight histological modifications but eventually culminating in complete cell destruction. Hearing loss is most likely to involve frequencies to which the ear is most sensitive, as it is at these frequencies that the transmission of acoustic energy from the external environment to the inner ear is optimal. This explains why hearing loss at 4,000 Hz is the first sign of occupationally induced hearing loss (figure 3). Interaction has been observed between stimulus intensity and duration, and international standards assume the degree of hearing loss to a function of the total acoustic energy received by the ear (dose of noise).

Figure 3. Audiogram showing bilateral noise-induced hearing loss.

SEN010F4

The development of noise-induced hearing loss shows individual susceptibility. Various potentially important variables have been examined to explain this susceptibility, such as age, gender, race, cardiovascular disease, smoking, etc. The data were inconclusive.

An interesting question is whether the amount of TTS could be used to predict the risk of PTS. As noted above, there is a progressive shift of the TTS to frequencies above that of the stimulation frequency. On the other hand, most of the ciliary damage occurring at high stimulus intensities involves cells that are sensitive to the stimulus frequency. Should exposure persist, the difference between the frequency at which the PTS is maximal and the stimulation frequency progressively decreases. Ciliary damage and cell loss consequently occurs in the cells most sensitive to the stimulus frequencies. It thus appears that TTS and PTS involve different mechanisms, and that it is thus impossible to predict an individual’s PTS on the basis of the observed TTS.

Individuals with PTS are usually asymptomatic initially. As the hearing loss progresses, they begin to have difficulty following conversations in noisy settings such as parties or restaurants. The progression, which usually affects the ability to perceive high-pitched sounds first, is usually painless and relatively slow.

Examination of individuals suffering from hearing loss

Clinical examination

In addition to the history of the date when the hearing loss was first detected (if any) and how it has evolved, including any asymmetry of hearing, the medical questionnaire should elicit information on the patient’s age, family history, use of ototoxic medications or exposure to other ototoxic chemicals, the presence of tinnitus (i.e., buzzing, whistling or ringing sounds in one or both ears), dizziness or any problems with balance, and any history of ear infections with pain or discharge from the outer ear canal. Of critical importance is a detailed life-long history of exposures to high sound levels (note that, to the layperson, not all sounds are “noise”) on the job, in previous jobs and off-the-job. A history of episodes of TTS would confirm prior toxic exposures to noise.

Physical examination should include evaluation of the function of the other cranial nerves, tests of balance, and ophthalmoscopy to detect any evidence of increased cranial pressure. Visual examination of the external auditory canal will detect any impacted cerumen and, after it has been cautiously removed (no sharp object!), any evidence of scarring or perforation of the tympanic membrane. Hearing loss can be determined very crudely by testing the patient’s ability to repeat words and phrases spoken softly or whispered by the examiner when positioned behind and out of the sight of the patient. The Weber test (placing a vibrating tuning fork in the centre of the forehead to determine if this sound is “heard” in either or both ears) and the Rinné pitch-pipe test (placing a vibrating tuning fork on the mastoid process until the patient can no longer hear the sound, then quickly placing the fork near the ear canal; normally the sound can be heard longer through air than through bone) will allow classification of the hearing loss as transmission- or neurosensory.

The audiogram is the standard test to detect and evaluate hearing loss (see below). Specialized studies to complement the audiogram may be necessary in some patients. These include: tympanometry, word discrimination tests, evaluation of the attenuation reflex, electrophysical studies (electrocochleogram, auditory evoked potentials) and radiological studies (routine skull x rays complemented by CAT scan, MRI).

Audiometry

This crucial component of the medical evaluation uses a device known as an audiometer to determine the auditory threshold of individuals to pure tones of 250-8,000 Hz and sound levels between –10 dB (the hearing threshold of intact ears) and 110 dB (maximal damage). To eliminate the effects of TTSs, patients should not have been exposed to noise during the previous 16 hours. Air conduction is measured by earphones placed on the ears, while bone conduction is measured by placing a vibrator in contact with the skull behind the ear. Each ear’s hearing is measured separately and test results are reported on a graph known as an audiogram (Figure 3). The threshold of intelligibility, that is. the sound intensity at which speech becomes intelligible, is determined by a complementary test method known as vocal audiometry, based on the ability to understand words composed of two syllables of equal intensity (for instance, shepherd, dinner, stunning).

Comparison of air and bone conduction allows classification of hearing losses as transmission (involving the external auditory canal or middle ear) or neurosensory loss (involving the inner ear or auditory nerve) (figures 3 and 4). The audiogram observed in cases of noise-induced hearing loss is characterized by an onset of hearing loss at 4,000 Hz, visible as a dip in the audiogram (figure 3). As exposure to excessive noise levels continues, neighbouring frequencies are progressively affected and the dip broadens, encroaching, at approximately 3,000 Hz, on frequencies essential for the comprehension of conversation. Noise-induced hearing loss is usually bilateral and shows a similar pattern in both ears, that is, the difference between the two ears does not exceed 15 dB at 500 Hz, at 1,000 dB and at 2,000 Hz, and 30 dB at 3,000, at 4,000 and at 6,000 Hz. Asymmetric damage may, however, be present in cases of non-uniform exposure, for example, with marksmen, in whom hearing loss is higher on the side opposite to the trigger finger (the left side, in a right-handed person). In hearing loss unrelated to noise exposure, the audiogram does not exhibit the characteristic 4,000 Hz dip (figure 4).

Figure 4. Examples of right-ear audiograms. The circles represent air-conduction hearing loss, the ““ bone conduction.

SEN010F5

There are two types of audiometric examinations: screening and diagnostic. Screening audiometry is used for the rapid examination of groups of individuals in the workplace, in schools or elsewhere in the community to identify those who appear to have some hearing loss. Often, electronic audiometers that permit self-testing are used and, as a rule, screening audiograms are obtained in a quiet area but not necessarily in a sound-proof, vibration-free chamber. The latter is considered to be a prerequisite for diagnostic audiometry which is intended to measure hearing loss with reproducible precision and accuracy. The diagnostic examination is properly performed by a trained audiologist (in some circumstances, formal certification of the competence of the audiologist is required). The accuracy of both types of audiometry depends on periodic testing and                                                                                                                                           recalibration of the equipment being used.

In many jurisdictions, individuals with job-related, noise-induced hearing loss are eligible for workers’ compensation benefits. Accordingly, many employers are including audiometry in their preplacement medical examinations to detect any existing hearing loss that may be the responsibility of a previous employer or represent a non-occupational exposure.

Hearing thresholds progressively increase with age, with higher frequencies being more affected (figure 3). The characteristic 4,000 Hz dip observed in noise-induced hearing loss is not seen with this type of hearing loss.

Calculation of hearing loss

In the United States the most widely accepted formula for calculating functional limitation related to hearing loss is the one proposed in 1979 by the American Academy of Otolaryngology (AAO) and adopted by the American Medical Association. It is based on the average of values obtained at 500, at 1,000, at 2,000 and at 3,000 Hz (table 1), with the lower limit for functional limitation set at 25 dB.

Table 1. Typical calculation of functional loss from an audiogram

  Frequency
  500 
Hz
1,000 
Hz
2,000 
Hz
3,000 
Hz
4,000 
Hz
6,000 
Hz
8,000 
Hz
Right ear (dB) 25 35 35 45 50 60 45
Left ear (dB) 25 35 40 50 60 70 50

 

Unilateral loss
Percentage unilateral loss = (average at 500, 1,000, 2,000 and 3,000 Hz)
– 25dB (lower limit) x1.5
Example:
Right ear: [([25 + 35 + 35 + 45]/4) – 25) x 1.5 = 15 (per cent)
Left ear: [([25 + 35 + 40 + 50]/4) – 25) x 1.5 = 18.8 (per cent)

 

Bilateral loss
Percentage of bilateral loss = {(percentage of unilateral loss of the best ear x 5) + (percentage of unilateral loss of the worst ear)}/6
Example: {(15 x 5) + 18.8}/6 = 15.6 (per cent)

Source: Rees and Duckert 1994.

Presbycusis

Presbycusis or age-related hearing loss generally begins at about age 40 and progresses gradually with increasing age. It is usually bilateral. The characteristic 4,000 Hz dip observed in noise-induced hearing loss is not seen with presbycusis. However, it is possible to have the effects of ageing superimposed on noise-related hearing loss.

Treatment

The first essential of treatment is avoidance of any further exposure to potentially toxic levels of noise (see “Prevention” below). It is generally believed that no more subsequent hearing loss occurs after the removal from noise exposure than would be expected from the normal ageing process.

While conduction losses, for example, those related to acute traumatic noise-induced damage, are amenable to medical treatment or surgery, chronic noise-induced hearing loss cannot be corrected by treatment. The use of a hearing aid is the sole “remedy” possible, and is only indicated when hearing loss affects the frequencies critical for speech comprehension (500 to 3,000 Hz). Other types of support, for example lip-reading and sound amplifiers (on telephones, for example), may, however, be possible.

Prevention

Because noise-induced hearing loss is permanent, it is essential to apply any measure likely to reduce exposure. This includes reduction at the source (quieter machines and equipment or encasing them in sound-proof enclosures) or the use of individual protective devices such as ear plugs and/or ear muffs. If reliance is placed on the latter, it is imperative to verify that their manufacturers’ claims for effectiveness are valid and that exposed workers are using them properly at all times.

The designation of 85 dB (A) as the highest permissible occupational exposure limit was to protect the greatest number of people. But, since there is significant interpersonal variation, strenuous efforts to keep exposures well below that level are indicated. Periodic audiometry should be instituted as part of the medical surveillance programme to detect as early as possible any effects that may indicate noise toxicity.

 

Back

A formal Environmental Management System (EMS), using the International Organization for Standardization (ISO) standard 14001 as the performance specification, has been developed and is being implemented in one of the largest teaching health care complexes in Canada. The Health Sciences Centre (HSC) consists of five hospitals and associated clinical and research laboratories, occupying a 32-acre site in central Winnipeg. Of the 32 segregated solid waste streams at the facility, hazardous wastes account for seven. This summary focuses on the hazardous waste disposal aspect of the hospital’s operations.

ISO 14000

The ISO 14000 standards system is a typical continuous improvement model based on a controlled management system. The ISO 14001 standard addresses the environmental management system structure exclusively. To conform with the standard, an organization must have processes in place for:

  • adopting an environmental policy that sets environmental protection as a high priority
  • identifying environmental impacts and setting performance goals
  • identifying and complying with legal requirements
  • assigning environmental accountability and responsibility throughout the organization
  • applying controls to achieve performance goals and legal requirements
  • monitoring and reporting environmental performance; auditing the EMS system
  • conducting management reviews/ identifying opportunities for improvement.

 

The hierarchy for carrying out these processes in the HSC is presented in table 1.

Table 1. HSC EMS documentation hierarchy

EMS level

Purpose

Governance document
Mission/strategic plan

Includes the Board’s expectations on each core performance category and its requirements for corporate competency in each category.

Level 1
Output requirements   

Prescribes the outputs that will be delivered in response to customer and stakeholder (C/S) needs (including government regulatory requirements).

Level 2
Corporate policy

Prescribes the methodologies, systems, processes and resources to be used for achieving C/S requirements; the goals, objectives and performance standards essential for confirming that the C/S requirements have been met (e.g., a schedule of required systems and processes including responsibility centre for each).

Level 3
System descriptions

Prescribes the design of each business system or process that will be operated to achieve the C/S requirements (e.g., criteria and boundaries for system operation; each information collection and data reporting point; position responsible for the system and for each component of the process, etc.).

Level 4
Work instructions

Prescribes detailed task instructions (specific methods and techniques), for each work activity (e.g., describe the task to be done; identify the position responsible for completing the task; state skills required for the task; prescribe education or training methodology to achieve required skills; identify task completion and conformance data, etc.).

Level 5
Records of work and  process compliance

Organizes and records measurable outcome data on the operation of systems, processes and tasks designed to verify completion according to specification. (e.g., measures for system or process compliance; resource allocation and budget compliance; effectiveness, efficiency, quality, risk, ethics, etc.).

Level 6
Performance reports 

Analyses records and processes to establish corporate performance in relation to standards set for each output requirement (Level 1) related to C/S needs (e.g., compliance, quality, effectiveness, risk, utilization, etc.); and financial and staff resources.

 

ISO standards encourage businesses to integrate all environmental considerations into mainstream business decisions and not restrict attention to concerns that are regulated. Since the ISO standards are not technical documents, the function of specifying numerical standards remains the responsibility of governments or independent expert bodies.

Management System Approach

Applying the generic ISO framework in a health care facility requires the adoption of management systems along the lines of those in table 1, which describes how this has been addressed by the HSC. Each level in the system is supported by appropriate documentation to confirm diligence in the process. While the volume of work is substantial, it is compensated by the resulting performance consistency and by the “expert” information that remains within the corporation when experienced persons leave.

The main objective of the EMS is to establish consistent, controlled and repeatable processes for addressing the environmental aspects of the corporation’s operations. To facilitate management review of the hospital’s performance, an EMS Score Card was conceived based on the ISO 14001 standard. The Score Card closely follows the requirements in the ISO 14001 standard and, with use, will be developed into the hospital’s audit protocol.

Application of the EMS to the Hazardous Waste Process

Facility hazardous waste process

The HSC hazardous waste process currently consists of the following elements:

  • procedure statement assigning responsibilities
  • process description, in both text and flowchart formats
  • Disposal Guide for Hazardous Waste for staff
  • education programme for staff
  • performance tracking system
  • continuous improvement through multidisciplinary team process
  • a process for seeking external partners.

 

The roles and responsibilities of the four main organizational units involved in the hazardous waste process are listed in table 2.

Table 2. Role and responsibilities

Organizational unit

Responsibility

S&DS
Supply and Distribution
Services

Operates the process and is the process owner/leader, and arranges responsible disposal of waste.

UD–User Departments
the source of waste materials

Identifies waste, selects packaging, initiates disposal activities.

DOEM
Department of Occupational and
Environmental Medicine

Provides specialist technical support in identifying risks and protective measures associated with materials used by HSC and identifies improvement opportunities.

EPE
Environmental Protection
Engineer

Provides specialist support in process performance monitoring and reporting, identifies emerging regulatory trends and compliance requirements, and identifies improvement opportunities.

ALL–All participants

Shares responsibility for process development activities.

 Process description

The initial step in preparing a process description is to identify the inputs (see table 3 ).

Table 3. Process inputs

Organizational unit

Examples of process inputs and supporting inputs

S&DS (S&DS)

Maintain stock of Hazardous Waste Disposal Requisition forms and labels
— order requisition forms and labels.

S&DS (UD, DOEM, EPE) (S&DS)

Maintain supply of packaging containers in warehouse for UDs
— determine appropriate packaging for each waste class
— build adequate stock of containers for requisitioning by UD.

DOEM

Produce SYMBAS Classification Decision Chart.

EPE

Produce the list of materials for which HSC is registered as a waste generator with regulatory department.

S&DS

Produce a database of SYMBAS classifications, packaging requirements, TDG classifications, and tracking information for each material disposed by HSC.

The next process component is the list of specific activities required for proper disposal of waste (see table 4 ).

Table 4. List of activities

Unit

Examples of activities required

UD

Order Hazardous Waste Disposal Requisition, label and packaging from S&DS as per standard stock ordering procedure.

S&DS     

Deliver Requisition, label and packaging to UD.

UD

Determine whether a waste material is hazardous (check MSDS, DOEM, and such considerations as dilution, mixture with other chemicals, etc.).

UD

Assign the Classification to the waste material using SYMBAS Chemical Decision Chart and WHMIS information. Classification can be checked with the S&DS Data Base for materials previously disposed by HSC. Call first S&DS and second DOEM for assistance if required.

UD

Determine appropriate packaging requirements from WHMIS information using professional judgement or from S&DS Data Base of materials previously disposed by HSC. Call first S&DS and second DOEM for assistance if required.

 

Communication

To support the process description, the hospital produced a Disposal Guide for Hazardous Waste to assist staff in the proper disposal of hazardous waste materials. The guide contains information on the specific steps to follow in identifying hazardous waste and preparing it for disposal. Supplemental information is also provided on legislation, the Workplace Hazardous Materials Information System (WHMIS) and key contacts for assistance.

A database was developed to track all relevant information pertaining to each hazardous waste event from originating source to final disposal. In addition to waste data, information is also collected on the performance of the process (e.g., source and frequency of phone calls for assistance to identify areas which may require further training; source, type, quantity and frequency of disposal requests from each user department; consumption of containers and packaging). Any deviations from the process are recorded on the corporate incident reporting form. Results from performance monitoring are reported to the executive and the board of directors. To support effective implementation of the process, a staff education programme was developed to elaborate on the information in the guide. Each of the core participants in the process carries specific responsibilities on staff education.

Continuous Improvement

To explore continuous improvement opportunities, the HSC established a multidisciplinary Waste Process Improvement Team. The Team’s mandate is to address all issues pertaining to waste management. Further to encourage continuous improvement, the hazardous waste process includes specific triggers to initiate process revisions. Typical improvement ideas generated to date include:

  • prepare list of high hazard materials to be tracked from time of procurement
  • develop material “shelf life” information, where appropriate, for inclusion in the materials classification database
  • review shelving for physical integrity
  • purchase spill containing trays
  • examine potential for spills entering sewer system
  • determine whether present storage rooms are adequate for anticipated waste volume
  • produce a procedure for disposing of old, incorrectly identified materials.

 

The ISO standards require regulatory issues to be addressed and state that business processes must be in place for this purpose. Under the ISO standards, the existence of corporate commitments, performance measuring and documentation provide a more visible and more convenient trail for regulators to check for compliance. It is conceivable that the opportunity for consistency provided by the ISO documents could automate reporting of key environmental performance factors to government authorities.

 

Back

Wednesday, 02 March 2011 16:38

Hospital Waste Management

An adaptation of current guidelines on the disposal of hospital wastes, as well as improvements in internal safety and hygiene, must be part of an overall plan of hospital waste management that establishes the procedures to follow. This should be done through properly coordinating internal and external services, as well as defining responsibilities in each of the management phases. The main goal of this plan is to protect the health of health care personnel, patients, visitors and the general public both in the hospital and beyond.

At the same time, the health of the people who come in contact with the waste once it leaves the medical centre should not be overlooked, and the risks to them should also be minimized.

Such a plan should be campaigned for and applied according to a global strategy that always keeps in mind the realities of the workplace, as well as the knowledge and the training of the personnel involved.

Stages followed in the implementation of a waste management plan are:

  • informing the management of the medical centre
  • designating those responsible at the executive level
  • creating a committee on hospital wastes made up of personnel from the general services, nursing and medical departments that is chaired by the medical centre’s waste manager.

 

The group should include personnel from the general services department, personnel from the nursing department and personnel from the medical department. The medical centre’s waste manager should coordinate the committee by:

  • putting together a report on the present performance of the centre’s waste management
  • putting together an internal plan for advanced management
  • creating a training programme for the entire staff of the medical centre, with the collaboration of the human resources department
  • launching the plan, with follow-up and control by the waste management committee.

 

Classification of hospital wastes

Until 1992, following the classical waste management system, the practice was to classify most hospital wastes as hazardous. Since then, applying an advanced management technique, only a very small proportion of the large volume of these wastes is considered hazardous.

The tendency has been to adopt an advanced management technique. This technique classifies wastes starting from the baseline assumption that only a very small percentage of the volume of wastes generated is hazardous.

Wastes should always be classified at the point where they are generated. According to the nature of the wastes and their source, they are classified as follows:

  • Group I: those wastes that can be assimilated into urban refuse
  • Group II: non-specific hospital wastes
  • Group III: specific hospital wastes or hazardous wastes
  • Group IV: cytostatic wastes (surplus antineoplastic drugs that are not fit for therapeutic use, as well as the single-use materials that have been in contact with them, e.g., needles, syringes, catheters, gloves and IV set-ups).

 

According to their physical state, wastes can be classified as follows:

  • solids: wastes that contain less than 10% liquid
  • liquids: wastes that contain more than 10% liquid

 

Gaseous wastes, such as CFCs from freezers and refrigerators, are not normally captured (see article “Waste anaesthetic gases”).

By definition, the following wastes are not considered sanitary wastes:

  • radioactive wastes that, because of their very nature, are already managed in a specific way by the radiological protection service
  • human cadavers and large anatomical parts which are cremated or incinerated according to regulations
  • waste water.

 

Group I Wastes

All wastes generated within the medical centre that are not directly related to sanitary activities are considered solid urban wastes (SUW). According to the local ordinances in Cataluna, Spain, as in most communities, the municipalities must remove these wastes selectively, and it is therefore convenient to facilitate this task for them. The following are considered wastes that can be assimilated to urban refuse according to their point of origin:

Kitchen wastes:

  • food wastes
  • wastes from leftovers or single-use items
  • containers.

 

Wastes generated by people treated in the hospital and non-medical personnel:

  • wastes from cleaning products
  • wastes left behind in the rooms (e.g., newspapers, magazines and flowers)
  • wastes from gardening and renovations.

 

Wastes from administrative activities:

  • paper and cardboard
  • plastics.

 

Other wastes:

  • glass containers
  • plastic containers
  • packing cartons and other packaging materials
  • dated single-use items.

 

So long as they are not included on other selective removal plans, SUW will be placed in white polyethylene bags that will be removed by janitorial personnel.

Group II Wastes

Group II wastes include all those wastes generated as a by-product of medical activities that do not pose a risk to health or the environment. For reasons of safety and industrial hygiene the type of internal management recommended for this group is different from that recommended for Group I wastes. Depending on where they originate, Group II wastes include:

Wastes derived from hospital activities, such as:

  • blood-stained materials
  • gauze and materials used in treating non-infectious patients
  • used medical equipment
  • mattresses
  • dead animals or parts thereof, from rearing stables or experimental laboratories, so long as they have not been inoculated with infectious agents.

 

Group II wastes will be deposited in yellow polyethylene bags that will be removed by janitorial personnel.

Group III Wastes

Group III includes hospital wastes which, due to their nature or their point of origin, could pose risks to health or the environment if several special precautions are not observed during handling and removal.

Group III wastes can be classified in the following way:

Sharp and pointed instruments:

  • needles
  • scalpels.

 

Infectious wastes. Group III wastes (including single-use items) generated by the diagnosis and treatment of patients who suffer from one of the infectious diseases are listed in table 1.

Table 1. Infectious diseases and Group III wastes

Infections

Wastes contaminated with

Viral haemorrhagic fevers
Congo-Crimean fever
Lassa fever
Marburg virus
Ebola
Junin fever
Machupo fever
Arbovirus
Absettarow
Hanzalova
Hypr
Kumlinge
Kiasanur Forest Disease
Omsk fever
Russian spring-summer
encephalitis

All wastes

Brucellosis

Pus

Diphtheria

Pharyngeal diphtheria: respiratory secretions
Cutaneous diphtheria: secretions from skin
lesions

Cholera

Stools

Creutzfelt-Jakob encephalitis

Stools

Borm

Secretions from skin lesions

Tularaemia

Pulmonary tularaemia: respiratory secretions
Cutaneous tularaemia: pus

Anthrax

Cutaneous anthrax: pus
Respiratory anthrax: respiratory secretions

Plague

Bubonic plague: pus
Pneumonic plague: respiratory secretions

Rabies

Respiratory secretions

Q Fever

Respiratory secretions

Active tuberculosis

Respiratory secretions

 

Laboratory wastes:

  • material contaminated with biological wastes
  • waste from work with animals inoculated with biohazardous substances.

 

Wastes of the Group III type will be placed in single-use, rigid, colour-coded polyethylene containers and hermetically sealed (in Cataluna, black containers are required). The containers should be clearly labelled as “Hazardous hospital wastes” and kept in the room until collected by janitorial personnel. Group III wastes should never be compacted.

To facilitate their removal and reduce risks to a minimum, containers should not be filled to capacity so that they can be closed easily. Wastes should never be handled once they are placed in these rigid containers. It is forbidden to dispose of biohazardous wastes by dumping them into the drainage system.

Group IV Wastes

Group IV wastes are surplus antineoplastic drugs that are not fit for therapeutic use, as well as all single-use material that has been in contact with the same (needles, syringes, catheters, gloves, IV set-ups and so on).

Given the danger they pose to persons and the environment, Group IV hospital wastes must be collected in rigid, watertight, sealable single-use, colour-coded containers (in Cataluna, they are blue) which should be clearly labelled “Chemically contaminated material: Cytostatic agents”.

Other Wastes

Guided by environmental concerns and the need to enhance waste management for the community, medical centres, with the cooperation of all personnel, staff and visitors, should encourage and facilitate the selective disposal (i.e., in special containers designated for specific materials) of recyclable materials such as:

  • paper and cardboard
  • glass
  • used oils
  • batteries and power cells
  • toner cartridges for laser printers
  • plastic containers.

 

The protocol established by the local sanitation department for the collection, transport and disposal of each of these types of materials should be followed.

Disposal of large pieces of equipment, furniture and other materials not covered in these guidelines should follow the directions recommended by the appropriate environmental authorities.

Internal transport and storage of wastes

Internal transport of all the wastes generated within the hospital building should be done by the janitorial personnel, according to established schedules. It is important that the following recommendations be observed when transporting wastes within the hospital:

  • The containers and the bags will always be closed during transport.
  • The carts used for this purpose will have smooth surfaces and be easy to clean.
  • The carts will be used exclusively for transporting waste.
  • The carts will be washed daily with water, soap and lye.
  • The waste bags or containers should never be dragged on the floor.
  • Waste should never be transferred from one receptacle to another.

 

The hospital must have an area specifically for the storage of wastes; it should conform to current guidelines and fulfil, in particular, the following conditions:

  • It should be covered.
  • It should be clearly marked by signs.
  • It should be built with smooth surfaces that are easy to clean.
  • It should have running water.
  • It should have drains to remove the possible spillage of waste liquids and the water used for cleaning the storage area.
  • It should be provided with a system to protect it against animal pests.
  • It should be located far away from windows and from the intake ducts of the ventilation system.
  • It should be provided with fire extinguishing systems.
  • It should have restricted access.
  • It should be used exclusively for the storage of wastes.

 

All the transport and storage operations that involve hospital wastes must be conducted under conditions of maximum safety and hygiene. In particular, one must remember:

  • Direct contact with the wastes must be avoided.
  • Bags should not be overfilled so that they may be closed easily.
  • Bags should not be emptied into other bags.

 

Liquid Wastes: Biological and Chemical

Liquid wastes can be classified as biological or chemical.

Liquid biological wastes

Liquid biological wastes can usually be poured directly into the hospital’s drainage system since they do not require any treatment before disposal. The exceptions are the liquid wastes of patients with infectious diseases and the liquid cultures of microbiology laboratories. These should be collected in specific containers and treated before being dumped.

It is important that the waste be dumped directly into the drainage system with no splashing or spraying. If this is not possible and wastes are gathered in disposable containers that are difficult to open, the containers should not be forced open. Instead, the entire container should be disposed of, as with Group III solid wastes. When liquid waste is eliminated like Group III solid waste, it should be taken into consideration that the conditions of work differ for the disinfection of solid and liquid wastes. This must be kept in mind in order to ensure the effectiveness of the treatment.

Liquid chemical wastes

Liquid wastes generated in the hospital (generally in the laboratories) can be classified in three groups:

  • liquid wastes that should not be dumped into the drains
  • liquid wastes that can be dumped into the drains after being treated
  • liquid wastes that can be dumped into the drains without being previously treated.

 

This classification is based on considerations related to the health and quality of life of the entire community. These include:

  • protection of the water supply
  • protection of the sewer system
  • protection of the waste water purification stations.

 

Liquid wastes that can pose a serious threat to people or to the environment because they are toxic, noxious, flammable, corrosive or carcinogenic should be separated and collected so that they can subsequently be recovered or destroyed. They should be collected as follows:

  • Each type of liquid waste should go into a separate container.
  • The container should be labelled with the name of the product or the major component of the waste, by volume.
  • Each laboratory, except the pathological anatomy laboratory, should provide its own individual receptacles to collect liquid wastes that are correctly labelled with the material or family of materials it contains. Periodically (at the end of each work day would be most desirable), these should be emptied into specifically labelled containers which are held in the room until collected at appropriate intervals by the assigned waste removal subcontractor.
  • Once each receptacle is correctly labelled with the product or the family of products it contains, it should be placed in specific containers in the labs.
  • The person responsible for the laboratory, or someone directly delegated by that person, will sign and stamp a control ticket. The subcontractor will then be responsible for delivering the control ticket to the department that supervises safety, hygiene and the environment.

 

Mixtures of chemical and biological liquid wastes

Treatment of chemical wastes is more aggressive than treatment of biological wastes. Mixtures of these two wastes should be treated using the steps indicated for liquid chemical wastes. Labels on containers should note the presence of biological wastes.

Any liquid or solid materials that are carcinogenic, mutagenic or teratogenic should be disposed of in rigid colour-coded containers specifically designed and labelled for this type of waste.

Dead animals that have been inoculated with biohazardous substances will be disposed of in closed rigid containers, which will be sterilized before being reused.

Disposal of Sharp and Pointed Instruments

Sharp and pointed instruments (e.g., needles and lancets), once used, must be placed in specifically designed, rigid “sharps” containers that have been strategically placed throughout the hospital. These wastes will be disposed of as hazardous wastes even if used on uninfected patients. They must never be disposed of except in the rigid sharps container.

All HCWs must be repeatedly reminded of the danger of accidental cuts or punctures with this type of material, and instructed to report them when they occur, so that appropriate preventive measures may be instituted. They should be specifically instructed not to attempt to recap used hypodermic needles before dropping them into the sharps container.

Whenever possible, needles to be placed in the sharps container without recapping may be separated from the syringes which, without the needle, can generally be disposed of as Group II waste. Many sharps containers have a special fitting for separating the syringe without risk of a needlestick to the worker; this saves space in the sharps containers for more needles. The sharps containers, which should never be opened by hospital personnel, should be removed by designated janitorial personnel and forwarded for appropriate disposal of their contents.

If it is not possible to separate the needle in adequately safe conditions, the whole needle-syringe combination must be considered as biohazardous and must be placed in the rigid sharps containers.

These sharps containers will be removed by the janitorial personnel.

Staff Training

There must be an ongoing training programme in waste management for all hospital personnel aimed at indoctrinating the staff on all levels with the imperative of always following the established guidelines for collecting, storing and disposing wastes of all kinds. It is particularly important that the housekeeping and janitorial staffs be trained in the details of the protocols for recognizing and dealing with the various categories of hazardous waste. The janitorial, security and fire-fighting staff must also be drilled in the correct course of action in the event of an emergency.

It is also important for the janitorial personnel to be informed and trained on the correct course of action in case of an accident.

Particularly when the programme is first launched, the janitorial staff should be instructed to report any problems that may hinder their performance of these assigned duties. They may be given special cards or forms on which to record such findings.

Waste Management Committee

To monitor the performance of the waste management programme and resolve any problems that may arise as it is implemented, a permanent waste management committee should be created and meet regularly, quarterly at a minimum. The committee should be accessible to any member of the hospital staff with a waste disposal problem or concern and should have access as needed to top management.

Implementing the Plan

The way the waste management programme is implemented may well determine whether it succeeds or not.

Since the support and cooperation of the various hospital committees and departments is essential, details of the programme should be presented to such groups as the administrative teams of the hospital, the health and safety committee and the infection control committee. It is necessary also to obtain validation of the programme from such community agencies as the departments of health, environmental protection and sanitation. Each of these may have helpful modifications to suggest, particularly with respect to the way the programme impinges on their areas of responsibility.

Once the programme design has been finalized, a pilot test in a selected area or department should permit rough edges to be polished and any unforeseen problems resolved. When this has been completed and its results analysed, the programme may be implemented progressively throughout the entire medical centre. A presentation, with audio-visual supports and distribution of descriptive literature, can be delivered in each unit or department, followed by delivery of bags and/or containers as required. Following the start-up of the programme, the department or unit should be visited so that any needed revisions may be instituted. In this manner, the participation and support of the entire hospital staff, without which the programme would never succeed, can be earned.

 

Back

A hospital is not an isolated social environment; it has, given its mission, very serious intrinsic social responsibilities. A hospital needs to be integrated with its surroundings and should minimize its impact upon them, thus contributing to the welfare of the people who live near it.

From a regulatory perspective, the health industry has never been considered to be on the same level as other industries when they are ranked according to the health risks they pose. The result is that specific legislation in this sphere has been non-existent until recently, although in the last few years this deficiency has been addressed. While in many other kinds of industrial activities, health and safety is an integral part of the organization, most health centres still pay little or no attention to it.

One reason for this could be the attitudes of HCWs themselves, who may be preoccupied more with research and the acquisition of the latest technologies and diagnostic and treatment techniques than with looking into the effects that these advances could have on their own health and on the environment.

New developments in science and health care must be combined with environmental protection, because environmental policies in a hospital affect the quality of life of HCWs within the hospital and those who live outside it.

Integrated Health, Safety and Environmental Programmes

HCWs represent a major group, comparable in size to the large enterprises of the private sector. The number of people who pass through a hospital every day is very large: visitors, inpatients, outpatients, medical and commercial representatives, subcontractors and so on. All of them, to a greater or lesser degree, are exposed to the potential risks posed by the activities of the medical centre and, at the same time, contribute on a certain level to the improvement or the worsening of the safety and the care of the centre’s surroundings.

Strict measures are needed in order to safeguard HCWs, the general public and the surrounding environment from the deleterious effects that may stem from hospital activities. These activities include the use of ever more sophisticated technology, the more frequent use of extremely powerful drugs (the effects of which can have a profound and irreparable impact on the people who prepare or administer them), the too-often uncontrolled use of chemical products and the incidence of infectious diseases, some of which are incurable.

The risks of working in a hospital are many. Some are easy to identify, while others are very hard to detect; the measures to be taken should therefore always be rigorous.

Different groups of health professionals are particularly exposed to risks common to the health care industry in general, as well as to specific risks related to their profession and/or to the activities they perform in the course of their work.

The concept of prevention, therefore, must of necessity be incorporated to the health care field and encompass:

  • safety in the broadest sense, including psychosociology and ergonomics as part of the programmes to improve the quality of life in the workplace
  • hygiene, minimizing as much as possible any physical, chemical or biological factor that may affect the health of people in the work environment
  • environment, following policies to protect nature and people in the surrounding community and decreasing the impact on the environment.

 

We should be aware that the environment is directly and intimately related to the safety and hygiene in the workplace, because natural resources are consumed at work, and because these resources are later reincorporated into our surroundings. Our quality of life will be good or bad depending on whether we make correct use of these resources and use appropriate technologies.

Everyone’s involvement is necessary in order to contribute to further:

  • nature conservancy policies, designed to guarantee the survival of the natural heritage that surrounds us
  • environmental improvement policies as well as policies to control indoor and environmental pollution in order to integrate human activity with the environment
  • environmental research and training policies to improve working conditions as well as to reduce environmental impact
  • planning organizational policies designed to set goals and develop norms and methodology for workers’ health and the environment.

 

Goals

Such a programme should endeavour to:

  • change the culture and habits of health professionals in order to stimulate behaviour more conducive to safeguarding their health
  • set goals and develop internal safety, hygiene and environmental guidelines through adequate planning and organization
  • improve the methods of work to avoid a negative impact on health and the environment through environmental research and education
  • increase the involvement of all personnel and have them take responsibility for health in the workplace
  • create an adequate programme to establish and publicize the guidelines as well as to monitor their continued implementation
  • correctly classify and manage the waste generated
  • optimize costs, avoiding added expenditures that cannot be justified by the increased levels of safety and health or environmental quality.

 

Plan

A hospital should be conceived as a system that, through a number of processes, generates services. These services are the main goal of the activities performed in a hospital.

For the process to begin, certain commitments of energy, investments and technology are needed, which in turn will generate their own emissions and wastes. Their only aim is to provide service.

In addition to these prerequisites, consideration should be given to the conditions of the areas of the building where these activities will take place, since they have been designed a certain way and built with basic construction materials.

Control, planning and coordination are all necessary for an integrated safety, health and environmental project to succeed.

Methodology

Because of the complexity and the variety of risks in the health care field, multidisciplinary groups are required if solutions to each particular problem are to be found.

It is important for health care workers to be able to collaborate with safety studies, participating in the decisions that will be made to improve their working conditions. This way changes will be seen with a better attitude and the guidelines will be more readily accepted.

The safety, hygiene and environmental service should advise, stimulate and coordinate the programmes developed at the health centre. Responsibility for their implementation should fall upon whoever heads up the service where this programme will be followed. This is the only way to involve the entire organization.

In each particular case, the following will be selected:

  • the system involved
  • the parameters of the study
  • the time needed to carry it out.

 

The study will consist of:

  • an initial diagnosis
  • analysis of the risk
  • deciding on the course of action.

 

In order to implement the plan successfully it will always be necessary to:

  • educate and inform people of the risks
  • improve the management of human resources
  • improve the channels of communication.

 

This type of study may be a global one encompassing the centre as a whole (e.g., internal plan for the disposal of hospital wastes) or partial, encompassing only one concrete area (e.g., where cancer chemotherapeutic drugs are prepared).

The study of these factors will give an idea of the degree to which safety measures are disregarded, as much from the legal as from the scientific point of view. The concept of “legal” here encompasses advances in science and technology as they occur, which requires the constant revision and modification of established norms and guidelines.

It would be convenient indeed if the regulations and the laws by which safety, hygiene and the environment are regulated were the same in all countries, something that would make the installation, management and use of technology or products from other countries much easier.

Results

The following examples show some of the measures that can be taken while following the aforementioned methodology.

Laboratories

An advisory service can be developed involving professionals of the various laboratories and coordinated by the safety and hygiene service of the medical centre. The main goal would be to improve the safety and health of the occupants of all the labs, involving and giving responsibility to the entire professional staff of each and trying at the same time to make sure that these activities do not have a negative impact on public health and the environment.

The measures taken should include:

  • instituting the sharing of materials, products and equipment among the different laboratories, in order to optimize resources
  • reducing the stocks of chemical products in laboratories
  • creating a manual of basic norms of safety and hygiene
  • planning courses to educate all laboratory workers on these matters
  • training for emergencies.

 

Mercury

Thermometers, when broken, release mercury into the environment. A pilot project has been started with “unbreakable” thermometers to consider eventually substituting them for the glass thermometers. In some countries, such as the United States, electronic thermometers have replaced mercury thermometers to a very great extent.

Training the workers

The training and the commitment of the workers is the most important part of an integrated safety, health and environment programme. Given enough resources and time, the technicalities of almost any problem can be solved, but a complete solution will not be achieved without informing the workers of the risks and training them to avoid or control them. The training and education must be continuous, integrating health and safety techniques into all the other training programmes in the hospital.

Conclusions

The results that have been achieved so far in applying this work model allow us so far to be optimistic. They have shown that when people are informed about the whys and wherefores, their attitude toward change is very positive.

The response of health care personnel has been very good. They feel more motivated in their work and more valued when they have participated directly in the study and in the decision-making process. This participation, in turn, helps to educate the individual health care worker and to increase the degree of responsibility he or she is willing to accept.

The attainment of the goals of this project is a long-term objective, but the positive effects it generates more than compensate for the effort and the energy invested in it.

 

Back

Wednesday, 02 March 2011 16:30

Buildings for Health Care Facilities

The health maintenance and enhancement, the safety and the comfort of people in health care facilities are seriously affected if specific building requirements are not met. Health care facilities are rather unique buildings, in which heterogeneous environments coexist. Different people, several activities in each environment and many risk factors are involved in the pathogenesis of a broad spectrum of diseases. Functional organization criteria classify health care facility environments as follows: nursing units, operating theatres, diagnostic facilities (radiology unit, laboratory units and so on), outpatients’ departments, administration area (offices), dietary facilities, linen services, engineering services and equipment areas, corridors and passages. The group of people which attends a hospital is composed of health personnel, staff personnel, patients (long-stay inpatients, acute inpatients and outpatients) and visitors. The processes include health care specific activities—diagnostic activities, therapeutic activities, nursing activities—and activities common to many public buildings—office work, technological maintenance, food preparation and so on. The risk factors are physical agents (ionizing and non-ionizing radiation, noise, lighting and microclimatic factors), chemicals (e.g., organic solvents and disinfectants), biological agents (viruses, bacteria, fungi and so on), ergonomics (postures, lifting and so on) and psychological and organizational factors (e.g., environmental perceptions and work hours). The illnesses related to the above-mentioned factors range from environmental annoyance or discomfort (e.g., thermal discomfort or irritative symptoms) to severe diseases (e.g., hospital-acquired infections and traumatic accidents). In this perspective, the risk assessment and control require an interdisciplinary approach involving physicians, hygienists, engineers, architects, economists and so on and fulfilment of preventive measures in the building planning, design, construction and management tasks. Specific building requirements are extremely important among these preventive measures, and, according to the guidelines for healthy buildings introduced by Levin (1992), they should be classified as follows:

  • site planning requirements
  • architectural design requirements
  • requirements for building materials and furnishings
  • requirements for heating, ventilation and air-conditioning systems and for microclimatic conditions.

 

This article focuses on general hospital buildings. Obviously, adaptations would be required for specialty hospitals (e.g., orthopaedic centres, eye and ear hospitals, maternity centres, psychiatric institutions, long-term care facilities and rehabilitation institutes), for ambulatory care clinics, emergency/urgent care facilities and offices for individual and group practices. These will be determined by the numbers and types of patients (including their physical and mental status) and by the number of HCWs and the tasks they perform. Considerations promoting the safety and well-being of both patients and staff that are common to all health care facilities include:

  • ambience, including not only decoration, lighting and noise control but also partitioning and placement of furniture and equipment that avoid entrapment of workers with potentially violent patients and visitors
  • ventilation systems that minimize exposure to infectious agents and potentially toxic chemicals and gases
  • storage facilities for clothing and effects of patients and their visitors that minimize potential contamination
  • lockers, changing rooms, wash-up facilities and rest rooms for staff
  • conveniently-located hand-washing facilities in each room and treatment area
  • doorways, elevators and toilets that accommodate wheel chairs and stretchers
  • storage and filing areas designed to minimize workers’ stooping, bending, reaching and heavy lifting
  • automatic and worker-controlled communication and alarm systems
  • mechanisms for collection, storage and disposition of toxic wastes, contaminated linens and clothing and so on.

 

Site Planning Requirements

The health care facility site must be chosen following four main criteria (Catananti and Cambieri 1990; Klein and Platt 1989; Decree of the President of Ministers Council 1986; Commission of the European Communities 1990; NHS 1991a, 1991b):

  1. Environmental factors. The terrain should be as level as possible. Ramps, escalators and elevators can offset sides of hills, but they hinder the access of elderly and handicapped people, adding both a higher cost to the project and an extra burden to fire departments and evacuation teams. Heavy wind sites should be avoided, and the area should be far from sources which create pollution and noise (especially factories and landfills). Radon and radon daughters levels should be assessed, and measures to reduce exposure should be taken. In colder climates, consideration should be given to embedding snow-melting coils in sidewalks, entrance ways and parking areas to minimize falls and other accidents. 
  2. Geological configuration. Earthquake-prone areas should be avoided, or at least anti-seismic construction criteria must be followed. The site must be chosen following an hydrogeological assessment, to avoid water infiltrations into the foundations. 
  3. Urbanistic factors. The site should be easily accessible to potential users, ambulances and service vehicles for goods supply and waste disposal. Public transportation and utilities (water, gas, electricity and sewers) should be available. Fire departments should be nearby, and fire-fighters and their apparatus should find ready access to all parts of the facility. 
  4. Space availability. The site should allow some scope for expansion and provision of adequate car parking.

 

Architectural Design

Health care facilities architectural design usually follows several criteria:

  • class of the health care facility: hospital (acute-care hospital, community hospital, rural hospital), large or small health care centre, nursing homes (extended care facilities, skilled nursing homes, residential care homes), general medical practice premise (NHS 1991a; NHS 1991b; Kleczkowski, Montoya-Aguilar and Nilsson 1985; ASHRAE 1987)
  • catchment area dimensions
  • management issues: costs, flexibility (susceptibility to adaptation)
  • ventilation provided: an air-conditioned building is compact and deep with as small an amount of external walls as possible, to reduce the heat transfer between outside and inside; a naturally ventilated building is long and thin, to maximize exposure to breezes and to minimize internal distances from windows (Llewelyn-Davies and Wecks 1979)
  • building/area ratio
  • environmental quality: safety and comfort are extremely relevant targets.

 

The listed criteria lead health care facilities planners to choose the best building shape for each situation, ranging essentially from an extended horizontal hospital with scattered buildings to a monolithic vertical or horizontal building (Llewelyn-Davies and Wecks 1979). The first case (a preferable format for low-density buildings) is normally used for hospitals up to 300 beds, because of its low costs in construction and management. It is particularly considered for small rural hospitals and community hospitals (Llewelyn-Davies and Wecks 1979). The second case (usually preferred for high-density buildings) becomes cost-effective for hospitals with more than 300 beds, and it is advisable for acute-care hospitals (Llewelyn-Davies and Wecks 1979). The internal space dimensions and distribution have to cope with many variables, among which one can consider: functions, processes, circulation and connections to other areas, equipment, predicted workload, costs, and flexibility, convertibility and susceptibility of shared use. Compartments, exits, fire alarms, automatic extinction systems and other fire prevention and protection measures should follow local regulations. Furthermore, several specific requirements have been defined for each area in health care facilities:

1.       Nursing units. Internal layout of nursing units usually follows one of the following three basic models (Llewelyn-Davies and Wecks 1979): an open ward (or “Nightingale” ward)—a broad room with 20 to 30 beds, heads to the windows, ranged along both walls; the “Rigs” layout—in this model beds were placed parallel to the windows, and, at first, they were in open bays on either side of a central corridor (as at Rigs Hospital in Copenhagen), and in later hospitals the bays were often enclosed, so that they became rooms with 6 to 10 beds; small rooms, with 1 to 4 beds. Four variables should lead the planner to choose the best layout: bed need (if high, an open ward is advisable), budget (if low, an open ward is the cheapest one), privacy needs (if considered high, small rooms are unavoidable) and intensive care level (if high, the open ward or Rigs layout with 6 to 10 beds are advisable). The space requirements should be at least: 6 to 8 square metres (sqm) per bed for open wards, inclusive of circulation and ancillary rooms (Llewelyn-Davies and Wecks 1979); 5 to 7 sqm/bed for multiple bedrooms and 9 sqm for single bedrooms (Decree of the President of Ministers Council 1986; American Institute of Architects Committee on Architecture for Health 1987). In open wards, toilet facilities should be close to patients’ beds (Llewelyn-Davies and Wecks 1979). For single and multiple bedrooms, handwashing facilities should be provided in each room; lavatories may be omitted where a toilet room is provided to serve one single-bed room or one two-bed room (American Institute of Architects Committee on Architecture for Health 1987). Nursing stations should be large enough to accommodate desks and chairs for record keeping, tables and cabinets for preparation of drugs, instruments and supplies, chairs for sit-down conferences with physicians and other staff members, a wash-up sink and access to a staff toilet.

2.       Operating theatres. Two main classes of elements should be considered: operating rooms and service areas (American Institute of Architects Committee on Architecture for Health 1987). Operating rooms should be classified as follows:

  • general operating room, needing a minimum clear area of 33.5 sqm.
  • room for orthopaedic surgery (optional), needing enclosed storage space for splints and traction equipment
  • room for cardiovascular surgery (optional), needing a minimum clear area of 44 sqm. In the clear area of the surgical suite, nearby the operating room, an additional pump room should be designed, where extracorporeal pump supplies and accessories are stored and serviced.
  • room for endoscope procedures, needing a minimum clear area of 23 sqm
  • rooms for waiting patients, induction of anaesthesia and recovery from anaesthesia.

 

Service areas should include: sterilizing facility with high-speed autoclave, scrub facilities, medical gas storage facilities and staff clothing change areas.

3.       Diagnostic facilities: Each radiology unit should include (Llewelyn-Davies and Wecks 1979; American Institute of Architects Committee on Architecture for Health 1987):

  • appointment desk and waiting areas
  • diagnostic radiographic rooms, needing 23 sqm for fluoroscopic procedures and about 16 sqm for radiographic ones, plus a shielded control area, and rigid support structures for ceiling-mounted equipment (where necessary)
  • dark room (where necessary), needing almost 5 sqm and appropriate ventilation for the developer
  • contrast media preparation area, clean-up facilities, film quality control area, computer area and film storage area
  • viewing area where films can be read and reports dictated.

 

The wall thickness in a radiology unit should be 8 to 12 cm (poured concrete) or 12 to 15 cm (cinder block or bricks). The diagnostic activities in health care facilities may require tests in haematology, clinical chemistry, microbiology, pathology and cytology. Each laboratory area should be provided with work areas, sample and material storage facilities (refrigerated or not), specimen collection facilities, facilities and equipment for terminal sterilization and waste disposal, and a special facility for radioactive material storage (where necessary) (American Institute of Architects Committee on Architecture for Health 1987).

4.       Outpatient departments. Clinical facilities should include (American Institute of Architects Committee on Architecture for Health 1987): general-purpose examination rooms (7.4 sqm), special-purpose examination rooms (varying with the specific equipment needed) and treatment rooms (11 sqm). In addition, administrative facilities are needed for the admittance of outpatients.

5.       Administration area (offices). Facilities such as common office building areas are needed. These include a loading dock and storage areas for receiving supplies and equipment and dispatching materials not disposed of by the separate waste removal system.

6.       Dietary facilities (optional). Where present, these should provide the following elements (American Institute of Architects Committee on Architecture for Health 1987): a control station for receiving and controlling food supplies, storage spaces (including cold storage), food preparation facilities, handwashing facilities, facility for assembling and distributing patients’ meals, dining space, dishwashing space (located in a room or an alcove separated from the food preparation and serving area), waste storage facilities and toilets for dietary staff.

7.       Linen services (optional). Where present, these should provide the following elements: a room for receiving and holding soiled linen, a clean-linen storage area, a clean-linen inspection and mending area and handwashing facilities (American Institute of Architects Committee on Architecture for Health 1987).

8.       Engineering services and equipment areas. Adequate areas, varying in size and characteristics for each health care facility, have to be provided for: boiler plant (and fuel storage, if necessary), electrical supply, emergency generator, maintenance workshops and stores, cold-water storage, plant rooms (for centralized or local ventilation) and medical gases (NHS 1991a).

9.       Corridors and passages. These have to be organized to avoid confusion for visitors and disruptions in the work of hospital personnel; circulation of clean and dirty goods should be strictly separated. Minimum corridor width should be 2 m (Decree of the President of Ministers Council 1986). Doorways and elevators must be large enough to allow easy passage of stretchers and wheelchairs.

Requirements for Building Materials and Furnishings

The choice of materials in modern health care facilities is often aimed to reduce the risk in accidents and fire occurrence: materials must be non-inflammable and must not produce noxious gases or smokes when burnt (American Institute of Architects Committee on Architecture for Health 1987). Trends in hospital floor-covering materials have shown a shift from stone materials and linoleum to polyvinyl chloride (PVC). In operating rooms, in particular, PVC is considered the best choice to avoid electrostatic effects that may cause explosion of anaesthetic flammable gases. Up to some years ago, walls were painted; today, PVC coverings and fibreglass wallpaper are the most used wall finishes. False ceilings are today built mainly from mineral fibres instead of gypsum board; a new trend appears to be that of using stainless steel ceilings (Catananti et al. 1993). However, a more complete approach should consider that each material and furnishing may cause effects in the outdoor and indoor environmental systems. Accurately chosen building materials may reduce environmental pollution and high social costs and improve the safety and comfort of building occupants. At the same time, internal materials and finishes may influence the functional performance of the building and its management. Besides, the choice of materials in hospitals should also consider specific criteria, such as ease of cleaning, washing and disinfecting procedures and susceptibility to becoming a habitat for living beings. A more detailed classification of criteria to be considered in this task, derived from the European Community Council Directive No. 89/106 (Council of the European Communities 1988), is shown in table 1 .

Table 1. Criteria and variables to be considered in the choice of materials

Criteria

Variables

Functional performance

Static load, transit load, impact load, durability, construction requirements

Safety

Collapse risk, fire risk (reaction to fire, fire resistance, flammability), static electric charge (explosion risk), disperse electric power (electric shock risk), sharp surface (wound risk), poisoning risk (hazardous chemical emission), slip risk, radioactivity

Comfort and pleasantness

Acoustic comfort (features related to noise), optical and visual comfort (features related to light), tactile comfort (consistence, surface), hygrothermal comfort (features related to heat), aesthetics, odour emissions, indoor air quality perception

Hygienicity

Living beings habitat (insects, moulds, bacteria), susceptibility to stains, susceptibility to dust, easiness in cleaning, washing and disinfecting, maintenance procedures

Flexibility

Susceptibility to modifications, conformational factors (tile or panel dimensions and morphology)

Environmental impact

Raw material, industrial manufacturing, waste management

Cost

Material cost, installation cost, maintenance cost

Source: Catananti et al. 1994.

On the matter of odour emissions, it should be observed that a correct ventilation after floor or wall-coverings installation or renovation work reduces exposure of personnel and patients to indoor pollutants (especially volatile organic compounds (VOCs)) emitted by building materials and furnishings.

Requirements for Heating, Ventilation and Air-Conditioning Systems and for Microclimatic Conditions

The control of microclimatic conditions in health care facilities areas may be carried out by heating, ventilation and/or air-conditioning systems (Catananti and Cambieri 1990). Heating systems (e.g., radiators) permit only temperature regulation and may be sufficient for common nursing units. Ventilation, which induces changes of air speed, may be natural (e.g., by porous building materials), supplementary (by windows) or artificial (by mechanical systems). The artificial ventilation is especially recommended for kitchens, laundries and engineering services. Air-conditioning systems, particularly recommended for some health care facility areas such as operating rooms and intensive-care units, should guarantee:

  • the control of all microclimatic factors (temperature, relative humidity and air speed)
  • the control of air purity and concentration of micro-organisms and chemicals (e.g., anaesthetic gases, volatile solvents, odours and so on). This target may be achieved by adequate air filtration and air changes, right pressure relationships among adjacent areas and laminar airflow.

 

General requirements of air-conditioning systems include outdoor intake locations, air filter features and air supply outlets (ASHRAE 1987). Outdoor intake locations should be far enough, at least 9.1 m, from pollution sources such as exhaust outlets of combustion equipment stacks, medical-surgical vacuum systems, ventilation exhaust outlets from the hospital or adjoining buildings, areas that may collect vehicular exhaust and other noxious fumes, or plumbing vent stacks. Besides, their distance from ground level should be at least 1.8 m. Where these components are installed above the roof, their distance from roof level should be at least 0.9 m.

Number and efficiency of filters should be adequate for the specific areas supplied by air conditioning systems. For example, two filter beds of 25 and 90% efficiency should be used in operating rooms, intensive-care units and transplant organ rooms. Installation and maintenance of filters follow several criteria: lack of leakage between filter segments and between the filter bed and its supporting frame, installation of a manometer in the filter system in order to provide a reading of the pressure so that filters can be identified as expired and provision of adequate facilities for maintenance without introducing contamination into the air flow. Air supply outlets should be located on the ceiling with perimeter or several exhaust inlets near the floor (ASHRAE 1987).

Ventilation rates for health care facility areas permitting air purity and comfort of occupants are listed in table 2 .

Table 2. Ventilation requirements in health care facilities areas

Areas

Pressure relationships to adjacent areas

Minimum air changes of outdoor air per hour supplied to room

Minimum total air changes per hour supplied to room

All air exhausted directly to outdoors

Recirculated within room units

Nursing units

         

Patient room

+/–

2

2

Optional

Optional

Intensive care

P

2

6

Optional

No

Patient corridor

+/–

2

4

Optional

Optional

Operating theatres

         

Operating room (all outdoor system)

P

15

15

Yes1

No

Operating room (recirculating system)

P

5

25

Optional

No2

Diagnostic facilities

         

X ray

+/–

2

6

Optional

Optional

Laboratories

         

Bacteriology

N

2

6

Yes

No

Clinical chemistry

P

2

6

Optional

No

Pathology

N

2

6

Yes

No

Serology

P

2

6

Optional

No

Sterilizing

N

Optional

10

Yes

No

Glasswashing

N

2

10

Yes

Optional

Dietary facilities

         

Food preparation centres3

+/–

2

10

Yes

No

Dishwashing

N

Optional

10

Yes

No

Linen service

         

Laundry (general)

+/–

2

10

Yes

No

Soiled linen sorting and storage

N

Optional

10

Yes

No

Clean linen storage

P

2 (Optional)

2

Optional

Optional

P = Positive. N = Negative. +/– = Continuous directional control not required.

1 For operating rooms, use of 100% outside air should be limited to these cases where local codes require it, only if heat recovery devices are used; 2 recirculating room units meeting the filtering requirement for the space may be used; 3 food preparation centres shall have ventilation systems that have an excess of air supply for positive pressure when hoods are not in operation. The number of air changes may be varied to any extent required for odour control when the space is not in use.

Source: ASHRAE 1987.

Specific requirements of air-conditioning systems and microclimatic conditions regarding several hospital areas are reported as follows (ASHRAE 1987):

Nursing units. In common patient rooms a temperature (T) of 24 °C and a 30% relative humidity (RH) for winter and a T of 24 °C with 50% RH for summer are recommended. In intensive-care units a variable range temperature capability of 24 to 27 °C and a RH of 30% minimum and 60% maximum with a positive air pressure are recommended. In immunosuppressed patient units a positive pressure should be maintained between patient room and adjacent area and HEPA filters should be used.

In full-term nursery a T of 24 °C with RH from 30% minimum to 60% maximum is recommended. The same microclimatic conditions of intensive-care units are required in special-care nursery.

Operating theatres. Variable temperature range capability of 20 to 24 °C with RH of 50% minimum and 60% maximum and positive air pressure are recommended in operating rooms. A separate air-exhaust system or special vacuum system should be provided in order to remove anaesthetic gas traces (see “Waste anaesthetic gases” in this chapter).

Diagnostic facilities. In the radiology unit, fluoroscopic and radiographic rooms require T of 24 to 27 °C and RH of 40 to 50%. Laboratory units should be supplied with adequate hood exhaust systems to remove dangerous fumes, vapours and bioaerosols. The exhaust air from the hoods of the units of clinical chemistry, bacteriology and pathology should be discharged to the outdoors with no recirculation. Also, the exhaust air from infectious disease and virology laboratories requires sterilization before being exhausted to the outdoors.

Dietary facilities. These should be provided with hoods over the cooking equipment for removal of heat, odours and vapours.

Linen services. The sorting room should be maintained at a negative pressure in relation to adjoining areas. In the laundry processing area, washers, flatwork ironers, tumblers, and so on should have direct overhead exhaust to reduce humidity.

Engineering services and equipment areas. At work stations, the ventilation system should limit temperature to 32 °C.

Conclusion

The essence of specific building requirements for health care facilities is the accommodation of external standard-based regulations to subjective index-based guidelines. In fact, subjective indices, such as Predicted Mean Vote (PMV) (Fanger 1973) and olf, a measure of odour (Fanger 1992), are able to make predictions of the comfort levels of patients and personnel without neglecting the differences related to their clothing, metabolism and physical status. Finally, the planners and architects of hospitals should follow the theory of “building ecology” (Levin 1992) which describes dwellings as a complex series of interactions among buildings, their occupants and the environment. Health facilities, accordingly, should be planned and built focusing on the whole “system” rather than any particular partial frames of reference.

 

Back

Wednesday, 02 March 2011 16:27

Health Care Workers and Latex Allergy

With the advent of the universal precautions against bloodborne infections which dictate the use of gloves whenever HCWs are exposed to patients or materials that might be infected with hepatitis B or HIV, the frequency and severity of allergic reactions to natural rubber latex (NRL) have zoomed upward. For example, the Department of Dermatology at the Erlangen-Nuremberg University in Germany reported a 12-fold increase in the number of patients with latex allergy between 1989 and 1995. More serious systemic manifestations increased from 10.7% in 1989 to 44% in 1994-1995 (Hesse et al. 1996).

It seems ironic that so much difficulty is attributable to rubber gloves when they were intended to protect the hands of nurses and other HCWs when they were originally introduced toward the end of the nineteenth century. This was the era of antiseptic surgery in which instruments and operative sites were bathed in caustic solutions of carbolic acid and bichloride of mercury. These not only killed germs but they also macerated the hands of the surgical team. According to what has become a romantic legend, William Stewart Halsted, one of the surgical “giants” of the time who is credited with a host of contributions to the techniques of surgery, is said to have “invented” rubber gloves around 1890 to make it more pleasant to hold hands with Caroline Hampton, his scrub nurse, whom he later married (Townsend 1994). Although Halsted may be credited with introducing and popularizing the use of rubber surgical gloves in the United States, many others had a hand in it, according to Miller (1982) who cited a report of their use in the United Kingdom published a half century earlier (Acton 1848).

Latex Allergy

Allergy to NRL is succinctly described by Taylor and Leow (see the article “Rubber contact dermatitis and latex allergy” in the chapter Rubber industry) as “an immunoglobulin E-mediated, immediate, Type I allergic reaction, most always due to NRL proteins present in medical and non-medical latex devices. The spectrum of clinical signs ranges from contact urticaria, generalized urticaria, allergic rhinitis, allergic conjunctivitis, angioedema (severe swelling) and asthma (wheezing) to anaphylaxis (severe, life-threatening allergic reaction)”. Symptoms may result from direct contact of normal or inflamed skin with gloves or other latex-containing materials or indirectly by mucosal contact with or inhalation of aerosolized NRL proteins or talcum powder particles to which NRL proteins have adhered. Such indirect contact can cause a Type IV reaction to the rubber accelerators. (Approximately 80% of “latex glove allergy” is actually a Type IV reaction to the accelerators.) The diagnosis is confirmed by patch, prick, scratch or other skin sensitivity tests or by serological studies for the immune globulin. In some individuals, the latex allergy is associated with allergy to certain foods (e.g., banana, chestnuts, avocado, kiwi and papaya).

While most common among health care workers, latex allergy is also found among employees in rubber manufacturing plants, other workers who habitually use rubber gloves (e.g., greenhouse workers (Carillo et al. 1995)) and in patients with a history of multiple surgical procedures (e.g., spina bifida, congenital urogenital abnormalities, etc.) (Blaycock 1995). Cases of allergic reactions after the use of latex condoms have been reported (Jonasson, Holm and Leegard 1993), and in one case, a potential reaction was averted by eliciting a history of an allergic reaction to a rubber swimming cap (Burke, Wilson and McCord 1995). Reactions have occurred in sensitive patients when hypodermic needles used to prepare doses of parenteral medications picked up NRL protein as they were pushed through the rubber caps on the vials.

According to a recent study of 63 patients with NRL allergy, it took an average of 5 years of working with latex products for the first symptoms, usually a contact urticaria, to develop. Some also had rhinitis or dyspnoea. It took, on average, an additional 2 years for the appearance of lower respiratory tract symptoms (Allmeers et al. 1996).

Frequency of latex allergy

To determine the frequency of NRL allergy, allergy tests were performed on 224 employees at the University of Cincinnati College of Medicine, including nurses, laboratory technicians, physicians, respiratory therapists, housekeeping and clerical workers (Yassin et al. 1994). Of these, 38 (17%) tested positive to latex extracts; the incidence ranged from 0% among housekeeping workers to 38% among dental staff. Exposure of these sensitized individuals to latex caused itching in 84%, a skin rash in 68%, urticaria in 55%, lachrymation and ocular itching in 45%, nasal congestion in 39% and sneezing in 34%. Anaphylaxis occurred in 10.5%.

In a similar study at the University of Oulo in Finland, 56% of 534 hospital employees who used protective latex or vinyl gloves on a daily basis had skin disorders related to the usage of the gloves (Kujala and Reilula 1995). Rhinorrhoea or nasal congestion was present in 13% of workers who used powdered gloves. The prevalence of both skin and respiratory symptoms was significantly higher among those who used the gloves for more than 2 hours a day.

Valentino and colleagues (1994) reported latex induced asthma in four health care workers in an Italian regional hospital, and the Mayo Medical Center in Rochester Minnesota, where 342 employees who reported symptoms suggestive of latex allergy were evaluated, recorded 16 episodes of latex-related anaphylaxis in 12 subjects (six episodes occurred after skin testing) (Hunt et al. 1995). The Mayo researchers also reported respiratory symptoms in workers who did not wear gloves but worked in areas where large numbers of gloves were being used, presumably due to air-borne talcum powder/latex protein particles.

Control and Prevention

The most effective preventive measure is modification of standard procedures to replace the use of gloves and equipment made with NRL with similar items made of vinyl or other non-rubber materials. This requires involvement of the purchasing and supply departments, which should also mandate the labelling of all latex-containing items so that they may be avoided by individuals with latex sensitivity. This is important not only to the staff but also to patients who may have a history suggestive of latex allergy. Aerosolized latex, from latex powder, is also problematic. HCWs who are allergic to latex and who do not use latex gloves may still be affected by the powdered latex gloves used by co-workers. A significant problem is presented by the wide variation in content of latex allergen among gloves from different manufacturers and, indeed, among different lots of gloves from the same manufacturer.

Glove manufacturers are experimenting with gloves using formulations with smaller amounts of NRL as well as coatings that will obviate the need for talcum powder to make the gloves easy to put on and take off. The goal is to provide comfortable, easy to wear, non-allergenic gloves that still provide effective barriers to the transmission of the hepatitis B virus, HIV and other pathogens.

A careful medical history with a particular emphasis on prior latex exposures should be elicited from all health care workers who present symptoms suggestive of latex allergy. In suspect cases, evidence of latex sensitivity may be confirmed by skin or serological testing. Since there is evidently a risk of provoking an anaphylactic reaction, the skin testing should only be performed by experienced medical personnel.

At the present time, allergens for desensitization are not available so that the only remedy is avoidance of exposure to products containing NRL. In some instances, this may require a change of job. Weido and Sim (1995) at the University of Texas Medical Branch at Galveston suggest advising individuals in high-risk groups to carry self-injectable epinephrine to use in the event of a systemic reaction.

Following the appearance of several clusters of latex allergy cases in 1990, the Mayo Medical Center in Rochester, Minnesota, formed a multidisciplinary work group to address the problem (Hunt et al. 1996). Subsequently, this was formalized in a Latex Allergy Task Force with members from the departments of allergy, preventive medicine, dermatology and surgery as well as the Director of Purchasing, the Surgical Nursing Clinical Director and the Director of Employee Health. Articles on latex allergy were published in staff newsletters and information bulletins to educate the 20,000 member workforce to the problem and to encourage those with suggestive symptoms to seek medical consultation. A standardized approach to testing for latex sensitivity and techniques for quantifying the amount of latex allergen in manufactured products and the amount and particle size of air-borne latex allergen were developed. The latter proved to be sufficiently sensitive to measure the exposure of individual workers while performing particular high-risk tasks. Steps were initiated to monitor a gradual transition to low-allergen gloves (an incidental effect was a lowering of their cost by concentrating glove purchases among the fewer vendors who could meet the low allergen requirements) and to minimize exposures of staff and patients with known sensitivity to NLR.

To alert the public to the risks of NLR allergy, a consumer group, the Delaware Valley Latex Allergy Support Network has been formed. This group has created an Internet website (http://www.latex.org) and maintains a toll-free telephone line (1-800 LATEXNO) to provide up-to-date factual information about latex allergy to persons with this problem and those who care for them. This organization, which has a Medical Advisory Group, maintains a Literature Library and a Product Center and encourages the exchange of experiences among those who have had allergic reactions.

Conclusion

Latex allergies are becoming an increasingly important problem among health care workers. The solution lies in minimizing contact with latex allergen in their work environment, especially by substituting non-latex surgical gloves and appliances.

 

Back

Wednesday, 02 March 2011 16:24

Waste Anaesthetic Gases

The use of inhaled anaesthetics was introduced in the decade of 1840 to 1850. The first compounds to be used were diethyl ether, nitrous oxide and chloroform. Cyclopropane and trichloroethylene were introduced many years later (circa 1930-1940), and the use of fluoroxene, halothane and methoxiflurane began in the decade of the 1950s. By the end of the 1960s enflurane was being used and, finally, isoflurane was introduced in the 1980s. Isoflurane is now considered the most widely used inhalation anaesthetic even though it is more expensive than the others. A summary of the physical and chemical characteristics of methoxiflurane, enflurane, halothane, isoflurane and nitrous oxide, the most commonly used anaesthetics, is shown in table 1 (Wade and Stevens 1981).

Table 1. Properties of inhaled anaesthetics

 

Isoflurane,
Forane

Enflurane,
Ethrane

Halothane,
Fluothane

Methoxyflurane,
Penthrane

Dinitrogen oxide,
Nitrous oxide

Molecular weight

184.0

184.5

197.4

165.0

44.0

Boiling point

48.5°C

56.5°C

50.2°C

104.7°C

Density

1.50

1.52 (25°C)

1.86 (22°C)

1.41 (25°C)

Vapour pressure at 20 °C

250.0

175.0 (20°C)

243.0 (20°C)

25.0 (20°C)

Smell

Pleasant, sharp

Pleasant, like ether

Pleasant, sweet

Pleasant, fruity

Pleasant, sweet

Separation coefficients:

Blood/gas

1.40

1.9

2.3

13.0

0.47

Brain/gas

3.65

2.6

4.1

22.1

0.50

Fat/gas

94.50

105.0

185.0

890.0

1.22

Liver/gas

3.50

3.8

7.2

24.8

0.38

Muscle/gas

5.60

3.0

6.0

20.0

0.54

Oil/gas

97.80

98.5

224.0

930.0

1.4

Water/gas

0.61

0.8

0.7

4.5

0.47

Rubber/gas

0.62

74.0

120.0

630.0

1.2

Metabolic rate

0.20

2.4

15–20

50.0

 

All of them, with the exception of nitrous oxide (N2O), are hydrocarbons or chlorofluorinated liquid ethers that are applied by vapourization. Isoflurane is the most volatile of these compounds; it is the one that is metabolized at the lowest rate and the one that is least soluble in blood, in fats and in the liver.

Normally, N2O, a gas, is mixed with a halogenated anaesthetic, although they are sometimes used separately, depending on the type of anaesthesia that is required, the characteristics of the patient and the work habits of the anaesthetist. The normally used concentrations are 50 to 66% N2O and up to 2 or 3% of the halogenated anaesthetic (the rest is usually oxygen).

The anaesthesia of the patient is usually started by the injection of a sedative drug followed by an inhaled anaesthetic. The volumes given to the patient are in the order of 4 or 5 litres/minute. Parts of the oxygen and of the anaesthetic gases in the mixture are retained by the patient while the remainder is exhaled directly into the atmosphere or is recycled into the respirator, depending among other things on the type of mask used, on whether the patient is intubated and on whether or not a recycling system is available. If recycling is available, exhaled air can be recycled after it is cleaned or it can be vented to the atmosphere, expelled from the operating room or aspirated by a vacuum. Recycling (closed circuit) is not a common procedure and many respirators do not have exhaust systems; all the air exhaled by the patient, including the waste anaesthetic gases, therefore, ends up in the air of the operating room.

The number of workers occupationally exposed to waste anaesthetic gases is high, because it is not only the anaesthetists and their assistants who are exposed, but all the other people who spend time in operating rooms (surgeons, nurses and support staff), the dentists who perform odontological surgery, the personnel in delivery rooms and intensive care units where patients may be under inhaled anaesthesia and veterinary surgeons. Similarly, the presence of waste anaesthetic gases is detected in recovery rooms, where they are exhaled by patients who are recovering from surgery. They are also detected in other areas adjacent to operating rooms because, for reasons of asepsis, operating rooms are kept at positive pressure and this favours the contamination of surrounding areas.

Health Effects

Problems due to the toxicity of anaesthetic gases were not seriously studied until the 1960s, even though a few years after the use of inhaled anaesthetics became common, the relationship between the illnesses (asthma, nephritis) that affected some of the first professional anaesthetists and their work as such was already suspected (Ginesta 1989). In this regard the appearance of an epidemiological study of more than 300 anaesthetists in the Soviet Union, the Vaisman (1967) survey, was the starting point for several other epidemiological and toxicological studies. These studies—mostly during the 1970s and the first half of the 1980s—focused on the effects of anaesthetic gases, in most cases nitrous oxide and halothane, on people occupationally exposed to them.

The effects observed in most of these studies were an increase in spontaneous abortions among women exposed during or before pregnancy, and among women partners of exposed men; an increase in congenital malformations in children of exposed mothers; and the occurrence of hepatic, renal and neurological problems and of some types of cancer in both men and women (Bruce et al. 1968, 1974; Bruce and Bach 1976). Even though the toxic effects of nitrous oxide and of halothane (and probably its substitutes as well) on the body are not exactly the same, they are commonly studied together, given that exposure generally occurs simultaneously.

It appears likely that there is a correlation between these exposures and an increased risk, particularly for spontaneous abortions and congenital malformations in children of women exposed during pregnancy (Stoklov et al. 1983; Spence 1987; Johnson, Buchan and Reif 1987). As a result, many of the people exposed have expressed great concern. Rigorous statistical analysis of these data, however, casts doubt on the existence of such a relationship. More recent studies reinforce these doubts while chromosomal studies yield ambiguous results.

The works published by Cohen and colleagues (1971, 1974, 1975, 1980), who carried out extensive studies for the American Society of Anaesthetists (ASA), constitute a fairly extensive series of observations. Follow-up publications criticized some of the technical aspects of the earlier studies, particularly with respect to the sampling methodology and, especially, the proper selection of a control group. Other deficiencies included lack of reliable information on the concentrations to which the subjects had been exposed, the methodology for dealing with false positives and the lack of controls for factors such as tobacco and alcohol use, prior reproductive histories and voluntary infertility. Consequently, some of the studies are now even considered invalid (Edling 1980; Buring et al. 1985; Tannenbaum and Goldberg 1985).

Laboratory studies have shown that exposure of animals to ambient concentrations of anaesthetic gases equivalent to those found in operating rooms does cause deterioration in their development, growth and adaptive behaviour (Ferstandig 1978; ACGIH 1991). These are not conclusive, however, since some of these experimental exposures involved anaesthetic or subanaesthetic levels, concentrations significantly higher than the levels of waste gases usually found in operating room air (Saurel-Cubizolles et al. 1994; Tran et al. 1994).

Nevertheless, even acknowledging that a relationship between the deleterious effects and exposures to waste anaesthetic gases has not been definitively established, the fact is that the presence of these gases and their metabolites is readily detected in the air of operating rooms, in exhaled air and in biological fluids. Accordingly, since there is concern about their potential toxicity, and because it is technically feasible to do so without inordinate effort or expense, it would be prudent to take steps to eliminate or reduce to a minimum the concentrations of waste anaesthetic gases in operating rooms and nearby areas (Rosell, Luna and Guardino 1989; NIOSH 1994).

Maximum Allowable Exposure Levels

The American Conference of Governmental Industrial Hygienists (ACGIH) has adopted a threshold limit value-time weighted average (TLV-TWA) of 50 ppm for nitrous oxide and halothane (ACGIH 1994). The TLV-TWA is the guideline for the production of the compound, and the recommendations for operating rooms are that its concentration be kept lower, at a level below 1 ppm (ACGIH 1991). NIOSH sets a limit of 25 ppm for nitrous oxide and of 1 ppm for halogenated anaesthetics, with the additional recommendation that when they are used together, the concentration of halogenated compounds be reduced to a limit of 0.5 ppm (NIOSH 1977b).

With regard to values in biological fluids, the recommended limit for nitrous oxide in urine after 4 hours of exposure at average ambient concentrations of 25 ppm ranges from 13 to 19 μg/L, and for 4 hours of exposure at average ambient concentrations of 50 ppm, the range is 21 to 39 μg/L (Guardino and Rosell 1995). If exposure is to a mixture of a halogenated anaesthetic and nitrous oxide, the measurement of the values from nitrous oxide is used as the basis for controlling exposure, because as higher concentrations are used, quantification becomes easier.

Analytical Measurement

Most of the procedures described for measuring residual anaesthetics in air are based on the capture of these compounds by adsorption or in an inert bag or container, later to be analysed by gas chromatography or infrared spectroscopy (Guardino and Rosell 1985). Gas chromatography is also employed to measure nitrous oxide in urine (Rosell, Luna and Guardino 1989), while isoflurane is not readily metabolized and is therefore seldom measured.

Common Levels of Residual Concentrations in the Air of Operating Rooms

In the absence of preventive measures, such as the extraction of residual gases and/or introducing an adequate supply of new air into the operating suite, personal concentrations of more than 6,000 ppm of nitrous oxide and 85 ppm of halothane have been measured (NIOSH 1977). Concentrations of up to 3,500 ppm and 20 ppm, respectively, in the ambient air of operating rooms, have been measured. The implementation of corrective measures can reduce these concentrations to values below the environmental limits cited earlier (Rosell, Luna and Guardino 1989).

Factors that Affect the Concentration of Waste Anaesthetic Gases

The factors which most directly affect the presence of waste anaesthetic gases in the environment of the operating room are the following.

Method of anaesthesia. The first question to consider is the method of anaesthesia, for example, whether or not the patient is intubated and the type of face mask being used. In dental, laryngeal or other forms of surgery in which intubation is precluded, the patient’s expired air would be an important source of emissions of waste gases, unless equipment specifically designed to trap these exhalations is properly placed near the patient’s breathing zone. Accordingly, dental and oral surgeons are considered to be particularly at risk (Cohen, Belville and Brown 1975; NIOSH 1977a), as are veterinary surgeons (Cohen, Belville and Brown 1974; Moore, Davis and Kaczmarek 1993).

Proximity to the focus of emission. As is usual in industrial hygiene, when the known point of emission of a contaminant exists, proximity to the source is the first factor to consider when dealing with personal exposure. In this case, the anaesthetists and their assistants are the persons most directly affected by the emission of waste anaesthetic gases, and personal concentrations have been measured in the order of two times the average levels found in the air of operating rooms (Guardino and Rosell 1985).

Type of circuit. It goes without saying that in the few cases in which closed circuits are used, with reinspiration after the cleansing of the air and the resupply of oxygen and the necessary anaesthetics, there will be no emissions except in the case of equipment malfunction or if a leak exists. In other cases, it will depend on the characteristics of the system used, as well as on whether or not it is possible to add an extraction system to the circuit.

The concentration of anaesthetic gases. Another factor to take into account is the concentrations of the anaesthetics used since, obviously, those concentrations and the amounts found in the air of the operating room are directly related (Guardino and Rosell 1985). This factor is especially important when it comes to surgical procedures of long duration.

Type of surgical procedures. The duration of the operations, the time elapsed between procedures done in the same operating room and the specific characteristics of each procedure—which often determine which anaesthetics are used—are other factors to consider. The duration of the operation directly affects the residual concentration of anaesthetics in the air. In operating rooms where procedures are scheduled successively, the time elapsed between them also affects the presence of residual gases. Studies done in large hospitals with uninterrupted use of the operating rooms or with emergency operating rooms that are used beyond standard work schedules, or in operating rooms used for prolonged procedures (transplants, laryngotomies), show that substantial levels of waste gases are detected even before the first procedure of the day. This contributes to increased levels of waste gases in subsequent procedures. On the other hand, there are procedures that require temporary interruptions of inhalation anaesthesia (where extracorporeal circulation is needed, for example), and this also interrupts the emission of waste anaesthetic gases into the environment (Guardino and Rosell 1985).

Characteristics specific to the operating room. Studies done in operating rooms of different sizes, design and ventilation (Rosell, Luna and Guardino 1989) have demonstrated that these characteristics greatly influence the concentration of waste anaesthetic gases in the room. Large and non-partitioned operating rooms tend to have the lowest measured concentrations of waste anaesthetic gases, while in small operating rooms (e.g., paediatric operating rooms) the measured concentrations of waste gases are usually higher. The general ventilation system of the operating room and its proper operation is a fundamental factor for the reduction of the concentration of waste anaesthetics; the design of the ventilation system also affects the circulation of waste gases within the operating room and the concentrations in different locations and at various heights, something that can be easily verified by carefully taking samples.

Characteristics specific to the anaesthesia equipment. The emission of gases into the environment of the operating room depends directly on the characteristics of the anaesthesia equipment used. The design of the system, whether it includes a system for the return of excess gases, whether it can be attached to a vacuum or vented out of the operating room, whether it has leaks, disconnected lines and so on are always to be considered when determining the presence of waste anaesthetic gases in the operating room.

Factors specific to the anaesthetist and his or her team. The anaesthetist and his or her team are the last element to consider, but not necessarily the least important. Knowledge of the anaesthesia equipment, of its potential problems and the level of maintenance it receives—both by the team and by the maintenance staff in the hospital—are factors that affect very directly the emission of waste gases into the air of the operating room (Guardino and Rosell 1995). It has been clearly shown that, even when using adequate technology, the reduction of the ambient concentrations of anaesthetic gases cannot be achieved if a preventive philosophy is absent from the work routines of anaesthetists and their assistants (Guardino and Rosell 1992).

Preventive Measures

The basic preventive actions required to reduce occupational exposure to waste anaesthetic gases effectively can be summarized in the following six points:

  1. Anaesthetic gases should be thought of as occupational hazards. Even if from a scientific standpoint it has not been conclusively shown that anaesthetic gases have a serious deleterious effect on the health of people who are occupationally exposed, there is a high probability that some of the effects mentioned here are directly related to the exposure to waste anaesthetic gases. For that reason it is a good idea to consider them toxic occupational hazards.
  2. Scavenger systems should be used for waste gases. Scavenger systems are the most effective technical hardware for the reduction of waste gases in the air of the operating room (NIOSH 1975). These systems must fulfil two basic principles: they must store and/or adequately eliminate the whole volume of air expired by the patient, and they must be designed to guarantee that neither the respiration of the patient nor the proper functioning of the anaesthesia equipment will be affected—with separate safety devices for each function. The techniques most commonly employed are: a direct connection to a vacuum outlet with a flexible regulating chamber that allows for the discontinuous emission of gases of the respiratory cycle; directing the flow of the gases exhaled by the patient to the vacuum without a direct connection; and directing the flow of gases coming from the patient to the return of the ventilation system installed in the operating room and expelling these gases from the operating room and from the building. All these systems are technically easy to implement and very cost-efficient; the use of installed respirators as part of the design is recommended. In cases where systems that eliminate waste gases directly cannot be used because of the special characteristics of a procedure, localized extraction can be employed near the source of emission as long as it does not affect the general ventilation system or the positive pressure in the operating room.
  3. General ventilation with a minimum of 15 renewals/hour in the operating room should be guaranteed. The general ventilation of the operating room should be perfectly regulated. It should not only maintain positive pressure and respond to the thermohygrometric characteristics of the ambient air, but should also provide a minimum of 15 to 18 renewals per hour. Also, a monitoring procedure should be in place to ensure its proper functioning.
  4. Preventive maintenance of the anaesthesia circuit should be planned and regular. Preventive maintenance procedures should be set up that include regular inspections of the respirators. Verifying that no gases are being emitted to the ambient air should be part of the protocol followed when the equipment is first turned on, and its proper functioning with regard to the safety of the patient should be checked. The proper functioning of the anaesthesia circuit should be verified by checking for leaks, periodically replacing filters and checking the safety valves.
  5. Environmental and biological controls should be used. The implementation of environmental and biological controls provides information not only about the correct functioning of the various technical elements (extraction of gases, general ventilation) but also about whether the working procedures are adequate for curtailing the emission of waste gases into the air. Today these controls do not present technical problems and they can be implemented economically, which is why they are recommended.
  6. Education and training of the exposed personnel is crucial. Achieving an effective reduction of occupational exposure to waste anaesthetic gases requires educating all operating room personnel about the potential risks and training them in the required procedures. This is particularly applicable to anaesthetists and their assistants who are most directly involved and those responsible for the maintenance of the anaesthesia and air-conditioning equipment.

 

Conclusion

Although not definitively proven, there is enough evidence to suggest that exposures to waste anaesthetic gases may be harmful to HCWs. Stillbirths and congenital malformations in infants born to female workers and to the spouses of male workers represent the major forms of toxicity. Since it is technically feasible at a low cost, it is desirable to reduce the concentration of these gases in the ambient air in operating rooms and adjacent areas to a minimum. This requires not only the use and correct maintenance of anaesthesia equipment and ventilation/air conditioning systems but also the education and training of all personnel involved, especially anaesthetists and their assistants, who generally are exposed to higher concentrations. Given the work conditions peculiar to operating rooms, indoctrination in the correct work habits and procedures is very important in trying to reduce the amounts of anaesthetic waste gases in the air to a minimum.

 

Back

Page 79 of 122

" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."

Contents

Chemical Processing References

Adams, WV, RR Dingman, and JC Parker. 1995. Dual gas sealing technology for pumps. Proceedings 12th International Pump Users Symposium. March, College Station, TX.

American Petroleum Institute (API). 1994. Shaft Sealing Systems for Centrifugal Pumps. API Standard 682. Washington, DC: API.

Auger, JE. 1995. Build a proper PSM program from the ground-up. Chemical Engineering Progress 91:47-53.

Bahner, M. 1996. Level-measurement tools keep tank contents where they belong. Environmental Engineering World 2:27-31.

Balzer, K. 1994. Strategies for developing biosafety programs in biotechnology facilities. Presented at the 3rd National Symposium on Biosafety, 1 March, Atlanta, GA.

Barletta, T, R Bayle, and K Kennelley. 1995. TAPS storage tank bottom: Fitted with improved connection. Oil & Gas Journal 93:89-94.

Bartknecht, W. 1989. Dust Explosions. New York: Springer-Verlag.

Basta, N. 1994. Technology lifts the VOC cloud. Chemical Engineering 101:43-48.

Bennett, AM. 1990. Health Hazards in Biotechnology. Salisbury, Wiltshire, UK: Division of Biologics, Public Health Laboratory Service, Centre for Applied Microbiology and Research.

Berufsgenossenschaftlices Institut für Arbeitssicherheit (BIA). 1997. Measurement of Hazardous Substances: Determination of Exposure to Chemical and Biological Agents. BIA Working Folder. Bielefeld: Erich Schmidt Verlag.

Bewanger, PC and RA Krecter. 1995. Making safety data “safe”. Chemical Engineering 102:62-66.

Boicourt, GW. 1995. Emergency relief system (ERS) design: An integrated approach using DIERS methodology. Process Safety Progress 14:93-106.

Carroll, LA and EN Ruddy. 1993. Select the best VOC control strategy. Chemical Engineering Progress 89:28-35.

Center for Chemical Process Safety (CCPS). 1988. Guidelines for Safe Storage and Handling of High Toxic Hazard Materials. New York: American Institute of Chemical Engineers.

—. 1993. Guidelines for Engineering Design for Process Safety. New York: American Institute of Chemical Engineers.
Cesana, C and R Siwek. 1995. Ignition behavior of dusts meaning and interpretation. Process Safety Progress 14:107-119.

Chemical and Engineering News. 1996. Facts and figures for the chemical industry. C&EN (24 June):38-79.

Chemical Manufacturers Association (CMA). 1985. Process Safety Management (Control of Acute Hazards). Washington, DC: CMA.

Committee on Recombinant DNA Molecules, Assembly of Life Sciences, National Research Council, National Academy of Sciences. 1974. Letter to the editor. Science 185:303.

Council of the European Communities. 1990a. Council Directive of 26 November 1990 on the protection of workers from risks related to exposure to biological agents at work. 90/679/EEC. Official Journal of the European Communities 50(374):1-12.

—. 1990b. Council Directive of 23 April 1990 on the deliberate release into the environment of genetically modified organisms. 90/220/EEC. Official Journal of the European Communities 50(117): 15-27.

Dow Chemical Company. 1994a. Dow’s Fire & Explosion Index Hazard Classification Guide, 7th edition. New York: American Institute of Chemical Engineers.

—. 1994b. Dow’s Chemical Exposure Index Guide. New York: American Institute of Chemical Engineers.

Ebadat, V. 1994. Testing to assess your powder’s fire and explosion hazards. Powder and Bulk Engineering 14:19-26.
Environmental Protection Agency (EPA). 1996. Proposed guidelines for ecological risk assessment. Federal Register 61.

Fone, CJ. 1995. The application of innovation and technology to the containment of shaft seals. Presented at the First European Conference on Controlling Fugitive Emissions from Valves, Pumps, and Flanges, 18-19 October, Antwerp.

Foudin, AS and C Gay. 1995. Introduction of genetically engineered microorganisms into the environment: Review under USDA, APHIS regulatory authority. In Engineered Organisms in Environmental Settings: Biotechnological and Agricultural Applications, edited by MA Levin and E Israeli. Boca Raton, FL:CRC Press.

Freifelder, D (ed.). 1978. The controversy. In Recombinant DNA. San Francisco, CA: WH Freeman.

Garzia, HW and JA Senecal. 1996. Explosion protection of pipe systems conveying combustible dusts or flammable gases. Presented at the 30th Loss Prevention Symposium, 27 February, New Orleans, LA.

Green, DW, JO Maloney, and RH Perry (eds.). 1984. Perry’s Chemical Engineer’s Handbook, 6th edition. New York: McGraw-Hill.

Hagen, T and R Rials. 1994. Leak-detection method ensures integrity of double bottom storage tanks. Oil & Gas Journal (14 November).

Ho, M-W. 1996. Are current transgenic technologies safe? Presented at the Workshop on Capacity Building in Biosafety for Developing Countries, 22-23 May, Stockholm.

Industrial Biotechnology Association. 1990. Biotechnology in Perspective. Cambridge, UK: Hobsons Publishing plc.

Industrial Risk Insurers (IRI). 1991. Plant Layout and Spacing for Oil and Chemical Plants. IRI Information Manual 2.5.2. Hartford, CT: IRI.

International Commission on Non-Ionizing Radiation Protection (ICNIRP). In press. Practical Guide for Safety in the Use of RF Dielectric Heaters and Sealers. Geneva: ILO.

Lee, SB and LP Ryan. 1996. Occupational health and safety in the biotechnology industry: A survey of practicing professionals. Am Ind Hyg Assoc J 57:381-386.

Legaspi, JA and C Zenz. 1994. Occupational health aspects of pesticides: Clinical and hygienic principles. In Occupational Medicine, 3rd edition, edited by C Zenz, OB Dickerson, and EP Horvath. St. Louis: Mosby-Year Book, Inc.

Lipton, S and JR Lynch. 1994. Handbook of Health Hazard Control in the Chemical Process Industry. New York: John Wiley & Sons.

Liberman, DF, AM Ducatman, and R Fink. 1990. Biotechnology: Is there a role for medical surveillance? In Bioprocessing Safety: Worker and Community Safety and Health Considerations. Philadelphia, PA: American Society for Testing and Materials.

Liberman, DF, L Wolfe, R Fink, and E Gilman. 1996. Biological safety considerations for environmental release of transgenic organisms and plants. In Engineered Organisms in Environmental Settings: Biotechnological and Agricultural Applications, edited by MA Levin and E Israeli. Boca Raton, FL: CRC Press.

Lichtenstein, N and K Quellmalz. 1984. Flüchtige Zersetzungsprodukte von Kunststoffen I: ABS-Polymere. Staub-Reinhalt 44(1):472-474.

—. 1986a. Flüchtige Zersetzungsprodukte von Kunststoffen II: Polyethylen. Staub-Reinhalt 46(1):11-13.

—. 1986b. Flüchtige Zersetzungsprodukte von Kunststoffen III: Polyamide. Staub-Reinhalt 46(1):197-198.

—. 1986c. Flüchtige Zersetzungsprodukte von Kunststoffen IV: Polycarbonate. Staub-Reinhalt 46(7/8):348-350.

Massachusetts Biotechnology Council Community Relations Committee. 1993. Unpublished statistics.

Mecklenburgh, JC. 1985. Process Plant Layout. New York: John Wiley & Sons.

Miller, H. 1983. Report on the World Health Organization Working Group on Health Implications of Biotechnology. Recombinant DNA Technical Bulletin 6:65-66.

Miller, HI, MA Tart and TS Bozzo. 1994. Manufacturing new biotech products: Gains and growing pains. J Chem Technol Biotechnol 59:3-7.

Moretti, EC and N Mukhopadhyay. 1993. VOC control: Current practices and future trends. Chemical Engineering Progress 89:20-26.

Mowrer, DS. 1995. Use quantitative analysis to manage fire risk. Hydrocarbon Processing 74:52-56.

Murphy, MR. 1994. Prepare for EPA’s risk management program rule. Chemical Engineering Progress 90:77-82.

National Fire Protection Association (NFPA). 1990. Flammable and Combustible Liquid. NFPA 30. Quincy, MA: NFPA.

National Institute for Occupational Safety and Health (NIOSH). 1984. Recommendations for Control of Occupational Safety and Health Hazards. Manufacture of Paint and Allied Coating Products. DHSS (NIOSH) Publication No. 84-115. Cincinnati, OH: NIOSH.

National Institute of Health (Japan). 1996. Personal communication.

National Institutes of Health (NIH). 1976. Recombinant DNA research. Federal Register 41:27902-27905.

—. 1991. Recombinant DNA research actions under the guidelines. Federal Register 56:138.

—. 1996. Guidelines for research involving recombinant DNA molecules. Federal Register 61:10004.

Netzel, JP. 1996. Seal technology: A control for industrial pollution. Presented at the 45th Society of Tribologists and Lubrication Engineers Annual Meetings. 7-10 May, Denver.

Nordlee, JA, SL Taylor, JA Townsend, LA Thomas, and RK Bush. 1996. Identification of a Brazil-nut allergen in transgenic soybeans. New Engl J Med 334 (11):688-692.

Occupational Safety and Health Administration (OSHA). 1984. 50 FR 14468. Washington, DC: OSHA.

—. 1994. CFR 1910.06. Washington, DC:OSHA.

Office of Science and Technology Policy (OSTP). 1986. Coordinated Framework for Biotechnology Regulation. FR 23303. Washington, DC: OSTP.

Openshaw, PJ, WH Alwan, AH Cherrie, and FM Record. 1991. Accidental infection of laboratory worker with recombinant vaccinia virus. Lancet 338.(8764):459.

Parliament of the European Communities. 1987. Treaty Establishing a Single Council and a Single Commission of the European Communities. Official Journal of the European Communities 50(152):2.

Pennington, RL. 1996. VOC and HAP control operations. Separations and Filtration Systems Magazine 2:18-24.

Pratt, D and J May. 1994. Agricultural occupational medicine. In Occupational Medicine, 3rd edition, edited by C Zenz, OB Dickerson, and EP Horvath. St. Louis: Mosby-Year Book, Inc.

Reutsch, C-J and TR Broderick. 1996. New biotechnology legislation in the European Community and Federal Republic of Germany. Biotechnology.

Sattelle, D. 1991. Biotechnology in perspective. Lancet 338:9,28.

Scheff, PA and RA Wadden. 1987. Engineering Design for Control of Workplace Hazards. New York: McGraw-Hill.

Siegell, JH. 1996. Exploring VOC control options. Chemical Engineering 103:92-96.

Society of Tribologists and Lubrication Engineers (STLE). 1994. Guidelines for Meeting Emission Regulations for Rotating Machinery with Mechanical Seals. STLE Special Publication SP-30. Park Ridge, IL: STLE.

Sutton, IS. 1995. Integrated management systems improve plant reliability. Hydrocarbon Processing 74:63-66.

Swiss Interdisciplinary Committee for Biosafety in Research and Technology (SCBS). 1995. Guidelines for Work with Genetically Modified Organisms. Zurich: SCBS.

Thomas, JA and LA Myers (eds.). 1993. Biotechnology and Safety Assessment. New York: Raven Press.

Van Houten, J and DO Flemming. 1993. Comparative analysis of current US and EC biosafety regulations and their impact on the industry. Journal of Industrial Microbiology 11:209-215.

Watrud, LS, SG Metz, and DA Fishoff. 1996. Engineered plants in the environment. In Engineered Organisms in Environmental Settings: Biotechnological and Agricultural Applications, edited by M Levin and E Israeli. Boca Raton, FL: CRC Press.

Woods, DR. 1995. Process Design and Engineering Practice. Englewood Cliffs, NJ: Prentice Hall.