97. Health Care Facilities and Services
Chapter Editor: Annelee Yassi
Table of Contents
Health Care: Its Nature and Its Occupational Health Problems
Annalee Yassi and Leon J. Warshaw
Social Services
Susan Nobel
Home Care Workers: The New York City Experience
Lenora Colbert
Occupational Health and Safety Practice: The Russian Experience
Valery P. Kaptsov and Lyudmila P. Korotich
Ergonomics and Health Care
Hospital Ergonomics: A Review
Madeleine R. Estryn-Béhar
Strain in Health Care Work
Madeleine R. Estryn-Béhar
Case Study: Human Error and Critical Tasks: Approaches for Improved System Performance
Work Schedules and Night Work in Health Care
Madeleine R. Estryn-Béhar
The Physical Environment and Health Care
Exposure to Physical Agents
Robert M. Lewy
Ergonomics of the Physical Work Environment
Madeleine R. Estryn-Béhar
Prevention and Management of Back Pain in Nurses
Ulrich Stössel
Case Study: Treatment of Back Pain
Leon J. Warshaw
Health Care Workers and Infectious Disease
Overview of Infectious Diseases
Friedrich Hofmann
Prevention of Occupational Transmission of Bloodborne Pathogens
Linda S. Martin, Robert J. Mullan and David M. Bell
Tuberculosis Prevention, Control and Surveillance
Robert J. Mullan
Chemicals in the Health Care Environment
Overview of Chemical Hazards in Health Care
Jeanne Mager Stellman
Managing Chemical Hazards in Hospitals
Annalee Yassi
Waste Anaesthetic Gases
Xavier Guardino Solá
Health Care Workers and Latex Allergy
Leon J. Warshaw
The Hospital Environment
Buildings for Health Care Facilities
Cesare Catananti, Gianfranco Damiani and Giovanni Capelli
Hospitals: Environmental and Public Health Issues
M.P. Arias
Hospital Waste Management
M.P. Arias
Managing Hazardous Waste Disposal Under ISO 14000
Jerry Spiegel and John Reimer
Click a link below to view table in article context.
1. Examples of health care functions
2. 1995 integrated sound levels
3. Ergonomic noise reduction options
4. Total number of injuries (one hospital)
5. Distribution of nurses’ time
6. Number of separate nursing tasks
7. Distribution of nurses' time
8. Cognitive & affective strain & burn-out
9. Prevalence of work complaints by shift
10. Congenital abnormalities following rubella
11. Indications for vaccinations
12. Post-exposure prophylaxis
13. US Public Health Service recommendations
14. Chemicals’ categories used in health care
15. Chemicals cited HSDB
16. Properties of inhaled anaesthetics
17. Choice of materials: criteria & variables
18. Ventilation requirements
19. Infectious diseases & Group III wastes
20. HSC EMS documentation hierarchy
21. Role & responsibilities
22. Process inputs
23. List of activities
Point to a thumbnail to see figure caption, click to see the figure in the article context.
Healthcare Workers and Infectious Diseases
Author: Madeleine R. Estryn-Béhar
Ergonomics is an applied science that deals with the adaptation of work and the workplace to the characteristics and capabilities of the worker so that he or she may perform the duties of the job effectively and safely. It addresses the worker’s physical capacities in relation to the physical requirements of the job (e.g., strength, endurance, dexterity, flexibility, ability to tolerate positions and postures, visual and auditory acuity) as well as his or her mental and emotional status in relation to the way the work is organized (e.g., work schedules, workload and work-related stress). Ideally, adaptations are made to the furniture, equipment and tools used by the worker and to the work environment to enable the worker to perform adequately without risk to himself/herself, co-workers and the public. Occasionally, it is necessary to improve the worker’s adaptation to the job through, for example, special training and the use of personal protective equipment.
Since the mid 1970s, the application of ergonomics to hospital workers has broadened. It is directed now at those involved in direct patient care (e.g., physicians and nurses), those involved in ancillary services (e.g., technicians, laboratory staff, pharmacists and social workers) and those providing support services (e.g., administrative and clerical personnel, food service staff, housekeeping staff, maintenance workers and security staff).
Extensive research has been conducted into the ergonomics of hospitalization, with most studies attempting to identify the extent to which hospital administrators should allow hospital personnel latitude in developing strategies to reconcile an acceptable workload with good quality of care. Participatory ergonomics has become increasingly widespread in hospitals in recent years. More specifically, wards have been reorganized on the basis of ergonomic analyses of activity undertaken in collaboration with medical and paramedical personnel, and participatory ergonomics has been used as the basis for the adaptation of equipment for use in health care.
In studies of hospital ergonomics, workstation analysis must extend at least to the departmental level—the distance between rooms and the amount and location of equipment are all crucial considerations.
Physical strain is one of the primary determinants of the health of HCWs and the quality of care that they dispense. This being said, the frequent interruptions that hinder care-giving and the effect of psychological factors associated with confrontations with serious illness, ageing and death must also be addressed. Accounting for all these factors is a difficult task, but approaches focusing only on single factors will fail to improve either working conditions or the quality of care. Similarly, patients’ perception of the quality of their hospital stay is determined by the effectiveness of the care they receive, their relationship with physicians and other personnel, the food and the architectural environment.
Basic to hospital ergonomics is study of the sum and interaction of personal factors (e.g., fatigue, fitness, age and training) and circumstantial factors (e.g., work organization, schedule, floor layout, furniture, equipment, communication and psychological support within the work team), which combine to affect the performance of work. Precise identification of the actual work performed by health care workers depends on ergonomic observation of entire workdays and collection of valid and objective information on the movements, postures, cognitive performance and emotional control called upon to satisfy work requirements. This helps to detect factors that may interfere with effective, safe, comfortable and healthy work. This approach also sheds light on the potential for workers’ suffering or taking pleasure in their work. Final recommendations must take the interdependence of the various professional and ancillary personnel attending the same patient into account.
These considerations lay the groundwork for further, specific research. Analysis of strain related to the use of basic equipment (e.g., beds, meal carts and mobile x-ray equipment) may help clarify the conditions of acceptable use. Measurements of lighting levels may be complemented by information on the size and contrast of medication labels, for example. Where alarms emitted by different intensive-care-unit equipment can be confused, analysis of their acoustic spectrum may prove useful. Computerization of patient charts should not be undertaken unless the formal and informal information-support structures have been analysed. The interdependence of the various elements of the work environment of any given caregiver should therefore always be borne in mind when analysing isolated factors.
Analysis of the interaction of different factors influencing care—physical strain, cognitive strain, affective strain, scheduling, ambience, architecture and hygiene protocols—is essential. It is important to adapt schedules and common work areas to the needs of the work team when attempting to improve overall patient management. Participatory ergonomics is a way of using specific information to bring about wide-ranging and relevant improvements to the quality of care and to working life. Involving all categories of personnel in key stages of the search for solution helps ensure that the modifications finally adopted will have their full support.
Working Postures
Epidemiological studies of joint and musculoskeletal disorders. Several epidemiological studies have indicated that inappropriate postures and handling techniques are associated with a doubling of the number of back, joint and muscle problems requiring treatment and time off the job. This phenomenon, discussed in greater detail elsewhere in this chapter and Encyclopaedia, is related to physical and cognitive strain.
Working conditions differ from country to country. Siegel et al. (1993) compared conditions in Germany and Norway and found that 51% of German nurses, but only 24% of Norwegian nurses, suffered lower-back pain on any given day. Working conditions in the two countries differed; however, in German hospitals, the patient-nurse ratio was twice as high and the number of adjustable-height beds half that in Norwegian hospitals, and fewer nurses had patient handling equipment (78% versus 87% in Norwegian hospitals).
Epidemiological studies of pregnancy and its outcome. Because the hospital workforce is usually predominantly female, the influence of work on pregnancy often becomes an important issue (see articles on pregnancy and work elsewhere in this Encyclopaedia). Saurel-Cubizolles et al. (1985) in France, for example, studied 621 women who returned to hospital work after giving birth and found that a higher rate of premature births were associated with heavy housekeeping chores (e.g., cleaning windows and floors), carrying heavy loads and long periods of standing. When these tasks were combined, the rate of premature births was increased: 6% when only one of these factors was involved and up to 21% when two or three were involved. These differences remained significant after adjustment for seniority, social and demographic characteristics and professional level. These factors were also associated with a higher frequency of contractions, more hospital admissions during pregnancy and, on average, longer sick leave.
In Sri Lanka, Senevirane and Fernando (1994) compared 130 pregnancies borne by 100 nursing officers and 126 by clerical workers whose jobs presumably were more sedentary; socio-economic backgrounds and use of prenatal care were similar for both groups. Odds-ratios for complications of pregnancy (2.18) and preterm delivery (5.64) were high among nursing officers.
Ergonomic Observation of Workdays
The effect of physical strain on health care workers has been demonstrated through continuous observation of workdays. Research in Belgium (Malchaire 1992), France (Estryn-Béhar and Fouillot 1990a) and Czechoslovakia (Hubacova, Borsky and Strelka 1992) has shown that health care workers spend 60 to 80% of their workday standing (see table 1). Belgian nurses were observed to spend approximately 10% of their workday bent over; Czechoslovakian nurses spent 11% of their workday positioning patients; and French nurses spent 16 to 24% of their workday in uncomfortable positions, such as stooping or squatting, or with their arms raised or loaded.
Table 1. Distribution of nurses’ time in three studies
Czechoslovakia |
Belgium |
France |
|
Authors |
Hubacova, Borsky and Strelka 1992* |
Malchaire 1992** |
Estryn-Béhar and |
Departments |
5 medical and surgical departments |
Cardiovascular surgery |
10 medical and |
Average time for the main postures and total distance walked by nurses: |
|||
Per cent working |
76% |
Morning 61% |
Morning 74% |
Including stooping, |
11% |
Morning 16% |
|
Standing flexed |
Morning 11% |
||
Distance walked |
Morning 4 km |
Morning 7 km |
|
Per cent working |
Three shifts: 47% |
Morning 38% |
Morning 24% |
Number of observations per shift:* 74 observations on 3 shifts. ** Morning: 10 observations (8 h); afternoon: 10 observations (8 h); night: 10 observations (11 h). *** Morning: 8 observations (8 h); afternoon: 10 observations (8 h); night: 9 observations (10-12 h).
In France, night-shift nurses spent somewhat more time sitting, but they end their shift by making beds and dispensing care, both of which involve work in uncomfortable positions. They are assisted in this by a nurses’ aide, but this should be contrasted with the situation during the morning shift, where these tasks are usually performed by two nurses’ aides. In general, nurses working day shifts spend less time in uncomfortable positions. Nurses’ aides were on their feet constantly, and uncomfortable positions, due largely to inadequate equipment, accounted for 31% (afternoon shift) to 46% (morning shift) of their time. Patient facilities in these French and Belgian teaching hospitals were spread out over large areas and consisted of rooms containing one to three beds. Nurses in these wards walked an average of 4 to 7 km per day.
Detailed ergonomic observation of entire workdays (Estryn-Béhar and Hakim-Serfaty 1990) is useful in revealing the interaction of the factors that determine quality of care and the manner in which work is performed. Consider the very different situations in a paediatric intensive care unit and a rheumatology ward. In paediatric resuscitation units, the nurse spends 71% of her time in patients’ rooms, and each patient’s equipment is kept on individual carts stocked by nurses’ aides. The nurses in this ward change location only 32 times per shift, walking a total of 2.5 km. They are able to communicate with physicians and other nurses in the adjoining lounge or nurses’ station through intercoms which have been installed in all the patients’ rooms.
By contrast, the nursing station in the rheumatology ward is very far from patients’ rooms, and care preparation is lengthy (38% of shift time). As a result, the nurses spend only 21% of their time in patients’ rooms and change location 128 times per shift, walking a total of 17 km. This clearly illustrates the interrelationship between physical strain, back problems and organizational and psychological factors. Because they need to move rapidly and get equipment and information, nurses only have time for hallway consultations—there is no time to sit while dispensing care, listen to patients and give patients personalized and integrated responses.
Continuous observation of 18 Dutch nurses in long-term-stay wards revealed that they spent 60% of their time performing physically demanding work with no direct contact with their patients (Engels, Senden and Hertog 1993). Housekeeping and preparation account for most of the 20% of the time described as spent in “slightly hazardous” activities. In all, 0.2% of shift time was spent in postures requiring immediate modification and 1.5% of shift time in postures requiring rapid modification. Contact with patients was the type of activity most frequently associated with these hazardous postures. The authors recommend modifying patient-handling practices and other less hazardous but more frequent tasks.
Given the physiological strain of the work of nurses’ aides, continuous measurement of heart rate is a useful complement to observation. Raffray (1994) used this technique to identify arduous housekeeping tasks and recommended not restricting personnel to this type of task for the whole day.
Electro-myographical (EMG) fatigue analysis is also interesting when body posture must remain more or less static—for example, during operations using an endoscope (Luttman et al. 1996).
Influence of architecture, equipment and organization
The inadequacy of nursing equipment, particularly beds, in 40 Japanese hospitals was demonstrated by Shindo (1992). In addition, patients’ rooms, both those lodging six to eight patients and single rooms reserved for the very ill, were poorly laid out and extremely small. Matsuda (1992) reported that these observations should lead to improvements in the comfort, safety and efficiency of nursing work.
In a French study (Saurel 1993), the size of patient rooms was problematic in 45 of 75 medium- and long-term-stay wards. The most common problems were:
The mean available area per bed for patients and nurses is at the root of these problems and decreases as the number of beds per room increases: 12.98 m2, 9.84 m2, 9.60 m2, 8.49 m2 and 7.25 m2 for rooms with one, two, three, four and more than four beds. A more accurate index of the useful area available to personnel is obtained by subtracting the area occupied by the beds themselves (1.8 to 2.0 m2) and by other equipment. The French Department of Health prescribes a useful surface area of 16 m2 for single rooms and 22 m2 for double rooms. The Quebec Department of Health recommends 17.8 m2 and 36 m2, respectively.
Turning to factors favouring the development of back problems, variable-height mechanisms were present on 55.1% of the 7,237 beds examined; of these, only 10.3% had electric controls. Patient-transfer systems, which reduce lifting, were rare. These systems were systematically used by 18.2% of the 55 responding wards, with over half the wards reporting using them “rarely” or “never”. “Poor” or “rather poor” manoeuvrability of meal carts was reported by 58.5% of 65 responding wards. There was no periodic maintenance of mobile equipment in 73.3% of 72 responding wards.
In almost half the responding wards, there were no rooms with seats that nurses could use. In many cases, this appears to have been due to the small size of the patient rooms. Sitting was usually possible only in the lounges—in 10 units, the nursing station itself had no seats. However, 13 units reported having no lounge and 4 units used the pantry for this purpose. In 30 wards, there were no seats in this room.
According to statistics for 1992 provided by the Confederation of Employees of the Health Services Employees of the United Kingdom (COHSE), 68.2% of nurses felt that there were not enough mechanical patient lifts and handling aides and 74.5% felt that they were expected to accept back problems as a normal part of their work.
In Quebec, the Joint Sectoral Association, Social Affairs Sector (Association pour la santé et la sécurité du travail, secteur afffaires sociales, ASSTAS) initiated its “Prevention-Planning-Renovation-Construction” project in 1993 (Villeneuve 1994). Over 18 months, funding for almost 100 bipartite projects, some costing several million dollars, was requested. This programme’s goal is to maximize investments in prevention by addressing health and safety concerns early in the design stage of planning, renovation and design projects.
The association completed the modification of the design specifications for patient rooms in long-term-care units in 1995. After noting that three-quarters of occupational accidents involving nurses occur in patient rooms, the association proposed new dimensions for patients’ rooms, and new rooms must now provide a minimum amount of free space around beds and accommodate patient lifts. Measuring 4.05 by 4.95 m, the rooms are more square than the older, rectangular rooms. To improve performance, ceiling-mounted patient lifts were installed, in collaboration with the manufacturer.
The association is also working on the modification of construction standards for washrooms, where many occupational accidents also occur, although to a lesser extent than in the rooms themselves. Finally, the feasibility of applying anti-skid coatings (with a coefficient of friction above the minimum standard of 0.50) on floors is being studied, since patient autonomy is best promoted by providing a non-skid surface on which neither they nor nurses can slip.
Evaluation of equipment that reduces physical strain
Proposals for improving beds (Teyssier-Cotte, Rocher and Mereau 1987) and meal carts (Bouhnik et al. 1989) have been formulated, but their impact is too limited. Tintori et al. (1994) studied adjustable-height beds with electric trunk-lifts and mechanical mattress-lifts. The trunk-lifts were judged satisfactory by the staff and patients, but the mattress-lifts were very unsatisfactory, since adjusting the beds required more than eight pedal strokes, each of which exceeded standards for foot force. Pushing a button located close to the patient’s head while talking to her or him is clearly preferable to pumping a pedal eight times from the foot of the bed (see figure 1). Because of time constraints, the mattress lift was often simply not used.
Figure 1. Electronically-operated trunk-lifts on beds effectively reduce lifting accidents
B. Floret
Van der Star and Voogd (1992) studied health care workers caring for 30 patients in a new prototype of bed over a period of six weeks. Observations of the workers’ positions, the height of work surfaces, physical interaction between nurses and patients and the size of the work space were compared to data collected on the same ward over a seven-week period prior to the introduction of the prototype. Use of the prototypes reduced the total time spent in uncomfortable positions while washing patients from 40% to 20%; for bed-making the figures were 35% and 5%. Patients also enjoyed greater autonomy and often changed positions on their own, raising their trunks or legs by means of electric control buttons.
In Swedish hospitals, each double room is equipped with ceiling-mounted patient lifts (Ljungberg, Kilbom and Goran 1989). Rigorous programmes such as the April Project evaluate the interrelation of working conditions, work organization, the establishment of a back school and the improvement of physical fitness (Öhling and Estlund 1995).
In Quebec, ASSTAS developed a global approach to the analysis of working conditions causing back problems in hospitals (Villeneuve 1992). Between 1988 and 1991, this approach led to modifications of the work environment and equipment used in 120 wards and a 30% reduction in the frequency and severity of occupational injuries. In 1994, a cost-benefit analysis performed by the association demonstrated that the systematic implementation of ceiling-mounted patient lifts would reduce occupational accidents and increase productivity, compared to the continued use of mobile, ground-based lifts (see figure 2).
Figure 2. Using ceiling-mounted patient lifts to reduce lifting accidents
Accounting for individual variation and facilitating activity
The female population in France is generally not very physically active. Of 1,505 nurses studied by Estryn-Béhar et al. (1992), 68% participated in no athletic activity, with inactivity more pronounced among mothers and unskilled personnel. In Sweden, fitness programmes for hospital personnel have been reported to be useful (Wigaeus Hjelm, Hagberg and Hellstrom 1993), but are feasible only if potential participants do not end their work day too tired to participate.
The adoption of better work postures is also conditioned by the possibility of wearing appropriate clothing (Lempereur 1992). The quality of shoes is particularly important. Hard soles are to be avoided. Anti-skid soles prevent occupational accidents caused by slips and falls, which in many countries are the second-leading cause of accidents leading to work absence. Ill-fitting overshoes or boots worn by operating room personnel to minimize the build-up of static electricity may be a hazard for falls.
Slips on level floors can be prevented by using low-slip floor surfaces that require no waxing. The risk of slips, particularly at doorways, can also be reduced by using techniques that do not leave the floor wet for long. The use of one mop per room, recommended by hygiene departments, is one such technique and has the additional advantage of reducing the handling of buckets of water.
In Vasteras County (Sweden), the implementation of several practical measures reduced painful syndromes and absenteeism by at least 25% (Modig 1992). In the archives (e.g., record or file rooms), ground- and ceiling-level shelves were eliminated, and an adjustable sliding board on which personnel can take notes while consulting the archives was installed. A reception office equipped with movable filing units, a computer and a telephone was also constructed. The height of the filing units is adjustable, allowing employees to adjust them to their own needs and facilitating the transition from sitting to standing during work.
Importance of “anti-lifting”
Manual patient-handling techniques designed to prevent back injuries have been proposed in many countries. Given the poor results of these techniques that have been reported to date (Dehlin et al. 1981; Stubbs, Buckle and Hudson 1983), more work in this area is needed.
The department of kinesiology of the University of Groningen (Netherlands) has developed an integrated patient-handling programme (Landewe and Schröer 1993) consisting of:
In the “anti-lifting” approach, the resolution of problems associated with patient transfers is based on the systematic analysis of all aspects of transfers, especially those related to patients, nurses, transfer equipment, teamwork, general working conditions and environmental and psychological barriers to the use of patient lifts (Friele and Knibbe 1993).
The application of European standard EN 90/269 of 29 May 1990 on back problems is an example of an excellent starting point for this approach. Besides requiring employers to implement appropriate work organization structures or other appropriate means, particularly mechanical equipment, to avoid manual handling of loads by workers, it also emphasizes the importance of “no-risk” handling policies that incorporate training. In practice, the adoption of appropriate postures and handling practices depends on the amount of functional space, presence of appropriate furniture and equipment, good collaboration on work organization and quality of care, good physical fitness and comfortable work clothing. The net effect of these factors is improved prevention of back problems.
Cognitive Strain
Continuous observation has revealed that nurses’ workdays are characterized by continual reorganization of their work schedules and frequent interruptions.
Belgian (Malchaire 1992) and French (Gadbois et al. 1992; Estryn-Béhar and Fouillot 1990b) studies have revealed that nurses perform 120 to 323 separate tasks during their workday (see table 1). Work interruptions are very frequent throughout the day, ranging from 28 to 78 per workday. Many of the units studied were large, short-term-stay units in which the nurses’ work consisted of a long series of spatially dispersed, short-duration tasks. Planning of work schedules was complicated by the presence of incessant technical innovation, close interdependence of the work of the various staff members and a generally haphazard approach to work organization.
Table 1. Number of separate tasks undertaken by nurses, and interruptions during each shift
Belgium |
France |
France |
|
Authors |
Malchaire 1992* |
Gadbois et al. 1992** |
Estryn-Béhar and |
Departments |
Cardiovascular |
Surgery (S) and |
Ten medical and |
Number of separate |
Morning 120/8 h |
S (day) 276/12 h |
Morning 323/8 h |
Number of |
S (day) 36/12 h |
Morning 78/8 h |
Number of hours of observation: * Morning: 80 h; afternoon: 80 h; night: 110 h. ** Surgery: 238 h; medicine: 220 h. *** Morning : 64 h; afternoon: 80 h; night: 90 h.
Gadbois et al. (1992) observed an average of 40 interruptions per workday, of which 5% were caused by patients, 40% by inadequate transmission of information, 15% by telephone calls and 25% by equipment. Ollagnier and Lamarche (1993) systematically observed nurses in a Swiss hospital and observed 8 to 32 interruptions per day, depending on the ward. On average, these interruptions represented 7.8% of the workday.
Work interruptions such as these, caused by inappropriate information supply and transmission structures, prevent workers from completing all their tasks and lead to worker dissatisfaction. The most serious consequence of this organizational deficiency is the reduction of time spent with patients (see table 2). In the first three studies cited above, nurses spent at most 30% of their time with patients on average. In Czechoslovakia, where multiple-bed rooms were common, nurses needed to change rooms less frequently, and spent 47% of their shift time with patients (Hubacova, Borsky and Strelka 1992). This clearly demonstrates how architecture, staffing levels and mental strain are all interrelated.
Table 2. Distribution of nurses’ time in three studies
Czechoslovakia |
Belgium |
France |
|
Authors |
Hubacova, Borsky and Strelka 1992* |
Malchaire 1992** |
Estryn-Béhar and |
Departments |
5 medical and surgical departments |
Cardiovascular surgery |
10 medical and |
Average time for the main postures and total distance walked by nurses: |
|||
Per cent working |
76% |
Morning 61% |
Morning 74% |
Including stooping, |
11% |
Morning 16% |
|
Standing flexed |
Morning 11% |
||
Distance walked |
Morning 4 km |
Morning 7 km |
|
Per cent working |
Three shifts: 47% |
Morning 38% |
Morning 24% |
Number of observations per shift: * 74 observations on 3 shifts. ** Morning: 10 observations (8 h); afternoon: 10 observations (8 h); night: 10 observations (11 h). *** Morning: 8 observations (8 h); afternoon: 10 observations (8 h); night: 9 observations (10-12 h).
Estryn-Béhar et al. (1994) observed seven occupations and schedules in two specialized medical wards with similar spatial organization and located in the same high-rise building. While work in one ward was highly sectorized, with two teams of a nurse and a nurses’ aide attending half of the patients, there were no sectors in the other ward, and basic care for all patients was dispensed by two nurses’ aides. There were no differences in the frequency of patient-related interruptions in the two wards, but team-related interruptions were clearly more frequent in the ward without sectors (35 to 55 interruptions compared to 23 to 36 interruptions). Nurses’ aides, morning-shift nurses and afternoon-shift nurses in the non-sectorized ward suffered 50, 70 and 30% more interruptions than did their colleagues in the sectorized one.
Sectorization thus appears to reduce the number of interruptions and the fracturing of work shifts. These results were used to plan the reorganization of the ward, in collaboration with the medical and paramedical staff, so as to facilitate sectorization of the office and the preparation area. The new office space is modular and easily divided into three offices (one for physicians and one for each of the two nursing teams), each separated by sliding glass partitions and furnished with at least six seats. Installation of two counters facing each other in the common preparation area means that nurses who are interrupted during preparation can return and find their materials in the same position and state, unaffected by their colleagues’ activities.
Reorganization of work schedules and technical services
Professional activity in technical departments is much more than the mere sum of tasks associated with each test. A study conducted in several nuclear medicine departments (Favrot-Laurens 1992) revealed that nuclear medicine technicians spend very little of their time performing technical tasks. In fact, a significant part of technicians’ time was spent coordinating activity and workload at the various workstations, transmitting information and making unavoidable adjustments. These responsibilities stem from technicians’ obligation to be knowledgeable about each test and to possess essential technical and administrative information in addition to test-specific information such as time and injection site.
Information processing necessary for the delivery of care
Roquelaure, Pottier and Pottier (1992) were asked by a manufacturer of electroencephalography (EEG) equipment to simplify the use of the equipment. They responded by facilitating the reading of visual information on controls which were excessively complicated or simply unclear. As they point out, “third-generation” machines present unique difficulties, due in part to the use of visual display units packed with barely legible information. Deciphering these screens requires complex work strategies.
On the whole, however, little attention has been paid to the need to present information in a manner that facilitates rapid decision-making in health care departments. For example, the legibility of information on medicine labels still leaves much to be desired, according to one study of 240 dry oral and 364 injectable medications (Ott et al. 1991). Ideally, labels for dry oral medication administered by nurses, who are frequently interrupted and attend several patients, should have a matte surface, characters at least 2.5 mm high and comprehensive information on the medication in question. Only 36% of the 240 medications examined satisfied the first two criteria, and only 6% all three. Similarly, print smaller than 2.5 mm was used in 63% of labels on the 364 injectable medications.
In many countries where English is not spoken, machine control panels are still labelled in English. Patient-chart software is being developed in many countries. In France, this type of software development is often motivated by a desire to improve hospital management and undertaken without adequate study of the software’s compatibility with actual working procedures (Estryn-Béhar 1991). As a result, the software may actually increase the complexity of nursing, rather than reduce cognitive strain. Requiring nurses to page through multiple screens of information to obtain the information they need to fill a prescription may increase the number of errors they make and memory lapses they suffer.
While Scandinavian and North American countries have computerized much of their patient records, it must be borne in mind that hospitals in these countries benefit from a high staff-to-patient ratio, and work interruptions and constant reshuffling of priorities are therefore less problematic there. In contrast, patient-chart software designed for use in countries with lower staff-to-patient ratios must be able to easily produce summaries and facilitate reorganization of priorities.
Human error in anaesthesia
Cooper, Newbower and Kitz (1984), in their study of the factors underlying errors during anaesthesia in the United States, found equipment design to be crucial. The 538 errors studied, largely drug administration and equipment problems, were related to the distribution of activities and the systems involved. According to Cooper, better design of equipment and monitoring apparatus would lead to a 22% reduction in errors, while complementary training of anaesthesiologists, using new technologies such as anaesthesia simulators, would lead to a 25% reduction. Other recommended strategies focus on work organization, supervision and communications.
Acoustic alarms in operating theatres and intensive-care units
Several studies have shown that too many types of alarms are used in operating theatres and intensive-care units. In one study, anaesthetists identified only 33% of alarms correctly, and only two monitors had recognition rates exceeding 50% (Finley and Cohen 1991). In another study, anaesthetists and anaesthesia nurses correctly identified alarms in only 34% of cases (Loeb et al. 1990). Retrospective analysis showed that 26% of nurses’ errors were due to similarities in alarm sounds and 20% to similarities in alarm functions. Momtahan and Tansley (1989) reported that recovery-room nurses and anaesthetists correctly identified alarms in only 35% and 22% of cases respectively. In another study by Momtahan, Hétu and Tansley (1993), 18 physicians and technicians were able to identify only 10 to 15 of 26 operating-theatre alarms, while 15 intensive-care nurses were able to identify only 8 to 14 of 23 alarms used in their unit.
De Chambost (1994) studied the acoustic alarms of 22 types of machines used in an intensive-care unit in the Paris region. Only the cardiogram alarms and those of one of the two types of automated-plunger syringes were readily identified. The others were not immediately recognized and required personnel first to investigate the source of the alarm in the patient’s room and then return with the appropriate equipment. Spectral analysis of the sound emitted by eight machines revealed significant similarities and suggests the existence of a masking effect between alarms.
The unacceptably high number of unjustifiable alarms has been the object of particular criticism. O’Carroll (1986) characterized the origin and frequency of alarms in a general intensive-care unit over three weeks. Only eight of 1,455 alarms were related to a potentially fatal situation. There were many false alarms from monitors and perfusion pumps. There was little difference between the frequency of alarms during the day and night.
Similar results have been reported for alarms used in anaesthesiology. Kestin, Miller and Lockhart (1988), in a study of 50 patients and five commonly used anaesthesia monitors, reported that only 3% indicated a real risk for the patient and that 75% of alarms were unfounded (caused by patient movement, interference and mechanical problems). On average, ten alarms were triggered per patient, equivalent to one alarm every 4.5 minutes.
A common response to false alarms is simply to disable them. McIntyre (1985) reported that 57% of Canadian anaesthetists admitted deliberately inactivating an alarm. Obviously, this could lead to serious accidents.
These studies underscore the poor design of hospital alarms and the need for alarm standardization based on cognitive ergonomics. Both Kestin, Miller and Lockhart (1988) and Kerr (1985) have proposed alarm modifications that take into account risk and the expected corrective responses of hospital personnel. As de Keyser and Nyssen (1993) have shown, the prevention of human error in anaesthesia integrates different measures—technological, ergonomic, social, organizational and training.
Technology, human error, patient safety and perceived psychological strain
Rigorous analysis of the error process is very useful. Sundström-Frisk and Hellström (1995) reported that equipment deficiencies and/or human error were responsible for 57 deaths and 284 injuries in Sweden between 1977 and 1986. The authors interviewed 63 intensive-care-unit teams involved in 155 incidents (“near-accidents”) involving advanced medical equipment; most of these incidents had not been reported to authorities. Seventy typical “near-accident” scenarios were developed. Causal factors identified included inadequate technical equipment and documentation, the physical environment, procedures, staffing levels and stress. The introduction of new equipment may lead to accidents if the equipment is poorly adapted to users’ needs and is introduced in the absence of basic changes in training and work organization.
In order to cope with forgetfulness, nurses develop several strategies for remembering, anticipating and avoiding incidents. They do still occur and even when patients are unaware of errors, near-accidents cause personnel to feel guilty. The article "Case Study: Human Error and Critical Taks" deals with some aspects of the problem.
Emotional or Affective Strain
Nursing work, especially if it forces nurses to confront serious illness and death, can be a significant source of affective strain, and may lead to burn-out, which is discussed more fully elsewhere in this Encyclopaedia. Nurses’ ability to cope with this stress depends on the extent of their support network and their possibility to discuss and improve patients’ quality of life. The following section summarizes the principal findings of Leppanen and Olkinuora’s (1987) review of Finnish and Swedish studies on stress.
In Sweden, the main motivations reported by health professionals for entering their profession were the “moral calling” of the work, its usefulness and the opportunity to exercise competence. However, almost half of nurses’ aides rated their knowledge as inadequate for their work, and one-quarter of nurses, one-fifth of registered nurses, one-seventh of physicians and one-tenth of head nurses considered themselves incompetent at managing some types of patients. Incompetence in managing psychological problems was the most commonly cited problem and was particularly prevalent among nurses’ aides, although also cited by nurses and head nurses. Physicians, on the other hand, consider themselves competent in this area. The authors focus on the difficult situation of nurses’ aides, who spend more time with patients than the others but are, paradoxically, unable to inform patients about their illness or treatment.
Several studies reveal the lack of clarity in delineating responsibilities. Pöyhönen and Jokinen (1980) reported that only 20% of Helsinki nurses were always informed of their tasks and the goals of their work. In a study conducted in a paediatric ward and an institute for disabled persons, Leppanen showed that the distribution of tasks did not allow nurses enough time to plan and prepare their work, perform office work and collaborate with team members.
Responsibility in the absence of decision-making power appears to be a stress factor. Thus, 57% of operating-room nurses felt that ambiguities concerning their responsibilities aggravated their cognitive strain; 47% of surgical nurses reported being unfamiliar with some of their tasks and felt that patients’ and nurses’ conflicting expectations were a source of stress. Further, 47% reported increased stress when problems occurred and physicians were not present.
According to three European epidemiological studies, burn-out affects approximately 25% of nurses (Landau 1992; Saint-Arnaud et al. 1992; Estryn-Béhar et al. 1990) (see table 3 ). Estryn-Béhar et al. studied 1,505 female health care workers, using a cognitive strain index that integrates information on work interruptions and reorganization and an affective strain index that integrates information on work ambience, teamwork, congruity of qualification and work, time spent talking to patients and the frequency of hesitant or uncertain responses to patients. Burn-out was observed in 12% of nurses with low, 25% of those with moderate and 39% of those with high cognitive strain. The relationship between burn-out and affective strain increases was even stronger: burn-out was observed in 16% of nurses with low, 25% of those with moderate and 64% of those with high affective strain. After adjustment by logistic multivariate regression analysis for social and demographic factors, women with a high affective strain index had an odds ratio for burn-out of 6.88 compared to those with a low index.
Table 3. Cognitive and affective strain and burn-out among health workers
Germany* |
Canada** |
France*** |
|
Number of subjects |
24 |
868 |
1,505 |
Method |
Maslach Burn-out |
Ilfeld Psychiatric |
Goldberg General |
High emotional |
33% |
20% |
26% |
Degree of burn-out, |
Morning 2.0; |
Morning 25%; |
|
Percentage suffering |
Cognitive and |
Cognitive strain: |
* Landau 1992. ** Saint Arnand et. al. 1992. *** Estryn-Béhar et al. 1990.
Saint-Arnaud et al. reported a correlation between the frequency of burn-out and the score on their composite cognitive and affective strain index. Landau’s results support these findings.
Finally, 25% of 520 nurses working in a cancer treatment centre and a general hospital in France were reported to exhibit high burn-out scores (Rodary and Gauvain-Piquard 1993). High scores were most closely associated with a lack of support. Feelings that their department did not regard them highly, take their knowledge of the patients into account or put the highest value on their patients’ quality of life were reported more frequently by nurses with high scores. Reports of being physically afraid of their patients and unable to organize their work schedule as they wished were also more frequent among these nurses. In light of these results, it is interesting to note that Katz (1983) observed a high suicide rate among nurses.
Impact of workload, autonomy and support networks
A study of 900 Canadian nurses revealed an association between workload and five indices of cognitive strain measured by the Ilfeld questionnaire: the global score, aggression, anxiety, cognitive problems and depression (Boulard 1993). Four groups were identified. Nurses with a high workload, high autonomy and good social support (11.76%) exhibited several stress-related symptoms. Nurses with a low workload, high autonomy and good social support (35.75%) exhibited the lowest stress. Nurses with high workload, little autonomy and little social support (42.09%) had a high prevalence of stress-related symptoms, while nurses with a low workload, little autonomy and little social support (10.40%) had low stress, but the authors suggest that these nurses may experience some frustration.
These results also demonstrate that autonomy and support, rather than moderating the relationship between workload and mental health, act directly on workload.
Role of head nurses
Classically, employee satisfaction with supervision has been considered to depend on the clear definition of responsibilities and on good communication and feedback. Kivimäki and Lindström (1995) administered a questionnaire to nurses in 12 wards of four medical departments and interviewed the wards’ head nurses. Wards were classified into two groups on the basis of the reported level of satisfaction with supervision (six satisfied wards and six dissatisfied wards). Scores for communication, feedback, participation in decision-making and the presence of a work climate that favours innovation were higher in “satisfied” wards. With one exception, head nurses of “satisfied” wards reported conducting at least one confidential conversation lasting one to two hours with each employee annually. In contrast, only one of the head nurses of the “dissatisfied” wards reported this behaviour.
Head nurses of the “satisfied” wards reported encouraging team members to express their opinions and ideas, discouraging team members from censuring or ridiculing nurses who made suggestions, and consistently attempting to give positive feedback to nurses expressing different or new opinions. Finally, all the head nurses in “satisfied” wards, but none of the ones in “dissatisfied” ones, emphasized their own role in creating a climate favourable to constructive criticism.
Psychological roles, relationships and organization
The structure of nurses’ affective relationships varies from team to team. A study of 1,387 nurses working regular night shifts and 1,252 nurses working regular morning or afternoon shifts revealed that shifts were extended more frequently during night shifts (Estryn-Béhar et al. 1989a). Early shift starts and late shift ends were more prevalent among night-shift nurses. Reports of a “good” or “very good” work ambience were more prevalent at night, but a “good relationship with physicians” was less prevalent. Finally, night-shift nurses reported having more time to talk to patients, although that meant that worries and uncertainties about the appropriate response to give patients, also more frequent at night, were harder to bear.
Büssing (1993) revealed that depersonalization was greater for nurses working abnormal hours.
Stress in physicians
Denial and suppression of stress are common defence mechanisms. Physicians may attempt to repress their problems by working harder, distancing themselves from their emotions or adopting the role of a martyr (Rhoads 1977; Gardner and Hall 1981; Vaillant, Sorbowale and McArthur 1972). As these barriers become more fragile and adaptive strategies break down, bouts of anguish and frustration become more and more frequent.
Valko and Clayton (1975) found that one-third of interns suffered severe and frequent episodes of emotional distress or depression, and that one-quarter of them entertained suicidal thoughts. McCue (1982) believed that a better understanding of both stress and reactions to stress would facilitate physician training and personal development and modify societal expectations. The net effect of these changes would be an improvement in care.
Avoidance behaviours may develop, often accompanied by a deterioration of interpersonal and professional relationships. At some point, the physician finally crosses the line into a frank deterioration of mental health, with symptoms which may include substance abuse, mental illness or suicide. In yet other cases, patient care may be compromised, resulting in inappropriate examinations and treatment, sexual abuse or pathological behaviour (Shapiro, Pinsker and Shale 1975).
A study of 530 physician suicides identified by the American Medical Association over a five-year period found that 40% of suicides by female physicians and less than 20% of suicides by male physicians occurred in individuals younger than 40 years (Steppacher and Mausner 1974). A Swedish study of suicide rates from 1976 to 1979 found the highest rates among some of the health professions, compared to the overall active population (Toomingas 1993). The standardized mortality ratio (SMR) for female physicians was 3.41, the highest value observed, while that for nurses was 2.13.
Unfortunately, health professionals with impaired mental health are often ignored and may even be rejected by their colleagues, who attempt to deny these tendencies in themselves (Bissel and Jones 1975). In fact, slight or moderate stress is much more prevalent among health professionals than are frank psychiatric disorders (McCue 1982). A good prognosis in these cases depends on early diagnosis and peer support (Bitker 1976).
Discussion groups
Studies on the effect of discussion groups on burn-out have been undertaken in the United States. Although positive results have been demonstrated (Jacobson and MacGrath 1983), it should be noted that these have been in institutions where there was sufficient time for regular discussions in quiet and appropriate settings (i.e., hospitals with high staff-patient ratios).
A literature review of the success of discussion groups has shown these groups to be valuable tools in wards where a high proportion of patients are left with permanent sequelae and must learn to accept modifications in their lifestyle (Estryn-Béhar 1990).
Kempe, Sauter and Lindner (1992) evaluated the merits of two support techniques for nurses near burn-out in geriatrics wards: a six-month course of 13 professional counselling sessions and a 12-month course of 35 “Balint group” sessions. The clarification and reassurance provided by the Balint group sessions were effective only if there was also significant institutional change. In the absence of such change, conflicts may even intensify and dissatisfaction increase. Despite their impending burn-out, these nurses remained very professional and sought ways of carrying on with their work. These compensatory strategies had to take into account extremely high workloads: 30% of nurses worked more than 20 hours of overtime per month, 42% had to cope with understaffing during more than two-thirds of their working hours and 83% were often left alone with unqualified personnel.
The experience of these geriatrics nurses was compared to that of nurses in oncology wards. Burnout score was high in young oncology nurses, and decreased with seniority. In contrast, burnout score among geriatrics nurses increased with seniority, attaining levels much higher than those observed in oncology nurses. This lack of decrease with seniority is due to the characteristics of the workload in geriatrics wards.
The need to act on multiple determinants
Some authors have extended their study of effective stress management to organizational factors related to affective strain.
For example, analysis of psychological and sociological factors was part of Theorell’s attempt to implement case-specific improvements in emergency, paediatric and juvenile psychiatry wards (Theorell 1993). Affective strain before and after the implementation of changes was measured through the use of questionnaires and the measurement of plasma prolactin levels, shown to mirror feelings of powerlessness in crisis situations.
Emergency-ward personnel experienced high levels of affective strain and frequently enjoyed little decisional latitude. This was attributed to their frequent confrontation with life-and-death situations, the intense concentration demanded by their work, the high number of patients they frequently attended and the impossibility of controlling the type and number of patients. On the other hand, because their contact with patients was usually short and superficial, they were exposed to less suffering.
The situation was more amenable to control in paediatric and juvenile psychiatry wards, where schedules for diagnostic procedures and therapeutic procedures were established in advance. This was reflected by a lower risk of overwork compared to emergency wards. However, personnel in these wards were confronted with children suffering from serious physical and mental disease.
Desirable organizational changes were identified through discussion groups in each ward. In emergency wards, personnel were very interested in organizational changes and recommendations concerning training and routine procedures—such as how to treat rape victims and elderly patients with no relations, how to evaluate work and what to do if a called physician doesn’t arrive—were formulated. This was followed by the implementation of concrete changes, including the creation of the position of head physician and the ensuring of the constant availability of an internist.
The personnel in juvenile psychiatry were primarily interested in personal growth. Reorganization of resources by the head physician and the county allowed one-third of the personnel to undergo psychotherapy.
In paediatrics, meetings were organized for all the personnel every 15 days. After six months, social support networks, decisional latitude and work content all had improved.
The factors identified by these detailed ergonomic, psychological and epidemiological studies are valuable indices of work organization. Studies which focus on them are quite different from in-depth studies of multi-factor interactions and instead revolve around the pragmatic characterization of specific factors.
Tintori and Estryn-Béhar (1994) identified some of these factors in 57 wards of a large hospital in the Paris region in 1993. Shift overlap of more than 10 minutes was present in 46 wards, although there was no official overlap between the night and morning shifts in 41 wards. In half the cases, these information communication sessions included nurses’ aides in all three shifts. In 12 wards, physicians participated in the morning-afternoon sessions. In the three months preceding the study, only 35 wards had held meetings to discuss patients’ prognoses, discharges and patients’ understanding of and reaction to their illnesses. In the year preceding the study, day-shift workers in 18 wards had received no training and only 16 wards had dispensed training to their night-shift workers.
Some new lounges were not used, since they were 50 to 85 metres from some of the patients’ rooms. Instead, the personnel preferred holding their informal discussions around a cup of coffee in a smaller but closer room. Physicians participated in coffee breaks in 45 day-shift wards. Nurses’ complaints of frequent work interruptions and feelings of being overwhelmed by their work are no doubt attributable in part to the dearth of seats (less than four in 42 of the 57 wards) and cramped quarters of the nursing stations, where more than nine people must spend a good part of their day.
The interaction of stress, work organization and support networks is clear in studies of the home-care unit of the hospital in Motala, Sweden (Beck-Friis, Strang and Sjöden 1991; Hasselhorn and Seidler 1993). The risk of burn-out, generally considered high in palliative care units, was not significant in these studies, which in fact revealed more occupational satisfaction than occupational stress. Turnover and work stoppages in these units were low, and personnel had a positive self-image. This was attributed to selection criteria for personnel, good teamwork, positive feedback and continuing education. Personnel and equipment costs for terminal-stage cancer hospital care are typically 167 to 350% higher than for hospital-based home care. There were more than 20 units of this type in Sweden in 1993.
For a long time, nurses and nursing assistants were among the only women working at night in many countries (Gadbois 1981; Estryn-Béhar and Poinsignon 1989). In addition to the problems already documented among men, these women suffer additional problems related to their family responsibilities. Sleep deprivation has been convincingly demonstrated among these women, and there is concern about the quality of care they are able to dispense in the absence of appropriate rest.
Organization of Schedules and Family Obligations
It appears that personal feelings about social and family life are at least partially responsible for the decision to accept or refuse night work. These feelings, in turn, lead workers to minimize or exaggerate their health problems (Lert, Marne and Gueguen 1993; Ramaciotti et al. 1990). Among non-professional personnel, financial compensation is the main determinant of the acceptance or refusal of night work.
Other work schedules may also pose problems. Morning-shift workers sometimes must rise before 05:00 and so lose some of the sleep that is essential for their recovery. Afternoon shifts finish between 21:00 and 23:00, limiting social and family life. Thus, often only 20% of women working in large university hospitals have work schedules in synchrony with the rest of society (Cristofari et al. 1989).
Complaints related to work schedules are more frequent among health care workers than among other employees (62% versus 39%) and indeed are among the complaints most frequently voiced by nurses (Lahaye et al. 1993).
One study demonstrated the interaction of work satisfaction with social factors, even in the presence of sleep deprivation (Verhaegen et al. 1987). In this study, nurses working only night shifts were more satisfied with their work than nurses working rotating shifts. These differences were attributed to the fact that all the night-shift nurses chose to work at night and organized their family life accordingly, while rotating-shift nurses found even rare night-shift work a disturbance of their personal and family lives. However, Estryn-Béhar et al. (1989b) reported that mothers working only night shifts were more tired and went out less frequently compared with male night-shift nurses.
In the Netherlands, the prevalence of work complaints was higher among nurses working rotating shifts than among those working only day shifts (Van Deursen et al. 1993) (see table 1).
Table 1. Prevalence of work complaints according to shift
Rotating shifts (%) |
Day shifts (%) |
|
Arduous physical work |
55.5 |
31.3 |
Arduous mental work |
80.2 |
61.9 |
Work often too tiring |
46.8 |
24.8 |
Under-staffing |
74.8 |
43.8 |
Insufficient time for breaks |
78.4 |
56.6 |
Interference of work with private life |
52.8 |
31.0 |
Dissatisfaction with schedules |
36.9 |
2.7 |
Frequent lack of sleep |
34.9 |
19.5 |
Frequent fatigue on rising |
31.3 |
17.3 |
Source: Van Deursen et al. 1993.
Sleep disturbances
On workdays, night-shift nurses sleep an average of two hours less than other nurses (Escribà Agüir et al. 1992; Estryn-Béhar et al. 1978; Estryn-Béhar et al. 1990; Nyman and Knutsson 1995). According to several studies, their quality of sleep is also poor (Schroër et al. 1993; Lee 1992; Gold et al. 1992; Estryn-Béhar and Fonchain 1986).
In their interview study of 635 Massachusetts nurses, Gold et al. (1992) found that 92.2% of nurses working alternating morning and afternoon shifts were able to maintain a nocturnal “anchor” sleep of four hours at the same schedule throughout the month, compared to only 6.3% of night-shift nurses and none of the nurses working alternating day and night shifts. The age- and seniority-adjusted odds ratio for “poor sleep” was 1.8 for night-shift nurses and 2.8 for rotating-shift nurses with night work, compared to morning- and afternoon-shift nurses. The odds ratio for taking sleep medication was 2.0 for night- and rotating-shift nurses, compared to morning- and afternoon-shift nurses.
Affective Problems and Fatigue
The prevalence of stress-related symptoms and reports of having stopped enjoying their work was higher among Finnish nurses working rotating shifts than among other nurses (Kandolin 1993). Estryn-Béhar et al. (1990) showed that night-shift nurses’ scores on the General Health Questionnaire used to evaluate mental health, compared to day-shift nurses (odds ratio of 1.6) showed poorer general health.
In another study, Estryn-Béhar et al. (1989b), interviewed a representative sample of one-quarter of night-shift employees (1,496 individuals) in 39 Paris-area hospitals. Differences appear according to sex and qualification (“qualified”=head nurses and nurses; “unqualified”=nurses’ aides and orderlies). Excessive fatigue was reported by 40% of qualified women, 37% of unqualified women, 29% of qualified men and 20% of unqualified men. Fatigue on rising was reported by 42% of qualified women, 35% of unqualified women, 28% of qualified men and 24% of unqualified men. Frequent irritability was reported by one-third of night-shift workers and by a significantly greater proportion of women. Women with no children were twice as likely to report excessive fatigue, fatigue on rising and frequent irritability than were comparable men. The increase compared to single men with no children was even more marked for women with one or two children, and greater still (a four-fold increase) for women with at least three children.
Fatigue on rising was reported by 58% of night-shift hospital workers and 42% of day-shift workers in a Swedish study using a stratified sample of 310 hospital workers (Nyman and Knutsson 1995). Intense fatigue at work was reported by 15% of day-shift workers and 30% of night-shift workers. Almost one-quarter of night-shift workers reported falling asleep at work. Memory problems were reported by 20% of night-shift workers and 9% of day-shift workers.
In Japan, the health and safety association publishes the results of medical examinations of all the country’s salaried employees. This report includes the results of 600,000 employees in the health and hygiene sector. Nurses generally work rotating shifts. Complaints concerning fatigue are highest in night-shift nurses, followed in order by evening- and morning-shift nurses (Makino 1995). Symptoms reported by night-shift nurses include sleepiness, sadness and difficulty concentrating, with numerous complaints about accumulated fatigue and disturbed social life (Akinori and Hiroshi 1985).
Sleep and Affective Disorders among Physicians
The effect of work content and duration on young physicians’ private lives, and the attendant risk of depression, has been noted. Valko and Clayton (1975) found that 30% of young residents suffered a bout of depression lasting an average of five months during their first year of residency. Of the 53 residents studied, four had suicidal thoughts and three made concrete suicide plans. Similar rates of depression have been reported by Reuben (1985) and Clark et al. (1984).
In a questionnaire study, Friedman, Kornfeld and Bigger (1971) showed that interns suffering from sleep deprivation reported more sadness, selfishness and modification of their social life than did more-rested interns. During interviews following the tests, interns suffering from sleep deprivation reported symptoms such as difficulty reasoning, depression, irritability, depersonalization, inappropriate reactions and short-term memory deficits.
In a one-year longitudinal study, Ford and Wentz (1984) evaluated 27 interns four times during their internship. During this period, four interns suffered at least one major bout of depression meeting standard criteria and 11 others reported clinical depression. Anger, fatigue and mood swings increased throughout the year and were inversely correlated with the amount of sleep the preceding week.
A literature review has identified six studies in which interns having spent one sleepless night exhibited deteriorations of mood, motivation and reasoning ability and increased fatigue and anxiety (Samkoff and Jacques 1991).
Devienne et al. (1995) interviewed a stratified sample of 220 general practitioners in the Paris area. Of these, 70 were on call at night. Most of the on-call physicians reported having had their sleep disturbed while on call and finding it particularly difficult to get back to sleep after having been awakened (men: 65%; women: 88%). Waking up in the middle of the night for reasons unrelated to service calls was reported by 22% of men and 44% of women. Having or almost having a car accident due to sleepiness related to being on call was reported by 15% of men and 19% of women. This risk was greater among physicians who were on call more than four times per month (30%) than in those on call three or four times per month (22%) or one to three times per month (10%). The day after being on call, 69% of women and 46% of men reported having difficulty concentrating and feeling less effective, while 37% of men and 31% of women reported experiencing mood swings. Accumulated sleep deficits were not recovered the day following on-call work.
Family and Social Life
A survey of 848 night-shift nurses found that over the previous month one-quarter had not gone out and had entertained no guests, and half had participated in such activities only once (Gadbois 1981). One-third reported refusing an invitation because of fatigue, and two-thirds reported going out only once, with this proportion rising to 80% among mothers.
Kurumatani et al. (1994) reviewed the time sheets of 239 Japanese nurses working rotating shifts over a total of 1,016 days and found that nurses with young children slept less and spent less time on leisure activities than did nurses without young children.
Estryn-Béhar et al. (1989b) observed that women were significantly less likely than men to spend at least one hour per week participating in team or individual sports (48% of qualified women, 29% of unqualified women, 65% of qualified men and 61% of unqualified men). Women were also less likely to frequently (at least four times per month) attend shows (13% of qualified women, 6% of unqualified women, 20% of qualified men and 13% of unqualified men). On the other hand, similar proportions of women and men practised home-based activities such as watching television and reading. Multivariate analysis showed that men with no children were twice as likely to spend at least one hour per week on athletic activities than were comparable women. This gap increases with the number of children. Child care, and not gender, influences reading habits. A significant proportion of the subjects in this study were single parents. This was very rare among qualified men (1%), less rare among unqualified men (4.5%), common in qualified women (9%) and extremely frequent in unqualified women (24.5%).
In Escribà Agüir’s (1992) study of Spanish hospital workers, incompatibility of rotating shifts with social and family life was the leading source of dissatisfaction. In addition, night-shift work (either permanent or rotating) disturbed the synchronization of their schedules with those of their spouses.
Lack of free time interferes severely with the private life of interns and residents. Landau et al. (1986) found that 40% of residents reported major conjugal problems. Of these residents, 72% attributed the problems to their work. McCall (1988) noted that residents have little time to spend on their personal relationships; this problem is particularly serious for women nearing the end of their low-risk-pregnancy years.
Irregular Shift Work and Pregnancy
Axelsson, Rylander and Molin (1989) distributed a questionnaire to 807 women employed at the hospital in Mölna, Sweden. The birth weights of children born to non-smoking women working irregular shifts were significantly lower than that of children born to non-smoking women who only worked day shifts. The difference was greatest for infants of at least grade 2 (3,489 g versus 3,793 g). Similar differences were also found for infants of at least grade 2 born to women working afternoon shifts (3,073 g) and shifts alternating every 24 hours (3,481 g).
Vigilance and Quality of Work among Night-Shift Nurses
Englade, Badet and Becque (1994) performed Holter EEGs on two groups of nine nurses. It showed that the group not allowed to sleep had attention deficits characterized by sleepiness, and in some cases even sleep of which they were unaware. An experimental group practised polyphasic sleep in an attempt to recover a little sleep during work hours, while the control group was not allowed any sleep recovery.
These results are similar to those reported by a survey of 760 California nurses (Lee 1992), in which 4.0% of night-shift nurses and 4.3% of nurses working rotating shifts reported suffering frequent attention deficits; no nurses from the other shifts mentioned lack of vigilance as a problem. Occasional attention deficits were reported by 48.9% of night-shift nurses, 39.2% of rotating-shift nurses, 18.5% of day-shift nurses and 17.5% of evening-shift nurses. Struggling to stay awake while dispensing care during the month preceding the survey was reported by 19.3% of night-shift and rotating-shift nurses, compared to 3.8% of day- and evening-shift nurses. Similarly, 44% of nurses reported having had to struggle to stay awake while driving during the preceding month, compared to 19% of day-shift nurses and 25% of evening-shift nurses.
Smith et al. (1979) studied 1,228 nurses in 12 American hospitals. The incidence of occupational accidents was 23.3 for nurses working rotating shifts, 18.0 for night-shift nurses, 16.8 for day-shift nurses and 15.7 for afternoon-shift nurses.
In an attempt to better characterize problems related to attention deficits among night-shift nurses, Blanchard et al. (1992) observed activity and incidents throughout a series of night shifts. Six wards, ranging from intensive care to chronic care, were studied. In each ward, one continuous observation of a nurse was performed on the second night (of night work) and two observations on the third or fourth nights (depending on the wards’ schedule). Incidents were not associated with serious outcomes. On the second night, the number of incidents rose from 8 in the first half of the night to 18 in the second half. On the third or fourth night, the increase was from 13 to 33 in one case and from 11 to 35 in another. The authors emphasized the role of sleep breaks in limiting risks.
Gold et al. (1992) collected information from 635 Massachusetts nurses on the frequency and consequences of attention deficits. Experiencing at least one episode of sleepiness at work per week was reported by 35.5% of rotating-shift nurses with night work, 32.4% of night-shift nurses and 20.7% of morning-shift and afternoon-shift nurses working exceptionally at night. Less than 3% of nurses working the morning and afternoon shifts reported such incidents.
The odds ratio for sleepiness while driving to and from work was 3.9 for rotating-shift nurses with night work and 3.6 for night-shift nurses, compared to morning- and afternoon-shift nurses. The odds ratio for total accidents and errors over the past year (car accidents driving to and from work, errors in medication or work procedures, occupational accidents related to sleepiness) was almost 2.00 for rotating-shift nurses with night work compared to morning- and afternoon-shift nurses.
Effect of Fatigue and Sleepiness on the Performance of Physicians
Several studies have shown that the fatigue and sleeplessness induced by night-shift and on-call work leads to deteriorations of physician performance.
Wilkinson, Tyler and Varey (1975) conducted a postal questionnaire survey of 6,500 British hospital physicians. Of the 2,452 who responded, 37% reported suffering a degradation of their effectiveness due to excessively long work hours. In response to open-ended questions, 141 residents reported committing errors due to overwork and lack of sleep. In a study performed in Ontario, Canada, 70% of 1,806 hospital physicians reported often worrying about the effect of the quantity of their work had on its quality (Lewittes and Marshall 1989). More specifically, 6% of the sample—and 10% of interns—reported often worrying about fatigue affecting the quality of care they dispensed.
Given the difficulty in performing real-time evaluations of clinical performance, several studies on the effects of sleep deprivation on physicians have relied upon neuropsychological tests.
In the majority of studies reviewed by Samkoff and Jacques (1991), residents deprived of sleep for one night exhibited little deterioration in their performance of rapid tests of manual dexterity, reaction time and memory. Fourteen of these studies used extensive test batteries. According to five tests, the effect on performance was ambiguous; according to six, a performance deficit was observed; but according to eight other tests, no deficit was observed.
Rubin et al. (1991) tested 63 medical-ward residents before and after an on-call period of 36 hours and a subsequent full day of work, using a battery of self-administered computerized behavioural tests. Physicians tested after being on call exhibited significant performance deficits in tests of visual attention, coding speed and accuracy and short-term memory. The duration of sleep enjoyed by the residents while on call was as follows: two hours at most in 27 subjects, four hours at most in 29 subjects, six hours at most in four subjects and seven hours in three subjects. Lurie et al. (1989) reported similarly brief sleep durations.
Virtually no difference has been observed in the performance of actual or simulated short-duration clinical tasks—including filling out a laboratory requisition (Poulton et al. 1978; Reznick and Folse 1987), simulated suturing (Reznick and Folse 1987), endotracheal intubation (Storer et al. 1989) and venous and arterial catheterization (Storer et al. 1989)—by sleep-deprived and control groups. The only difference observed was a slight lengthening of the time required by sleep-deprived residents to perform arterial catheterization.
On the other hand, several studies have demonstrated significant differences for tasks requiring continuous vigilance or intense concentration. For example, sleep-deprived interns committed twice as many errors when reading 20-minute ECGs as did rested interns (Friedman et al. 1971). Two studies, one relying on 50-minute VDU-based simulations (Beatty, Ahern and Katz 1977), the other on 30-minute video simulations (Denisco, Drummond and Gravenstein 1987), have reported poorer performance by anaesthesiologists deprived of sleep for one night. Another study has reported significantly poorer performance by sleep-deprived residents on a four-hour test exam (Jacques, Lynch and Samkoff 1990). Goldman, McDonough and Rosemond (1972) used closed-circuit filming to study 33 surgical procedures. Surgeons with less than two hours of sleep were reported to perform “worse” than more-rested surgeons. The duration of surgical inefficiency or indecision (i.e., of poorly planned manoeuvres) was over 30% of the total duration of the operation.
Bertram (1988) examined the charts of emergency admissions by second-year residents over a one-month period. For a given diagnosis, less information on medical histories and the results of clinical examinations was gathered as the number of hours worked and patients seen increased.
Smith-Coggins et al. (1994) analysed the EEG, mood, cognitive performance and motor performance of six emergency-ward physicians over two 24-hour periods, one with diurnal work and nocturnal sleep, the other with nocturnal work and diurnal sleep.
Physicians working at night slept significantly less (328.5 versus 496.6 minutes) and performed significantly less well. This poorer motor performance was reflected in the increased time required to perform a simulated intubation (42.2 versus 31.56 seconds) and an increased number of protocol errors.
Their cognitive performance was evaluated at five test periods throughout their shift. For each test, physicians were required to review four charts drawn from a pool of 40, rank them and list the initial procedures, the treatments and the appropriate laboratory tests. Performance deteriorated as the shift progressed for both night-shift and day-shift physicians. Night-shift physicians were less successful at providing correct responses than day-shift physicians.
Physicians working during the day rated themselves as less sleepy, more satisfied and more lucid than did night-shift physicians.
Recommendations in English-speaking countries concerning the work schedules of physicians-in-training have tended to take these results into account and now call for work-weeks of at most 70 hours and the provision of recovery periods following on-call work. In the US, following the death of a patient attributed to errors by an overworked, poorly supervised resident physician which received much media attention, New York State enacted legislation limiting work hours for hospital staff physicians and defining the role of attending physicians in supervising their activities.
Content of Night Work in Hospitals
Night work has long been undervalued. In France, nurses used to be seen as guardians, a term rooted in a vision of nurses’ work as the mere monitoring of sleeping patients, with no delivery of care. The inaccuracy of this vision became increasingly obvious as the length of hospitalization decreased and patients’ uncertainty about their hospitalization increased. Hospital stays require frequent technical interventions during the night, precisely when the nurse:patient ratio is lowest.
The importance of the amount of time spent by nurses in patients’ rooms is demonstrated by the results of a study based on continuous observation of the ergonomics of nurses’ work in each of three shifts in ten wards (Estryn-Béhar and Bonnet 1992). The time spent in rooms accounted for an average of 27% of the day and night shifts and 30% of the afternoon shift. In four of the ten wards, nurses spent more time in the rooms during the night than during the day. Blood samples were of course taken less frequently during the night, but other technical interventions such as monitoring vital signs and medication, and administering, adjusting and monitoring intravenous drips and transfusions were more frequent during the night in six of seven wards where detailed analysis was performed. The total number of technical and non-technical direct-care interventions was higher during the night in six of seven wards.
Nurses’ work postures varied from shift to shift. The percentage of time spent seated (preparation, writing, consultations, time spent with patients, breaks) was higher at night in seven of ten wards, and exceeded 40% of shift time in six wards. However, the time spent in painful postures (bent over, crouched, arms extended, carrying loads) exceeded 10% of shift time in all wards and 20% of shift time in six wards at night; in five wards the percentage of time spent in painful positions was higher at night. In fact, night-shift nurses also make beds and perform tasks related to hygiene, comfort and voiding, tasks which are all normally performed by nurses’ aides during the day.
Night-shift nurses may be obliged to change location very frequently. Night-shift nurses in all the wards changed location over 100 times per shift; in six wards, the number of changes of location was higher at night. However, because rounds were scheduled at 00:00, 02:00, 04:00 and 06:00, nurses did not travel greater distances, except in juvenile intensive-care wards. Nonetheless, nurses walked over six kilometres in three of the seven wards where podometry was performed.
Conversations with patients were frequent at night, exceeding 30 per shift in all wards; in five wards these conversations were more frequent at night. Conversations with physicians were much rarer and almost always brief.
Leslie et al. (1990) conducted continuous observation of 12 of 16 interns in the medical ward of a 340-bed Edinburgh (Scotland) hospital over 15 consecutive winter days. Each ward cared for approximately 60 patients. In all, 22 day shifts (08:00 to 18:00) and 18 on-call shifts (18:00 to 08:00), equivalent to 472 hours of work, were observed. The nominal duration of the interns’ work week was 83 to 101 hours, depending on whether or not they were on call during the weekends. However, in addition to the official work schedule, each intern also spent an average of 7.3 hours each week on miscellaneous hospital activities. Information on the time spent performing each of 17 activities, on a minute-by-minute basis, was collected by trained observers assigned to each intern.
The longest continuous work period observed was 58 hours (08:00 Saturday to 06:00 Monday) and the longest work period was 60.5 hours. Calculations showed that a one-week sickness leave of one intern would require the other two interns in the ward to increase their workload by 20 hours.
In practice, in wards admitting patients during on-call shifts, interns working consecutive day, on-call and night shifts worked all but 4.6 of the 34 elapsed hours. These 4.6 hours were devoted to meals and rest, but interns remained on call and available during this time. In wards that did not admit new patients during on-call shifts, interns’ workload abated only after midnight.
Due to the on-call schedules in other wards, interns spent approximately 25 minutes outside their home ward each shift. On average, they walked 3 kilometres and spent 85 minutes (32 to 171 minutes) in other wards each night shift.
Time spent filling out requests for examinations and charts, in addition, is often performed outside of their normal work hours. Non-systematic observation of this additional work over several days revealed that it accounts for approximately 40 minutes of additional work at the end of each shift (18:00).
During the day, 51 to 71% of interns’ time was spent on patient-oriented duties, compared to 20 to 50% at night. Another study, conducted in the United States, reported that 15 to 26% of work time was spent on patient-oriented duties (Lurie et al. 1989).
The study concluded that more interns were needed and that interns should no longer be required to attend other wards while on call. Three additional interns were hired. This reduced interns’ work week to an average of 72 hours, with no work, excepting on-call shifts, after 18:00. Interns also obtained a free half-day following an on-call shift and preceding a weekend when they were to be on call. Two secretaries were hired on a trial basis by two wards. Working 10 hours per week, the secretaries were able to fill out 700 to 750 documents per ward. In the opinion of both senior physicians and nurses, this resulted in more efficient rounds, since all the information had been entered correctly.
Health care workers (HCWs) confront numerous physical hazards.
Electrical Hazards
Failure to meet standards for electrical equipment and its use is the most frequently cited violation in all industries. In hospitals, electrical malfunctions are the second leading cause of fires. Additionally, hospitals require that a wide variety of electrical equipment be used in hazardous environments (i.e., in wet or damp locations or adjacent to flammables or combustibles).
Recognition of these facts and the danger they may pose to patients has led most hospitals to put great effort into electrical safety promotion in patient-care areas. However, non-patient areas are sometimes neglected and employee- or hospital-owned appliances may be found with:
Prevention and control
It is critical that all electrical installations be in accordance with prescribed safety standards and regulations. Measures that can be taken to prevent fires and avoid shocks to employees include the following:
Employees should be instructed:
Heat
Although heat-related health effects on hospital workers can include heat stroke, exhaustion, cramps and fainting, these are rare. More common are the milder effects of increased fatigue, discomfort and inability to concentrate. These are important because they may increase the risk of accidents.
Heat exposure can be measured with wet bulb and globe thermometers, expressed as the Wet Bulb Globe Temperature (WBGT) Index, which combines the effects of radiant heat and humidity with the dry bulb temperature. This testing should only be done by a skilled individual.
The boiler room, laundry and kitchen are the most common high-temperature environments in the hospital. However, in old buildings with inadequate ventilation and cooling systems heat may be a problem in many locations in summer months. Heat exposure may also be a problem where ambient temperatures are elevated and health care personnel are required to wear occlusive gowns, caps, masks and gloves.
Prevention and control
Although it may be impossible to keep some hospital settings at a comfortable temperature, there are measures to keep temperatures at acceptable levels and to ameliorate the effects of heat upon workers, including:
Noise
Exposure to high levels of noise in the workplace is a common job hazard. The “quiet” image of hospitals notwithstanding, they can be noisy places to work.
Exposure to loud noises can cause a loss in hearing acuity. Short-term exposure to loud noises can cause a decrease in hearing called a “temporary threshold shift” (TTS). While these TTSs can be reversed with sufficient rest from high noise levels, the nerve damage resulting from long-term exposure to loud noises cannot.
The US Occupational Safety and Health Administration (OSHA) has set 90 dBA as the permissible limit per 8 hours of work. For 8-hour average exposures in excess of 85 dBA, a hearing conservation programme is mandated. (Sound level meters, the basic noise measuring instrument, are provided with three weighting networks. OSHA standards use the A scale, expressed as dBA.)
The effects of noise at the 70-dB level are reported by the National Institute of Environmental Health Sciences to be:
Food service areas, laboratories, engineering areas (which usually includes the boiler room), business office and medical records and nursing units can be so noisy that productivity is reduced. Other departments where noise levels are sometimes quite high are laundries, print shops and construction areas.
Prevention and control
If a noise survey of the facility shows that employees’ noise exposure is in excess of the OSHA standard, a noise abatement programme is required. Such a programme should include:
In addition to abatement measures, a hearing conservation programme should be established that provides for:
Inadequate Ventilation
The specific ventilation requirements for various types of equipment are engineering matters and will not be discussed here. However, both old and new facilities present general ventilation problems that warrant mentioning.
In older facilities built before central heating and cooling systems were common, ventilation problems must often be solved on a location-by-location basis. Frequently, the problem rests in achieving uniform temperatures and correct circulation.
In newer facilities that are hermetically sealed, a phenomenon called “tight-building syndrome” or “sick building syndrome” is sometimes experienced. When the circulation system does not exchange the air rapidly enough, concentrations of irritants may build up to the extent that employees may experience such reactions as sore throat, runny nose and watery eyes. This situation can provoke severe reaction in sensitized individuals. It can be exacerbated by various chemicals emitted from such sources as foam insulation, carpeting, adhesives and cleaning agents.
Prevention and control
While careful attention is paid to ventilation in sensitive areas such as surgical suites, less attention is given to general-purpose areas. It is important to alert employees to report irritant reactions that appear only in the workplace. If local air quality cannot be improved with venting, it may be necessary to transfer individuals who have become sensitized to some irritant in their workstation.
Laser Smoke
During surgical procedures using a laser or electrosurgical unit, the thermal destruction of tissue creates smoke as a by-product. NIOSH has confirmed studies showing that this smoke plume can contain toxic gases and vapours such as benzene, hydrogen cyanide and formaldehyde, bioaerosols, dead and live cellular material (including blood fragments) and viruses. At high concentrations, the smoke causes ocular and upper respiratory tract irritation in health care personnel and may create visual problems for the surgeon. The smoke has an unpleasant odour and has been shown to have mutagenic material.
Prevention and control
Exposure to airborne contaminants in such smoke can be effectively controlled by proper ventilation of the treatment room, supplemented by local exhaust ventilation (LEV) using a high-efficiency suction unit (i.e., a vacuum pump with an inlet nozzle held within 2 inches of the surgical site) that is activated throughout the procedure. Both the room ventilation system and the local exhaust ventilator should be equipped with filters and absorbers that capture particulates and absorb or inactivate airborne gases and vapours. These filters and absorbers require monitoring and replacement on a regular basis and are considered a possible biohazard requiring proper disposal.
Radiation
Ionizing radiation
When ionizing radiation strikes cells in living tissue, it may either kill the cell directly (i.e., cause burns or hair loss) or it may alter the genetic material of the cell (i.e., cause cancer or reproductive damage). Standards involving ionizing radiation may refer to exposure (the amount of radiation the body is exposed to) or dose (the amount of radiation the body absorbs) and may be expressed in terms of millirem (mrem), the usual measure of radiation, or rems (1,000 millirems).
Various jurisdictions have developed regulations governing the procurement, use, transportation and disposal of radioactive materials, as well as established limits for exposure (and in some places specific limits for dosage to various parts of the body), providing a strong measure of protection for radiation workers. In addition, institutions using radioactive materials in treatment and research generally develop their own internal controls in addition to those prescribed by law.
The greatest dangers to hospital workers are from scatter, the small amount of radiation that is deflected or reflected from the beam into the immediate vicinity, and from unexpected exposure, either because they are inadvertently exposed in an area not defined as a radiation area or because the equipment is not well maintained.
Radiation workers in diagnostic radiology (including x ray, fluoroscopy and angiography for diagnostic purposes, dental radiography and computerized axial tomography (CAT) scanners), in therapeutic radiology, in nuclear medicine for diagnostic and therapeutic procedures, and in radiopharmaceutical laboratories are carefully followed and checked for exposure, and radiation safety is usually well managed in their workstations, although there are many localities in which control is inadequate.
There are other areas not usually designated as “radiation areas”, where careful monitoring is needed to ensure that appropriate precautions are being taken by staff and that correct safeguards are provided for patients who might be exposed. These include angiography, emergency rooms, intensive care units, locations where portable x rays are being taken and operating rooms.
Prevention and control
The following protective measures are strongly recommended for ionizing radiation (x rays and radioisotopes):
Lead aprons, gloves and goggles must be worn by employees working in the direct field or where scatter radiation levels are high. All such protective equipment should be checked annually for cracks in the lead.
Dosimeters must be worn by all personnel exposed to ionizing radiation sources. Dosimeter badges should be regularly analysed by a laboratory with good quality control, and the results should be recorded. Records must be kept not only of each employee’s personal radiation exposure but also of the receipt and disposition of all radioisotopes.
In therapeutic radiology settings, periodic dose checks should be done using lithium fluoride (LiF) solid-state dosimeters to check on system calibration. Treatment rooms should be equipped with radiation monitor-door interlock and visual-alarm systems.
During internal or intravenous treatment with radioactive sources, the patient should be housed in a room located to minimize exposure to other patients and staff and signs posted warning others not to enter. Staff contact time should be limited, and staff should be careful in handling bedding, dressings and wastes from these patients.
During fluoroscopy and angiography, the following measures can minimize unnecessary exposure:
Full protective equipment should also be used by operating-room personnel during radiation procedures, and, when possible, personnel should stand 2 m or more from the patient.
Non-ionizing radiation
Ultraviolet radiation, lasers and microwaves are non-ionizing radiation sources. They are generally far less hazardous than ionizing radiation but nevertheless require special care to prevent injury.
Ultraviolet radiation is used in germicidal lamps, in certain dermatology treatments and in air filters in some hospitals. It is also produced in welding operations. Exposure of the skin to ultraviolet light causes sunburn, ages the skin and increases the risk of skin cancer. Eye exposure can result in temporary but extremely painful conjunctivitis. Long-term exposure can lead to partial loss of vision.
Standards regarding exposure to ultraviolet radiation are not widely applicable. The best approach to prevention is education and wearing shaded protective eyeglasses.
The Bureau of Radiological Health of the US Food and Drug Administration regulates lasers and classifies them into four classes, I to IV. The laser used to position patients in radiology is considered Class I and represents minimal risk. Surgical lasers, however, can pose a significant hazard to the retina of the eye where the intense beam can cause total loss of vision. Because of the high voltage supply required, all lasers present the risk of electrical shock. The accidental reflection of the laser beam during surgical procedures can result in injury to the staff. Guidelines for laser use have been developed by the American National Standards Institute and the US Army; for example, laser users should wear protective goggles specifically designed for each type of laser and take care not to focus the beam on reflecting surfaces.
The primary concern regarding exposure to microwaves, which are used in hospitals chiefly for cooking and heating food and for diathermy treatments, is the heating effect they have on the body. The eye lens and gonads, having fewer vessels with which to remove heat, are most vulnerable to damage. The long-term effects of low-level exposure have not been established, but there is some evidence that nervous system effects, decreased sperm count, sperm malformations (at least partially reversible after exposure ceases) and cataracts may result.
Prevention and control
The OSHA standard for exposure to microwaves is 10 milliwatts per square centimetre (10 mW/cm). This is the level established to protect against the thermal effects of microwaves. In other countries where levels have been established to protect against reproductive and nervous system damage, the standards are as much as two orders of magnitude lower, that is, 0.01 mW/cm2 at 1.2 m.
To ensure the safety of workers, microwave ovens should be kept clean to protect the integrity of the door seals and should be checked for leakage at least every three months. Leakage from diathermy equipment should be monitored in the vicinity of the therapist before each treatment.
Hospital workers should be aware of the radiation hazards of ultraviolet exposure and of infrared heat used for therapy. They should have appropriate eye protection when using or repairing ultraviolet equipment, such as germicidal lamps and air purifiers or infrared instruments and equipment.
Conclusion
Physical agents represent an important class of hazards to workers in hospitals, clinics and private offices where diagnostic and therapeutic procedures are performed. These agents are discussed in more detail elsewhere in this Encyclopaedia. Their control requires education and training of all health professionals and support staff who may be involved and constant vigilance and systemic monitoring of both the equipment and the way it is used.
Several countries have established recommended noise, temperature and lighting levels for hospitals. These recommendations are, however, rarely included in the specifications given to hospital designers. Further, the few studies examining these variables have reported disquieting levels.
Noise
In hospitals, it is important to distinguish between machine-generated noise capable of impairing hearing (above 85 dBA) and noise which is associated with a degradation of ambiance, administrative work and care (65 to 85 dBA).
Machine-generated noise capable of impairing hearing
Prior to the 1980s, a few publications had already drawn attention to this problem. Van Wagoner and Maguire (1977) evaluated the incidence of hearing loss among 100 employees in an urban hospital in Canada. They identified five zones in which noise levels were between 85 and 115 dBA: the electrical plant, laundry, dish-washing station and printing department and areas where maintenance workers used hand or power tools. Hearing loss was observed in 48% of the 50 workers active in these noisy areas, compared to 6% of workers active in quieter areas.
Yassi et al. (1992) conducted a preliminary survey to identify zones with dangerously high noise levels in a large Canadian hospital. Integrated dosimetry and mapping were subsequently used to study these high-risk areas in detail. Noise levels exceeding 80 dBA were common. The laundry, central processing, nutrition department, rehabilitation unit, stores and electrical plant were all studied in detail. Integrated dosimetry revealed levels of up to 110 dBA at some of these locations.
Noise levels in a Spanish hospital’s laundry exceeded 85 dBA at all workstations and reached 97 dBA in some zones (Montoliu et al. 1992). Noise levels of 85 to 94 dBA were measured at some workstations in a French hospital’s laundry (Cabal et al. 1986). Although machine re-engineering reduced the noise generated by pressing machines to 78 dBA, this process was not applicable to other machines, due to their inherent design.
A study in the United States reported that electrical surgical instruments generate noise levels of 90 to 100 dBA (Willet 1991). In the same study, 11 of 24 orthopaedic surgeons were reported to suffer from significant hearing loss. The need for better instrument design was emphasized. Vacuum and monitor alarms have been reported to generate noise levels of up to 108 dBA (Hodge and Thompson 1990).
Noise associated with a degradation of ambiance, administrative work and care
A systematic review of noise levels in six Egyptian hospitals revealed the presence of excessive levels in offices, waiting rooms and corridors (Noweir and al-Jiffry 1991). This was attributed to the characteristics of hospital construction and of some of the machines. The authors recommended the use of more appropriate building materials and equipment and the implementation of good maintenance practices.
Work in the first computerized facilities was hindered by the poor quality of printers and the inadequate acoustics of offices. In the Paris region, groups of cashiers talked to their clients and processed invoices and payments in a crowded room whose low plaster ceiling had no acoustic absorption capacity. Noise levels with only one printer active (in practice, all four usually were) were 78 dBA for payments and 82 dBA for invoices.
In a 1992 study of a rehabilitation gymnasium consisting of 8 cardiac rehabilitation bicycles surrounded by four private patient areas, noise levels of 75 to 80 dBA and 65 to 75 dBA were measured near cardiac rehabilitation bicycles and in the neighbouring kinesiology area, respectively. Levels such as these render personalized care difficult.
Shapiro and Berland (1972) viewed noise in operating theatres as the “third pollution”, since it increases the fatigue of the surgeons, exerts physiological and psychological effects and influences the accuracy of movements. Noise levels were measured during a cholecystectomy and during tubal ligation. Irritating noises were associated with the opening of a package of gloves (86 dBA), the installation of a platform on the floor (85 dBA), platform adjustment (75 to 80 dBA), placing surgical instruments upon each other (80 dBA), suctioning of trachea of patient (78 dBA), continuous suction bottle (75 to 85 dBA) and the heels of nurses’ shoes (68 dBA). The authors recommended the use of heat-resistant plastic, less noisy instruments and, to minimize reverberation, easily cleaned materials other than ceramic or glass for walls, tiles and ceilings.
Noise levels of 51 to 82 dBA and 54 to 73 dBA have been measured in the centrifuge room and automated analyser room of a medical analytical laboratory. The Leq (reflecting full-shift exposure) at the control station was 70.44 dBA, with 3 hours over 70 dBA. At the technical station, the Leq was 72.63 dBA, with 7 hours over 70 dBA. The following improvements were recommended: installing telephones with adjustable ring levels, grouping centrifuges in a closed room, moving photocopiers and printers and installing hutches around the printers.
Patient Care and Comfort
In several countries, recommended noise limits for care units are 35 dBA at night and 40 dBA during the day (Turner, King and Craddock 1975). Falk and Woods (1973) were the first to draw attention to this point, in their study of noise levels and sources in neonatology incubators, recovery rooms and two rooms in an intensive-care unit. The following mean levels were measured over a 24-hour period: 57.7 dBA (74.5 dB) in the incubators, 65.5 dBA (80 dB linear) at the head of patients in the recovery room, 60.1 dBA (73.3 dB) in the intensive care unit and 55.8 dBA (68.1 dB) in one patient room. Noise levels in the recovery room and intensive-care unit were correlated with the number of nurses. The authors emphasized the probable stimulation of patients’ hypophyseal-corticoadrenal system by these noise levels, and the resultant increase in peripheral vasoconstriction. There was also some concern about the hearing of patients receiving aminoglycoside antibiotics. These noise levels were considered incompatible with sleep.
Several studies, most of which have been conducted by nurses, have shown that noise control improves patient recovery and quality of life. Reports of research conducted in neonatology wards caring for low-birth-weight babies emphasized the need to reduce the noise caused by personnel, equipment and radiology activities (Green 1992; Wahlen 1992; Williams and Murphy 1991; Oëler 1993; Lotas 1992; Halm and Alpen 1993). Halm and Alpen (1993) have studied the relationship between noise levels in intensive-care units and the psychological well-being of patients and their families (and in extreme cases, even of post-resuscitation psychosis). The effect of ambient noise on the quality of sleep has been rigorously evaluated under experimental conditions (Topf 1992). In intensive care units, the playing of pre-recorded sounds was associated with a deterioration of several sleep parameters.
A multi-ward study reported peak noise levels at the head of patients in excess of 80 dBA, especially in intensive- and respiratory-care units (Meyer et al. 1994). Lighting and noise levels were recorded continuously over seven consecutive days in a medical intensive-care unit, one-bed and multi-bed rooms in a respiratory-care unit and a private room. Noise levels were very high in all cases. The number of peaks exceeding 80 dBA was particularly high in the intensive- and respiratory-care units, with a maximum observed between 12:00 and 18:00 and a minimum between 00:00 and 06:00. Sleep deprivation and fragmentation were considered to have a negative impact on the respiratory system of patients and impair the weaning of patients from mechanical ventilation.
Blanpain and Estryn-Béhar (1990) found few noisy machines such as waxers, ice machines and hotplates in their study of ten Paris-area wards. However, the size and surfaces of the rooms could either reduce or amplify the noise generated by these machines, as well as that (albeit lower) generated by passing cars, ventilation systems and alarms. Noise levels in excess of 45 dBA (observed in 7 of 10 wards) did not promote patient rest. Furthermore, noise disturbed hospital personnel performing very precise tasks requiring close attention. In five of 10 wards, noise levels at the nursing station reached 65 dBA; in two wards, levels of 73 dBA were measured. Levels in excess of 65 dBA were measured in three pantries.
In some cases, architectural decorative effects were instituted with no thought to their effect on acoustics. For example, glass walls and ceilings have been in fashion since the 1970s and have been used in patient admission open-space offices. The resultant noise levels do not contribute to the creation of a calm environment in which patients about to enter the hospital can fill out forms. Fountains in this type of hall generated a background noise level of 73 dBA at the reception desk, requiring receptionists to ask one-third of people requesting information to repeat themselves.
Heat stress
Costa, Trinco and Schallenberg (1992) studied the effect of installing a laminar flow system, which maintained air sterility, on heat stress in an orthopaedic operating theatre. Temperature in the operating theatre increased by approximately 3 °C on average and could reach 30.2 °C. This was associated with a deterioration of the thermal comfort of operating-room personnel, who must wear very bulky clothes that favour heat retention.
Cabal et al. (1986) analysed heat stress in a hospital laundry in central France prior to its renovation. They noted that the relative humidity at the hottest workstation, the “gown-dummy”, was 30%, and radiant temperature reached 41 °C. Following installation of double-pane glass and reflective outside walls, and implementation of 10 to 15 air changes per hour, thermal comfort parameters fell within standard levels at all workstations, regardless of the weather outside. A study of a Spanish hospital laundry has shown that high wet-bulb temperatures result in oppressive work environments, especially in ironing areas, where temperatures may exceed 30 °C (Montoliu et al. 1992).
Blanpain and Estryn-Béhar (1990) characterized the physical work environment in ten wards whose work content they had already studied. Temperature was measured twice in each of ten wards. The nocturnal temperature in patient rooms may be below 22 °C, as patients use covers. During the day, as long as patients are relatively inactive, a temperature of 24 °C is acceptable but should not be exceeded, since some nursing interventions require significant exertion.
The following temperatures were observed between 07:00 and 07:30: 21.5 °C in geriatric wards, 26 °C in a non-sterile room in the haematology ward. At 14:30 on a sunny day, the temperatures were as follows: 23.5 °C in the emergency room and 29 °C in the haematology ward. Afternoon temperatures exceeded 24 °C in 9 of 19 cases. The relative humidity in four out of five wards with general air-conditioning was below 45% and was below 35% in two wards.
Afternoon temperature also exceeded 22 °C at all nine care preparation stations and 26 °C at three care stations. The relative humidity was below 45% in all five stations of wards with air-conditioning. In the pantries, temperatures ranged between 18 °C and 28.5 °C.
Temperatures of 22 °C to 25 °C were measured at the urine drains, where there were also odour problems and where dirty laundry was sometimes stored. Temperatures of 23 °C to 25 °C were measured in the two dirty-laundry closets; a temperature of 18 °C would be more appropriate.
Complaints concerning thermal comfort were frequent in a survey of 2,892 women working in Paris-area wards (Estryn-Béhar et al. 1989a). Complaints of being often or always hot were reported by 47% of morning- and afternoon-shift nurses and 37% of night-shift nurses. Although nurses were sometimes obliged to perform physically strenuous work, such as making several beds, the temperature in the various rooms was too high to perform these activities comfortably while wearing polyester-cotton clothes, which hinder evaporation, or gowns and masks necessary for the prevention of nosocomial infections.
On the other hand, 46% of night-shift nurses and 26% of morning- and afternoon-shift nurses reported being often or always cold. The proportions reporting never suffering from the cold were 11% and 26%.
To conserve energy, the heating in hospitals was often lowered during the night, when patients are under covers. However nurses, who must remain alert despite chronobiologically mediated drops in core body temperatures, were required to put on jackets (not always very hygienic ones) around 04:00. At the end of the study, some wards installed adjustable space-heating at nursing stations.
Studies of 1,505 women in 26 units conducted by occupational physicians revealed that rhinitis and eye irritation were more frequent among nurses working in air-conditioned rooms (Estryn-Béhar and Poinsignon 1989) and that work in air-conditioned environments was related to an almost twofold increase in dermatoses likely to be occupational in origin (adjusted odds ratio of 2) (Delaporte et al. 1990).
Lighting
Several studies have shown that the importance of good lighting is still underestimated in administrative and general departments of hospitals.
Cabal et al. (1986) observed that lighting levels at half of the workstations in a hospital laundry were no higher than 100 lux. Lighting levels following renovations were 300 lux at all workstations, 800 lux at the darning station and 150 lux between the washing tunnels.
Blanpain and Estryn-Béhar (1990) observed maximum night lighting levels below 500 lux in 9 out of 10 wards. Lighting levels were below 250 lux in five pharmacies with no natural lighting and were below 90 lux in three pharmacies. It should be recalled that the difficulty in reading small lettering on labels experienced by older persons may be mitigated by increasing the level of illumination.
Building orientation can result in high day-time lighting levels that disturb patients’ rest. For example, in geriatric wards, beds furthest from the windows received 1,200 lux, while those nearest the windows received 5,000 lux. The only window shading available in these rooms were solid window blinds and nurses were unable to dispense care in four-bed rooms when these were drawn. In some cases, nurses stuck paper on the windows to provide patients with some relief.
The lighting in some intensive-care units is too intense to allow patients to rest (Meyer et al. 1994). The effect of lighting on patients’ sleep has been studied in neonatology wards by North American and German nurses (Oëler 1993; Boehm and Bollinger 1990).
In one hospital, surgeons disturbed by reflections from white tiles requested the renovation of the operating theatre. Lighting levels outside the shadow-free zone (15,000 to 80,000 lux) were reduced. However, this resulted in levels of only 100 lux at the instrument nurses’ work surface, 50 to 150 lux at the wall unit used for equipment storage, 70 lux at the patients’ head and 150 lux at the anaesthetists’ work surface. To avoid generating glare capable of affecting the accuracy of surgeons’ movements, lamps were installed outside of surgeons’ sight-lines. Rheostats were installed to control lighting levels at the nurses’ work surface between 300 and 1,000 lux and general levels between 100 and 300 lux.
Construction of a hospital with extensive natural lighting
In 1981, planning for the construction of Saint Mary’s Hospital on the Isle of Wight began with a goal of halving energy costs (Burton 1990). The final design called for extensive use of natural lighting and incorporated double-pane windows that could be opened in the summer. Even the operating theatre has an outside view and paediatric wards are located on the ground floor to allow access to play areas. The other wards, on the second and third (top) floors, are equipped with windows and ceiling lighting. This design is quite suitable for temperate climates but may be problematic where ice and snow inhibit overhead lighting or where high temperatures may lead to a significant greenhouse effect.
Architecture and Working Conditions
Flexible design is not multi-functionality
Prevailing concepts from 1945 to 1985, in particular the fear of instant obsolescence, were reflected in the construction of multi-purpose hospitals composed of identical modules (Games and Taton-Braen 1987). In the United Kingdom this trend led to the development of the “Harnes system”, whose first product was the Dudley Hospital, built in 1974. Seventy other hospitals were later built on the same principles. In France, several hospitals were constructed on the “Fontenoy” model.
Building design should not prevent modifications necessitated by the rapid evolution of therapeutic practice and technology. For example, partitions, fluid circulation subsystems and technical duct-work should all be capable of being easily moved. However, this flexibility should not be construed as an endorsement of the goal of complete multi-functionality—a design goal which leads to the construction of facilities poorly suited to any speciality. For example, the surface area needed to store machines, bottles, disposable equipment and medication is different in surgical, cardiology and geriatric wards. Failure to recognize this will lead to rooms being used for purposes they were not designed for (e.g., bathrooms being used for bottle storage).
The Loma Linda Hospital in California (United States) is an example of better hospital design and has been copied elsewhere. Here, nursing and technical medicine departments are located above and below technical floors; this “sandwich” structure permits easy maintenance and adjustment of fluid circulation.
Unfortunately, hospital architecture does not always reflect the needs of those who work there, and multi-functional design has been responsible for reported problems related to physical and cognitive strain. Consider a 30-bed ward composed of one- and two-bed rooms, in which there is only one functional area of each type (nursing station, pantry, storage of disposable materials, linen or medication), all based on the same all-purpose design. In this ward, the management and dispensation of care obliges nurses to change location extremely frequently, and work is greatly fragmented. A comparative study of ten wards has shown that the distance from the nurses’ station to the farthest room is an important determinant of both nurses’ fatigue (a function of the distance walked) and the quality of care (a function of the time spent in patients’ rooms) (Estryn-Béhar and Hakim-Serfaty 1990).
This discrepancy between the architectural design of spaces, corridors and materials, on the one hand, and the realities of hospital work, on the other, has been characterized by Patkin (1992), in a review of Australian hospitals, as an ergonomic “debacle”.
Preliminary analysis of the spatial organization in nursing areas
The first mathematical model of the nature, purposes and frequency of staff movements, based on the Yale Traffic Index, appeared in 1960 and was refined by Lippert in 1971. However, attention to one problem in isolation may in fact aggravate others. For example, locating a nurses’ station in the centre of the building, in order to reduce the distances walked, may worsen working conditions if nurses must spend over 30% of their time in such windowless surroundings, known to be a source of problems related to lighting, ventilation and psychological factors (Estryn-Béhar and Milanini 1992).
The distance of the preparation and storage areas from patients is less problematic in settings with a high staff-patient ratio and where the existence of a centralized preparation area facilitates the delivery of supplies several times per day, even on holidays. In addition, long waits for elevators are less common in high-rise hospitals with over 600 beds, where the number of elevators is not limited by financial constraints.
Research on the design of specific but flexible hospital units
In the United Kingdom in the late 1970s, the Health Ministry created a team of ergonomists to compile a database on ergonomics training and on the ergonomic layout of hospital work areas (Haigh 1992). Noteworthy examples of the success of this programme include the modification of the dimensions of laboratory furniture to take into account the demands of microscopy work and the redesign of maternity rooms to take into account nurses’ work and mothers’ preferences.
Cammock (1981) emphasized the need to provide distinct nursing, public and common areas, with separate entrances for nursing and public areas, and separate connections between these areas and the common area. Furthermore, there should be no direct contact between the public and nursing areas.
The Krankenanstalt Rudolfsstiftung is the first pilot hospital of the “European Healthy Hospitals” project. The Viennese pilot project consists of eight sub-projects, one of which, the “Service Reorganization” project, is an attempt, in collaboration with ergonomists, to promote functional reorganization of available space (Pelikan 1993). For example, all the rooms in an intensive care unit were renovated and rails for patient lifts installed in the ceilings of each room.
A comparative analysis of 90 Dutch hospitals suggests that small units (floors of less than 1,500 m2) are the most efficient, as they allow nurses to tailor their care to the specifics of patients’ occupational therapy and family dynamics (Van Hogdalem 1990). This design also increases the time nurses can spend with patients, since they waste less time in changes of location and are less subject to uncertainty. Finally, the use of small units reduces the number of windowless work areas.
A study carried out in the health administration sector in Sweden reported better employee performance in buildings incorporating individual offices and conference rooms, as opposed to an open plan (Ahlin 1992). The existence in Sweden of an institute dedicated to the study of working conditions in hospitals, and of legislation requiring consultation with employee representatives both before and during all construction or renovation projects, has resulted in the regular recourse to participatory design based on ergonomic training and intervention (Tornquist and Ullmark 1992).
Architectural design based on participatory ergonomics
Workers must be involved in the planning of the behavioural and organizational changes associated with the occupation of a new work space. The adequate organization and equipping of a workplace requires taking into account the organizational elements that require modification or emphasis. Two detailed examples taken from two hospitals illustrate this.
Estryn-Béhar et al. (1994) report the results of the renovation of the common areas of a medical ward and a cardiology ward of the same hospital. The ergonomics of the work performed by each profession in each ward was observed over seven entire workdays and discussed over a two-day period with each group. The groups included representatives of all occupations (department heads, supervisors, interns, nurses, nurses’ aides, orderlies) from all the shifts. One entire day was spent developing architectural and organizational proposals for each problem noted. Two more days were spent on the simulation of characteristic activities by the entire group, in collaboration with an architect and an ergonomist, using modular cardboard mock-ups and scale models of objects and people. Through this simulation, representatives of the various occupations were able to agree on distances and the distribution of space within each ward. Only after this process was concluded was the design specification drawn up.
The same participatory method was used in a cardiac intensive-care unit in another hospital (Estryn-Béhar et al. 1995a, 1995b). It was found that four types of virtually incompatible activities were conducted at the nursing station:
These zones overlapped, and nurses had to cross the meeting-writing-monitoring area to reach the other areas. Because of the position of the furniture, nurses had to change direction three times to get to the drain-board. Patient rooms were laid out along a corridor, both for regular intensive care and highly intensive care. The storage units were located at the far end of the ward from the nursing station.
In the new layout, the station’s longitudinal orientation of functions and traffic is replaced with a lateral one which allows direct and central circulation in a furniture-free area. The meeting-writing-monitoring area is now located at the end of the room, where it offers a calm space near windows, while remaining accessible. The clean and dirty preparation areas are located by the entrance to the room and are separated from each other by a large circulation area. The highly intensive care rooms are large enough to accommodate emergency equipment, a preparation counter and a deep washbasin. A glass wall installed between the preparation areas and the highly intensive care rooms ensures that patients in these rooms are always visible. The main storage area was rationalized and reorganized. Plans are available for each work and storage area.
Architecture, ergonomics and developing countries
These problems are also found in developing countries; in particular, renovations there frequently involve the elimination of common rooms. The performance of ergonomic analysis would identify existing problems and help avoid new ones. For example, the construction of wards comprised of only one- or two-bed rooms increases the distances that personnel must travel. Inadequate attention to staffing levels and the layout of nursing stations, satellite kitchens, satellite pharmacies and storage areas may lead to significant reductions in the amount of time nurses spend with patients and may render work organization more complex.
Furthermore, the application in developing countries of the multi-functional hospital model of developed countries does not take into account different cultures’ attitudes toward space utilization. Manuaba (1992) has pointed out that the layout of developed countries’ hospital rooms and the type of medical equipment used is poorly suited to developing countries, and that the rooms are too small to comfortably accommodate visitors, essential partners in the curative process.
Hygiene and Ergonomics
In hospital settings, many breaches of asepsis can be understood and corrected only by reference to work organization and work space. Effective implementation of the necessary modifications requires detailed ergonomic analysis. This analysis serves to characterize the interdependencies of team tasks, rather than their individual characteristics, and identify discrepancies between real and nominal work, especially nominal work described in official protocols.
Hand-mediated contamination was one of the first targets in the fight against nosocomial infections. In theory, hands should be systemtically washed on entering and leaving patients’ rooms. Although initial and ongoing training of nurses emphasizes the results of descriptive epidemiological studies, research indicates persistent problems associated with hand-washing. In a study conducted in 1987 and involving continuous observation of entire 8-hour shifts in 10 wards, Delaporte et al. (1990) observed an average of 17 hand-washings by morning-shift nurses, 13 by afternoon-shift nurses and 21 by night-shift nurses.
Nurses washed their hands one-half to one-third as often as is recommended for their number of patient contacts (without even considering care-preparation activities); for nurses’ aides, the ratio was one-third to one-fifth. Hand-washing before and after each activity is, however, clearly impossible, in terms of both time and skin damage, given the atomization of activity, number of technical interventions and frequency of interruptions and attendant repetition of care that personnel must cope with. Reduction of work interruptions is thus essential and should take precedence over simply reaffirming the importance of hand-washing, which, in any event, cannot be performed over 25 to 30 times per day.
Similar patterns of hand-washing were found in a study based on observations collected over 14 entire workdays in 1994 during the reorganization of the common areas of two university hospital wards (Estryn-Béhar et al. 1994). In every case, nurses would have been incapable of dispensing the required care if they had returned to the nursing station to wash their hands. In short-term-stay units, for example, almost all the patients have blood samples drawn and subsequently receive oral and intravenous medication at virtually the same time. The density of activities at certain times also renders appropriate hand-washing impossible: in one case, an afternoon-shift nurse responsible for 13 patients in a medical ward entered patients’ rooms 21 times in one hour. Poorly organized information provision and transmission structures contributed to the number of visits he was obliged to perform. Given the impossibility of washing his hands 21 times in one hour, the nurse washed them only when dealing with the most fragile patients (i.e., those suffering from pulmonary failure).
Ergonomically based architectural design takes several factors affecting hand-washing into account, especially those concerning the location and access to wash-basins, but also the implementation of truly functional “dirty” and “clean” circuits. Reduction of interruptions through participatory analysis of organization helps to make hand-washing possible.
Epidemiology
The significance of back pain among instances of disease in developed industrial societies is currently on the rise. According to data provided by the National Center for Health Statistics in the United States, chronic diseases of the back and of the vertebral column make up the dominant group among disorders affecting employable individuals under 45 in the US population. Countries such as Sweden, which have at their disposal traditionally good occupational accident statistics, show that musculoskeletal injuries occur twice as frequently in the health services as in all other fields (Lagerlöf and Broberg 1989).
In an analysis of accident frequency in a 450-bed hospital in the United States, Kaplan and Deyo (1988) were able to demonstrate an 8 to 9% yearly incidence of injury to lumbar vertebrae in nurses, leading on average to 4.7 days of absence from work. Thus of all employee groups in hospitals, nurses were the one most afflicted by this condition.
As is clear from a survey of studies done in the last 20 years (Hofmann and Stössel 1995), this disorder has become the object of intensive epidemiological research. All the same, such research—particularly when it aims at furnishing internationally comparable results—is subject to a variety of methodological difficulties. Sometimes all employee categories in the hospital are investigated, sometimes simply nurses. Some studies have suggested that it would make sense to differentiate, within the group “nurses”, between registered nurses and nursing aides. Since nurses are predominantly women (about 80% in Germany), and since reported incidence and prevalence rates regarding this disorder do not differ significantly for male nurses, gender-related differentiation would seem to be of less importance to epidemiological analyses.
More important is the question of what investigative tools should be used to research back pain conditions and their gradations. Along with the interpretation of accident, compensation and treatment statistics, one frequently finds, in the international literature, a retrospectively applied standardized questionnaire, to be filled out by the person tested. Other investigative approaches operate with clinical investigative procedures such as orthopaedic function studies or radiological screening procedures. Finally, the more recent investigative approaches also use biomechanical modelling and direct or video-taped observation to study the pathophysiology of work performance, particularly as it involves the lumbo-sacral area (see Hagberg et al. 1993 and 1995).
An epidemiological determination of the extent of the problem based on self-reported incidence and prevalence rates, however, poses difficulties as well. Cultural-anthropological studies and comparisons of health systems have shown that perceptions of pain differ not only between members of different societies but also within societies (Payer 1988). Also, there is the difficulty of objectively grading the intensity of pain, a subjective experience. Finally, the prevailing perception among nurses that “back pain goes with the job” leads to under-reporting.
International comparisons based on analyses of governmental statistics on occupational disorders are unreliable for scientific evaluation of this disorder because of variations in the laws and regulations related to occupational disorders among different countries. Further, within a single country, there is the truism that such data are only as reliable as the reports upon which they are based.
In summary, many studies have determined that 60 to 80% of all nursing staff (averaging 30 to 40 years in age) have had at least one episode of back pain during their working lives. The reported incidence rates usually do not exceed 10%. When classifying back pain, it has been helpful to follow the suggestion of Nachemson and Anderson (1982) to distinguish between back pain and back pain with sciatica. In an as-yet unpublished study a subjective complaint of sciatica was found to be useful in classifying the results of subsequent CAT scans (computer assisted tomography) and magnetic resonance imaging (MRI).
Economic Costs
Estimates of the economic costs differ greatly, depending, in part, on the possibilities and conditions of diagnosis, treatment and compensation available at the particular time and/or place. Thus, in the US for 1976, Snook (1988b) estimated that the costs of back pain totalled US$14 billion, while a total cost of US$25 billion was calculated for 1983. The calculations of Holbrook et al. (1984), which estimated 1984 costs to total just under US$16 billion, appear to be most reliable. In the United Kingdom, costs were estimated to have risen by US$2 billion between 1987 and 1989 according to Ernst and Fialka (1994). Estimates of direct and indirect costs for 1990 reported by Cats-Baril and Frymoyer (1991) indicate that the costs of back pain have continued to increase. In 1988 the US Bureau of National Affairs reported that chronic back pain generated costs of US$80,000 per chronic case per year.
In Germany, the two largest workers’ accident insurance funds (Berufsgenossenschaften) developed statistics showing that, in 1987, about 15 million work days were lost because of back pain. This corresponds to roughly one-third of all missed work days annually. These losses appear to be increasing at a current average cost of DM 800 per lost day.
It may therefore be said, independently of national differences and vocational groups, that back disorders and their treatment represent not simply a human and a medical problem, but also an enormous economic burden. Accordingly, it seems advisable to pay special attention to the prevention of these disorders in particularly burdened vocational groups such as nursing.
In principle one should differentiate, in research concerning the causes of work-related disorders of the lower back in nurses, between those attributed to a particular incident or accident and those whose genesis lacks such specificity. Both may give rise to chronic back pain if not properly treated. Reflecting their presumed medical knowledge, nurses are much more prone to use self-medication and self-treatment, without consulting a physician, than other groups in the working population. This is not always a disadvantage, since many physicians either do not know how to treat back problems or give them short shrift, simply prescribing sedatives and advising heat applications to the area. The latter reflects the oft-repeated truism that “backaches come with the job”, or the tendency to regard workers with chronic back complaints as malingerers.
Detailed analyses of work accident occurrences in the area of spinal disorders have only just begun to be made (see Hagberg et al. 1995). This is also true of the analysis of so-called near-accidents, which can provide a particular sort of information concerning the precursor conditions of a given work accident.
The cause of low back disorders has been attributed by the majority of the studies to the physical demands of the work of nursing, i.e., lifting, supporting and moving of patients and handling heavy and/or bulky equipment and materials, often without ergonomic aids or the help of additional personnel. These activities are often conducted in awkward body positions, where footing is uncertain, and when, out of wilfulness or dementia, the nurse’s efforts are resisted by the patient. Trying to keep a patient from falling often results in injury to the nurse or the attendant. Current research, however, is characterized by a strong tendency to speak in terms of multicausality, whereby both the biomechanical basis of demands made upon the body and the anatomical preconditions are discussed.
In addition to faulty biomechanics, injury in such situations can be pre-conditioned by fatigue, muscular weakness (especially of the abdominals, back extensors and quadriceps), diminished flexibility of joints and ligaments and various forms of arthritis. Excessive psychosocial stress can contribute in two ways: (1) prolonged unconscious muscular tension and spasm leading to muscular fatigue and proneness to injury, and (2) irritation and impatience which prompts injudicious attempts to work hurriedly and without waiting for assistance. Enhanced ability to cope with stress and the availability of social support in the workplace are helpful (Theorell 1989; Bongers et al. 1992) when work-related stressors cannot be eliminated or controlled.
Diagnosis
Certain risk situations and dispositions may be added to the risk factors deriving from the biomechanics of the forces acting on the spine and from the anatomy of the support and movement apparatus, ones which are attributable to the work environment. Even though current research is not clear on this point, there is still some indication that the increased and recurrent incidence of psychosocial stress factors in nursing work has the capacity to reduce the threshold of sensitivity to physically burdensome activities, thus contributing to an increased level of vulnerability. In any case, whether such stress factors exist appears to be less decisive in this connection than how nursing staff manages them in a demanding situation and whether they can count on social support in the workplace (Theorell 1989; Bongers et al. 1992).
The proper diagnosis of low back pain requires a complete medical and a detailed occupational history including accidents resulting in injury or near-misses and prior episodes of back pain. The physical examination should include evaluation of gait and posture, palpation for areas of tenderness and evaluation of muscle strength, range of motion and joint flexibility. Complaints of weakness in the leg, areas of numbness and pain that radiate below the knee are indications for neurological examination to seek evidence of spinal cord and/or peripheral nerve involvement. Psychosocial problems may be disclosed through judicious probing of emotional status, attitudes and pain tolerance.
Radiological studies and scans are rarely helpful since, in the vast majority of cases, the problem lies in the muscles and ligaments rather than the bony structures. In fact, bony abnormalities are found in many individuals who have never had back pain; ascribing the back pain to such radiological findings as disc space narrowing or spondylosis may lead to needlessly heroic treatment. Myelography should not be undertaken unless spinal surgery is contemplated.
Clinical laboratory tests are useful in assessing general medical status and may be helpful in disclosing systemic diseases such as arthritis.
Treatment
Various modes of management are indicated depending on the nature of the disorder. Besides ergonomic interventions to enable the return of injured workers to the workplace, surgical, invasive-radiological, pharmacological, physical, physiotherapeutic and also psychotherapeutic management approaches may be necessary—sometimes in combination (Hofmann et al. 1994). Again, however, the vast majority of cases resolve regardless of the therapy offered. Treatment is discussed further in the Case Study: Treatment of Back Pain.
Prevention in the Work Environment
Primary prevention of back pain in the workplace involves the application of ergonomic principles and the use of technical aids, coupled with physical conditioning and training of the workers.
Despite the reservations frequently held by nursing staff regarding the use of technical aids for the lifting, positioning and moving of patients, the importance of ergonomic approaches to prevention is increasing (see Estryn-Béhar, Kaminski and Peigné 1990; Hofmann et al. 1994).
In addition to the major systems (permanently installed ceiling lifters, mobile floor lifters), a series of small and simple systems has been introduced noticeably into nursing practice (turntables, walking girdles, lifting cushions, slide boards, bed ladders, anti-slide mats and so on). When using these aids it is important that their actual use fits in well with the care concept of the particular area of nursing in which they are used. Wherever the use of such lifting aids stands in contradiction to the care concept practised, acceptance of such technical lifting aids by nursing staff tends to be low.
Even where technical aids are employed, training in techniques of lifting, carrying and supporting are essential. Lidström and Zachrisson (1973) describe a Swedish “Back School” in which physiotherapists trained in communication conduct classes explaining the structure of the spine and its muscles, how they work in different positions and movements and what can go wrong with them, and demonstrating appropriate lifting and handling techniques that will prevent injury. Klaber Moffet et al. (1986) describe the success of a similar programme in the UK. Such training in lifting and carrying is particularly important where, for one reason or another, use of technical aids is not possible. Numerous studies have shown that training in such techniques must constantly be reviewed; knowledge gained through instruction is frequently “unlearned” in practice.
Unfortunately, the physical demands presented by patients’ size, weight, illness and positioning are not always amenable to nurses’ control and they are not always able to modify the physical environment and the way their duties are structured. Accordingly, it is important for institutional managers and nursing supervisors to be included in the educational programme so that, when making decisions about work environments, equipment and job assignments, factors making for “back friendly” working conditions can be considered. At the same time, deployment of staff, with particular reference to nurse-patient ratios and the availability of “helping hands”, must be appropriate to the nurses’ well-being as well as consistent with the care concept, as hospitals in the Scandinavian countries seem to have managed to do in exemplary fashion. This is becoming ever more important where fiscal constraints dictate staff reductions and cut-backs in equipment procurement and maintenance.
Recently developed holistic concepts, which see such training not simply as instruction in bedside lifting and carrying techniques but rather as movement programmes for both nurses and patients, could take the lead in future developments in this area. Approaches to “participatory ergonomics” and programmes of health advancement in hospitals (understood as organizational development) must also be more intensively discussed and researched as future strategies (see article “Hospital ergonomics: A review”).
Since psychosocial stress factors also exercise a moderating function in the perception and mastering of the physical demands made by work, prevention programmes should also ensure that colleagues and superiors work to ensure satisfaction with work, avoid making excessive demands on the mental and physical capacities of workers and provide an appropriate level of social support.
Preventive measures should extend beyond professional life to include work in the home (housekeeping and caring for small children who have to be lifted and carried are particular hazards) as well as in sports and other recreational activities. Individuals with persistent or recurrent back pain, however it is acquired, should be no less diligent in following an appropriate preventive regimen.
Rehabilitation
The key to a rapid recovery is early mobilization and a prompt resumption of activities with the limits of tolerance and comfort. Most patients with acute back injuries recover fully and return to their usual work without incident. Resumption of an unrestricted range of activity should not be undertaken until exercises have fully restored muscle strength and flexibility and banished the fear and temerity that make for recurrent injury. Many individuals exhibit a tendency to recurrences and chronicity; for these, physiotherapy coupled with exercise and control of psychosocial factors will often be helpful. It is important that they return to some form of work as quickly as possible. Temporary elimination of more strenuous tasks and limitation of hours with a graduated return to unrestricted activity will promote a more complete recovery in these cases.
Fitness for work
The professional literature attributes only a very limited prognostic value to screening done before employees start work (US Preventive Services Task Force 1989). Ethical considerations and laws such as the Americans with Disabilities Act mitigate against pre-employment screening. It is generally agreed that pre-employment back x rays have no value, particularly when one considers their cost and the needless exposure to radiation. Newly-hired nurses and other health workers and those returning from an episode of disability due to back pain should be evaluated to detect any predisposition to this problem and provided with access to educational and physical conditioning programmes that will prevent it.
Conclusion
The social and economic impact of back pain, a problem particularly prevalent among nurses, can be minimized by the application of ergonomic principles and technology in the organization of their work and its environment, by physical conditioning that enhances the strength and flexibility of the postural muscles, by education and training in the performance of problematic activities and, when episodes of back pain do occur, by treatment that emphasizes a minimum of medical intervention and a prompt return to activity.
Most episodes of acute back pain respond promptly to several days of rest followed by the gradual resumption of activities within the limits of pain. Non-narcotic analgesics and non-steroidal anti-inflammatory drugs may be helpful in relieving pain but do not shorten the course. (Since some of these drugs affect alertness and reaction time, they should be used with caution by individuals who drive vehicles or have assignments where momentary lapses may result in harm to patients.) A variety of forms of physiotherapy (e.g., local applications of heat or cold, diathermy, massage, manipulation, etc.) often provide short periods of transient relief; they are particularly useful as a prelude to graded exercises that will promote the restoration of muscle strength and relaxation as well as flexibility. Prolonged bed rest, traction and the use of lumbar corsets tend to delay recovery and often lengthen the period of disability (Blow and Jayson 1988).
Chronic, recurrent back pain is best treated by a secondary prevention regimen. Getting enough rest, sleeping on a firm mattress, sitting in straight chairs, wearing comfortable, well-fitted shoes, maintaining good posture and avoiding long periods of standing in one position are important adjuncts. Excessive or prolonged use of medications increase the risk of side effects and should be avoided. Some cases are helped by the injection of “trigger points”, localized tender nodules in muscles and ligaments, as originally advocated in the seminal report by Lange (1931).
Exercise of key postural muscles (upper and lower abdominal, back, gluteal and thigh muscles) is the mainstay of both chronic care and prevention of back pain. Kraus (1970) has formulated a regimen that features strengthening exercises to correct muscle weakness, relaxing exercises to relief tension, spasticity and rigidity, stretching exercises to minimize contractures and exercises to improve balance and coordination. These exercises, he cautions, should be individualized on the basis of examination of the patient and functional tests of muscle strength, holding power and elasticity (e.g., the Kraus-Weber tests (Kraus 1970)). To avoid adverse effects of exercise, each session should include warm-up and cool-down exercises as well as limbering and relaxing exercises, and the number, duration and intensity of the exercises should be increased gradually as conditioning improves. Simply giving the patient a printed exercise sheet or booklet is not enough; initially, he or she should be given individual instruction and observed to be sure that the exercises are being done correctly.
In 1974, the YMCA in New York introduced the “Y’s Way to a Healthy Back Program”, a low-cost course of exercise training based on the Kraus exercises; in 1976 it became a national programme in the US and, later, it was established in Australia and in several European countries (Melleby 1988). The twice-a-week, six week programme is given by specially-trained YMCA exercise instructors and volunteers, mainly in urban YMCAs (arrangements for courses at the worksite have been made by a number of employers), and it emphasizes the indefinite continuation of the exercises at home. Approximately 80% of the thousands of individuals with chronic or recurrent back pain who have participated in this program have reported elimination or improvement in their pain.
Infectious diseases play a significant part in worldwide occurrences of occupational disease in HCWs. Since reporting procedures vary from country to country, and since diseases considered job-related in one country may be classified as non-occupational elsewhere, accurate data concerning their frequency and their proportion of the overall number of occupational diseases among HCWs are difficult to obtain. The proportions range from about 10% in Sweden (Lagerlöf and Broberg 1989), to about 33% in Germany (BGW 1993) and nearly 40% in France (Estryn-Béhar 1991).
The prevalence of infectious diseases is directly related to the efficacy of preventive measures such as vaccines and post-exposure prophylaxis. For example, during the 1980s in France, the proportion of all viral hepatitides fell to 12.7% of its original level thanks to the introduction of vaccination against hepatitis B (Estryn-Béhar 1991). This was noted even before hepatitis A vaccine became available.
Similarly, it may be presumed that, with the declining immunization rates in many countries (e.g., in the Russian Federation and Ukraine in the former Soviet Union during 1994-1995), cases of diphtheria and poliomyelitis among HCWs will increase.
Finally, occasional infections with streptococci, staphylococci and Salmonella typhi are being reported among health care workers.
Epidemiological Studies
The following infectious diseases—listed in order of frequency—are the most important in worldwide occurrences of occupational infectious diseases in health care workers:
Also important are the following (not in order of frequency):
It is very doubtful that the very many cases of enteric infection (e.g., salmonella, shigella, etc.) often included in the statistics are, in fact, job-related, since these infections are transmitted faecally/orally as a rule.
Much data is available concerning the epidemiological significance of these job-related infections mostly in relation to hepatitis B and its prevention but also in relation to tuberculosis, hepatitis A and hepatitis C. Epidemiological studies have also dealt with measles, mumps, rubella, varicella and Ringenröteln. In using them, however, care must be taken to distinguish between incidence studies (e.g., determination of annual hepatitis B infection rates), sero-epidemiological prevalence studies and other types of prevalence studies (e.g., tuberculin tests).
Hepatitis B
The risk of hepatitis B infections, which are primarily transmitted through contact with blood during needlestick injuries, among HCWs, depends on the frequency of this disease in the population they serve. In northern, central and western Europe, Australia and North America it is found in about 2% of the population. It is encountered in about 7% of the population in southern and south-eastern Europe and most parts of Asia. In Africa, the northern parts of South America and in eastern and south-eastern Asia, rates as high as 20% have been observed (Hollinger 1990).
A Belgian study found that 500 HCWs in northern Europe became infected with hepatitis B each year while the figure for southern Europe was 5,000 (Van Damme and Tormanns 1993). The authors calculated that the annual case rate for western Europe is about 18,200 health care workers. Of these, about 2,275 ultimately develop chronic hepatitis, of whom some 220 will develop cirrhosis of the liver and 44 will develop hepatic carcinoma.
A large study involving 4,218 HCWs in Germany, where about 1% of the population is positive for hepatitis B surface antigen (HBsAg), found that the risk of contracting hepatitis B is approximately 2.5 greater among HCWs than in the general population (Hofmann and Berthold 1989). The largest study to date, involving 85,985 HCWs worldwide, demonstrated that those in dialysis, anaesthesiology and dermatology departments were at greatest risk of hepatitis B (Maruna 1990).
A commonly overlooked source of concern is the HCW who has a chronic hepatitis B infection. More than 100 instances have been recorded worldwide in which the source of the infection was not the patient but the doctor. The most spectacular instance was the Swiss doctor who infected 41 patients (Grob et al. 1987).
While the most important mechanism for transmitting the hepatitis B virus is an injury by a blood-contaminated needle (Hofmann and Berthold 1989), the virus has been detected in a number of other body fluids (e.g., male semen, vaginal secretions, cerebrospinal fluid and pleural exudate) (CDC 1989).
Tuberculosis
In most countries around the world, tuberculosis continues to rank first or second in importance of work-related infections among HCWs (see the article “Tuberculosis prevention, control and surveillance”). Many studies have demonstrated that although the risk is present throughout the professional life, it is greatest during the period of training. For example, a Canadian study in the 1970s demonstrated the tuberculosis rate among female nurses to be double that of women in other professions (Burhill et al. 1985). And, in Germany, where the tuberculosis incidence ranges around 18 per 100,000 for the general population, it is about 26 per 100,000 among health care workers (BGW 1993).
A more accurate estimate of the risk of tuberculosis may be obtained from epidemiological studies based on the tuberculin test. A positive reaction is an indicator of infection by Mycobacterium tuberculosis or other mycobacteria or a prior inoculation with the BCG vaccine. If that inoculation was received 20 or more years earlier, it is presumed that the positive test indicates at least one contact with tubercle bacilli.
Today, tuberculin testing is done by means of the patch test in which the response is read within five to seven days after the application of the “stamp”. A large-scale German study based on such skin tests showed a rate of positives among health professionals that was only moderately higher than that among the general population (Hofmann et al. 1993), but long-range studies demonstrate that a greatly heightened risk of tuberculosis does exist in some areas of health care services.
More recently, anxiety has been generated by the increasing number of cases infected with drug-resistant organisms. This is a matter of particular concern in designing a prophylactic regimen for apparently healthy health care workers whose tuberculin tests “converted” to positive after exposure to patients with tuberculosis.
Hepatitis A
Since the hepatitis A virus is transmitted almost exclusively through faeces, the number of HCWs at risk is substantially smaller than for hepatitis B. An early study conducted in West Berlin showed that paediatric personnel were at greatest risk of this infection (Lange and Masihi 1986). These results were subsequently confirmed by a similar study in Belgium (Van Damme et al. 1989). Similarly, studies in Southwest Germany showed increase risk to nurses, paediatric nurses and cleaning women (Hofmann et al. 1992; Hofmann, Berthold and Wehrle 1992). A study undertaken in Cologne, Germany, revealed no risk to geriatric nurses in contrast to higher prevalence rates among the personnel of child care centres. Another study showed increased risk of hepatitis A among paediatric nurses in Ireland, Germany and France; in the last of these, greater risk was found in workers in psychiatric units treating children and youngsters. Finally, a study of infection rates among handicapped people disclosed higher levels of risk for the patients as well as the workers caring for them (Clemens et al. 1992).
Hepatitis C
Hepatitis C, discovered in 1989, like hepatitis B, is primarily transmitted through blood introduced via needle puncture wounds. Until recently, however, data relating to its threat to HCWs have been limited. A 1991 New York study of 456 dentists and 723 controls showed an infection rate of 1.75% among the dentists compared with 0.14% among the controls (Klein et al. 1991). A German research group demonstrated the prevalence of hepatitis C in prisons and attributed it to the large number of intravenous drug users among the inmates (Gaube et al. 1993). An Austrian study found 2.0% of 294 health care personnel to be seropositive for hepatitis C antibodies, a figure thought to be much higher than that among the general population (Hofmann and Kunz 1990). This was confirmed by another study of HCWs conducted in Cologne, Germany (Chriske and Rossa 1991).
A study in Freiburg, Germany, found that contact with handicapped residents of nursing homes, particularly those with infantile cerebral paresis and trisomia-21, patients with haemophilia and those dependent on drugs administered intravenously presented a particular risk of hepatitis C to workers involved in their care. A significantly increased prevalence rate was found in dialysis personnel and the relative risk to all health care workers was estimated to be 2.5% (admittedly calculated from a relatively small sample).
A possible alternative path of infection was demonstrated in 1993 when a case of hepatitis C was shown to have developed after a splash into the eye (Sartori et al. 1993).
Varicella
Studies of the prevalence of varicella, an illness particularly grave in adults, have consisted of tests for varicella antibodies (anti VZV) conducted in Anglo-Saxon countries. Thus, a seronegative rate of 2.9% was found among 241 hospital employees aged 24 to 62, but the rate was 7.5% for those under the age of 35 (McKinney, Horowitz and Baxtiola 1989). Another study in a paediatric clinic yielded a negative rate of 5% among 2,730 individuals tested in the clinic, but these data become less impressive when it is noted that the serological tests were performed only on persons without a history of having had varicella. A significantly increased risk of varicella infection for paediatric hospital personnel, however, was demonstrated by a study conducted in Freiburg, which found that, in a group of 533 individuals working in hospital care, paediatric hospital care and administration, evidence of varicella immunity was present in 85% of persons younger than 20 years.
Mumps
In considering risk levels of mumps infection, a distinction must be made between countries in which mumps immunization is mandatory and those in which these inoculations are voluntary. In the former, nearly all children and young people will have been immunized and, therefore, mumps poses little risk to health care workers. In the latter, which includes Germany, cases of mumps are becoming more frequent. As a result of lack of immunity, the complications of mumps have been increasing, particularly among adults. A report of an epidemic in a non-immune Inuit population on St. Laurance Island (located between Siberia and Alaska) demonstrated the frequency of such complications of mumps as orchitis in men, mastitis in women and pancreatitis in both sexes (Philip, Reinhard and Lackman 1959).
Unfortunately, epidemiological data on mumps among HCWs are very sparse. A 1986 study in Germany showed that the rate of mumps immunity among 15 to 10 year-olds was 84% but, with voluntary rather than mandatory inoculation, one may presume that this rate has been declining. A 1994 study involving 774 individuals in Freiburg indicated a significantly increased risk to employees in paediatric hospitals (Hofmann, Sydow and Michaelis 1994).
Measles
The situation with measles is similar to that with mumps. Reflecting its high degree of contagiousness, risks of infection among adults emerge as their immunization rates fall. A US study reported an immunity rate of over 99% (Chou, Weil and Arnmow 1986) and two years later 98% of a cohort of 163 nursing students were found to have immunity (Wigand and Grenner 1988). A study in Freiburg yielded rates of 96 to 98% among nurses and paediatric nurses while the rates of immunity among non-medical personnel were only 87 to 90% (Sydow and Hofman 1994). Such data would support a recommendation that immunization be made mandatory for the general population.
Rubella
Rubella falls between measles and mumps with respect to its contagiousness. Studies have shown that about 10% of HCWs are not immune (Ehrengut and Klett 1981; Sydow and Hofmann 1994) and, therefore, at high risk of infection when exposed. Although generally not a serious illness among adults, rubella may be responsible for devastating effects on the foetus during the first 18 weeks of pregnancy: abortion, stillbirth or congenital defects (see table 1) (South, Sever and Teratogen 1985; Miller, Vurdien and Farrington 1993). Since these may be produced even before the woman knows that she is pregnant and, since health care workers, particularly those in contact with paediatric patients, are likely to be exposed, it is especially important that inoculation be urged (and perhaps even required) for all female health care workers of child-bearing age who are not immune.
Table 1. Congenital abnormalities following rubella infection in pregnancy
Studies by South, Sever and Teratogen (1985) |
|||||
Week of pregnancy |
<4 |
5–8 |
9–12 |
13–16 |
>17 |
Deformity rate (%) |
70 |
40 |
25 |
40 |
8 |
Studies by Miller, Vurdien and Farrington (1993) |
|||||
Week of pregnancy |
<10 |
11–12 |
13–14 |
15–16 |
>17 |
Deformity rate (%) |
90 |
33 |
11 |
24 |
0 |
HIV/AIDS
During the 1980s and 1990s, HIV seroconversions (i.e., a positive reaction in an individual previously found to have been negative) became a minor occupational risk among HCWs, although clearly not one to be ignored. By early 1994, reports of some 24 reliably documented cases and 35 possible cases were collected in Europe (Pérez et al. 1994) with an additional 43 documented cases and 43 possible cases were reported in the US (CDC 1994a). Unfortunately, except for avoiding needlesticks and other contacts with infected blood or body fluids, there are no effective preventive measures. Some prophylactic regimens for individuals who have been exposed are recommended and described in the article “Prevention of occupational transmission of bloodborne pathogens”.
Other infectious diseases
The other infectious diseases listed earlier in this article have not yet emerged as significant hazards to HCWs either because they have not been recognized and reported or because their epidemiology has not yet been studied. Sporadic reports of single and small clusters of cases suggest that the identification and testing of serological markers should be explored. For example, a 33-month study of typhus conducted by the Centers for Disease Control (CDC) revealed that 11.2% of all sporadic cases not associated with outbreaks occurred in laboratory workers who had examined stool specimens (Blazer et al. 1980).
The future is clouded by two simultaneous problems: the emergence of new pathogens (e.g., new strains such as hepatitis G and new organisms such as the Ebola virus and the equine morbillivirus recently discovered to be fatal to both horses and humans in Australia) and the continuing development of drug resistance by well-recognized organisms such as the tuberculus bacillus. HCWs are likely to be the first to be systematically exposed. This makes their prompt and accurate identification and the epidemiological study of their patterns of susceptibility and transmission of the utmost importance.
Prevention of Infectious Diseases among Health Care Workers
The first essential in the prevention of infectious disease is the indoctrination of all HCWs, support staff as well as health professionals, in the fact that health care facilities are “hotbeds” of infection with every patient representing a potential risk. This is important not only for those directly involved in diagnostic or therapeutic procedures, but also those who collect and handle blood, faeces and other biological materials and those who come in contact with dressings, linens, dishes and other fomites. In some instances, even breathing the same air may be a possible hazard. Each health care facility, therefore, must develop a detailed procedure manual identifying these potential risks and the steps needed to eliminate, avoid or control them. Then, all personnel must be drilled in following these procedures and monitored to ensure that they are being properly performed. Finally, all failures of these protective measures must be recorded and reported so that revision and/or retraining may be undertaken.
Important secondary measures are the labelling of areas and materials which may be especially infectious and the provision of gloves, gowns, masks, forceps and other protective equipment. Washing the hands with germicidal soap and running water (wherever possible) will not only protect the health care worker but also will minimize the risk of his or her transmitting the infection to co-workers and other patients.
All blood and body fluid specimens or splashes and materials stained with them must be handled as though they are infected. The use of rigid plastic containers for the disposal of needles and other sharp instruments and diligence in the proper disposal of potentially infectious wastes are important preventive measures.
Careful medical histories, serological testing and patch testing should be performed prior to or as soon as health care workers report for duty. Where advisable (and there are no contraindications), appropriate vaccines should be administered (hepatitis B, hepatitis A and rubella appear to be the most important) (see table 2). In any case, seroconversion may indicate an acquired infection and the advisability of prophylactic treatment.
Table 2. Indications for vaccinations in health service employees.
Disease |
Complications |
Who should be vaccinated? |
Diptheria |
In the event of an epidemic, all employees without |
|
Hepatitis A |
Employees in the paediatric field as well as in infection |
|
Hepatitis B |
All seronegative employees with possibility of contact |
|
Influenza |
Regularly offered to all employees |
|
Measles |
Encephalitis |
Seronegative employees in the paediatric field |
Mumps |
Meningitis |
Seronegative employees in the paediatric field |
Rubella |
Embryopathy |
Seronegative employees in paediatry/midwifery/ |
Poliomyelitis |
All employees, e.g., those involved in vaccination |
|
Tetanus |
Employees in gardening and technical fields obligatory, |
|
Tuberculosis |
In all events employees in pulmonology and lung surgery |
|
Varicellas |
Foetal risks |
Seronegative employees in paediatry or at least in the |
Prophylactic therapy
In some exposures when it is known that the worker is not immune and has been exposed to a proven or highly suspected risk of infection, prophylactic therapy may be instituted. Especially if the worker presents any evidence of possible immunodeficiency, human immunoglobulin may be administered. Where specific “hyperimmune” serum is available, as in mumps and hepatitis B, it is preferable. In infections which, like hepatitis B, may be slow to develop, or “booster” doses are advisable, as in tetanus, a vaccine may be administered. When vaccines are not available, as in meningococcus infections and plague, prophylactic antibiotics may be used either alone or as a supplement to immune globulin. Prophylactic regimens of other drugs have been developed for tuberculosis and, more recently, for potential HIV infections, as discussed elsewhere in this chapter.
Prevention of occupational transmission of bloodborne pathogens (BBP) including the human immunodeficiency virus (HIV), hepatitis B virus (HBV) and more recently hepatitis C virus (HCV), has received significant attention. Although HCWs are the primary occupational group at risk of acquisition of infection, any worker who is exposed to blood or other potentially infectious body fluids during the performance of job duties is at risk. Populations at risk for occupational exposure to BBP include workers in health care delivery, public safety and emergency response workers and others such as laboratory researchers and morticians. The potential for occupational transmission of bloodborne pathogens including HIV will continue to increase as the number of persons who have HIV and other bloodborne infections and require medical care increases.
In the US, the Centers for Disease Control and Prevention (CDC) recommended in 1982 and 1983 that patients with the acquired immunodeficiency syndrome (AIDS) be treated according to the (now obsolete) category of “blood and body fluid precautions” (CDC 1982; CDC 1983). Documentation that HIV, the causative agent of AIDS, had been transmitted to HCWs by percutaneous and mucocutaneous exposures to HIV-infected blood, as well as the realization that the HIV infection status of most patients or blood specimens encountered by HCWs would be unknown at the time of the encounter, led CDC to recommend that blood and body fluid precautions be applied to all patients, a concept known as “universal precautions” (CDC 1987a, 1987b). The use of universal precautions eliminates the need to identify patients with bloodborne infections, but is not intended to replace general infection control practices. Universal precautions include the use of handwashing, protective barriers (e.g., goggles, gloves, gowns and face protection) when blood contact is anticipated and care in the use and disposal of needles and other sharp instruments in all health care settings. Also, instruments and other reusable equipment used in performing invasive procedures should be appropriately disinfected or sterilized (CDC 1988a, 1988b). Subsequent CDC recommendations have addressed prevention of transmission of HIV and HBV to public safety and emergency responders (CDC 1988b), management of occupational exposure to HIV, including the recommendations for the use of zidovudine (CDC 1990), immunization against HBV and management of HBV exposure (CDC 1991a), infection control in dentistry (CDC 1993) and the prevention of HIV transmission from HCWs to patients during invasive procedures (CDC 1991b).
In the US, CDC recommendations do not have the force of law, but have often served as the foundation for government regulations and voluntary actions by industry. The Occupational Health and Safety Administration (OSHA), a federal regulatory agency, promulgated a standard in 1991 on Occupational Exposure to Bloodborne Pathogens (OSHA 1991). OSHA concluded that a combination of engineering and work practice controls, personal protective clothing and equipment, training, medical surveillance, signs and labels and other provisions can help to minimize or eliminate exposure to bloodborne pathogens. The standard also mandated that employers make available hepatitis B vaccination to their employees.
The World Health Organization (WHO) has also published guidelines and recommendations pertaining to AIDS and the workplace (WHO 1990, 1991). In 1990, the European Economic Council (EEC) issued a council directive (90/679/EEC) on protection of workers from risks related to exposure to biological agents at work. The directive requires employers to conduct an assessment of the risks to the health and safety of the worker. A distinction is drawn between activities where there is a deliberate intention to work with or use biological agents (e.g., laboratories) and activities where exposure is incidental (e.g., patient care). Control of risk is based on a hierarchical system of procedures. Special containment measures, according to the classification of the agents, are set out for certain types of health facilities and laboratories (McCloy 1994). In the US, CDC and the National Institutes of Health also have specific recommendations for laboratories (CDC 1993b).
Since the identification of HIV as a BBP, knowledge about HBV transmission has been helpful as a model for understanding modes of transmission of HIV. Both viruses are transmitted via sexual, perinatal and bloodborne routes. HBV is present in the blood of individuals positive for hepatitis B e antigen (HBeAg, a marker for high infectivity) at a concentration of approximately 108 to 109 viral particles per millilitre (ml) of blood (CDC 1988b). HIV is present in blood at much lower concentrations: 103 to 104 viral particles/ml for a person with AIDS and 10 to 100/ml for a person with asymptomatic HIV infection (Ho, Moudgil and Alam 1989). The risk of HBV transmission to a HCW after percutaneous exposure to HBeAg-positive blood is approximately 100-fold higher than the risk of HIV transmission after percutaneous exposure to HIV-infected blood (i.e., 30% versus 0.3%) (CDC 1989).
Hepatitis
Hepatitis, or inflammation of the liver, can be caused by a variety of agents, including toxins, drugs, autoimmune disease and infectious agents. Viruses are the most common cause of hepatitis (Benenson 1990). Three types of bloodborne viral hepatitis have been recognized: hepatitis B, formerly called serum hepatitis, the major risk to HCWs; hepatitis C, the major cause of parenterally transmitted non-A, non-B hepatitis; and hepatitis D, or delta hepatitis.
Hepatitis B. The major infectious bloodborne occupational hazard to HCWs is HBV. Among US HCWs with frequent exposure to blood, the prevalence of serological evidence of HBV infection ranges between approximately 15 and 30%. In contrast, the prevalence in the general populations averages 5%. The cost-effectiveness of serological screening to detect susceptible individuals among HCWs depends on the prevalence of infection, the cost of testing and the vaccine costs. Vaccination of persons who already have antibodies to HBV has not been shown to cause adverse effects. Hepatitis B vaccine provides protection against hepatitis B for at least 12 years after vaccination; booster doses currently are not recommended. The CDC estimated that in 1991 there were approximately 5,100 occupationally acquired HBV infections in HCWs in the United States, causing 1,275 to 2,550 cases of clinical acute hepatitis, 250 hospitalizations and about 100 deaths (unpublished CDC data). In 1991, approximately 500 HCWs became HBV carriers. These individuals are at risk of long-term sequelae, including disabling chronic liver disease, cirrhosis and liver cancer.
The HBV vaccine is recommended for use in HCWs and public safety workers who may be exposed to blood in the workplace (CDC 1991b). Following a percutaneous exposure to blood, the decision to provide prophylaxis must include considerations of several factors: whether the source of the blood is available, the HBsAg status of the source and the hepatitis B vaccination and vaccine-response status of the exposed person. For any exposure of a person not previously vaccinated, hepatitis B vaccination is recommended. When indicated, hepatitis B immune globulin (HBIG) should be administered as soon as possible after exposure since its value beyond 7 days after exposure is unclear. Specific CDC recommendations are indicated in table 1 (CDC 1991b).
Table 1. Recommendation for post-exposure prophylaxis for percutaneous or permucosal exposure to hepatitis B virus, United States
Exposed person |
When source is |
||
HBsAg1 positive |
HBsAg negative |
Source not tested or |
|
Unvaccinated |
HBIG2´1 and initiate |
Initiate HB vaccine |
Initiate HB vaccine |
Previously Known |
No treatment |
No treatment |
No treatment |
Known non- |
HBIG´2 or HBIG´1 and |
No treatment |
If known high-risk source |
Response |
Test exposed for anti-HBs4 |
No treatment |
Test exposed for anti-HBs |
1 HBsAg = Hepatitis B surface antigen. 2 HBIG = Hepatitis B immune globulin; dose 0.06 mL/kg IM. 3 HB vaccine = hepatitis B vaccine. 4 Anti-HBs = antibody to hepatitis B surface antigen. 5 Adequate anti-HBs is ≥10 mIU/mL.
Table 2. Provisional US Public Health Service recommendations for chemoprophylaxis after occupational exposure to HIV, by type of exposure and source of material, 1996
Type of exposure |
Source material1 |
Antiretroviral |
Antiretroviral regimen3 |
Percutaneous |
Blood |
|
|
Mucous membrane |
Blood |
Offer |
ZDV plus 3TC, ± IDV5 |
Skin, increased risk7 |
Blood |
Offer |
ZDV plus 3TC, ± IDV5 |
1 Any exposure to concentrated HIV (e.g., in a research laboratory or production facility) is treated as percutaneous exposure to blood with highest risk. 2 Recommend—Postexposure prophylaxis (PEP) should be recommended to the exposed worker with counselling. Offer—PEP should be offered to the exposed worker with counselling. Not offer—PEP should not be offered because these are not occupational exposures to HIV. 3 Regimens: zidovudine (ZDV), 200 mg three times a day; lamivudine (3TC), 150 mg two times a day; indinavir (IDV), 800 mg three times a day (if IDV is not available, saquinavir may be used, 600 mg three times a day). Prophylaxis is given for 4 weeks. For full prescribing information, see package inserts. 4 Risk definitions for percutaneous blood exposure: Highest risk—BOTH larger volume of blood (e.g., deep injury with large diameter hollow needle previously in source patient’s vein or artery, especially involving an injection of source-patient’s blood) AND blood containing a high titre of HIV (e.g., source with acute retroviral illness or end-stage AIDS; viral load measurement may be considered, but its use in relation to PEP has not been evaluated). Increased risk—EITHER exposure to larger volume of blood OR blood with a high titre of HIV. No increased risk—NEITHER exposure to larger volume of blood NOR blood with a high titre of HIV (e.g., solid suture needle injury from source patient with asymptomatic HIV infection). 5 Possible toxicity of additional drug may not be warranted. 6 Includes semen; vaginal secretions; cerebrospinal, synovial, pleural, peritoneal, pericardial and amniotic fluids. 7 For skin, risk is increased for exposures involving a high titre of HIV, prolonged contact, an extensive area, or an area in which skin integrity is visibly compromised. For skin exposures without increased risk, the risk for drug toxicity outweighs the benefit of PEP.
Article 14(3) of EEC Directive 89/391/EEC on vaccination required only that effective vaccines, where they exist, be made available for exposed workers who are not already immune. There was an amending Directive 93/88/EEC which contained a recommended code of practice requiring that workers at risk be offered vaccination free of charge, informed of the benefits and disadvantages of vaccination and non-vaccination, and be provided a certificate of vaccination (WHO 1990).
The use of hepatitis B vaccine and appropriate environmental controls will prevent almost all occupational HBV infections. Reducing blood exposure and minimizing puncture injuries in the health care setting will reduce also the risk of transmission of other bloodborne viruses.
Hepatitis C. Transmission of HCV is similar to that of HBV, but infection persists in most patients indefinitely and more frequently progresses to long-term sequelae (Alter et al. 1992). The prevalence of anti-HCV among US hospital-based health care workers averages 1 to 2% (Alter 1993). HCWs who sustain accidental injuries from needlesticks contaminated with anti-HCV-positive blood have a 5 to 10% risk of acquiring HCV infection (Lampher et al. 1994; Mitsui et al. 1992). There has been one report of HCV transmission after a blood splash to the conjunctiva (Sartori et al. 1993). Prevention measures again consist of adherence to universal precautions and percutaneous injury prevention, since no vaccine is available and immune globulin does not appear to be effective.
Hepatitis D. Hepatitis D virus requires the presence of hepatitis B virus for replication; thus, HDV can infect persons only as a coinfection with acute HBV or as a superinfection of chronic HBV infection. HDV infection can increase the severity of liver disease; one case of occupationally acquired HDV infection hepatitis has been reported (Lettau et al. 1986). Hepatitis B vaccination of HBV-susceptible persons will also prevent HDV infection; however, there is no vaccine to prevent HDV superinfection of an HBV carrier. Other prevention measures consist of adherence to universal precautions and percutaneous injury prevention.
HIV
The first cases of AIDS were recognized in June of 1981. Initially, over 92% of the cases reported in the United States were in homosexual or bisexual men. However, by the end of 1982, AIDS cases were identified among injection drug users, blood transfusion recipients, haemophilia patients treated with clotting factor concentrates, children and Haitians. AIDS is the result of infection with HIV, which was isolated in 1985. HIV has spread rapidly. In the United States, for example, the first 100,000 AIDS cases occurred between 1981 and 1989; the second 100,000 cases occurred between 1989 and 1991. As of June 1994, 401,749 cases of AIDS had been reported in the United States (CDC 1994b).
Globally, HIV has affected many countries including those in Africa, Asia and Europe. As of 31 December 1994, 1,025,073 cumulative cases of AIDS in adults and children had been reported to the WHO. This represented a 20% increase from the 851,628 cases reported through December 1993. It was estimated that 18 million adults and about 1.5 million children have been infected with HIV since the beginning of the pandemic (late 1970s to early 1980s) (WHO 1995).
Although HIV has been isolated from human blood, breast milk, vaginal secretions, semen, saliva, tears, urine, cerebrospinal fluid and amniotic fluid, epidemiological evidence has implicated only blood, semen, vaginal secretions and breast milk in the transmission of the virus. The CDC has also reported on the transmission of HIV as the result of contact with blood or other body secretions or excretions from an HIV-infected person in the household (CDC 1994c). Documented modes of occupational HIV transmission include having percutaneous or mucocutaneous contact with HIV-infected blood. Exposure by the percutaneous route is more likely to result in infection transmission than is mucocutaneous contact.
There are a number of factors which may influence the likelihood of occupational bloodborne pathogen transmission, including: the volume of fluid in the exposure, the virus titre, the length of time of the exposure and the immune status of the worker. Additional data are needed to determine precisely the importance of these factors. Preliminary data from a CDC case-control study indicate that for percutaneous exposures to HIV-infected blood, HIV transmission is more likely if the source patient has advanced HIV disease and if the exposure involves a larger inoculum of blood (e.g., injury due to a large-bore hollow needle) (Cardo et al. 1995). Virus titre can vary between individuals and over time within a single individual. Also, blood from persons with AIDS, particularly in the terminal stages, may be more infectious than blood from persons in earlier stages of HIV infection, except possibly during the illness associated with acute infection (Cardo et al. 1995).
Occupational exposure and HIV infection
As of December 1996, CDC reported 52 HCWs in the United States who have seroconverted to HIV following a documented occupational exposure to HIV, including 19 laboratory workers, 21 nurses, six physicians and six in other occupations. Forty-five of the 52 HCWs sustained percutaneous exposures, five had mucocutaneous exposures, one had both a percutaneous and a mucocutaneous exposure and one had an unknown route of exposure. In addition, 111 possible cases of occupationally acquired infection have been reported. These possible cases have been investigated and are without identifiable non-occupational or transfusion risks; each reported percutaneous or mucocutaneous occupational exposures to blood or body fluids, or laboratory solutions containing HIV, but HIV seroconversion specifically resulting from an occupational exposure was not documented (CDC 1996a).
In 1993, the AIDS Centre at the Communicable Disease Surveillance Centre (UK) summarized reports of cases of occupational HIV transmission including 37 in the United States, four in the UK and 23 from other countries (France, Italy, Spain, Australia, South Africa, Germany and Belgium) for a total of 64 documented seroconversions after a specific occupational exposure. In the possible or presumed category there were 78 in the United States, six in the UK and 35 from other countries (France, Italy, Spain, Australia, South Africa, Germany, Mexico, Denmark, Netherlands, Canada and Belgium) for a total of 118 (Heptonstall, Porter and Gill 1993). The number of reported occupationally acquired HIV infections is likely to represent only a portion of the actual number due to under-reporting and other factors.
HIV post-exposure management
Employers should make available to workers a system for promptly initiating evaluation, counselling and follow-up after a reported occupational exposure that may place a worker at risk of acquiring HIV infection. Workers should be educated and encouraged to report exposures immediately after they occur so that appropriate interventions can be implemented (CDC 1990).
If an exposure occurs, the circumstances should be recorded in the worker’s confidential medical record. Relevant information includes the following: date and time of exposure; job duty or task being performed at the time of exposure; details of exposure; description of source of exposure, including, if known, whether the source material contained HIV or HBV; and details about counselling, post-exposure management and follow-up. The source individual should be informed of the incident and, if consent is obtained, tested for serological evidence of HIV infection. If consent cannot be obtained, policies should be developed for testing source individuals in compliance with applicable regulations. Confidentiality of the source individual should be maintained at all times.
If the source individual has AIDS, is known to be HIV seropositive, refuses testing or the HIV status is unknown, the worker should be evaluated clinically and serologically for evidence of HIV infection as soon as possible after the exposure (baseline) and, if seronegative, should be retested periodically for a minimum of 6 months after exposure (e.g., six weeks, 12 weeks and six months after exposure) to determine whether HIV infection has occurred. The worker should be advised to report and seek medical evaluation for any acute illness that occurs during the follow-up period. During the follow-up period, especially the first six to 12 weeks after the exposure, exposed workers should be advised to refrain from blood, semen or organ donation and to abstain from, or use measures to prevent HIV transmission, during sexual intercourse.
In 1990, CDC published a statement on the management of exposure to HIV including considerations regarding zidovudine (ZDV) post-exposure use. After a careful review of the available data, CDC stated that the efficacy of zidovudine could not be assessed due to insufficient data, including available animal and human data (CDC 1990).
In 1996, information suggesting that ZDV post-exposure prophylaxis (PEP) may reduce the risk for HIV transmission after occupational exposure to HIV-infected blood (CDC 1996a) prompted a US Public Health Service (PHS) to update a previous PHS statement on management of occupational exposure to HIV with the following findings and recommendations on PEP (CDC 1996b). Although failures of ZDV PEP have occurred (Tokars et al. 1993), ZDV PEP was associated with a decrease of approximately 79% in the risk for HIV seroconversion after percutaneous exposure to HIV-infected blood in a case-control study among HCWs (CDC 1995).
Although information about the potency and toxicity of antiretroviral drugs is available from studies of HIV-infected patients, it is uncertain to what extent this information can be applied to uninfected persons receiving PEP. In HIV-infected patients, combination therapy with the nucleosides ZDV and lamivudine (3TC) has greater antiretroviral activity than ZDV alone and is active against many ZDV-resistant HIV strains without significantly increased toxicity (Anon. 1996). Adding a protease inhibitor provides even greater increases in antiretroviral activity; among protease inhibitors, indinavir (IDV) is more potent than saquinavir at currently recommended doses and appears to have fewer drug interactions and short-term adverse effects than ritonavir (Niu, Stein and Schnittmann 1993). Few data exist to assess possible long-term (i.e., delayed) toxicity resulting from use of these drugs in persons not infected with HIV.
The following PHS recommendations are provisional because they are based on limited data regarding efficacy and toxicity of PEP and risk for HIV infection after different types of exposure. Because most occupational exposures to HIV do not result in infection transmission, potential toxicity must be carefully considered when prescribing PEP. Changes in drug regimens may be appropriate, based on factors such as the probable antiretroviral drug resistance profile of HIV from the source patient, local availability of drugs and medical conditions, concurrent drug therapy and drug toxicity in the exposed worker. If PEP is used, drug-toxicity monitoring should include a complete blood count and renal and hepatic chemical function tests at baseline and two weeks after starting PEP. If subjective or objective toxicity is noted, drug reduction or drug substitution should be considered, and further diagnostic studies may be indicated.
Chemoprophylaxis should be recommended to exposed workers after occupational exposures associated with the highest risk for HIV transmission. For exposures with a lower, but non-negligible risk, PEP should be offered, balancing the lower risk against the use of drugs having uncertain efficacy and toxicity. For exposures with negligible risk, PEP is not justified (see table 2 ). Exposed workers should be informed that knowledge about the efficacy and toxicity of PEP is limited, that for agents other than ZDV, data are limited regarding toxicity in persons without HIV infection or who are pregnant and that any or all drugs for PEP may be declined by the exposed worker.
PEP should be initiated promptly, preferably with 1 to 2 hours post-exposure. Although animal studies suggest that PEP probably is not effective when started later than 24 to 36 hours post-exposure (Niu, Stein and Schnittmann 1993; Gerberding 1995), the interval after which there is no benefit from PEP for humans is undefined. Initiating therapy after a longer interval (e.g., 1 to 2 weeks) may be considered for the highest risk exposures; even if infection is not prevented, early treatment of acute HIV infection may be beneficial (Kinloch-de-los et al. 1995).
If the source patient or the patient’s HIV status is unknown, initiating PEP should be decided on a case-by-case basis, based on the exposure risk and likelihood of infection in known or possible source patients.
Other Bloodborne Pathogens
Syphilis, malaria, babesiosis, brucellosis, leptospirosis, arboviral infections, relapsing fever, Creutzfeldt-Jakob disease, human T-lymphotropic virus type 1 and viral haemorrhagic fever have also been transmitted by the bloodborne route (CDC 1988a; Benenson 1990). Occupational transmission of these agents has only rarely been recorded, if ever.
Prevention of Transmission of Bloodborne Pathogens
There are several basic strategies which relate to the prevention of occupational transmission of bloodborne pathogens. Exposure prevention, the mainstay of occupational health, can be accomplished by substitution (e.g., replacing an unsafe device with a safer one), engineering controls (i.e., controls that isolate or remove the hazard), administrative controls (e.g., prohibiting recapping of needles by a two-handed technique) and use of personal protective equipment. The first choice is to “engineer out the problem”.
In order to reduce exposures to bloodborne pathogens, adherence to general infection control principles, as well as strict compliance with universal precaution guidelines, is required. Important components of universal precautions include the use of appropriate personal protective equipment, such as gloves, gowns and eye protection, when exposure to potentially infectious body fluids is anticipated. Gloves are one of the most important barriers between the worker and the infectious material. While they do not prevent needlesticks, protection for the skin is provided. Gloves should be worn when contact with blood or body fluids is anticipated. Washing of gloves in not recommended. Recommendations also advise workers to take precautions to prevent injuries by needles, scalpels and other sharp instruments or devices during procedures; when cleaning used instruments; during disposal of used needles; and when handling sharp instruments after procedures.
Percutaneous exposures to blood
Since the major risk of infection results from parenteral exposure from sharp instruments such as syringe needles, engineering controls such as resheathing needles, needleless IV systems, blunt suture needles and appropriate selection and use of sharps disposal containers to minimize exposures to percutaneous injuries are critical components of universal precautions.
The most common type of percutaneous inoculation occurs through inadvertent needlestick injury, many of which are associated with recapping of needles. The following reasons have been indicated by workers as reasons for recapping: inability to properly dispose of needles immediately, sharps disposal containers too far away, lack of time, dexterity problems and patient interaction.
Needles and other sharp devices can be redesigned to prevent a significant proportion of percutaneous exposures. A fixed barrier should be provided between hands and the needle after use. Worker’s hands should remain behind the needle. Any safety feature should be an integral part of the device. The design should be simple and little or no training should be required (Jagger et al. 1988).
Implementing safer needle devices must be accompanied by evaluation. In 1992, the American Hospital Association (AHA) published a briefing to assist hospitals with the selection, evaluation and adoption of safer needle devices (AHA 1992). The briefing stated that “because safer needle devices, unlike drugs and other therapies, do not undergo clinical testing for safety and efficacy before they are marketed, hospitals are essentially ‘on their own’ when it comes to selecting appropriate products for their specific institutional needs”. Included in the AHA document are guidance for the evaluation and adoption of safer needle devices, case studies of the use of safety devices, evaluation forms and listing of some, but not all, products on the US market.
Prior to implementation of a new device, health care institutions must ensure that there is an appropriate needlestick surveillance system in place. In order to accurately assess the efficacy of new devices, the number of reported exposures should be expressed as an incidence rate.
Possible denominators for reporting the number of needlestick injuries include patient days, hours worked, number of devices purchased, number of devices used and number of procedures performed. The collection of specific information on device-related injuries is an important component of the evaluation of the effectiveness of a new device. Factors to be considered in collecting information on needlestick injuries include: new product distribution, stocking and tracking; identification of users; removal of other devices; compatibility with other devices (especially IV equipment); ease of use; and mechanical failure. Factors which may contribute to bias include compliance, subject selection, procedures, recall, contamination, reporting and follow-up. Possible outcome measures include rates of needlestick injuries, HCW compliance, patient care complications and cost.
Finally, training and feedback from workers are important components of any successful needlestick prevention programme. User acceptance is a critical factor, but one that seldom receives enough attention.
Elimination or reduction of percutaneous injuries should result if adequate engineering controls are available. If HCWs, product evaluation committees, administrators and purchasing departments all work together to identify where and what safer devices are needed, safety and cost effectiveness can be combined. Occupational transmission of bloodborne pathogens is costly, both in terms of money and the impact on the employee. Every needlestick injury causes undue stress on the employee and may affect job performance. Referral to mental health professionals for supportive counselling may be required.
In summary, a comprehensive approach to prevention is essential to maintaining a safe and healthy environment in which to provide health care services. Prevention strategies include the use of vaccines, post-exposure prophylaxis and prevention or reduction of needlestick injuries. Prevention of needlestick injuries can be accomplished by improvement in the safety of needle-bearing devices, development of procedures for safer use and disposal and adherence to infection control recommendations.
Acknowledgements: The authors thank Mariam Alter, Lawrence Reed and Barbara Gooch for their manuscript review.
Transmission of Mycobacterium tuberculosis is a recognized risk in health care facilities. The magnitude of the risk to HCWs varies considerably by the type of health care facility, the prevalence of TB in the community, the patient population served, the HCW’s occupational group, the area of the health care facility in which the HCW works and the effectiveness of TB infection-control interventions. The risk may be higher in areas where patients with TB are provided care before diagnosis and initiation of TB treatment and isolation precautions (e.g., in clinic waiting areas and emergency departments) or where diagnostic or treatment procedures that stimulate coughing are performed. Nosocomial transmission of M. tuberculosis has been associated with close contact with persons who have infectious TB and with the performance of certain procedures (e.g., bronchoscopy, endotracheal intubation and suctioning, open abscess irrigation and autopsy). Sputum induction and aerosol treatments that induce coughing may also increase the potential for transmission of M. tuberculosis. Personnel in health care facilities should be particularly alert to the need for preventing transmission of M. tuberculosis in those facilities in which immunocompromised persons (e.g., HIV-infected persons) work or receive care—especially if cough-inducing procedures, such as sputum induction and aerosolized pentamidine treatments, are being performed.
Transmission and Pathogenesis
M. tuberculosis is carried in airborne particles, or droplet nuclei, that can be generated when persons who have pulmonary or laryngeal TB sneeze, cough, speak or sing. The particles are an estimated 1 to 5 μm in size and normal air currents can keep them airborne for prolonged time periods and spread them throughout a room or building. Infection occurs when a susceptible person inhales droplet nuclei containing M. tuberculosis and these droplet nuclei traverse the mouth or nasal passages, upper respiratory tract and bronchi to reach the alveoli of the lungs. Once in the alveoli, the organisms are taken up by alveolar macrophages and spread throughout the body. Usually within two to ten weeks after initial infection with M. tuberculosis, the immune response limits further multiplication and spread of the tubercle bacilli; however, some of the bacilli remain dormant and viable for many years. This condition is referred to as latent TB infection. Persons with latent TB infection usually have positive purified protein derivative (PPD)-tuberculin skin-test results, but they do not have symptoms of active TB, and they are not infectious.
In general, persons who become infected with M. tuberculosis have approximately a 10% risk for developing active TB during their lifetimes. This risk is greatest during the first two years after infection. Immunocompromised persons have a greater risk for the progression of latent TB infection to active TB disease; HIV infection is the strongest known risk factor for this progression. Persons with latent TB infection who become co-infected with HIV have approximately an 8 to 10% risk per year for developing active TB. HIV-infected persons who are already severely immunosuppressed and who become newly infected with M. tuberculosis have an even greater risk for developing active TB.
The probability that a person who is exposed to M. tuberculosis will become infected depends primarily on the concentration of infectious droplet nuclei in the air and the duration of exposure. Characteristics of the TB patient that enhance transmission include:
Environmental factors that enhance the likelihood of transmission include:
Characteristics of the persons exposed to M. tuberculosis that may affect the risk for becoming infected are not as well defined. In general, persons who have been infected previously with M. tuberculosis may be less susceptible to subsequent infection. However, reinfection can occur among previously infected persons, especially if they are severely immunocompromised. Vaccination with Bacille of Calmette and Guérin (BCG) probably does not affect the risk for infection; rather, it decreases the risk for progressing from latent TB infection to active TB. Finally, although it is well established that HIV infection increases the likelihood of progressing from latent TB infection to active TB, it is unknown whether HIV infection increases the risk for becoming infected if exposed to M. tuberculosis.
Epidemiology
Several TB outbreaks among persons in health care facilities have been reported recently in the United States. Many of these outbreaks involved transmission of multidrug-resistant strains of M. tuberculosis to both patients and HCWs. Most of the patients and some of the HCWs were HIV-infected persons in whom new infection progressed rapidly to active disease. Mortality associated with those outbreaks was high (with a range of 43 to 93%). Furthermore, the interval between diagnosis and death was brief (with a range of median intervals of 4 to 16 weeks). Factors contributing to these outbreaks included delayed diagnosis of TB, delayed recognition of drug resistance and delayed initiation of effective therapy, all of which resulted in prolonged infectiousness, delayed initiation and inadequate duration of TB isolation, inadequate ventilation in TB isolation rooms, lapses in TB isolation practices and inadequate precautions for cough-inducing procedures and lack of adequate respiratory protection.
Fundamentals of TB infection control
An effective TB infection-control programme requires early identification, isolation and effective treatment of persons who have active TB. The primary emphasis of the TB infection-control plan should be on achieving these three goals. In all health care facilities, particularly those in which persons who are at high risk for TB work or receive care, policies and procedures for TB control should be developed, reviewed periodically and evaluated for effectiveness to determine the actions necessary to minimize the risk for transmission of M. tuberculosis.
The TB infection-control programme should be based on a hierarchy of control measures. The first level of the hierarchy, and that which affects the largest number of persons, is using administrative measures intended primarily to reduce the risk for exposing uninfected persons to persons who have infectious TB. These measures include:
The second level of the hierarchy is the use of engineering controls to prevent the spread and reduce the concentration of infectious droplet nuclei. These controls include:
The first two levels of the hierarchy minimize the number of areas in the health care facility where exposure to infectious TB may occur, and they reduce, but do not eliminate, the risk in those few areas where exposure to M. tuberculosis can still occur (e.g., rooms in which patients with known or suspected infectious TB are being isolated and treatment rooms in which cough-inducing or aerosol-generating procedures are performed on such patients). Because persons entering such rooms may be exposed to M. tuberculosis, the third level of the hierarchy is the use of personal respiratory protective equipment in these and certain other situations in which the risk for infection with M. tuberculosis may be relatively higher.
Specific measures to reduce the risk for transmission of M. tuberculosis include the following:
1. Assigning to specific persons in the health care facility the supervisory responsibility for designing, implementing, evaluating and maintaining the TB infection-control programme.
2. Conducting a risk assessment to evaluate the risk for transmission of M. tuberculosis in all areas of the health care facility, developing a written TB infection-control programme based on the risk assessment and periodically repeating the risk assessment to evaluate the effectiveness of the TB infection-control programme. TB infection-control measures for each health care facility should be based on a careful assessment of the risk for transmission of M. tuberculosis in that particular setting. The first step in developing the TB infection-control programme should be to conduct a baseline risk assessment to evaluate the risk for transmission of M. tuberculosis in each area and occupational group in the facility. Appropriate infection-control interventions can then be developed on the basis of actual risk. Risk assessments should be performed for all inpatient and outpatient settings (e.g., medical and dental offices). Classification of risk for a facility, for a specific area and for a specific occupational group should be based on the profile of TB in the community, the number of infectious TB patients admitted to the area or ward, or the estimated number of infectious TB patients to whom HCWs in an occupational group may be exposed and the results of analysis of HCW PPD test conversions (where applicable) and possible person-to-person transmission of M. tuberculosis. Regardless of risk level, the management of patients with known or suspected infectious TB should not vary. However, the index of suspicion for infectious TB among patients, the frequency of HCW PPD skin testing, the number of TB isolation rooms and other factors will depend on the level of risk for transmission of M. tuberculosis in the facility, area or occupational group.
3. Developing, implementing and enforcing policies and protocols to ensure early identification, diagnostic evaluation and effective treatment of patients who may have infectious TB. A diagnosis of TB may be considered for any patient who has a persistent cough (i.e., a cough lasting for longer than 3 weeks) or other signs or symptoms compatible with active TB (e.g., bloody sputum, night sweats, weight loss, anorexia or fever). However, the index of suspicion for TB will vary in different geographic areas and will depend on the prevalence of TB and other characteristics of the population served by the facility. The index of suspicion for TB should be very high in geographic areas or among groups of patients in which the prevalence of TB is high. Appropriate diagnostic measures should be conducted and TB precautions implemented for patients in whom active TB is suspected.
4. Providing prompt triage for and appropriate management of patients in the outpatient setting who may have infectious TB. Triage of patients in ambulatory-care settings and emergency departments should include vigorous efforts to identify promptly patients who have active TB. HCWs who are the first points of contact in facilities that serve populations at risk for TB should be trained to ask questions that will facilitate identification of patients with signs and symptoms suggestive of TB. Patients with signs or symptoms suggestive of TB should be evaluated promptly to minimize the amount of time they are in ambulatory-care areas. TB precautions should be followed while the diagnostic evaluation is being conducted for these patients. TB precautions in the ambulatory-care setting should include placing these patients in a separate area apart from other patients and not in open waiting areas (ideally, in a room or enclosure meeting TB isolation requirements), giving these patients surgical masks to wear and instructing them to keep their masks on and giving these patients tissues and instructing them to cover their mouths and noses with the tissues when coughing or sneezing. Surgical masks are designed to prevent the respiratory secretions of the person wearing the mask from entering the air. When not in a TB isolation room, patients suspected of having TB should wear surgical masks to reduce the expulsion of droplet nuclei into the air. These patients do not need to wear particulate respirators, which are designed to filter the air before it is inhaled by the person wearing the mask. Patients suspected of having or known to have TB should never wear a respirator that has an exhalation valve, because the device would provide no barrier to the expulsion of droplet nuclei into the air.
5. Promptly initiating and maintaining TB isolation for persons who may have infectious TB and who are admitted to the inpatient setting. In hospitals and other inpatient facilities, any patient suspected of having or known to have infectious TB should be placed in a TB isolation room that has currently recommended ventilation characteristics (see below). Written policies for initiating isolation should specify the indications for isolation, the person(s) authorized to initiate and discontinue isolation, the isolation practices to follow, the monitoring of isolation, the management of patients who do not adhere to isolation practices and the criteria for discontinuing isolation.
6. Effectively planning arrangements for discharge. Before a TB patient is discharged from the health care facility, the facility’s staff and public health authorities should collaborate to ensure continuation of therapy. Discharge planning in the health care facility should include, at a minimum, a confirmed outpatient appointment with the provider who will manage the patient until the patient is cured, sufficient medication to take until the outpatient appointment and placement into case management (e.g., directly observed therapy (DOT)) or outreach programmes of the public health department. These plans should be initiated and in place before the patient’s discharge.
7. Developing, installing, maintaining and evaluating ventilation and other engineering controls to reduce the potential for airborne exposure to M. tuberculosis. Local exhaust ventilation is a preferred source control technique, and it is often the most efficient way to contain airborne contaminants because it captures these contaminants near their source before they can disperse. Therefore, the technique should be used, if feasible, wherever aerosol-generating procedures are performed. Two basic types of local exhaust devices use hoods: the enclosing type, in which the hood either partially or fully encloses the infectious source, and the exterior type, in which the infectious source is near but outside the hood. Fully enclosed hoods, booths or tents are always preferable to exterior types because of their superior ability to prevent contaminants from escaping into the HCW’s breathing zone. General ventilation can be used for several purposes, including diluting and removing contaminated air, controlling airflow patterns within rooms and controlling the direction of airflow throughout a facility. General ventilation maintains air quality by two processes: dilution and removal of airborne contaminants. Uncontaminated supply air mixes with the contaminated room air (i.e., dilution), which is subsequently removed from the room by the exhaust system. These processes reduce the concentration of droplet nuclei in the room air. Recommended general ventilation rates for health care facilities are usually expressed in number of air changes per hour (ACH).
This number is the ratio of the volume of air entering the room per hour to the room volume and is equal to the exhaust airflow (Q, in cubic feet per minute) divided by the room volume (V, in cubic feet) multiplied by 60 (i.e., ACH = Q / V x 60). For the purposes of reducing the concentration of droplet nuclei, TB isolation and treatment rooms in existing health care facilities should have an airflow of greater than 6 ACH. Where feasible, this airflow rate should be increased to at least 12 ACH by adjusting or modifying the ventilation system or by using auxiliary means (e.g., recirculation of air through fixed HEPA filtration systems or portable air cleaners). New construction or renovation of existing health care facilities should be designed so that TB isolation rooms achieve an airflow of at least 12 ACH. The general ventilation system should be designed and balanced so that air flows from less contaminated (i.e., more clean) to more contaminated (less clean) areas. For example, air should flow from corridors into TB isolation rooms to prevent spread of contaminants to other areas. In some special treatment rooms in which operative and invasive procedures are performed, the direction of airflow is from the room to the hallway to provide cleaner air during these procedures. Cough-inducing or aerosol-generating procedures (e.g., bronchoscopy and irrigation of tuberculous abscesses) should not be performed in rooms with this type of airflow on patients who may have infectious TB. HEPA filters may be used in a number of ways to reduce or eliminate infectious droplet nuclei from room air or exhaust. These methods include placement of HEPA filters in exhaust ducts discharging air from booths or enclosures into the surrounding room, in ducts or in ceiling- or wall-mounted units, for recirculation of air within an individual room (fixed recirculation systems), in portable air cleaners, in exhaust ducts to remove droplet nuclei from air being discharged to the outside, either directly or through ventilation equipment, and in ducts discharging air from the TB isolation room into the general ventilation system. In any application, HEPA filters should be installed carefully and maintained meticulously to ensure adequate functioning. For general use areas in which the risk for transmission of M. tuberculosis is relatively high, ultraviolet lamps (UVGI) may be used as an adjunct to ventilation for reducing the concentration of infectious droplet nuclei, although the effectiveness of such units has not been evaluated adequately. Ultraviolet (UV) units can be installed in a room or corridor to irradiate the air in the upper portion of the room, or they can be installed in ducts to irradiate air passing through the ducts.
8. Developing, implementing, maintaining and evaluating a respiratory protection programme. Personal respiratory protection (i.e., respirators) should be used by persons entering rooms in which patients with known or suspected infectious TB are being isolated, persons present during cough-inducing or aerosol-generating procedures performed on such patients and persons in other settings where administrative and engineering controls are not likely to protect them from inhaling infectious airborne droplet nuclei. These other settings include transporting patients who may have infectious TB in emergency transport vehicles and providing urgent surgical or dental care to patients who may have infectious TB before a determination has been made that the patient is non-infectious.
9. Educating and training HCWs about TB, effective methods for preventing transmission of M. tuberculosis and the benefits of medical screening programmes. All HCWs, including physicians, should receive education regarding TB that is relevant to persons in their particular occupational group. Ideally, training should be conducted before initial assignment and the need for additional training should be re-evaluated periodically (e.g., once a year). The level and detail of this education will vary according to the HCW’s work responsibilities and the level of risk in the facility (or area of the facility) in which the HCW works. However, the programme may include the following elements:
10. Developing and implementing a programme for routine periodic counselling and screening of HCWs for active TB and latent TB infection. A TB counselling, screening and prevention programme for HCWs should be established to protect both HCWs and patients. HCWs who have positive PPD test results, PPD test conversions or symptoms suggestive of TB should be identified, evaluated to rule out a diagnosis of active TB and started on therapy or preventive therapy if indicated. In addition, the results of the HCW PPD screening programme will contribute to evaluation of the effectiveness of current infection-control practices. Because of the increased risk for rapid progression from latent TB infection to active TB in human immunodeficiency virus, HIV-infected or otherwise severely immunocompromised persons, all HCWs should know if they have a medical condition or are receiving a medical treatment that may lead to severely impaired cell-mediated immunity. HCWs who may be at risk for HIV infection should know their HIV status (i.e., they should be encouraged to voluntarily seek counselling and testing for HIV antibody status). Existing guidelines for counselling and testing should be followed routinely. Knowledge of these conditions allows the HCW to seek the appropriate preventive measures and to consider voluntary work reassignments.
11. ll HCWs should be informed about the need to follow existing recommendations for infection control to minimize the risk for exposure to infectious agents; implementation of these recommendations will greatly reduce the risk for occupational infections among HCWs. All HCWs should also be informed about the potential risks to severely immunocompromised persons associated with caring for patients who have some infectious diseases, including TB. It should be emphasized that limiting exposure to TB patients is the most protective measure that severely immunosuppressed HCWs can take to avoid becoming infected with M. tuberculosis. HCWs who have severely impaired cell-mediated immunity and who may be exposed to M. tuberculosis may consider a change in job-setting to avoid such exposure. HCWs should be advised of the legal option in many jurisdictions that severely immunocompromised HCWs can choose to transfer voluntarily to areas and work activities in which there is the lowest possible risk for exposure to M. tuberculosis. This choice should be a personal decision for HCWs after they have been informed of the risks to their health.
12. Employers should make reasonable accommodations (e.g., alternative job assignments) for employees who have a health condition that compromises cell-mediated immunity and who work in settings where they may be exposed to M. tuberculosis. HCWs who are known to be immunocompromised should be referred to employee health professionals who can individually counsel the employees regarding their risk for TB. Upon the request of the immunocompromised HCW, employers should offer, but not compel, a work setting in which the HCW would have the lowest possible risk for occupational exposure to M. tuberculosis.
13. All HCWs should be informed that immunosuppressed HCWs should have appropriate follow-up and screening for infectious diseases, including TB, provided by their medical practitioner. HCWs who are known to be HIV-infected or otherwise severely immunosuppressed should be tested for cutaneous anergy at the time of PPD testing. Consideration should be given to retesting, at least every 6 months, those immunocompromised HCWs who are potentially exposed to M. tuberculosis because of the high risk for rapid progression to active TB if they become infected.
14. Information provided by HCWs regarding their immune status should be treated confidentially. If the HCW requests voluntary job reassignment, the privacy of the HCW should be maintained. Facilities should have written procedures on confidential handling of such information.
15. Promptly evaluating possible episodes of M. tuberculosis transmission in health care facilities, including PPD skin-test conversions among HCWs, epidemiologically associated cases among HCWs or patients and contacts of patients or HCWs who have TB and who were not promptly identified and isolated. Epidemiological investigations may be indicated for several situations. These include, but are not limited to, the occurrence of PPD test conversions or active TB in HCWs, the occurrence of possible person-to-person transmission of M. tuberculosis and situations in which patients or HCWs with active TB are not promptly identified and isolated, thus exposing other persons in the facility to M. tuberculosis. The general objectives of the epidemiological investigations in these situations are as follows:
16. Coordinating activities with the local public health department, emphasizing reporting and ensuring adequate discharge follow-up and the continuation and completion of therapy. As soon as a patient or HCW is known or suspected to have active TB, the patient or HCW should be reported to the public health department so that appropriate follow-up can be arranged and a community contact investigation can be performed. The health department should be notified well before patient discharge to facilitate follow-up and continuation of therapy. A discharge plan coordinated with the patient or HCW, the health department and the inpatient facility should be implemented.
Health care is a labour intensive industry and, in most countries, health care workers (HCWs) constitute a major sector of the workforce. They comprise a wide range of professional, technical and support personnel working in a large variety of settings. In addition to health professionals, laboratory technicians, pharmacists, social workers and others involved in clinical services, they include administrative and clerical personnel, housekeeping and dietary staff, laundry workers, engineers, electricians, painters and maintenance workers who repair and refurbish the building and the equipment it contains. In contrast with those providing direct care, these support workers usually have only casual, incidental contact with patients.
HCWs represent diverse educational, social and ethnic levels and are usually predominantly female. Many, particularly in home care, are employed in entry-level positions and require considerable basic training. Table 1 lists samples of health care functions and associated occupations.
Table 1. Examples of health care functions and associated occupations
Functions |
Occupational category * |
Specific occupations |
Direct patient care |
Health-diagnosing occupations |
Physicians |
Technical support |
Health technicians |
Clinical laboratory technicians |
Services |
Health services |
Dental assistants |
Administrative support |
Clerical services |
Billing clerks |
Research |
Scientific occupations |
Scientists and research |
* Occupational categories are, in part, adapted from those used by the US Department of Labor, Bureau of Labor Statistics.
A segment of the health sector (unfortunately, often too small and under-resourced in most communities) is devoted to direct and indirect preventive services. The major focus of the health care industry, however, is the diagnosis, treatment and care of the sick. This creates a special set of dynamics, for the sick exhibit varying levels of physical and emotional dependencies that set them apart from the customers in such personal services industries as, for example, retail trade, restaurants and hotels. They require, and traditionally receive, special services and considerations, often on an emergency basis, provided frequently at the expense of the HCWs’ personal comfort and safety.
Reflecting their size and numbers of employees, acute and long-term care facilities constitute perhaps the most prominent elements in the health care industry. They are supplemented by outpatient clinics, “surgicenters” (facilities for outpatient surgery), clinical and pathological laboratories, pharmacies, x-ray and imaging centres, ambulance and emergency care services, individual and group offices, and home care services. These may be located within a hospital or operated elsewhere under its aegis, or they may be free-standing and operated independently. It should be noted that there are profound differences in the way health services are delivered, ranging from the well-organized, “high tech” care available in urban centres in developed countries to the underserved areas in rural communities, in developing countries and in inner-city enclaves in many large cities.
Superimposed on the health care system is a massive educational and research establishment in which students, faculty, researchers and support staffs often come in direct contact with patients and participate in their care. This comprises schools of medicine, dentistry, nursing, public health, social work and the variety of technical disciplines involved in health care.
The health care industry has been undergoing profound changes during the past few decades. Ageing of the population, especially in developed countries, has amplified the use of nursing homes, domiciliary facilities and home care services. Scientific and technological developments have not only led to the creation of new types of facilities staffed by new classes of specially-trained personnel, but they have also de-emphasized the role of the acute care hospital. Now, many services requiring inpatient care are being provided on an ambulatory basis. Finally, fiscal constraints dictated by the continuing escalation of health care costs have been reconfiguring the health care industry, at least in developing countries, resulting in pressure for cost-containment to be achieved through changes in the organization of health care services.
HCWs who are in direct contact with the sick, wherever they work, are exposed to a number of unique hazards. They face the risk of acquiring infections from the patients they serve, as well as the risk of musculoskeletal injuries when lifting, transferring or restraining them. Support staff not directly involved in patient care (e.g., laundry and housekeeping and materials handling workers) are not only routinely exposed to chemicals, such as cleaning agents and disinfectants of industrial strength, but are also exposed to biological hazards from contaminated linens and wastes (see figure 1). There is also the ethos of health care which, especially in emergency situations, requires HCWs to put the safety and comfort of their patients above their own. Coping with the stress of therapeutic failures, death and dying often takes its toll in worker burnout. All this is compounded by shift work, deliberate or inadvertent understaffing and the necessity of catering to the sometimes unreasonable demands from patients and their families. Finally, there is the threat of abuse and violence from patients, particularly when the job requires them to work alone or takes them into unsafe areas. All these are described in greater detail in other articles in this chapter and elsewhere in this Encyclopaedia.
Figure 1. Handling contaminated biological material
Health Sciences Centre, Winnipeg, Manitoba, Canada
The US National Institute for Occupational Safety and Health (NIOSH) reported that needle punctures, musculoskeletal sprains and back injuries probably were the most common injuries in the health care industry (Wugofski 1995). The World Health Organization (WHO) Conference on Occupational Hazards in 1981 identified as its five main areas of concern:
Are they health care workers, too?
Often overlooked when considering the safety and well-being of health care workers are students attending medical, dental, nursing and other schools for health professionals and volunteers serving pro bono in healthcare facilities. Since they are not “employees” in the technical or legal sense of the term, they are ineligible for workers’ compensation and employment-based health insurance in many jurisdictions. Health care administrators have only a moral obligation to be concerned about their health and safety.
The clinical segments of their training bring medical, nursing and dental students into direct contact with patients who may have infectious diseases. They perform or assist in a variety of invasive procedures, including taking blood samples, and often do laboratory work involving body fluids and specimens of urine and faeces. They are usually free to wander about the facility, entering areas containing potential hazards often, since such hazards are rarely posted, without an awareness of their presence. They are usually supervised very loosely, if at all, while their instructors are often not very knowledgeable, or even interested, in matters of safety and health protection.
Volunteers are rarely permitted to participate in clinical care but they do have social contacts with patients and they usually have few restrictions with respect to areas of the facility they may visit.
Under normal circumstances, students and volunteers share with health care workers the risks of exposure to potentially harmful hazards. These risks are exacerbated at times of crisis and in emergencies when they step into or are ordered into the breech. Clearly, even though it may not be spelled out in laws and regulations or in organizational procedure manuals, they are more than entitled to the concern and protection extended to “regular” health care workers.
Leon Warshaw
Biological Hazards
Biological hazards, which pose a risk for infectious disease, are common throughout the world, but they are particularly problematic in developing countries. While the hepatitis B virus (HBV) is a nearly universal threat to HCWs, it is particularly important in African and Asian countries where this virus is endemic. As discussed later in this chapter, the risk of HBV transmission after percutaneous exposure to hepatitis B surface antigen (HBsAg) positive blood is approximately 100-fold higher than the risk of transmitting the human immunodeficiency virus (HIV) through percutaneous exposure to HIV-infected blood (i.e., 30% versus 0.3%). Nonetheless, there has indeed been an evolution of concern regarding parenteral exposure to blood and body fluids from the pre-HIV to the AIDS era. McCormick et al. (1991) found that the annual reported incidents of injuries from sharp instruments increased more than threefold during a 14-year period and among medical house officers the reported incidents increased ninefold. Overall, nurses incur approximately two-thirds of the needlestick injuries reported. Yassi and McGill (1991) also noted that nursing staff, particularly nursing students, are at highest risk for needlestick injuries, but they also found that approximately 7.5% of medical personnel reported exposures to blood and body fluids, a figure that is probably low because of underreporting. These data were consistent with other reports which indicated that, while there is increased reporting of needlesticks reflecting concerns about HIV and AIDS, certain groups continue to underreport. Sterling (1994) concludes that underreporting of needlestick injuries ranges from 40 to 60%.
Certain risk factors clearly enhance the likelihood of transmission of bloodborne diseases; these are discussed in the article “Prevention of occupational transmission of bloodborne pathogens”. Frequent exposure has indeed been associated with high seroprevalence rates of hepatitis B among laboratory workers, surgeons and pathologists. The risk of hepatitis C is also increased. The trend towards greater attention to prevention of needlestick injuries is, however, also noteworthy. The adoption of universal precautions is an important advance. Under universal precautions, it is assumed that all blood-containing fluid is potentially infectious and that appropriate safeguards should always be invoked. Safe disposal containers for needles and other sharp instruments are increasingly being placed in conveniently accessible locations in treatment areas, as illustrated in figure 2. The use of new devices, such as the needle-less access system for intravenous treatment and/or blood sampling has been shown to be a cost-effective method of reducing needlestick injuries (Yassi and McGill 1995).
Figure 2. Disposal container for sharp instruments and devices
Health Sciences Centre, Winnipeg, Manitoba, Canada
Blood and body fluids are not the only source of infection for HCWs. Tuberculosis (TB) is also on the rise again in parts of the world where previously its spread had been curtailed and, as discussed later in this chapter, is a growing occupational health concern. In this, as in other nosocomial infections, such concern is heightened by the fact that so many of the organisms involved have become drug-resistant. There is also the problem of new outbreaks of deadly infectious agents, such as the Ebola virus. The article “Overview of infectious diseases” summarizes the major infectious disease risks for HCWs.
Chemical Hazards
HCWs are exposed to a wide variety of chemicals, including disinfectants, sterilants, laboratory reagents, drugs and anaesthetic agents, to name just a few of the categories. Figure 3 shows a storage cabinet in an area of a large hospital where prosthetics are fabricated and clearly illustrates the vast array of chemicals that are present in health care facilities. Some of these substances are highly irritating and may also be sensitizing. Some disinfectants and antiseptics also tend to be quite toxic, also with irritating and sensitizing propensities that may induce skin or respiratory tract disease. Some, like formaldehyde and ethylene oxide, are classified as mutagens, teratogens and human carcinogens as well. Prevention depends on the nature of the chemical, the maintenance of the apparatus in which it is used or applied, environmental controls, worker training and, in some instances, the availability of correct personal protective equipment. Often such control is straightforward and not very expensive. For example, Elias et al. (1993) showed how ethylene oxide exposure was controlled in one health care facility. Other articles in this chapter address chemical hazards and their management.
Figure 3. Storage cabinet for hazardous chemicals
Health Sciences Centre, Winnipeg, Manitoba, Canada
Physical Hazards and the Building Environment
In addition to the specific environmental contaminants faced by HCWs, many health care facilities also have documented indoor air quality problems. Tran et al. (1994), in studying symptoms experienced by operating room personnel, noted the presence of the “sick building syndrome” in one hospital. Building design and maintenance decisions are, therefore, extremely important in health care facilities. Particular attention must be paid to correct ventilation in specific areas such as laboratories, operating rooms and pharmacies, the availability of hoods and avoidance of the insertion of chemical-laden fumes into the general air-conditioning system. Controlling the recirculation of air and using special equipment (e.g., appropriate filters and ultraviolet lamps) is needed to prevent the transmission of air-borne infectious agents. Aspects of the construction and planning of health care facilities are discussed in the article “Buildings for health care facilities”.
Physical hazards are also ubiquitous in hospitals (see “Exposure to physical agents” in this chapter). The wide variety of electrical equipment used in hospitals can present an electrocution hazard to patients and staff if not properly maintained and grounded (see figure 4). Especially in hot and humid environments, heat exposure may present a problem to workers in such areas as laundries, kitchens and boiler rooms. Ionizing radiation is a special concern for staff in diagnostic radiology (i.e., x ray, angiography, dental radiography and computerized axial tomography (CAT) scans) as well as for those in therapeutic radiology. Controlling such radiation exposures is a routine matter in designated departments where there is careful supervision, well-trained technicians and properly shielded and maintained equipment, but it can be a problem when portable equipment is used in emergency rooms, intensive care units and operating rooms. It can also be a problem to housekeeping and other support staff whose duties take them into areas of potential exposure. In many jurisdictions these workers have not been properly trained to avoid this hazard. Exposure to ionizing radiation may also present a problem in diagnostic and therapeutic nuclear medicine units and in preparing and distributing doses of radioactive pharmaceuticals. In some cases, however, radiation exposure remains a serious problem (see the article “Occupational health and safety practice: The Russian experience” in this chapter).
Figure 4. Electrical equipment in hospital
Health Sciences Centre, Winnipeg, Manitoba, Canada
Contradicting the prevailing impression of hospitals as quiet workplaces, Yassi et al. (1991) have documented the surprising extent of noise-induced hearing loss among hospital workers (see table 2). The article “Ergonomics of the physical work environment” in this chapter offers useful recommendations for controlling this hazard, as does table 3.
Table 2. 1995 integrated sound levels
Area monitored |
dBA (lex) Range |
Cast room |
76.32 to 81.9 |
Central energy |
82.4 to 110.4 |
Nutrition and food services (main kitchen) |
|
Housekeeping |
|
Laundry |
|
Linen service |
76.3 to 91.0 |
Mailroom |
|
Maintenance |
|
Materials handling |
|
Print shop |
|
Rehabilitation engineering |
|
Note: “Lex” means the equivalent sound level or the steady sound level in dBA which, if present in a workplace for 8 hours, would contain the same acoustic energy.
Table 3. Ergonomic noise reduction options
Work area |
Process |
Control options |
Central energy |
General area |
Enclose the source |
Dietetics |
Pot washer |
Automate process |
Housekeeping |
Burnishing |
Purchasing criteria |
Laundry |
Dryer/washer |
Isolate and reduce vibration |
Mailroom |
Tuberoom |
Purchasing criteria |
Maintenance |
Various equipment |
Purchasing criteria |
Materiel handling and |
Carts |
Maintenance |
Print shop |
Press operator |
Maintenance |
Rehabilitation |
Orthotics |
Purchasing criteria |
By far the most common and most costly type of injury faced by HCWs is back injury. Nurses and attendants are at greatest risk of musculoskeletal injuries due to the large amount of patient lifting and transferring that their jobs require. The epidemiology of back injury in nurses was summarized by Yassi et al. (1995a) with respect to one hospital. The pattern they observed mirrors those that have been universally reported. Hospitals are increasingly turning to preventive measures which may include staff training and the use of mechanical lifting devices. Many are also providing up-to-date diagnostic, therapeutic and rehabilitation health services that will minimize lost time and disability and are cost-effective (Yassi et al. 1995b). Hospital ergonomics has taken on increasing importance and, therefore, is the subject of a review article in this chapter. The specific problem of the prevention and management of back pain in nurses as one of the most important problems for this cohort of HCWs is also discussed in the article “Prevention and management of back pain in nurses” in this chapter. Table 4 lists the total number of injuries in a one-year period.
Table 4. Total number of injuries, mechanism of injury and nature of industry (one hospital, all departments), 1 April 1994 to 31 March 1995
Nature of injury sustained |
Total |
||||||||||||
Mechanism |
Blood/ |
Cut/ |
Bruise/ |
Sprain/ |
Fracture/ |
Burn/ |
Human |
Broken |
Head- |
Occupa- |
Other3 |
Un- |
|
Exertion |
|||||||||||||
Transferring |
105 |
105 |
|||||||||||
Lifting |
83 |
83 |
|||||||||||
Assisting |
4 |
4 |
|||||||||||
Turning |
27 |
27 |
|||||||||||
Breaking fall |
28 |
28 |
|||||||||||
Pushing |
1 |
25 |
26 |
||||||||||
Lifting |
1 |
52 |
1 |
54 |
|||||||||
Pulling |
14 |
14 |
|||||||||||
Combination- |
38 |
38 |
|||||||||||
Other |
74 |
74 |
|||||||||||
Fall |
3 |
45 |
67 |
3 |
1 |
119 |
|||||||
Struck/ |
66 |
76 |
5 |
2 |
2 |
1 |
152 |
||||||
Caught in/ |
13 |
68 |
8 |
1 |
1 |
91 |
|||||||
Exp. |
3 |
1 |
4 |
19 |
16 |
12 |
55 |
||||||
Staff abuse |
|||||||||||||
Patient |
16 |
11 |
51 |
28 |
8 |
3 |
1 |
2 |
120 |
||||
Spill/splashes |
80 |
1 |
81 |
||||||||||
Drug/ |
2 |
2 |
|||||||||||
Exp. |
5 |
5 |
10 |
||||||||||
Needlesticks |
159 |
22 |
181 |
||||||||||
Scalpel cuts |
34 |
14 |
48 |
||||||||||
Other5 |
3 |
1 |
29 |
1 |
6 |
40 |
|||||||
Unknown (no |
8 |
8 |
|||||||||||
Total |
289 |
136 |
243 |
558 |
5 |
33 |
8 |
7 |
19 |
25 |
29 |
8 |
1,360 |
1 No blood/body fluid. 2 This includes rashes/dermatitis/work-related illness/burning eyes, irritated eyes. 3 Exposure to chemical or physical agents but with no documented injuries affects. 4 Accident not reported. 5 Exposure to cold/heat, unknown.
In discussing musculoskeletal and ergonomic problems, it is important to note that while those engaged in direct patient care may be at greatest risk (see figure 5) many of the support personnel in hospital must contend with similar ergonomic burdens (see figure 6 and figure 7). The ergonomic problems facing hospital laundry workers have been well-documented (Wands and Yassi 1993) (see figure 8, figure 9 and figure 10) and they also are common among dentists, otologists, surgeons and especially microsurgeons, obstetricians, gynaecologists and other health personnel who often must work in awkward postures.
Figure 5. Patient lifting is an ergonomic hazard in most hospitals
Health Sciences Centre, Winnipeg, Manitoba, Canada
Figure 6. Overhead painting: A typical ergonomic hazard for a tradesworker
Health Sciences Centre, Winnipeg, Manitoba, Canada
Figure 7. Cast-making involves many ergonomic stresses
Health Sciences Centre, Winnipeg, Manitoba, Canada
Figure 8. Laundry work such as this can cause repetitive stress injury to the upper limbs
Health Sciences Centre, Winnipeg, Manitoba, Canada
Figure 9. This laundry task requires working in an awkward position
Health Sciences Centre, Winnipeg, Manitoba, Canada
Figure 10. A poorly designed laundry operation can cause back strain
Health Sciences Centre, Winnipeg, Manitoba, Canada
Organizational Problems
The article “Strain in health care work” contains a discussion of some of the organizational problems in hospitals and a summary of the principal findings of Leppanen and Olkinuora (1987), who reviewed Finnish and Swedish studies of stress among HCWs. With the rapid changes currently under way in this industry, the extent of alienation, frustration and burnout among HCWs is considerable. Added to that is the prevalence of staff abuse, an increasingly troublesome problem in many facilities (Yassi 1994). While it is often thought that the most difficult psychosocial problem faced by HCWs is dealing with death and dying, it is being recognized increasingly that the nature of the industry itself, with its hierarchical structure, its growing job insecurity and the high demands unsupported by adequate resources, is the cause of the variety of stress-related illness faced by HCWs.
The Nature of the Health Care Sector
In 1976, Stellman wrote, “If you ever wondered how people can manage to work with the sick and always stay healthy themselves, the answer is that they can’t” (Stellman 1976). The answer has not changed, but the potential hazards have clearly expanded from infectious diseases, back and other injuries, stress and burnout to include a large variety of potentially toxic environmental, physical and psychosocial exposures. The world of the HCW continues to be largely unmonitored and largely unregulated. None the less, progress is being made in addressing occupational health and safety hazards in hospitals. The International Commission on Occupational Health (ICOH) has a sub-committee addressing this problem, and several international conferences have been held with published proceedings that offer useful information (Hagberg et al. 1995). The US Centers for Disease Control and Prevention (CDC) and NIOSH have proposed guidelines to address many of the problems of the health care industry discussed in this article (e.g., see NIOSH 1988). The number of articles and books addressing health and safety issues for HCWs has been growing rapidly, and good overviews of health and safety in the US health care industry have been published (e.g., Charney 1994; Lewy 1990; Sterling 1994). The need for systematic data collection, study and analysis regarding hazards in the health care industry and the desirability of assembling interdisciplinary occupational health teams to address them have become increasingly evident.
When considering occupational health and safety in the health care industry, it is crucial to appreciate the enormous changes currently taking place in it. Health care “reform”, being instituted in most of the developed countries of the world, is creating extraordinary turbulence and uncertainty for HCWs, who are being asked to absorb rapid changes in their work tasks often with greater exposure to risks. The transformation of health care is spurred, in part, by advances in medical and scientific knowledge, the development of innovative technological procedures and the acquisition of new skills. It is also being driven, however, and perhaps to an even greater extent, by concepts of cost-effectiveness and organizational efficiency, in which “downsizing” and “cost control” have often seemed to become goals in themselves. New institutional incentives are being introduced at different organizational levels in different countries. The contracting out of jobs and services that had traditionally been carried out by a large stable workforce is now increasingly becoming the norm. Such contracting out of work is reported to have helped the health administrators and politicians achieve their long-term goal of making the process of health care more flexible and more accountable. These changes have also brought changes in roles that were previously rather well-defined, undermining the traditional hierarchical relationships among planners, administrators, physicians and other health professionals. The rise of investor-owned health care organizations in many countries has introduced a new dynamic in the financing and management of health services. In many situations, HCWs have been forced into new working relationships that involve such changes as downgrading services so that they can be performed by less-skilled workers at lower pay, reduced staffing levels, staff redeployments involving split shifts and part-time assignments. At the same time, there has been a slow but steady growth in the numbers of such physician surrogates as physician assistants, nurse practitioners, midwives and psychiatric social workers who command lower rates of pay than the physicians they are replacing. (The ultimate social and health costs both to HCWs and to the public, as patients and payers, is still to be determined.)
A growing trend in the US that is also emerging in the UK and northern European countries is “managed care”. This generally involves the creation of organizations paid on a per capita basis by insurance companies or government agencies to provide or contract for the provision of a comprehensive range of health services to a voluntarily-enrolled population of subscribers. Their aim is to reduce the costs of health care by “managing” the process: using administrative procedures and primary care physicians as “gatekeepers” to control the utilization of expensive in-patient hospital days, reducing referrals to high-priced specialists and use of costly diagnostic procedures, and denying coverage for expensive new forms of “experimental” treatment. The growing popularity of these managed care systems, fuelled by aggressive marketing to employer- and government-sponsored groups and individuals, has made it difficult for physicians and other health care providers to resist becoming involved. Once engaged, there is a variety of financial incentives and disincentives to influence their judgement and condition their behaviour. The loss of their traditional autonomy has been particularly painful for many medical practitioners and has had a profound influence on their patterns of practice and their relationships with other HCWs.
These rapid changes in the organization of the health care industry are having profound direct and indirect effects on the health and safety of HCWs. They affect the ways health services are organized, managed, delivered and paid for. They affect the ways HCWs are trained, assigned and supervised and the extent to which considerations of their health and safety are addressed. This should be kept in mind as the various occupational health hazards faced by HCWs are discussed in this chapter. Finally, although it may not appear to be directly relevant to the content of this chapter, thought should be given to the implications of the well-being and performance of HCWs to the quality and effectiveness of the services they provide to their patients.
Exposure to potentially hazardous chemicals is a fact of life for health care workers. They are encountered in the course of diagnostic and therapeutic procedures, in laboratory work, in preparation and clean-up activities and even in emanations from patients, to say nothing of the “infrastructure” activities common to all worksites such as cleaning and housekeeping, laundry, painting, plumbing and maintenance work. Despite the constant threat of such exposures and the large numbers of workers involved—in most countries, health care invariably is one of the most labour-intensive industries—this problem has received scant attention from those involved in occupational health and safety research and regulation. The great majority of chemicals in common use in hospitals and other health care settings are not specifically covered under national and international occupational exposure standards. In fact, very little effort has been made to date to identify the chemicals most frequently used, much less to study the mechanisms and intensity of exposures to them and the epidemiology of the effects on the health care workers involved.
This may be changing in the many jurisdictions in which right-to-know laws, such as the Canadian Workplace Hazardous Materials Information Systems (WHMIS) are being legislated and enforced. These laws require that workers be informed of the name and nature of the chemicals to which they may be exposed on the job. They have introduced a daunting challenge to administrators in the health care industry who must now turn to occupational health and safety professionals to undertake a de novo inventory of the identity and location of the thousands of chemicals to which their workers may be exposed.
The wide range of professions and jobs and the complexity of their interplay in the health care workplace require unique diligence and astuteness on the part of those charged with such occupational safety and health responsibilities. A significant complication is the traditional altruistic focus on the care and well-being of the patients, even at the expense of the health and well-being of those providing the services. Another complication is the fact that these services are often required at times of great urgency when important preventive and protective measures may be forgotten or deliberately disregarded.
Categories of Chemical Exposures in the Health Care Setting
Table 1 lists the categories of chemicals encountered in the health care workplace. Laboratory workers are exposed to the broad range of chemical reagents they employ, histology technicians to dyes and stains, pathologists to fixative and preservative solutions (formaldeyde is a potent sensitizer), and asbestos is a hazard to workers making repairs or renovations in older health care facilities.
Table 1. Categories of chemicals used in health care
Types of chemicals |
Locations most likely to be found |
Disinfectants |
Patient areas |
Sterilants |
Central supply |
Medicines |
Patient areas |
Laboratory reagents |
Laboratories |
Housekeeping/maintenance chemicals |
Hospital-wide |
Food ingredients and products |
Kitchen |
Pesticides |
Hospital-wide |
Even when liberally applied in combating and preventing the spread of infectious agents, detergents, disinfectants and sterilants offer relatively little danger to patients whose exposure is usually of brief duration. Even though individual doses at any one time may be relatively low, their cumulative effect over the course of a working lifetime may, however, constitute a significant risk to health care workers.
Occupational exposures to drugs can cause allergic reactions, such as have been reported over many years among workers administering penicillin and other antibiotics, or much more serious problems with such highly carcinogenic agents as the antineoplastic drugs. The contacts may occur during the preparation or administration of the dose for injection or in cleaning up after it has been administered. Although the danger of this mechanism of exposure had been known for many years, it was fully appreciated only after mutagenic activity was detected in the urine of nurses administering antineoplastic agents.
Another mechanism of exposure is the administration of drugs as aerosols for inhalation. The use of antineoplastic agents, pentamidine and ribavarin by this route has been studied in some detail, but there has been, as of this writing, no report of a systematic study of aerosols as a source of toxicity among health care workers.
Anaesthetic gases represent another class of drugs to which many health care workers are exposed. These chemicals are associated with a variety of biological effects, the most obvious of which are on the nervous system. Recently, there have been reports suggesting that repeated exposures to anaesthetic gases may, over time, have adverse reproductive effects among both male and female workers. It should be recognized that appreciable amounts of waste anaesthetic gases may accumulate in the air in recovery rooms as the gases retained in the blood and other tissues of patients are eliminated by exhalation.
Chemical disinfecting and sterilizing agents are another important category of potentially hazardous chemical exposures for health care workers. Used primarily in the sterilization of non-disposable equipment, such as surgical instruments and respiratory therapy apparatus, chemical sterilants such as ethylene oxide are effective because they interact with infectious agents and destroy them. Alkylation, whereby methyl or other alkyl groups bind chemically with protein-rich entities such as the amino groups in haemoglobiin and DNA, is a powerful biological effect. In intact organisms, this may not cause direct toxicity but should be considered potentially carcinogenic until proven otherwise. Ethylene oxide itself, however, is a known carcinogen and is associated with a variety of adverse health effects, as discussed elsewhere in the Encyclopaedia. The potent alkylation capability of ethylene oxide, probably the most widely-used sterilant for heat-sensitive materials, has led to its use as a classic probe in studying molecular structure.
For years, the methods used in the chemical sterilization of instruments and other surgical materials have carelessly and needlessly put many health care workers at risk. Not even rudimentary precautions were taken to prevent or limit exposures. For example, it was the common practice to leave the door of the sterilizer partially open to allow the escape of excess ethylene oxide, or to leave freshly-sterilized materials uncovered and open to the room air until enough had been assembled to make efficient use of the aerator unit.
The fixation of metallic or ceramic replacement parts so common in dentistry and orthopaedic surgery may be a source of potentially hazardous chemical exposure such as silica. These and the acrylic resins often used to glue them in place are usually biologically inert, but health care workers may be exposed to the monomers and other chemical reactants used during the preparation and application process. These chemicals are often sensitizing agents and have been associated with chronic effects in animals. The preparation of mercury amalgam fillings can lead to mercury exposure. Spills and the spread of mercury droplets is a particular concern since these may linger unnoticed in the work environment for many years. The acute exposure of patients to them appears to be entirely safe, but the long-term health implications of the repeated exposure of health care workers have not been adequately studied.
Finally, such medical techniques as laser surgery, electro-cauterization and use of other radiofrequency and high-energy devices can lead to the thermal degradation of tissues and other substances resulting in the formation of potentially toxic smoke and fumes. For example, the cutting of “plaster” casts made of polyester resin impregnated bandages has been shown to release potentially toxic fumes.
The hospital as a “mini-municipality”
A listing of the varied jobs and tasks performed by the personnel of hospitals and other large health care facilities might well serve as a table of contents for the commercial listings of a telephone directory for a sizeable municipality. All of these entail chemical exposures intrinsic to the particular work activity in addition to those that are peculiar to the health care environment. Thus, painters and maintenance workers are exposed to solvents and lubricants. Plumbers and others engaged in soldering are exposed to fumes of lead and flux. Housekeeping workers are exposed to soaps, detergents and other cleansing agents, pesticides and other household chemicals. Cooks may be exposed to potentially carcinogenic fumes in broiling or frying foods and to oxides of nitrogen from the use of natural gas as fuel. Even clerical workers may be exposed to the toners used in copiers and printers. The occurrence and effects of such chemical exposures are detailed elsewhere in this Encyclopaedia.
One chemical exposure that is diminishing in importance as more and more HCWs quit smoking and more health care facilities become “smoke-free” is “second hand” tobacco smoke.
Unusual chemical exposures in health care
Table 2 presents a partial listing of the chemicals most commonly encountered in health care workplaces. Whether or not they will be toxic will depend on the nature of the chemical and its biological proclivities, the manner, intensity and duration of the exposure, the susceptibilities of the exposed worker, and the speed and effectiveness of any countermeasures that may have been attempted. Unfortunately, a compendium of the nature, mechanisms, effects and treatment of chemical exposures of health care workers has not yet been published.
There are some unique exposures in the health care workplace that substantiate the dictum that a high level of vigilance is necessary to protect workers fully from such risks. For example, it was recently reported that health care workers had been overcome by toxic fumes emanating from a patient under treatment from a massive chemical exposure. Cases of cyanide poisoning arising from patient emissions have also been reported. In addition to the direct toxicity of waste anaesthetic gases to anaesthetists and other personnel in operating theatres, there is the often unrecognized problem created by the frequent use in such areas of high-energy sources which can transform the anaesthetic gases to free radicals, a form in which they are potentially carcinogenic.
Table 2. Chemicals cited Hazardous Substances Database (HSDB)
The following chemicals are listed in the HSDB as being used in some area of the health care environment. The HSDB is produced by the US National Library of Medicine and is a compilation of more than 4,200 chemicals with known toxic effects in commercial use. Absence of a chemical from the list does not imply that it is not toxic, but that it is not present in the HSDB.
Use list in the HSDB |
Chemical name |
CAS number* |
Disinfectants; antiseptics |
benzylalkonium chloride |
0001-54-5 |
Sterilants |
beta-propiolactone |
57-57-8 |
Laboratory reagents: |
2,4-xylidine (magenta-base) |
3248-93-9 |
* Chemical Abstracts identification number.
Often overlooked when considering the safety and well-being of health care workers are students attending medical, dental, nursing and other schools for health professionals and volunteers serving pro bono in healthcare facilities. Since they are not “employees” in the technical or legal sense of the term, they are ineligible for workers’ compensation and employment-based health insurance in many jurisdictions. Health care administrators have only a moral obligation to be concerned about their health and safety.
The clinical segments of their training bring medical, nursing and dental students into direct contact with patients who may have infectious diseases. They perform or assist in a variety of invasive procedures, including taking blood samples, and often do laboratory work involving body fluids and specimens of urine and faeces. They are usually free to wander about the facility, entering areas containing potential hazards often, since such hazards are rarely posted, without an awareness of their presence. They are usually supervised very loosely, if at all, while their instructors are often not very knowledgeable, or even interested, in matters of safety and health protection.
Volunteers are rarely permitted to participate in clinical care but they do have social contacts with patients and they usually have few restrictions with respect to areas of the facility they may visit.
Under normal circumstances, students and volunteers share with health care workers the risks of exposure to potentially harmful hazards. These risks are exacerbated at times of crisis and in emergencies when they step into or are ordered into the breech. Clearly, even though it may not be spelled out in laws and regulations or in organizational procedure manuals, they are more than entitled to the concern and protection extended to “regular” health care workers.
The vast array of chemicals in hospitals, and the multitude of settings in which they occur, call for a systematic approach to their control. A chemical-by-chemical approach to prevention of exposures and their deleterious outcome is simply too inefficient to handle a problem of this scope. Moreover, as noted in the article “Overview of chemical hazards in health care”, many chemicals in the hospital environment have been inadequately studied; new chemicals are constantly being introduced and for others, even some that have become quite familiar (e.g., gloves made of latex), new hazardous effects are only now becoming manifest. Thus, while it is useful to follow chemical-specific control guidelines, a more comprehensive approach is needed whereby individual chemical control policies and practices are superimposed on a strong foundation of general chemical hazard control.
The control of chemical hazards in hospitals must be based on classic principles of good occupational health practice. Because health care facilities are accustomed to approaching health through the medical model, which focuses on the individual patient and treatment rather than on prevention, special effort is required to ensure that the orientation for handling chemicals is indeed preventive and that measures are principally focused on the workplace rather than on the worker.
Environmental (or engineering) control measures are the key to prevention of deleterious exposures. However, it is necessary to train each worker correctly in appropriate exposure prevention techniques. In fact, right-to-know legislation, as described below, requires that workers be informed of the hazards with which they work, as well as of the appropriate safety precautions. Secondary prevention at the level of the worker is the domain of medical services, which may include medical monitoring to ascertain whether health effects of exposure can be medically detected; it also consists of prompt and appropriate medical intervention in the event of accidental exposure. Chemicals that are less toxic must replace more toxic ones, processes should be enclosed wherever possible and good ventilation is essential.
While all means to prevent or minimize exposures should be implemented, if exposure does occur (e.g., a chemical is spilled), procedures must be in place to ensure prompt and appropriate response to prevent further exposure.
Applying the General Principles of Chemical Hazard Control in the Hospital Environment
The first step in hazard control is hazard identification. This, in turn, requires a knowledge of the physical properties, chemical constituents and toxicological properties of the chemicals in question. Material safety data sheets (MSDSs), which are becoming increasingly available by legal requirement in many countries, list such properties. The vigilant occupational health practitioner, however, should recognize that the MSDS may be incomplete, particularly with respect to long-term effects or effects of low-dose chronic exposure. Hence, a literature search may be contemplated to supplement the MSDS material, when appropriate.
The second step in controlling a hazard is characterizing the risk. Does the chemical pose a carcinogenic risk? Is it an allergen? A teratogen? Is it mainly short-term irritancy effects that are of concern? The answer to these questions will influence the way in which exposure is assessed.
The third step in chemical hazard control is to assess the actual exposure. Discussion with the health care workers who use the product in question is the most important element in this endeavour. Monitoring methods are necessary in some situations to ascertain that exposure controls are functioning properly. These may be area sampling, either grab sample or integrated, depending on the nature of the exposure; it may be personal sampling; in some cases, as discussed below, medical monitoring may be contemplated, but usually as a last resort and only as back-up to other means of exposure assessment.
Once the properties of the chemical product in question are known, and the nature and extent of exposure are assessed, a determination could be made as to the degree of risk. This generally requires that at least some dose-response information be available.
After evaluating the risk, the next series of steps is, of course, to control the exposure, so as to eliminate or at least minimize the risk. This, first and foremost, involves applying the general principles of exposure control.
Organizing a Chemical Control Programme in Hospitals
The traditional obstacles
The implementation of adequate occupational health programmes in health care facilities has lagged behind the recognition of the hazards. Labour relations are increasingly forcing hospital management to look at all aspects of their benefits and services to employees, as hospitals are no longer tacitly exempt by custom or privilege. Legislative changes are now compelling hospitals in many jurisdictions to implement control programmes.
However, obstacles remain. The preoccupation of the hospital with patient care, emphasizing treatment rather than prevention, and the staff’s ready access to informal “corridor consultation”, have hindered the rapid implementation of control programmes. The fact that laboratory chemists, pharmacists and a host of medical scientists with considerable toxicological expertise are heavily represented in management has, in general, not served to hasten the development of programmes. The question may be asked, “Why do we need an occupational hygienist when we have all these toxicology experts?” To the extent that changes in procedures threaten to have an impact on the tasks and services provided by these highly skilled personnel, the situation may be made worse: “We cannot eliminate the use of Substance X as it is the best bactericide around.” Or, “If we follow the procedure that you are recommending, patient care will suffer.” Moreover, the “we don’t need training” attitude is commonplace among the health care professions and hinders the implementation of the essential components of chemical hazard control. Internationally, the climate of cost constraint in health care is clearly also an obstacle.
Another problem of particular concern in hospitals is preserving the confidentiality of personal information about health care workers. While occupational health professionals should need only to indicate that Ms. X cannot work with chemical Z and needs to be transferred, curious clinicians are often more prone to push for the clinical explanation than their non-health care counterparts. Ms. X may have liver disease and the substance is a liver toxin; she may be allergic to the chemical; or she may be pregnant and the substance has potential teratogenic properties. While the need to alter the work assignment of particular individuals should not be routine, the confidentiality of the medical details should be protected if it is necessary.
Right-to-know legislation
Many jurisdictions around the world have implemented right-to-know legislation. In Canada, for example, WHMIS has revolutionized the handling of chemicals in industry. This country-wide system has three components: (1) the labelling of all hazardous substances with standardized labels indicating the nature of the hazard; (2) the provision of MSDSs with the constituents, hazards and control measures for each substance; and (3) the training of workers to understand the labels and the MSDSs and to use the product safely.
Under WHMIS in Canada and OSHA’s Hazard Communications requirements in the United States, hospitals have been required to construct inventories of all chemicals on the premises so that those that are “controlled substances” can be identified and addressed according to the legislation. In the process of complying with the training requirements of these regulations, hospitals have had to engage occupational health professionals with appropriate expertise and the spin-off benefits, particularly when bipartite train-the-trainer programmes were conducted, have included a new spirit to work cooperatively to address other health and safety concerns.
Corporate commitment and the role of joint health and safety committees
The most important element in the success of any occupational health and safety programme is corporate commitment to ensure its successful implementation. Policies and procedures regarding the safe handling of chemicals in hospitals must be written, discussed at all levels within the organization and adopted and enforced as corporate policy. Chemical hazard control in hospitals should be addressed by general as well as specific policies. For example, there should be a policy on responsibility for the implementation of right-to-know legislation that clearly outlines each party’s obligations and the procedures to be followed by individuals at each level of the organization (e.g., who chooses the trainers, how much work time is allowed for preparation and provision of training, to whom should communication regarding non-attendance be communicated and so on). There should be a generic spill clean-up policy indicating the responsibility of the worker and the department where the spill occurred, the indications and protocol for notifying the emergency response team, including the appropriate in-hospital and external authorities and experts, follow-up provisions for exposed workers and so on. Specific policies should also exist regarding the handling, storage and disposal of specific classes of toxic chemicals.
Not only is it essential that management be strongly committed to these programmes; the workforce, through its representatives, must also be actively involved in the development and implementation of policies and procedures. Some jurisdictions have legislatively mandated joint (labour-management) health and safety committees that meet at a minimum prescribed interval (bimonthly in the case of Manitoba hospitals), have written operating procedures and keep detailed minutes. Indeed in recognizing the importance of these committees, the Manitoba Workers’ Compensation Board (WCB) provides a rebate on WCB premiums paid by employers based on the successful functioning of these committees. To be effective, the members must be appropriately chosen—specifically, they must be elected by their peers, knowledgeable about the legislation, have appropriate education and training and be allotted sufficient time to conduct not only incident investigations but regular inspections. With respect to chemical control, the joint committee has both a pro-active and a re-active role: assisting in setting priorities and developing preventive policies, as well as serving as a sounding board for workers who are not satisfied that all appropriate controls are being implemented.
The multidisciplinary team
As noted above, the control of chemical hazards in hospitals requires a multidisciplinary endeavour. At a minimum, it requires occupational hygiene expertise. Generally hospitals have maintenance departments that have within them the engineering and physical plant expertise to assist a hygienist in determining whether workplace alterations are necessary. Occupational health nurses also play a prominent role in evaluating the nature of concerns and complaints, and in assisting an occupational physician in ascertaining whether clinical intervention is warranted. In hospitals, it is important to recognize that numerous health care professionals have expertise that is quite relevant to the control of chemical hazards. It would be unthinkable to develop policies and procedures for the control of laboratory chemicals without the involvement of lab chemists, for example, or procedures for handling anti-neoplastic drugs without the involvement of the oncology and pharmacology staff. While it is wise for occupational health professionals in all industries to consult with line staff prior to implementing control measures, it would be an unforgivable error to fail to do so in health care settings.
Data collection
As in all industries, and with all hazards, data need to be compiled both to help in priority setting and in evaluating the success of programmes. With respect to data collection on chemical hazards in hospitals, minimally, data need to be kept regarding accidental exposures and spills (so that these areas can receive special attention to prevent recurrences); the nature of concerns and complaints should be recorded (e.g., unusual odours); and clinical cases need to be tabulated, so that, for example, an increase in dermatitis from a given area or occupational group could be identified.
Cradle-to-grave approach
Increasingly, hospitals are becoming cognizant of their obligation to protect the environment. Not only the workplace hazardous properties, but the environmental properties of chemicals are being taken into consideration. Moreover, it is no longer acceptable to pour hazardous chemicals down the drain or release noxious fumes into the air. A chemical control programme in hospitals must, therefore, be capable of tracking chemicals from their purchase and acquisition (or, in some cases, synthesis on site), through the work handling, safe storage and finally to their ultimate disposal.
Conclusion
It is now recognized that there are thousands of potentially very toxic chemicals in the work environment of health care facilities; all occupational groups may be exposed; and the nature of the exposures are varied and complex. Nonetheless, with a systematic and comprehensive approach, with strong corporate commitment and a fully informed and involved workforce, chemical hazards can be managed and the risks associated with these chemicals controlled.
" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."