Wednesday, 02 March 2011 16:21

Managing Chemical Hazards in Hospitals

The vast array of chemicals in hospitals, and the multitude of settings in which they occur, call for a systematic approach to their control. A chemical-by-chemical approach to prevention of exposures and their deleterious outcome is simply too inefficient to handle a problem of this scope. Moreover, as noted in the article “Overview of chemical hazards in health care”, many chemicals in the hospital environment have been inadequately studied; new chemicals are constantly being introduced and for others, even some that have become quite familiar (e.g., gloves made of latex), new hazardous effects are only now becoming manifest. Thus, while it is useful to follow chemical-specific control guidelines, a more comprehensive approach is needed whereby individual chemical control policies and practices are superimposed on a strong foundation of general chemical hazard control.

The control of chemical hazards in hospitals must be based on classic principles of good occupational health practice. Because health care facilities are accustomed to approaching health through the medical model, which focuses on the individual patient and treatment rather than on prevention, special effort is required to ensure that the orientation for handling chemicals is indeed preventive and that measures are principally focused on the workplace rather than on the worker.

Environmental (or engineering) control measures are the key to prevention of deleterious exposures. However, it is necessary to train each worker correctly in appropriate exposure prevention techniques. In fact, right-to-know legislation, as described below, requires that workers be informed of the hazards with which they work, as well as of the appropriate safety precautions. Secondary prevention at the level of the worker is the domain of medical services, which may include medical monitoring to ascertain whether health effects of exposure can be medically detected; it also consists of prompt and appropriate medical intervention in the event of accidental exposure. Chemicals that are less toxic must replace more toxic ones, processes should be enclosed wherever possible and good ventilation is essential.

While all means to prevent or minimize exposures should be implemented, if exposure does occur (e.g., a chemical is spilled), procedures must be in place to ensure prompt and appropriate response to prevent further exposure.

Applying the General Principles of Chemical Hazard Control in the Hospital Environment

The first step in hazard control is hazard identification. This, in turn, requires a knowledge of the physical properties, chemical constituents and toxicological properties of the chemicals in question. Material safety data sheets (MSDSs), which are becoming increasingly available by legal requirement in many countries, list such properties. The vigilant occupational health practitioner, however, should recognize that the MSDS may be incomplete, particularly with respect to long-term effects or effects of low-dose chronic exposure. Hence, a literature search may be contemplated to supplement the MSDS material, when appropriate.

The second step in controlling a hazard is characterizing the risk. Does the chemical pose a carcinogenic risk? Is it an allergen? A teratogen? Is it mainly short-term irritancy effects that are of concern? The answer to these questions will influence the way in which exposure is assessed.

The third step in chemical hazard control is to assess the actual exposure. Discussion with the health care workers who use the product in question is the most important element in this endeavour. Monitoring methods are necessary in some situations to ascertain that exposure controls are functioning properly. These may be area sampling, either grab sample or integrated, depending on the nature of the exposure; it may be personal sampling; in some cases, as discussed below, medical monitoring may be contemplated, but usually as a last resort and only as back-up to other means of exposure assessment.

Once the properties of the chemical product in question are known, and the nature and extent of exposure are assessed, a determination could be made as to the degree of risk. This generally requires that at least some dose-response information be available.

After evaluating the risk, the next series of steps is, of course, to control the exposure, so as to eliminate or at least minimize the risk. This, first and foremost, involves applying the general principles of exposure control.

Organizing a Chemical Control Programme in Hospitals

The traditional obstacles

The implementation of adequate occupational health programmes in health care facilities has lagged behind the recognition of the hazards. Labour relations are increasingly forcing hospital management to look at all aspects of their benefits and services to employees, as hospitals are no longer tacitly exempt by custom or privilege. Legislative changes are now compelling hospitals in many jurisdictions to implement control programmes.

However, obstacles remain. The preoccupation of the hospital with patient care, emphasizing treatment rather than prevention, and the staff’s ready access to informal “corridor consultation”, have hindered the rapid implementation of control programmes. The fact that laboratory chemists, pharmacists and a host of medical scientists with considerable toxicological expertise are heavily represented in management has, in general, not served to hasten the development of programmes. The question may be asked, “Why do we need an occupational hygienist when we have all these toxicology experts?” To the extent that changes in procedures threaten to have an impact on the tasks and services provided by these highly skilled personnel, the situation may be made worse: “We cannot eliminate the use of Substance X as it is the best bactericide around.” Or, “If we follow the procedure that you are recommending, patient care will suffer.” Moreover, the “we don’t need training” attitude is commonplace among the health care professions and hinders the implementation of the essential components of chemical hazard control. Internationally, the climate of cost constraint in health care is clearly also an obstacle.

Another problem of particular concern in hospitals is preserving the confidentiality of personal information about health care workers. While occupational health professionals should need only to indicate that Ms. X cannot work with chemical Z and needs to be transferred, curious clinicians are often more prone to push for the clinical explanation than their non-health care counterparts. Ms. X may have liver disease and the substance is a liver toxin; she may be allergic to the chemical; or she may be pregnant and the substance has potential teratogenic properties. While the need to alter the work assignment of particular individuals should not be routine, the confidentiality of the medical details should be protected if it is necessary.

Right-to-know legislation

Many jurisdictions around the world have implemented right-to-know legislation. In Canada, for example, WHMIS has revolutionized the handling of chemicals in industry. This country-wide system has three components: (1) the labelling of all hazardous substances with standardized labels indicating the nature of the hazard; (2) the provision of MSDSs with the constituents, hazards and control measures for each substance; and (3) the training of workers to understand the labels and the MSDSs and to use the product safely.

Under WHMIS in Canada and OSHA’s Hazard Communications requirements in the United States, hospitals have been required to construct inventories of all chemicals on the premises so that those that are “controlled substances” can be identified and addressed according to the legislation. In the process of complying with the training requirements of these regulations, hospitals have had to engage occupational health professionals with appropriate expertise and the spin-off benefits, particularly when bipartite train-the-trainer programmes were conducted, have included a new spirit to work cooperatively to address other health and safety concerns.

Corporate commitment and the role of joint health and safety committees

The most important element in the success of any occupational health and safety programme is corporate commitment to ensure its successful implementation. Policies and procedures regarding the safe handling of chemicals in hospitals must be written, discussed at all levels within the organization and adopted and enforced as corporate policy. Chemical hazard control in hospitals should be addressed by general as well as specific policies. For example, there should be a policy on responsibility for the implementation of right-to-know legislation that clearly outlines each party’s obligations and the procedures to be followed by individuals at each level of the organization (e.g., who chooses the trainers, how much work time is allowed for preparation and provision of training, to whom should communication regarding non-attendance be communicated and so on). There should be a generic spill clean-up policy indicating the responsibility of the worker and the department where the spill occurred, the indications and protocol for notifying the emergency response team, including the appropriate in-hospital and external authorities and experts, follow-up provisions for exposed workers and so on. Specific policies should also exist regarding the handling, storage and disposal of specific classes of toxic chemicals.

Not only is it essential that management be strongly committed to these programmes; the workforce, through its representatives, must also be actively involved in the development and implementation of policies and procedures. Some jurisdictions have legislatively mandated joint (labour-management) health and safety committees that meet at a minimum prescribed interval (bimonthly in the case of Manitoba hospitals), have written operating procedures and keep detailed minutes. Indeed in recognizing the importance of these committees, the Manitoba Workers’ Compensation Board (WCB) provides a rebate on WCB premiums paid by employers based on the successful functioning of these committees. To be effective, the members must be appropriately chosen—specifically, they must be elected by their peers, knowledgeable about the legislation, have appropriate education and training and be allotted sufficient time to conduct not only incident investigations but regular inspections. With respect to chemical control, the joint committee has both a pro-active and a re-active role: assisting in setting priorities and developing preventive policies, as well as serving as a sounding board for workers who are not satisfied that all appropriate controls are being implemented.

The multidisciplinary team

As noted above, the control of chemical hazards in hospitals requires a multidisciplinary endeavour. At a minimum, it requires occupational hygiene expertise. Generally hospitals have maintenance departments that have within them the engineering and physical plant expertise to assist a hygienist in determining whether workplace alterations are necessary. Occupational health nurses also play a prominent role in evaluating the nature of concerns and complaints, and in assisting an occupational physician in ascertaining whether clinical intervention is warranted. In hospitals, it is important to recognize that numerous health care professionals have expertise that is quite relevant to the control of chemical hazards. It would be unthinkable to develop policies and procedures for the control of laboratory chemicals without the involvement of lab chemists, for example, or procedures for handling anti-neoplastic drugs without the involvement of the oncology and pharmacology staff. While it is wise for occupational health professionals in all industries to consult with line staff prior to implementing control measures, it would be an unforgivable error to fail to do so in health care settings.

Data collection

As in all industries, and with all hazards, data need to be compiled both to help in priority setting and in evaluating the success of programmes. With respect to data collection on chemical hazards in hospitals, minimally, data need to be kept regarding accidental exposures and spills (so that these areas can receive special attention to prevent recurrences); the nature of concerns and complaints should be recorded (e.g., unusual odours); and clinical cases need to be tabulated, so that, for example, an increase in dermatitis from a given area or occupational group could be identified.

Cradle-to-grave approach

Increasingly, hospitals are becoming cognizant of their obligation to protect the environment. Not only the workplace hazardous properties, but the environmental properties of chemicals are being taken into consideration. Moreover, it is no longer acceptable to pour hazardous chemicals down the drain or release noxious fumes into the air. A chemical control programme in hospitals must, therefore, be capable of tracking chemicals from their purchase and acquisition (or, in some cases, synthesis on site), through the work handling, safe storage and finally to their ultimate disposal.

Conclusion

It is now recognized that there are thousands of potentially very toxic chemicals in the work environment of health care facilities; all occupational groups may be exposed; and the nature of the exposures are varied and complex. Nonetheless, with a systematic and comprehensive approach, with strong corporate commitment and a fully informed and involved workforce, chemical hazards can be managed and the risks associated with these chemicals controlled.

 

Back

Wednesday, 02 March 2011 16:17

Overview of Chemical Hazards in Health Care

Exposure to potentially hazardous chemicals is a fact of life for health care workers. They are encountered in the course of diagnostic and therapeutic procedures, in laboratory work, in preparation and clean-up activities and even in emanations from patients, to say nothing of the “infrastructure” activities common to all worksites such as cleaning and housekeeping, laundry, painting, plumbing and maintenance work. Despite the constant threat of such exposures and the large numbers of workers involved—in most countries, health care invariably is one of the most labour-intensive industries—this problem has received scant attention from those involved in occupational health and safety research and regulation. The great majority of chemicals in common use in hospitals and other health care settings are not specifically covered under national and international occupational exposure standards. In fact, very little effort has been made to date to identify the chemicals most frequently used, much less to study the mechanisms and intensity of exposures to them and the epidemiology of the effects on the health care workers involved.

This may be changing in the many jurisdictions in which right-to-know laws, such as the Canadian Workplace Hazardous Materials Information Systems (WHMIS) are being legislated and enforced. These laws require that workers be informed of the name and nature of the chemicals to which they may be exposed on the job. They have introduced a daunting challenge to administrators in the health care industry who must now turn to occupational health and safety professionals to undertake a de novo inventory of the identity and location of the thousands of chemicals to which their workers may be exposed.

The wide range of professions and jobs and the complexity of their interplay in the health care workplace require unique diligence and astuteness on the part of those charged with such occupational safety and health responsibilities. A significant complication is the traditional altruistic focus on the care and well-being of the patients, even at the expense of the health and well-being of those providing the services. Another complication is the fact that these services are often required at times of great urgency when important preventive and protective measures may be forgotten or deliberately disregarded.

Categories of Chemical Exposures in the Health Care Setting

Table 1 lists the categories of chemicals encountered in the health care workplace. Laboratory workers are exposed to the broad range of chemical reagents they employ, histology technicians to dyes and stains, pathologists to fixative and preservative solutions (formaldeyde is a potent sensitizer), and asbestos is a hazard to workers making repairs or renovations in older health care facilities.

Table 1. Categories of chemicals used in health care

Types of chemicals

Locations most likely to be found

Disinfectants

Patient areas

Sterilants

Central supply
Operating theatres
Physician offices
Rehabilitation centres

Medicines

Patient areas
Pharmacy

Laboratory reagents

Laboratories

Housekeeping/maintenance chemicals

Hospital-wide

Food ingredients and products

Kitchen
Cafeteria

Pesticides

Hospital-wide

 

Even when liberally applied in combating and preventing the spread of infectious agents, detergents, disinfectants and sterilants offer relatively little danger to patients whose exposure is usually of brief duration. Even though individual doses at any one time may be relatively low, their cumulative effect over the course of a working lifetime may, however, constitute a significant risk to health care workers.

Occupational exposures to drugs can cause allergic reactions, such as have been reported over many years among workers administering penicillin and other antibiotics, or much more serious problems with such highly carcinogenic agents as the antineoplastic drugs. The contacts may occur during the preparation or administration of the dose for injection or in cleaning up after it has been administered. Although the danger of this mechanism of exposure had been known for many years, it was fully appreciated only after mutagenic activity was detected in the urine of nurses administering antineoplastic agents.

Another mechanism of exposure is the administration of drugs as aerosols for inhalation. The use of antineoplastic agents, pentamidine and ribavarin by this route has been studied in some detail, but there has been, as of this writing, no report of a systematic study of aerosols as a source of toxicity among health care workers.

Anaesthetic gases represent another class of drugs to which many health care workers are exposed. These chemicals are associated with a variety of biological effects, the most obvious of which are on the nervous system. Recently, there have been reports suggesting that repeated exposures to anaesthetic gases may, over time, have adverse reproductive effects among both male and female workers. It should be recognized that appreciable amounts of waste anaesthetic gases may accumulate in the air in recovery rooms as the gases retained in the blood and other tissues of patients are eliminated by exhalation.

Chemical disinfecting and sterilizing agents are another important category of potentially hazardous chemical exposures for health care workers. Used primarily in the sterilization of non-disposable equipment, such as surgical instruments and respiratory therapy apparatus, chemical sterilants such as ethylene oxide are effective because they interact with infectious agents and destroy them. Alkylation, whereby methyl or other alkyl groups bind chemically with protein-rich entities such as the amino groups in haemoglobiin and DNA, is a powerful biological effect. In intact organisms, this may not cause direct toxicity but should be considered potentially carcinogenic until proven otherwise. Ethylene oxide itself, however, is a known carcinogen and is associated with a variety of adverse health effects, as discussed elsewhere in the Encyclopaedia. The potent alkylation capability of ethylene oxide, probably the most widely-used sterilant for heat-sensitive materials, has led to its use as a classic probe in studying molecular structure.

For years, the methods used in the chemical sterilization of instruments and other surgical materials have carelessly and needlessly put many health care workers at risk. Not even rudimentary precautions were taken to prevent or limit exposures. For example, it was the common practice to leave the door of the sterilizer partially open to allow the escape of excess ethylene oxide, or to leave freshly-sterilized materials uncovered and open to the room air until enough had been assembled to make efficient use of the aerator unit.

The fixation of metallic or ceramic replacement parts so common in dentistry and orthopaedic surgery may be a source of potentially hazardous chemical exposure such as silica. These and the acrylic resins often used to glue them in place are usually biologically inert, but health care workers may be exposed to the monomers and other chemical reactants used during the preparation and application process. These chemicals are often sensitizing agents and have been associated with chronic effects in animals. The preparation of mercury amalgam fillings can lead to mercury exposure. Spills and the spread of mercury droplets is a particular concern since these may linger unnoticed in the work environment for many years. The acute exposure of patients to them appears to be entirely safe, but the long-term health implications of the repeated exposure of health care workers have not been adequately studied.

Finally, such medical techniques as laser surgery, electro-cauterization and use of other radiofrequency and high-energy devices can lead to the thermal degradation of tissues and other substances resulting in the formation of potentially toxic smoke and fumes. For example, the cutting of “plaster” casts made of polyester resin impregnated bandages has been shown to release potentially toxic fumes.

The hospital as a “mini-municipality”

A listing of the varied jobs and tasks performed by the personnel of hospitals and other large health care facilities might well serve as a table of contents for the commercial listings of a telephone directory for a sizeable municipality. All of these entail chemical exposures intrinsic to the particular work activity in addition to those that are peculiar to the health care environment. Thus, painters and maintenance workers are exposed to solvents and lubricants. Plumbers and others engaged in soldering are exposed to fumes of lead and flux. Housekeeping workers are exposed to soaps, detergents and other cleansing agents, pesticides and other household chemicals. Cooks may be exposed to potentially carcinogenic fumes in broiling or frying foods and to oxides of nitrogen from the use of natural gas as fuel. Even clerical workers may be exposed to the toners used in copiers and printers. The occurrence and effects of such chemical exposures are detailed elsewhere in this Encyclopaedia.

One chemical exposure that is diminishing in importance as more and more HCWs quit smoking and more health care facilities become “smoke-free” is “second hand” tobacco smoke.

Unusual chemical exposures in health care

Table 2 presents a partial listing of the chemicals most commonly encountered in health care workplaces. Whether or not they will be toxic will depend on the nature of the chemical and its biological proclivities, the manner, intensity and duration of the exposure, the susceptibilities of the exposed worker, and the speed and effectiveness of any countermeasures that may have been attempted. Unfortunately, a compendium of the nature, mechanisms, effects and treatment of chemical exposures of health care workers has not yet been published.

There are some unique exposures in the health care workplace that substantiate the dictum that a high level of vigilance is necessary to protect workers fully from such risks. For example, it was recently reported that health care workers had been overcome by toxic fumes emanating from a patient under treatment from a massive chemical exposure. Cases of cyanide poisoning arising from patient emissions have also been reported. In addition to the direct toxicity of waste anaesthetic gases to anaesthetists and other personnel in operating theatres, there is the often unrecognized problem created by the frequent use in such areas of high-energy sources which can transform the anaesthetic gases to free radicals, a form in which they are potentially carcinogenic.

Table 2. Chemicals cited Hazardous Substances Database (HSDB)

The following chemicals are listed in the HSDB as being used in some area of the health care environment. The HSDB is produced by the US National Library of Medicine and is a compilation of more than 4,200 chemicals with known toxic effects in commercial use. Absence of a chemical from the list does not imply that it is not toxic, but that it is not present in the HSDB.

Use list in the HSDB

Chemical name

CAS number*

Disinfectants; antiseptics

benzylalkonium chloride
borax
boric acid
cetyl pyridinium chloride
m-cresol
2-chlorophenol
4-chlorophenol
hexachlorophene
methyl ethyl ketone
phenol
tri-m-cresyl phosphate (lysol)

0001-54-5
1303-96-4
10043-35-3
123-03-5
95-57-8
106-48-9
70-30-4
108-39-4
78-93-3
108-95-2
563-04-2

Sterilants

beta-propiolactone
crotonaldehyde
ethylene oxide
formaldehyde
glutaraldehyde

57-57-8
4170-30-3
75-21-8
50-00-0
111-30-8

Laboratory reagents:
Biological stains

2,4-xylidine (magenta-base)
acridine-red
basic parafuchsine
basic-magenta
CI-acid-blue-9
CI-acid-green-3
CI-acid-red-14
CI-direct-blue-1
CI-direct-red-28
CI-direct-yellow-11
CI-acid-green-3
curcumin
Heamtoxylin
hexamethyl-p-rosaniline
chloride (violet)
malachite green
osmiun tetroxide
ponceau 3R

3248-93-9
2465-29-4
569-61-9
3248-93-9
129-17-9
4680-78-8
3567-69-9
2429-74-5
573-58-0
1325-37-7
4680-78-8
458-37-7
517-28-2

548-62-9
569-64-2
20816-12-0
3564-09-8

* Chemical Abstracts identification number.

 

Back

Transmission of Mycobacterium tuberculosis is a recognized risk in health care facilities. The magnitude of the risk to HCWs varies considerably by the type of health care facility, the prevalence of TB in the community, the patient population served, the HCW’s occupational group, the area of the health care facility in which the HCW works and the effectiveness of TB infection-control interventions. The risk may be higher in areas where patients with TB are provided care before diagnosis and initiation of TB treatment and isolation precautions (e.g., in clinic waiting areas and emergency departments) or where diagnostic or treatment procedures that stimulate coughing are performed. Nosocomial transmission of M. tuberculosis has been associated with close contact with persons who have infectious TB and with the performance of certain procedures (e.g., bronchoscopy, endotracheal intubation and suctioning, open abscess irrigation and autopsy). Sputum induction and aerosol treatments that induce coughing may also increase the potential for transmission of M. tuberculosis. Personnel in health care facilities should be particularly alert to the need for preventing transmission of M. tuberculosis in those facilities in which immunocompromised persons (e.g., HIV-infected persons) work or receive care—especially if cough-inducing procedures, such as sputum induction and aerosolized pentamidine treatments, are being performed.

Transmission and Pathogenesis

M. tuberculosis is carried in airborne particles, or droplet nuclei, that can be generated when persons who have pulmonary or laryngeal TB sneeze, cough, speak or sing. The particles are an estimated 1 to 5 μm in size and normal air currents can keep them airborne for prolonged time periods and spread them throughout a room or building. Infection occurs when a susceptible person inhales droplet nuclei containing M. tuberculosis and these droplet nuclei traverse the mouth or nasal passages, upper respiratory tract and bronchi to reach the alveoli of the lungs. Once in the alveoli, the organisms are taken up by alveolar macrophages and spread throughout the body. Usually within two to ten weeks after initial infection with M. tuberculosis, the immune response limits further multiplication and spread of the tubercle bacilli; however, some of the bacilli remain dormant and viable for many years. This condition is referred to as latent TB infection. Persons with latent TB infection usually have positive purified protein derivative (PPD)-tuberculin skin-test results, but they do not have symptoms of active TB, and they are not infectious.

In general, persons who become infected with M. tuberculosis have approximately a 10% risk for developing active TB during their lifetimes. This risk is greatest during the first two years after infection. Immunocompromised persons have a greater risk for the progression of latent TB infection to active TB disease; HIV infection is the strongest known risk factor for this progression. Persons with latent TB infection who become co-infected with HIV have approximately an 8 to 10% risk per year for developing active TB. HIV-infected persons who are already severely immunosuppressed and who become newly infected with M. tuberculosis have an even greater risk for developing active TB.

The probability that a person who is exposed to M. tuberculosis will become infected depends primarily on the concentration of infectious droplet nuclei in the air and the duration of exposure. Characteristics of the TB patient that enhance transmission include:

  • disease in the lungs, airways or larynx
  • presence of cough or other forceful expiratory measures
  • presence of acid-fast bacilli (AFB) in the sputum
  • failure of the patient to cover the mouth and nose when coughing or sneezing
  • presence of cavitation on chest radiograph
  • inappropriate or short duration of chemotherapy
  • administration of procedures that can induce coughing or cause aerosolization of M. tuberculosis (e.g., sputum induction).

 

Environmental factors that enhance the likelihood of transmission include:

  • exposure in relatively small, enclosed spaces
  • inadequate local or general ventilation that results in insufficient dilution and/or removal of infectious droplet nuclei
  • recirculation of air containing infectious droplet nuclei.

 

Characteristics of the persons exposed to M. tuberculosis that may affect the risk for becoming infected are not as well defined. In general, persons who have been infected previously with M. tuberculosis may be less susceptible to subsequent infection. However, reinfection can occur among previously infected persons, especially if they are severely immunocompromised. Vaccination with Bacille of Calmette and Guérin (BCG) probably does not affect the risk for infection; rather, it decreases the risk for progressing from latent TB infection to active TB. Finally, although it is well established that HIV infection increases the likelihood of progressing from latent TB infection to active TB, it is unknown whether HIV infection increases the risk for becoming infected if exposed to M. tuberculosis.

Epidemiology

Several TB outbreaks among persons in health care facilities have been reported recently in the United States. Many of these outbreaks involved transmission of multidrug-resistant strains of M. tuberculosis to both patients and HCWs. Most of the patients and some of the HCWs were HIV-infected persons in whom new infection progressed rapidly to active disease. Mortality associated with those outbreaks was high (with a range of 43 to 93%). Furthermore, the interval between diagnosis and death was brief (with a range of median intervals of 4 to 16 weeks). Factors contributing to these outbreaks included delayed diagnosis of TB, delayed recognition of drug resistance and delayed initiation of effective therapy, all of which resulted in prolonged infectiousness, delayed initiation and inadequate duration of TB isolation, inadequate ventilation in TB isolation rooms, lapses in TB isolation practices and inadequate precautions for cough-inducing procedures and lack of adequate respiratory protection.

Fundamentals of TB infection control

An effective TB infection-control programme requires early identification, isolation and effective treatment of persons who have active TB. The primary emphasis of the TB infection-control plan should be on achieving these three goals. In all health care facilities, particularly those in which persons who are at high risk for TB work or receive care, policies and procedures for TB control should be developed, reviewed periodically and evaluated for effectiveness to determine the actions necessary to minimize the risk for transmission of M. tuberculosis.

The TB infection-control programme should be based on a hierarchy of control measures. The first level of the hierarchy, and that which affects the largest number of persons, is using administrative measures intended primarily to reduce the risk for exposing uninfected persons to persons who have infectious TB. These measures include:

  • developing and implementing effective written policies and protocols to ensure the rapid identification, isolation, diagnostic evaluation and treatment of persons likely to have TB
  • implementing effective work practices among HCWs in the health care facility (e.g., correctly wearing respiratory protection and keeping doors to isolation rooms closed)
  • educating, training and counselling HCWs about TB
  • screening HCWs for TB infection and disease.

 

The second level of the hierarchy is the use of engineering controls to prevent the spread and reduce the concentration of infectious droplet nuclei. These controls include:

  • direct source control using local exhaust ventilation
  • controlling direction of airflow to prevent contamination of air in areas adjacent to the infectious source
  • diluting and removing contaminated air via general ventilation
  • air cleaning via air filtration or ultraviolet germicidal irradiation (UVGI).

 

The first two levels of the hierarchy minimize the number of areas in the health care facility where exposure to infectious TB may occur, and they reduce, but do not eliminate, the risk in those few areas where exposure to M. tuberculosis can still occur (e.g., rooms in which patients with known or suspected infectious TB are being isolated and treatment rooms in which cough-inducing or aerosol-generating procedures are performed on such patients). Because persons entering such rooms may be exposed to M. tuberculosis, the third level of the hierarchy is the use of personal respiratory protective equipment in these and certain other situations in which the risk for infection with M. tuberculosis may be relatively higher.

Specific measures to reduce the risk for transmission of M. tuberculosis include the following:

1.    Assigning to specific persons in the health care facility the supervisory responsibility for designing, implementing, evaluating and maintaining the TB infection-control programme.

2.    Conducting a risk assessment to evaluate the risk for transmission of M. tuberculosis in all areas of the health care facility, developing a written TB infection-control programme based on the risk assessment and periodically repeating the risk assessment to evaluate the effectiveness of the TB infection-control programme. TB infection-control measures for each health care facility should be based on a careful assessment of the risk for transmission of M. tuberculosis in that particular setting. The first step in developing the TB infection-control programme should be to conduct a baseline risk assessment to evaluate the risk for transmission of M. tuberculosis in each area and occupational group in the facility. Appropriate infection-control interventions can then be developed on the basis of actual risk. Risk assessments should be performed for all inpatient and outpatient settings (e.g., medical and dental offices). Classification of risk for a facility, for a specific area and for a specific occupational group should be based on the profile of TB in the community, the number of infectious TB patients admitted to the area or ward, or the estimated number of infectious TB patients to whom HCWs in an occupational group may be exposed and the results of analysis of HCW PPD test conversions (where applicable) and possible person-to-person transmission of M. tuberculosis. Regardless of risk level, the management of patients with known or suspected infectious TB should not vary. However, the index of suspicion for infectious TB among patients, the frequency of HCW PPD skin testing, the number of TB isolation rooms and other factors will depend on the level of risk for transmission of M. tuberculosis in the facility, area or occupational group.

3.    Developing, implementing and enforcing policies and protocols to ensure early identification, diagnostic evaluation and effective treatment of patients who may have infectious TB. A diagnosis of TB may be considered for any patient who has a persistent cough (i.e., a cough lasting for longer than 3 weeks) or other signs or symptoms compatible with active TB (e.g., bloody sputum, night sweats, weight loss, anorexia or fever). However, the index of suspicion for TB will vary in different geographic areas and will depend on the prevalence of TB and other characteristics of the population served by the facility. The index of suspicion for TB should be very high in geographic areas or among groups of patients in which the prevalence of TB is high. Appropriate diagnostic measures should be conducted and TB precautions implemented for patients in whom active TB is suspected.

4.    Providing prompt triage for and appropriate management of patients in the outpatient setting who may have infectious TB. Triage of patients in ambulatory-care settings and emergency departments should include vigorous efforts to identify promptly patients who have active TB. HCWs who are the first points of contact in facilities that serve populations at risk for TB should be trained to ask questions that will facilitate identification of patients with signs and symptoms suggestive of TB. Patients with signs or symptoms suggestive of TB should be evaluated promptly to minimize the amount of time they are in ambulatory-care areas. TB precautions should be followed while the diagnostic evaluation is being conducted for these patients. TB precautions in the ambulatory-care setting should include placing these patients in a separate area apart from other patients and not in open waiting areas (ideally, in a room or enclosure meeting TB isolation requirements), giving these patients surgical masks to wear and instructing them to keep their masks on and giving these patients tissues and instructing them to cover their mouths and noses with the tissues when coughing or sneezing. Surgical masks are designed to prevent the respiratory secretions of the person wearing the mask from entering the air. When not in a TB isolation room, patients suspected of having TB should wear surgical masks to reduce the expulsion of droplet nuclei into the air. These patients do not need to wear particulate respirators, which are designed to filter the air before it is inhaled by the person wearing the mask. Patients suspected of having or known to have TB should never wear a respirator that has an exhalation valve, because the device would provide no barrier to the expulsion of droplet nuclei into the air.

5.    Promptly initiating and maintaining TB isolation for persons who may have infectious TB and who are admitted to the inpatient setting. In hospitals and other inpatient facilities, any patient suspected of having or known to have infectious TB should be placed in a TB isolation room that has currently recommended ventilation characteristics (see below). Written policies for initiating isolation should specify the indications for isolation, the person(s) authorized to initiate and discontinue isolation, the isolation practices to follow, the monitoring of isolation, the management of patients who do not adhere to isolation practices and the criteria for discontinuing isolation.

6.    Effectively planning arrangements for discharge. Before a TB patient is discharged from the health care facility, the facility’s staff and public health authorities should collaborate to ensure continuation of therapy. Discharge planning in the health care facility should include, at a minimum, a confirmed outpatient appointment with the provider who will manage the patient until the patient is cured, sufficient medication to take until the outpatient appointment and placement into case management (e.g., directly observed therapy (DOT)) or outreach programmes of the public health department. These plans should be initiated and in place before the patient’s discharge.

7.    Developing, installing, maintaining and evaluating ventilation and other engineering controls to reduce the potential for airborne exposure to M. tuberculosis. Local exhaust ventilation is a preferred source control technique, and it is often the most efficient way to contain airborne contaminants because it captures these contaminants near their source before they can disperse. Therefore, the technique should be used, if feasible, wherever aerosol-generating procedures are performed. Two basic types of local exhaust devices use hoods: the enclosing type, in which the hood either partially or fully encloses the infectious source, and the exterior type, in which the infectious source is near but outside the hood. Fully enclosed hoods, booths or tents are always preferable to exterior types because of their superior ability to prevent contaminants from escaping into the HCW’s breathing zone. General ventilation can be used for several purposes, including diluting and removing contaminated air, controlling airflow patterns within rooms and controlling the direction of airflow throughout a facility. General ventilation maintains air quality by two processes: dilution and removal of airborne contaminants. Uncontaminated supply air mixes with the contaminated room air (i.e., dilution), which is subsequently removed from the room by the exhaust system. These processes reduce the concentration of droplet nuclei in the room air. Recommended general ventilation rates for health care facilities are usually expressed in number of air changes per hour (ACH).

This number is the ratio of the volume of air entering the room per hour to the room volume and is equal to the exhaust airflow (Q, in cubic feet per minute) divided by the room volume (V, in cubic feet) multiplied by 60 (i.e., ACH = Q / V x 60). For the purposes of reducing the concentration of droplet nuclei, TB isolation and treatment rooms in existing health care facilities should have an airflow of greater than 6 ACH. Where feasible, this airflow rate should be increased to at least 12 ACH by adjusting or modifying the ventilation system or by using auxiliary means (e.g., recirculation of air through fixed HEPA filtration systems or portable air cleaners). New construction or renovation of existing health care facilities should be designed so that TB isolation rooms achieve an airflow of at least 12 ACH. The general ventilation system should be designed and balanced so that air flows from less contaminated (i.e., more clean) to more contaminated (less clean) areas. For example, air should flow from corridors into TB isolation rooms to prevent spread of contaminants to other areas. In some special treatment rooms in which operative and invasive procedures are performed, the direction of airflow is from the room to the hallway to provide cleaner air during these procedures. Cough-inducing or aerosol-generating procedures (e.g., bronchoscopy and irrigation of tuberculous abscesses) should not be performed in rooms with this type of airflow on patients who may have infectious TB. HEPA filters may be used in a number of ways to reduce or eliminate infectious droplet nuclei from room air or exhaust. These methods include placement of HEPA filters in exhaust ducts discharging air from booths or enclosures into the surrounding room, in ducts or in ceiling- or wall-mounted units, for recirculation of air within an individual room (fixed recirculation systems), in portable air cleaners, in exhaust ducts to remove droplet nuclei from air being discharged to the outside, either directly or through ventilation equipment, and in ducts discharging air from the TB isolation room into the general ventilation system. In any application, HEPA filters should be installed carefully and maintained meticulously to ensure adequate functioning. For general use areas in which the risk for transmission of M. tuberculosis is relatively high, ultraviolet lamps (UVGI) may be used as an adjunct to ventilation for reducing the concentration of infectious droplet nuclei, although the effectiveness of such units has not been evaluated adequately. Ultraviolet (UV) units can be installed in a room or corridor to irradiate the air in the upper portion of the room, or they can be installed in ducts to irradiate air passing through the ducts.

8.    Developing, implementing, maintaining and evaluating a respiratory protection programme. Personal respiratory protection (i.e., respirators) should be used by persons entering rooms in which patients with known or suspected infectious TB are being isolated, persons present during cough-inducing or aerosol-generating procedures performed on such patients and persons in other settings where administrative and engineering controls are not likely to protect them from inhaling infectious airborne droplet nuclei. These other settings include transporting patients who may have infectious TB in emergency transport vehicles and providing urgent surgical or dental care to patients who may have infectious TB before a determination has been made that the patient is non-infectious.

9.    Educating and training HCWs about TB, effective methods for preventing transmission of M. tuberculosis and the benefits of medical screening programmes. All HCWs, including physicians, should receive education regarding TB that is relevant to persons in their particular occupational group. Ideally, training should be conducted before initial assignment and the need for additional training should be re-evaluated periodically (e.g., once a year). The level and detail of this education will vary according to the HCW’s work responsibilities and the level of risk in the facility (or area of the facility) in which the HCW works. However, the programme may include the following elements:

  • the basic concepts of M. tuberculosis transmission, pathogenesis and diagnosis,
    including information concerning the difference between latent TB infection and active
    TB disease, the signs and symptoms of TB and the possibility of reinfection
  • the potential for occupational exposure to persons who have infectious TB in the
    health care facility, including information concerning the prevalence of TB in the
    community and facility, the ability of the facility to properly isolate patients who have
    active TB, and situations with increased risk for exposure to M. tuberculosis
  • the principles and practices of infection control that reduce the risk for transmission of
    M. tuberculosis, including information concerning the hierarchy of TB infection-control
    measures and the written policies and procedures of the facility. Site-specific control
    measures should be provided to HCWs working in areas that require control
    measures in addition to those of the basic TB infection-control programme.
  • the importance of proper maintenance for engineering controls (e.g., cleaning UVGI lamps and ensuring negative pressure in TB isolation rooms)
  • the purpose of PPD skin testing, the significance of a positive PPD test result and the importance of participating in the skin-test programme
  • the principles of preventive therapy for latent TB infection; these principles include the indications, use, effectiveness and the potential adverse effects of the drugs
  • the HCW’s responsibility to seek prompt medical evaluation if a PPD test conversion
    occurs or if symptoms develop that could be caused by TB. Medical evaluation will
    enable HCWs who have TB to receive appropriate therapy and will help to prevent
    transmission of M. tuberculosis to patients and other HCWs.
  • the principles of drug therapy for active TB
  • the importance of notifying the facility if the HCW is diagnosed with active TB so that contact investigation procedures can be initiated
  • the responsibilities of the facility to maintain the confidentiality of the HCW while
    ensuring that the HCW who has TB receives appropriate therapy and is non-
    infectious before returning to duty
  • the higher risks associated with TB infection in persons who have HIV infection or
    other causes of severely impaired cell-mediated immunity, including (a) the more
    frequent and rapid development of clinical TB after infection with M. tuberculosis, (b)
    the differences in the clinical presentation of disease and (c) the high mortality rate associated with multiple drug resistant-TB in such persons
  • the potential development of cutaneous anergy as immune function (as measured by CD4+ T-lymphocyte counts) declines
  • information regarding the efficacy and safety of BCG vaccination and the principles of PPD screening among BCG recipients
  • the facility’s policy on voluntary work reassignment options for immunocompromised HCWs.

 

10.    Developing and implementing a programme for routine periodic counselling and screening of HCWs for active TB and latent TB infection. A TB counselling, screening and prevention programme for HCWs should be established to protect both HCWs and patients. HCWs who have positive PPD test results, PPD test conversions or symptoms suggestive of TB should be identified, evaluated to rule out a diagnosis of active TB and started on therapy or preventive therapy if indicated. In addition, the results of the HCW PPD screening programme will contribute to evaluation of the effectiveness of current infection-control practices. Because of the increased risk for rapid progression from latent TB infection to active TB in human immunodeficiency virus, HIV-infected or otherwise severely immunocompromised persons, all HCWs should know if they have a medical condition or are receiving a medical treatment that may lead to severely impaired cell-mediated immunity. HCWs who may be at risk for HIV infection should know their HIV status (i.e., they should be encouraged to voluntarily seek counselling and testing for HIV antibody status). Existing guidelines for counselling and testing should be followed routinely. Knowledge of these conditions allows the HCW to seek the appropriate preventive measures and to consider voluntary work reassignments.

11.    ll HCWs should be informed about the need to follow existing recommendations for infection control to minimize the risk for exposure to infectious agents; implementation of these recommendations will greatly reduce the risk for occupational infections among HCWs. All HCWs should also be informed about the potential risks to severely immunocompromised persons associated with caring for patients who have some infectious diseases, including TB. It should be emphasized that limiting exposure to TB patients is the most protective measure that severely immunosuppressed HCWs can take to avoid becoming infected with M. tuberculosis. HCWs who have severely impaired cell-mediated immunity and who may be exposed to M. tuberculosis may consider a change in job-setting to avoid such exposure. HCWs should be advised of the legal option in many jurisdictions that severely immunocompromised HCWs can choose to transfer voluntarily to areas and work activities in which there is the lowest possible risk for exposure to M. tuberculosis. This choice should be a personal decision for HCWs after they have been informed of the risks to their health.

12.    Employers should make reasonable accommodations (e.g., alternative job assignments) for employees who have a health condition that compromises cell-mediated immunity and who work in settings where they may be exposed to M. tuberculosis. HCWs who are known to be immunocompromised should be referred to employee health professionals who can individually counsel the employees regarding their risk for TB. Upon the request of the immunocompromised HCW, employers should offer, but not compel, a work setting in which the HCW would have the lowest possible risk for occupational exposure to M. tuberculosis.

13.    All HCWs should be informed that immunosuppressed HCWs should have appropriate follow-up and screening for infectious diseases, including TB, provided by their medical practitioner. HCWs who are known to be HIV-infected or otherwise severely immunosuppressed should be tested for cutaneous anergy at the time of PPD testing. Consideration should be given to retesting, at least every 6 months, those immunocompromised HCWs who are potentially exposed to M. tuberculosis because of the high risk for rapid progression to active TB if they become infected.

14.    Information provided by HCWs regarding their immune status should be treated confidentially. If the HCW requests voluntary job reassignment, the privacy of the HCW should be maintained. Facilities should have written procedures on confidential handling of such information.

15.    Promptly evaluating possible episodes of M. tuberculosis transmission in health care facilities, including PPD skin-test conversions among HCWs, epidemiologically associated cases among HCWs or patients and contacts of patients or HCWs who have TB and who were not promptly identified and isolated. Epidemiological investigations may be indicated for several situations. These include, but are not limited to, the occurrence of PPD test conversions or active TB in HCWs, the occurrence of possible person-to-person transmission of M. tuberculosis and situations in which patients or HCWs with active TB are not promptly identified and isolated, thus exposing other persons in the facility to M. tuberculosis. The general objectives of the epidemiological investigations in these situations are as follows:

  • to determine the likelihood that transmission of and infection with M. tuberculosis has occurred in the facility
  • to determine the extent to which M. tuberculosis has been transmitted
  • to identify those persons who have been exposed and infected, enabling them to receive appropriate clinical management
  • to identify factors that could have contributed to transmission and infection and to implement appropriate interventions
  • to evaluate the effectiveness of any interventions that are implemented and to ensure that exposure to and transmission of M. tuberculosis have been terminated.

 

16.    Coordinating activities with the local public health department, emphasizing reporting and ensuring adequate   discharge follow-up and the continuation and completion of therapy. As soon as a patient or HCW is known or suspected to have active TB, the patient or HCW should be reported to the public health department so that appropriate follow-up can be arranged and a community contact investigation can be performed. The health department should be notified well before patient discharge to facilitate follow-up and continuation of therapy. A discharge plan coordinated with the patient or HCW, the health department and the inpatient facility should be implemented.

 

Back

Prevention of occupational transmission of bloodborne pathogens (BBP) including the human immunodeficiency virus (HIV), hepatitis B virus (HBV) and more recently hepatitis C virus (HCV), has received significant attention. Although HCWs are the primary occupational group at risk of acquisition of infection, any worker who is exposed to blood or other potentially infectious body fluids during the performance of job duties is at risk. Populations at risk for occupational exposure to BBP include workers in health care delivery, public safety and emergency response workers and others such as laboratory researchers and morticians. The potential for occupational transmission of bloodborne pathogens including HIV will continue to increase as the number of persons who have HIV and other bloodborne infections and require medical care increases.

In the US, the Centers for Disease Control and Prevention (CDC) recommended in 1982 and 1983 that patients with the acquired immunodeficiency syndrome (AIDS) be treated according to the (now obsolete) category of “blood and body fluid precautions” (CDC 1982; CDC 1983). Documentation that HIV, the causative agent of AIDS, had been transmitted to HCWs by percutaneous and mucocutaneous exposures to HIV-infected blood, as well as the realization that the HIV infection status of most patients or blood specimens encountered by HCWs would be unknown at the time of the encounter, led CDC to recommend that blood and body fluid precautions be applied to all patients, a concept known as “universal precautions” (CDC 1987a, 1987b). The use of universal precautions eliminates the need to identify patients with bloodborne infections, but is not intended to replace general infection control practices. Universal precautions include the use of handwashing, protective barriers (e.g., goggles, gloves, gowns and face protection) when blood contact is anticipated and care in the use and disposal of needles and other sharp instruments in all health care settings. Also, instruments and other reusable equipment used in performing invasive procedures should be appropriately disinfected or sterilized (CDC 1988a, 1988b). Subsequent CDC recommendations have addressed prevention of transmission of HIV and HBV to public safety and emergency responders (CDC 1988b), management of occupational exposure to HIV, including the recommendations for the use of zidovudine (CDC 1990), immunization against HBV and management of HBV exposure (CDC 1991a), infection control in dentistry (CDC 1993) and the prevention of HIV transmission from HCWs to patients during invasive procedures (CDC 1991b).

In the US, CDC recommendations do not have the force of law, but have often served as the foundation for government regulations and voluntary actions by industry. The Occupational Health and Safety Administration (OSHA), a federal regulatory agency, promulgated a standard in 1991 on Occupational Exposure to Bloodborne Pathogens (OSHA 1991). OSHA concluded that a combination of engineering and work practice controls, personal protective clothing and equipment, training, medical surveillance, signs and labels and other provisions can help to minimize or eliminate exposure to bloodborne pathogens. The standard also mandated that employers make available hepatitis B vaccination to their employees.

The World Health Organization (WHO) has also published guidelines and recommendations pertaining to AIDS and the workplace (WHO 1990, 1991). In 1990, the European Economic Council (EEC) issued a council directive (90/679/EEC) on protection of workers from risks related to exposure to biological agents at work. The directive requires employers to conduct an assessment of the risks to the health and safety of the worker. A distinction is drawn between activities where there is a deliberate intention to work with or use biological agents (e.g., laboratories) and activities where exposure is incidental (e.g., patient care). Control of risk is based on a hierarchical system of procedures. Special containment measures, according to the classification of the agents, are set out for certain types of health facilities and laboratories (McCloy 1994). In the US, CDC and the National Institutes of Health also have specific recommendations for laboratories (CDC 1993b).

Since the identification of HIV as a BBP, knowledge about HBV transmission has been helpful as a model for understanding modes of transmission of HIV. Both viruses are transmitted via sexual, perinatal and bloodborne routes. HBV is present in the blood of individuals positive for hepatitis B e antigen (HBeAg, a marker for high infectivity) at a concentration of approximately 108 to 109 viral particles per millilitre (ml) of blood (CDC 1988b). HIV is present in blood at much lower concentrations: 103 to 104 viral particles/ml for a person with AIDS and 10 to 100/ml for a person with asymptomatic HIV infection (Ho, Moudgil and Alam 1989). The risk of HBV transmission to a HCW after percutaneous exposure to HBeAg-positive blood is approximately 100-fold higher than the risk of HIV transmission after percutaneous exposure to HIV-infected blood (i.e., 30% versus 0.3%) (CDC 1989).

Hepatitis

Hepatitis, or inflammation of the liver, can be caused by a variety of agents, including toxins, drugs, autoimmune disease and infectious agents. Viruses are the most common cause of hepatitis (Benenson 1990). Three types of bloodborne viral hepatitis have been recognized: hepatitis B, formerly called serum hepatitis, the major risk to HCWs; hepatitis C, the major cause of parenterally transmitted non-A, non-B hepatitis; and hepatitis D, or delta hepatitis.

Hepatitis B. The major infectious bloodborne occupational hazard to HCWs is HBV. Among US HCWs with frequent exposure to blood, the prevalence of serological evidence of HBV infection ranges between approximately 15 and 30%. In contrast, the prevalence in the general populations averages 5%. The cost-effectiveness of serological screening to detect susceptible individuals among HCWs depends on the prevalence of infection, the cost of testing and the vaccine costs. Vaccination of persons who already have antibodies to HBV has not been shown to cause adverse effects. Hepatitis B vaccine provides protection against hepatitis B for at least 12 years after vaccination; booster doses currently are not recommended. The CDC estimated that in 1991 there were approximately 5,100 occupationally acquired HBV infections in HCWs in the United States, causing 1,275 to 2,550 cases of clinical acute hepatitis, 250 hospitalizations and about 100 deaths (unpublished CDC data). In 1991, approximately 500 HCWs became HBV carriers. These individuals are at risk of long-term sequelae, including disabling chronic liver disease, cirrhosis and liver cancer.

The HBV vaccine is recommended for use in HCWs and public safety workers who may be exposed to blood in the workplace (CDC 1991b). Following a percutaneous exposure to blood, the decision to provide prophylaxis must include considerations of several factors: whether the source of the blood is available, the HBsAg status of the source and the hepatitis B vaccination and vaccine-response status of the exposed person. For any exposure of a person not previously vaccinated, hepatitis B vaccination is recommended. When indicated, hepatitis B immune globulin (HBIG) should be administered as soon as possible after exposure since its value beyond 7 days after exposure is unclear. Specific CDC recommendations are indicated in table 1 (CDC 1991b).

Table 1. Recommendation for post-exposure prophylaxis for percutaneous or permucosal exposure to hepatitis B virus, United States

Exposed person

When source is

 

HBsAg1 positive

HBsAg negative

Source not tested or
unknown

Unvaccinated

HBIG2´1 and initiate
HB vaccine3

Initiate HB vaccine

Initiate HB vaccine

Previously
vaccinated

Known
responder

No treatment

No treatment

No treatment

Known non-
responder

HBIG´2 or HBIG´1 and
initiate revaccination

No treatment

If known high-risk source
treat as if source were
HBsAg positive

Response
unknown

Test exposed for anti-HBs4
1. If adequate5, no
treatment
2. If inadequate, HBIGx1
and vaccine booster

No treatment

Test exposed for anti-HBs
1. If adequate, no
treatment
2. If inadequate, vaccine
booster

1 HBsAg = Hepatitis B surface antigen. 2 HBIG = Hepatitis B immune globulin; dose 0.06 mL/kg IM. 3 HB vaccine = hepatitis B vaccine.  4 Anti-HBs = antibody to hepatitis B surface antigen. 5 Adequate anti-HBs is ≥10 mIU/mL.

Table 2. Provisional US Public Health Service recommendations for chemoprophylaxis after occupational exposure to HIV, by type of exposure and source of material, 1996

Type of exposure

Source material1

Antiretroviral
prophylaxis2

Antiretroviral regimen3

Percutaneous

Blood
Highest risk4
Increased risk4
No increased risk4
Fluid containing
visible blood, other
potentially infectious
fluid6, or tissue
Other body fluid
(e.g., urine)


Recommend
Recommend
Offer
Offer
Not offer


ZDV plus 3TC plus IDV
ZDV plus 3TC, ± IDV5
ZDV plus 3TC
ZDV plus 3TC

Mucous membrane

Blood
Fluid containing
visible blood, other
potentially infectious
fluid6, or tissue
Other body fluid
(e.g., urine)

Offer
Offer
Not offer

ZDV plus 3TC, ± IDV5
ZDV, ± 3TC5

Skin, increased risk7

Blood
Fluid containing
visible blood, other
potentially infectious
fluid6 , or tissue
Other body fluid
(e.g., urine)

Offer
Offer
Not offer

ZDV plus 3TC, ± IDV5
ZDV, ± 3TC5

1 Any exposure to concentrated HIV (e.g., in a research laboratory or production facility) is treated as percutaneous exposure to blood with highest risk.  2 Recommend—Postexposure prophylaxis (PEP) should be recommended to the exposed worker with counselling. Offer—PEP should be offered to the exposed worker with counselling. Not offer—PEP should not be offered because these are not occupational exposures to HIV.  3 Regimens: zidovudine (ZDV), 200 mg three times a day; lamivudine (3TC), 150 mg two times a day; indinavir (IDV), 800 mg three times a day (if IDV is not available, saquinavir may be used, 600 mg three times a day). Prophylaxis is given for 4 weeks. For full prescribing information, see package inserts. 4 Risk definitions for percutaneous blood exposure: Highest risk—BOTH larger volume of blood (e.g., deep injury with large diameter hollow needle previously in source patient’s vein or artery, especially involving an injection of source-patient’s blood) AND blood containing a high titre of HIV (e.g., source with acute retroviral illness or end-stage AIDS; viral load measurement may be considered, but its use in relation to PEP has not been evaluated). Increased risk—EITHER exposure to larger volume of blood OR blood with a high titre of HIV. No increased risk—NEITHER exposure to larger volume of blood NOR blood with a high titre of HIV (e.g., solid suture needle injury from source patient with asymptomatic HIV infection).  5 Possible toxicity of additional drug may not be warranted. 6 Includes semen; vaginal secretions; cerebrospinal, synovial, pleural, peritoneal, pericardial and amniotic fluids.  7 For skin, risk is increased for exposures involving a high titre of HIV, prolonged contact, an extensive area, or an area in which skin integrity is visibly compromised. For skin exposures without increased risk, the risk for drug toxicity outweighs the benefit of PEP.

Article 14(3) of EEC Directive 89/391/EEC on vaccination required only that effective vaccines, where they exist, be made available for exposed workers who are not already immune. There was an amending Directive 93/88/EEC which contained a recommended code of practice requiring that workers at risk be offered vaccination free of charge, informed of the benefits and disadvantages of vaccination and non-vaccination, and be provided a certificate of vaccination (WHO 1990).

The use of hepatitis B vaccine and appropriate environmental controls will prevent almost all occupational HBV infections. Reducing blood exposure and minimizing puncture injuries in the health care setting will reduce also the risk of transmission of other bloodborne viruses.

Hepatitis C. Transmission of HCV is similar to that of HBV, but infection persists in most patients indefinitely and more frequently progresses to long-term sequelae (Alter et al. 1992). The prevalence of anti-HCV among US hospital-based health care workers averages 1 to 2% (Alter 1993). HCWs who sustain accidental injuries from needlesticks contaminated with anti-HCV-positive blood have a 5 to 10% risk of acquiring HCV infection (Lampher et al. 1994; Mitsui et al. 1992). There has been one report of HCV transmission after a blood splash to the conjunctiva (Sartori et al. 1993). Prevention measures again consist of adherence to universal precautions and percutaneous injury prevention, since no vaccine is available and immune globulin does not appear to be effective.

Hepatitis D. Hepatitis D virus requires the presence of hepatitis B virus for replication; thus, HDV can infect persons only as a coinfection with acute HBV or as a superinfection of chronic HBV infection. HDV infection can increase the severity of liver disease; one case of occupationally acquired HDV infection hepatitis has been reported (Lettau et al. 1986). Hepatitis B vaccination of HBV-susceptible persons will also prevent HDV infection; however, there is no vaccine to prevent HDV superinfection of an HBV carrier. Other prevention measures consist of adherence to universal precautions and percutaneous injury prevention.

HIV

The first cases of AIDS were recognized in June of 1981. Initially, over 92% of the cases reported in the United States were in homosexual or bisexual men. However, by the end of 1982, AIDS cases were identified among injection drug users, blood transfusion recipients, haemophilia patients treated with clotting factor concentrates, children and Haitians. AIDS is the result of infection with HIV, which was isolated in 1985. HIV has spread rapidly. In the United States, for example, the first 100,000 AIDS cases occurred between 1981 and 1989; the second 100,000 cases occurred between 1989 and 1991. As of June 1994, 401,749 cases of AIDS had been reported in the United States (CDC 1994b).

Globally, HIV has affected many countries including those in Africa, Asia and Europe. As of 31 December 1994, 1,025,073 cumulative cases of AIDS in adults and children had been reported to the WHO. This represented a 20% increase from the 851,628 cases reported through December 1993. It was estimated that 18 million adults and about 1.5 million children have been infected with HIV since the beginning of the pandemic (late 1970s to early 1980s) (WHO 1995).

Although HIV has been isolated from human blood, breast milk, vaginal secretions, semen, saliva, tears, urine, cerebrospinal fluid and amniotic fluid, epidemiological evidence has implicated only blood, semen, vaginal secretions and breast milk in the transmission of the virus. The CDC has also reported on the transmission of HIV as the result of contact with blood or other body secretions or excretions from an HIV-infected person in the household (CDC 1994c). Documented modes of occupational HIV transmission include having percutaneous or mucocutaneous contact with HIV-infected blood. Exposure by the percutaneous route is more likely to result in infection transmission than is mucocutaneous contact.

There are a number of factors which may influence the likelihood of occupational bloodborne pathogen transmission, including: the volume of fluid in the exposure, the virus titre, the length of time of the exposure and the immune status of the worker. Additional data are needed to determine precisely the importance of these factors. Preliminary data from a CDC case-control study indicate that for percutaneous exposures to HIV-infected blood, HIV transmission is more likely if the source patient has advanced HIV disease and if the exposure involves a larger inoculum of blood (e.g., injury due to a large-bore hollow needle) (Cardo et al. 1995). Virus titre can vary between individuals and over time within a single individual. Also, blood from persons with AIDS, particularly in the terminal stages, may be more infectious than blood from persons in earlier stages of HIV infection, except possibly during the illness associated with acute infection (Cardo et al. 1995).

Occupational exposure and HIV infection

As of December 1996, CDC reported 52 HCWs in the United States who have seroconverted to HIV following a documented occupational exposure to HIV, including 19 laboratory workers, 21 nurses, six physicians and six in other occupations. Forty-five of the 52 HCWs sustained percutaneous exposures, five had mucocutaneous exposures, one had both a percutaneous and a mucocutaneous exposure and one had an unknown route of exposure. In addition, 111 possible cases of occupationally acquired infection have been reported. These possible cases have been investigated and are without identifiable non-occupational or transfusion risks; each reported percutaneous or mucocutaneous occupational exposures to blood or body fluids, or laboratory solutions containing HIV, but HIV seroconversion specifically resulting from an occupational exposure was not documented (CDC 1996a).

In 1993, the AIDS Centre at the Communicable Disease Surveillance Centre (UK) summarized reports of cases of occupational HIV transmission including 37 in the United States, four in the UK and 23 from other countries (France, Italy, Spain, Australia, South Africa, Germany and Belgium) for a total of 64 documented seroconversions after a specific occupational exposure. In the possible or presumed category there were 78 in the United States, six in the UK and 35 from other countries (France, Italy, Spain, Australia, South Africa, Germany, Mexico, Denmark, Netherlands, Canada and Belgium) for a total of 118 (Heptonstall, Porter and Gill 1993). The number of reported occupationally acquired HIV infections is likely to represent only a portion of the actual number due to under-reporting and other factors.

HIV post-exposure management

Employers should make available to workers a system for promptly initiating evaluation, counselling and follow-up after a reported occupational exposure that may place a worker at risk of acquiring HIV infection. Workers should be educated and encouraged to report exposures immediately after they occur so that appropriate interventions can be implemented (CDC 1990).

If an exposure occurs, the circumstances should be recorded in the worker’s confidential medical record. Relevant information includes the following: date and time of exposure; job duty or task being performed at the time of exposure; details of exposure; description of source of exposure, including, if known, whether the source material contained HIV or HBV; and details about counselling, post-exposure management and follow-up. The source individual should be informed of the incident and, if consent is obtained, tested for serological evidence of HIV infection. If consent cannot be obtained, policies should be developed for testing source individuals in compliance with applicable regulations. Confidentiality of the source individual should be maintained at all times.

If the source individual has AIDS, is known to be HIV seropositive, refuses testing or the HIV status is unknown, the worker should be evaluated clinically and serologically for evidence of HIV infection as soon as possible after the exposure (baseline) and, if seronegative, should be retested periodically for a minimum of 6 months after exposure (e.g., six weeks, 12 weeks and six months after exposure) to determine whether HIV infection has occurred. The worker should be advised to report and seek medical evaluation for any acute illness that occurs during the follow-up period. During the follow-up period, especially the first six to 12 weeks after the exposure, exposed workers should be advised to refrain from blood, semen or organ donation and to abstain from, or use measures to prevent HIV transmission, during sexual intercourse.

In 1990, CDC published a statement on the management of exposure to HIV including considerations regarding zidovudine (ZDV) post-exposure use. After a careful review of the available data, CDC stated that the efficacy of zidovudine could not be assessed due to insufficient data, including available animal and human data (CDC 1990).

In 1996, information suggesting that ZDV post-exposure prophylaxis (PEP) may reduce the risk for HIV transmission after occupational exposure to HIV-infected blood (CDC 1996a) prompted a US Public Health Service (PHS) to update a previous PHS statement on management of occupational exposure to HIV with the following findings and recommendations on PEP (CDC 1996b). Although failures of ZDV PEP have occurred (Tokars et al. 1993), ZDV PEP was associated with a decrease of approximately 79% in the risk for HIV seroconversion after percutaneous exposure to HIV-infected blood in a case-control study among HCWs (CDC 1995).

Although information about the potency and toxicity of antiretroviral drugs is available from studies of HIV-infected patients, it is uncertain to what extent this information can be applied to uninfected persons receiving PEP. In HIV-infected patients, combination therapy with the nucleosides ZDV and lamivudine (3TC) has greater antiretroviral activity than ZDV alone and is active against many ZDV-resistant HIV strains without significantly increased toxicity (Anon. 1996). Adding a protease inhibitor provides even greater increases in antiretroviral activity; among protease inhibitors, indinavir (IDV) is more potent than saquinavir at currently recommended doses and appears to have fewer drug interactions and short-term adverse effects than ritonavir (Niu, Stein and Schnittmann 1993). Few data exist to assess possible long-term (i.e., delayed) toxicity resulting from use of these drugs in persons not infected with HIV.

The following PHS recommendations are provisional because they are based on limited data regarding efficacy and toxicity of PEP and risk for HIV infection after different types of exposure. Because most occupational exposures to HIV do not result in infection transmission, potential toxicity must be carefully considered when prescribing PEP. Changes in drug regimens may be appropriate, based on factors such as the probable antiretroviral drug resistance profile of HIV from the source patient, local availability of drugs and medical conditions, concurrent drug therapy and drug toxicity in the exposed worker. If PEP is used, drug-toxicity monitoring should include a complete blood count and renal and hepatic chemical function tests at baseline and two weeks after starting PEP. If subjective or objective toxicity is noted, drug reduction or drug substitution should be considered, and further diagnostic studies may be indicated.

Chemoprophylaxis should be recommended to exposed workers after occupational exposures associated with the highest risk for HIV transmission. For exposures with a lower, but non-negligible risk, PEP should be offered, balancing the lower risk against the use of drugs having uncertain efficacy and toxicity. For exposures with negligible risk, PEP is not justified (see table 2 ). Exposed workers should be informed that knowledge about the efficacy and toxicity of PEP is limited, that for agents other than ZDV, data are limited regarding toxicity in persons without HIV infection or who are pregnant and that any or all drugs for PEP may be declined by the exposed worker.

PEP should be initiated promptly, preferably with 1 to 2 hours post-exposure. Although animal studies suggest that PEP probably is not effective when started later than 24 to 36 hours post-exposure (Niu, Stein and Schnittmann 1993; Gerberding 1995), the interval after which there is no benefit from PEP for humans is undefined. Initiating therapy after a longer interval (e.g., 1 to 2 weeks) may be considered for the highest risk exposures; even if infection is not prevented, early treatment of acute HIV infection may be beneficial (Kinloch-de-los et al. 1995).

If the source patient or the patient’s HIV status is unknown, initiating PEP should be decided on a case-by-case basis, based on the exposure risk and likelihood of infection in known or possible source patients.

Other Bloodborne Pathogens

Syphilis, malaria, babesiosis, brucellosis, leptospirosis, arboviral infections, relapsing fever, Creutzfeldt-Jakob disease, human T-lymphotropic virus type 1 and viral haemorrhagic fever have also been transmitted by the bloodborne route (CDC 1988a; Benenson 1990). Occupational transmission of these agents has only rarely been recorded, if ever.

Prevention of Transmission of Bloodborne Pathogens

There are several basic strategies which relate to the prevention of occupational transmission of bloodborne pathogens. Exposure prevention, the mainstay of occupational health, can be accomplished by substitution (e.g., replacing an unsafe device with a safer one), engineering controls (i.e., controls that isolate or remove the hazard), administrative controls (e.g., prohibiting recapping of needles by a two-handed technique) and use of personal protective equipment. The first choice is to “engineer out the problem”.

In order to reduce exposures to bloodborne pathogens, adherence to general infection control principles, as well as strict compliance with universal precaution guidelines, is required. Important components of universal precautions include the use of appropriate personal protective equipment, such as gloves, gowns and eye protection, when exposure to potentially infectious body fluids is anticipated. Gloves are one of the most important barriers between the worker and the infectious material. While they do not prevent needlesticks, protection for the skin is provided. Gloves should be worn when contact with blood or body fluids is anticipated. Washing of gloves in not recommended. Recommendations also advise workers to take precautions to prevent injuries by needles, scalpels and other sharp instruments or devices during procedures; when cleaning used instruments; during disposal of used needles; and when handling sharp instruments after procedures.

Percutaneous exposures to blood

Since the major risk of infection results from parenteral exposure from sharp instruments such as syringe needles, engineering controls such as resheathing needles, needleless IV systems, blunt suture needles and appropriate selection and use of sharps disposal containers to minimize exposures to percutaneous injuries are critical components of universal precautions.

The most common type of percutaneous inoculation occurs through inadvertent needlestick injury, many of which are associated with recapping of needles. The following reasons have been indicated by workers as reasons for recapping: inability to properly dispose of needles immediately, sharps disposal containers too far away, lack of time, dexterity problems and patient interaction.

Needles and other sharp devices can be redesigned to prevent a significant proportion of percutaneous exposures. A fixed barrier should be provided between hands and the needle after use. Worker’s hands should remain behind the needle. Any safety feature should be an integral part of the device. The design should be simple and little or no training should be required (Jagger et al. 1988).

Implementing safer needle devices must be accompanied by evaluation. In 1992, the American Hospital Association (AHA) published a briefing to assist hospitals with the selection, evaluation and adoption of safer needle devices (AHA 1992). The briefing stated that “because safer needle devices, unlike drugs and other therapies, do not undergo clinical testing for safety and efficacy before they are marketed, hospitals are essentially ‘on their own’ when it comes to selecting appropriate products for their specific institutional needs”. Included in the AHA document are guidance for the evaluation and adoption of safer needle devices, case studies of the use of safety devices, evaluation forms and listing of some, but not all, products on the US market.

Prior to implementation of a new device, health care institutions must ensure that there is an appropriate needlestick surveillance system in place. In order to accurately assess the efficacy of new devices, the number of reported exposures should be expressed as an incidence rate.

Possible denominators for reporting the number of needlestick injuries include patient days, hours worked, number of devices purchased, number of devices used and number of procedures performed. The collection of specific information on device-related injuries is an important component of the evaluation of the effectiveness of a new device. Factors to be considered in collecting information on needlestick injuries include: new product distribution, stocking and tracking; identification of users; removal of other devices; compatibility with other devices (especially IV equipment); ease of use; and mechanical failure. Factors which may contribute to bias include compliance, subject selection, procedures, recall, contamination, reporting and follow-up. Possible outcome measures include rates of needlestick injuries, HCW compliance, patient care complications and cost.

Finally, training and feedback from workers are important components of any successful needlestick prevention programme. User acceptance is a critical factor, but one that seldom receives enough attention.

Elimination or reduction of percutaneous injuries should result if adequate engineering controls are available. If HCWs, product evaluation committees, administrators and purchasing departments all work together to identify where and what safer devices are needed, safety and cost effectiveness can be combined. Occupational transmission of bloodborne pathogens is costly, both in terms of money and the impact on the employee. Every needlestick injury causes undue stress on the employee and may affect job performance. Referral to mental health professionals for supportive counselling may be required.

In summary, a comprehensive approach to prevention is essential to maintaining a safe and healthy environment in which to provide health care services. Prevention strategies include the use of vaccines, post-exposure prophylaxis and prevention or reduction of needlestick injuries. Prevention of needlestick injuries can be accomplished by improvement in the safety of needle-bearing devices, development of procedures for safer use and disposal and adherence to infection control recommendations.

Acknowledgements: The authors thank Mariam Alter, Lawrence Reed and Barbara Gooch for their manuscript review.

 

Back

Wednesday, 02 March 2011 15:51

Overview of Infectious Diseases

Infectious diseases play a significant part in worldwide occurrences of occupational disease in HCWs. Since reporting procedures vary from country to country, and since diseases considered job-related in one country may be classified as non-occupational elsewhere, accurate data concerning their frequency and their proportion of the overall number of occupational diseases among HCWs are difficult to obtain. The proportions range from about 10% in Sweden (Lagerlöf and Broberg 1989), to about 33% in Germany (BGW 1993) and nearly 40% in France (Estryn-Béhar 1991).

The prevalence of infectious diseases is directly related to the efficacy of preventive measures such as vaccines and post-exposure prophylaxis. For example, during the 1980s in France, the proportion of all viral hepatitides fell to 12.7% of its original level thanks to the introduction of vaccination against hepatitis B (Estryn-Béhar 1991). This was noted even before hepatitis A vaccine became available.

Similarly, it may be presumed that, with the declining immunization rates in many countries (e.g., in the Russian Federation and Ukraine in the former Soviet Union during 1994-1995), cases of diphtheria and poliomyelitis among HCWs will increase.

Finally, occasional infections with streptococci, staphylococci and Salmonella typhi are being reported among health care workers.

Epidemiological Studies

The following infectious diseases—listed in order of frequency—are the most important in worldwide occurrences of occupational infectious diseases in health care workers:

  • hepatitis B
  • tuberculosis
  • hepatitis C
  • hepatitis A
  • hepatitis, non A-E.

 

Also important are the following (not in order of frequency):

  • varicella
  • measles
  • mumps
  • rubella
  • Ringelröteln (parvovirus B 19 virus infections)
  • HIV/AIDS
  • hepatitis D
  • EBV hepatitis
  • CMV hepatitis.

 

It is very doubtful that the very many cases of enteric infection (e.g., salmonella, shigella, etc.) often included in the statistics are, in fact, job-related, since these infections are transmitted faecally/orally as a rule.

Much data is available concerning the epidemiological significance of these job-related infections mostly in relation to hepatitis B and its prevention but also in relation to tuberculosis, hepatitis A and hepatitis C. Epidemiological studies have also dealt with measles, mumps, rubella, varicella and Ringenröteln. In using them, however, care must be taken to distinguish between incidence studies (e.g., determination of annual hepatitis B infection rates), sero-epidemiological prevalence studies and other types of prevalence studies (e.g., tuberculin tests).

Hepatitis B

The risk of hepatitis B infections, which are primarily transmitted through contact with blood during needlestick injuries, among HCWs, depends on the frequency of this disease in the population they serve. In northern, central and western Europe, Australia and North America it is found in about 2% of the population. It is encountered in about 7% of the population in southern and south-eastern Europe and most parts of Asia. In Africa, the northern parts of South America and in eastern and south-eastern Asia, rates as high as 20% have been observed (Hollinger 1990).

A Belgian study found that 500 HCWs in northern Europe became infected with hepatitis B each year while the figure for southern Europe was 5,000 (Van Damme and Tormanns 1993). The authors calculated that the annual case rate for western Europe is about 18,200 health care workers. Of these, about 2,275 ultimately develop chronic hepatitis, of whom some 220 will develop cirrhosis of the liver and 44 will develop hepatic carcinoma.

A large study involving 4,218 HCWs in Germany, where about 1% of the population is positive for hepatitis B surface antigen (HBsAg), found that the risk of contracting hepatitis B is approximately 2.5 greater among HCWs than in the general population (Hofmann and Berthold 1989). The largest study to date, involving 85,985 HCWs worldwide, demonstrated that those in dialysis, anaesthesiology and dermatology departments were at greatest risk of hepatitis B (Maruna 1990).

A commonly overlooked source of concern is the HCW who has a chronic hepatitis B infection. More than 100 instances have been recorded worldwide in which the source of the infection was not the patient but the doctor. The most spectacular instance was the Swiss doctor who infected 41 patients (Grob et al. 1987).

While the most important mechanism for transmitting the hepatitis B virus is an injury by a blood-contaminated needle (Hofmann and Berthold 1989), the virus has been detected in a number of other body fluids (e.g., male semen, vaginal secretions, cerebrospinal fluid and pleural exudate) (CDC 1989).

Tuberculosis

In most countries around the world, tuberculosis continues to rank first or second in importance of work-related infections among HCWs (see the article “Tuberculosis prevention, control and surveillance”). Many studies have demonstrated that although the risk is present throughout the professional life, it is greatest during the period of training. For example, a Canadian study in the 1970s demonstrated the tuberculosis rate among female nurses to be double that of women in other professions (Burhill et al. 1985). And, in Germany, where the tuberculosis incidence ranges around 18 per 100,000 for the general population, it is about 26 per 100,000 among health care workers (BGW 1993).

A more accurate estimate of the risk of tuberculosis may be obtained from epidemiological studies based on the tuberculin test. A positive reaction is an indicator of infection by Mycobacterium tuberculosis or other mycobacteria or a prior inoculation with the BCG vaccine. If that inoculation was received 20 or more years earlier, it is presumed that the positive test indicates at least one contact with tubercle bacilli.

Today, tuberculin testing is done by means of the patch test in which the response is read within five to seven days after the application of the “stamp”. A large-scale German study based on such skin tests showed a rate of positives among health professionals that was only moderately higher than that among the general population (Hofmann et al. 1993), but long-range studies demonstrate that a greatly heightened risk of tuberculosis does exist in some areas of health care services.

More recently, anxiety has been generated by the increasing number of cases infected with drug-resistant organisms. This is a matter of particular concern in designing a prophylactic regimen for apparently healthy health care workers whose tuberculin tests “converted” to positive after exposure to patients with tuberculosis.

Hepatitis A

Since the hepatitis A virus is transmitted almost exclusively through faeces, the number of HCWs at risk is substantially smaller than for hepatitis B. An early study conducted in West Berlin showed that paediatric personnel were at greatest risk of this infection (Lange and Masihi 1986). These results were subsequently confirmed by a similar study in Belgium (Van Damme et al. 1989). Similarly, studies in Southwest Germany showed increase risk to nurses, paediatric nurses and cleaning women (Hofmann et al. 1992; Hofmann, Berthold and Wehrle 1992). A study undertaken in Cologne, Germany, revealed no risk to geriatric nurses in contrast to higher prevalence rates among the personnel of child care centres. Another study showed increased risk of hepatitis A among paediatric nurses in Ireland, Germany and France; in the last of these, greater risk was found in workers in psychiatric units treating children and youngsters. Finally, a study of infection rates among handicapped people disclosed higher levels of risk for the patients as well as the workers caring for them (Clemens et al. 1992).

Hepatitis C

Hepatitis C, discovered in 1989, like hepatitis B, is primarily transmitted through blood introduced via needle puncture wounds. Until recently, however, data relating to its threat to HCWs have been limited. A 1991 New York study of 456 dentists and 723 controls showed an infection rate of 1.75% among the dentists compared with 0.14% among the controls (Klein et al. 1991). A German research group demonstrated the prevalence of hepatitis C in prisons and attributed it to the large number of intravenous drug users among the inmates (Gaube et al. 1993). An Austrian study found 2.0% of 294 health care personnel to be seropositive for hepatitis C antibodies, a figure thought to be much higher than that among the general population (Hofmann and Kunz 1990). This was confirmed by another study of HCWs conducted in Cologne, Germany (Chriske and Rossa 1991).

A study in Freiburg, Germany, found that contact with handicapped residents of nursing homes, particularly those with infantile cerebral paresis and trisomia-21, patients with haemophilia and those dependent on drugs administered intravenously presented a particular risk of hepatitis C to workers involved in their care. A significantly increased prevalence rate was found in dialysis personnel and the relative risk to all health care workers was estimated to be 2.5% (admittedly calculated from a relatively small sample).

A possible alternative path of infection was demonstrated in 1993 when a case of hepatitis C was shown to have developed after a splash into the eye (Sartori et al. 1993).

Varicella

Studies of the prevalence of varicella, an illness particularly grave in adults, have consisted of tests for varicella antibodies (anti VZV) conducted in Anglo-Saxon countries. Thus, a seronegative rate of 2.9% was found among 241 hospital employees aged 24 to 62, but the rate was 7.5% for those under the age of 35 (McKinney, Horowitz and Baxtiola 1989). Another study in a paediatric clinic yielded a negative rate of 5% among 2,730 individuals tested in the clinic, but these data become less impressive when it is noted that the serological tests were performed only on persons without a history of having had varicella. A significantly increased risk of varicella infection for paediatric hospital personnel, however, was demonstrated by a study conducted in Freiburg, which found that, in a group of 533 individuals working in hospital care, paediatric hospital care and administration, evidence of varicella immunity was present in 85% of persons younger than 20 years.

Mumps

In considering risk levels of mumps infection, a distinction must be made between countries in which mumps immunization is mandatory and those in which these inoculations are voluntary. In the former, nearly all children and young people will have been immunized and, therefore, mumps poses little risk to health care workers. In the latter, which includes Germany, cases of mumps are becoming more frequent. As a result of lack of immunity, the complications of mumps have been increasing, particularly among adults. A report of an epidemic in a non-immune Inuit population on St. Laurance Island (located between Siberia and Alaska) demonstrated the frequency of such complications of mumps as orchitis in men, mastitis in women and pancreatitis in both sexes (Philip, Reinhard and Lackman 1959).

Unfortunately, epidemiological data on mumps among HCWs are very sparse. A 1986 study in Germany showed that the rate of mumps immunity among 15 to 10 year-olds was 84% but, with voluntary rather than mandatory inoculation, one may presume that this rate has been declining. A 1994 study involving 774 individuals in Freiburg indicated a significantly increased risk to employees in paediatric hospitals (Hofmann, Sydow and Michaelis 1994).

Measles

The situation with measles is similar to that with mumps. Reflecting its high degree of contagiousness, risks of infection among adults emerge as their immunization rates fall. A US study reported an immunity rate of over 99% (Chou, Weil and Arnmow 1986) and two years later 98% of a cohort of 163 nursing students were found to have immunity (Wigand and Grenner 1988). A study in Freiburg yielded rates of 96 to 98% among nurses and paediatric nurses while the rates of immunity among non-medical personnel were only 87 to 90% (Sydow and Hofman 1994). Such data would support a recommendation that immunization be made mandatory for the general population.

Rubella

Rubella falls between measles and mumps with respect to its contagiousness. Studies have shown that about 10% of HCWs are not immune (Ehrengut and Klett 1981; Sydow and Hofmann 1994) and, therefore, at high risk of infection when exposed. Although generally not a serious illness among adults, rubella may be responsible for devastating effects on the foetus during the first 18 weeks of pregnancy: abortion, stillbirth or congenital defects (see table 1) (South, Sever and Teratogen 1985; Miller, Vurdien and Farrington 1993). Since these may be produced even before the woman knows that she is pregnant and, since health care workers, particularly those in contact with paediatric patients, are likely to be exposed, it is especially important that inoculation be urged (and perhaps even required) for all female health care workers of child-bearing age who are not immune.

Table 1. Congenital abnormalities following rubella infection in pregnancy

Studies by South, Sever and Teratogen (1985)

Week of pregnancy

<4

5–8

9–12

13–16

>17

Deformity rate (%)

70

40

25

40

8

Studies by Miller, Vurdien and Farrington (1993)

Week of pregnancy

<10

11–12

13–14

15–16

>17

Deformity rate (%)

90

33

11

24

0

 

HIV/AIDS

During the 1980s and 1990s, HIV seroconversions (i.e., a positive reaction in an individual previously found to have been negative) became a minor occupational risk among HCWs, although clearly not one to be ignored. By early 1994, reports of some 24 reliably documented cases and 35 possible cases were collected in Europe (Pérez et al. 1994) with an additional 43 documented cases and 43 possible cases were reported in the US (CDC 1994a). Unfortunately, except for avoiding needlesticks and other contacts with infected blood or body fluids, there are no effective preventive measures. Some prophylactic regimens for individuals who have been exposed are recommended and described in the article “Prevention of occupational transmission of bloodborne pathogens”.

Other infectious diseases

The other infectious diseases listed earlier in this article have not yet emerged as significant hazards to HCWs either because they have not been recognized and reported or because their epidemiology has not yet been studied. Sporadic reports of single and small clusters of cases suggest that the identification and testing of serological markers should be explored. For example, a 33-month study of typhus conducted by the Centers for Disease Control (CDC) revealed that 11.2% of all sporadic cases not associated with outbreaks occurred in laboratory workers who had examined stool specimens (Blazer et al. 1980).

The future is clouded by two simultaneous problems: the emergence of new pathogens (e.g., new strains such as hepatitis G and new organisms such as the Ebola virus and the equine morbillivirus recently discovered to be fatal to both horses and humans in Australia) and the continuing development of drug resistance by well-recognized organisms such as the tuberculus bacillus. HCWs are likely to be the first to be systematically exposed. This makes their prompt and accurate identification and the epidemiological study of their patterns of susceptibility and transmission of the utmost importance.

Prevention of Infectious Diseases among Health Care Workers

The first essential in the prevention of infectious disease is the indoctrination of all HCWs, support staff as well as health professionals, in the fact that health care facilities are “hotbeds” of infection with every patient representing a potential risk. This is important not only for those directly involved in diagnostic or therapeutic procedures, but also those who collect and handle blood, faeces and other biological materials and those who come in contact with dressings, linens, dishes and other fomites. In some instances, even breathing the same air may be a possible hazard. Each health care facility, therefore, must develop a detailed procedure manual identifying these potential risks and the steps needed to eliminate, avoid or control them. Then, all personnel must be drilled in following these procedures and monitored to ensure that they are being properly performed. Finally, all failures of these protective measures must be recorded and reported so that revision and/or retraining may be undertaken.

Important secondary measures are the labelling of areas and materials which may be especially infectious and the provision of gloves, gowns, masks, forceps and other protective equipment. Washing the hands with germicidal soap and running water (wherever possible) will not only protect the health care worker but also will minimize the risk of his or her transmitting the infection to co-workers and other patients.

All blood and body fluid specimens or splashes and materials stained with them must be handled as though they are infected. The use of rigid plastic containers for the disposal of needles and other sharp instruments and diligence in the proper disposal of potentially infectious wastes are important preventive measures.

Careful medical histories, serological testing and patch testing should be performed prior to or as soon as health care workers report for duty. Where advisable (and there are no contraindications), appropriate vaccines should be administered (hepatitis B, hepatitis A and rubella appear to be the most important) (see table 2). In any case, seroconversion may indicate an acquired infection and the advisability of prophylactic treatment.

Table 2. Indications for vaccinations in health service employees.

Disease

Complications

Who should be vaccinated?

Diptheria

 

In the event of an epidemic, all employees without
demonstrable immunization, beyond this vaccination
recommended, combination vaccine td used, if threat of
epidemic all employees

Hepatitis A

 

Employees in the paediatric field as well as in infection
stations, in microbiological laboratories and in kitchens,
cleaning women

Hepatitis B

 

All seronegative employees with possibility of contact
with blood or bodily fluid

Influenza

 

Regularly offered to all employees

Measles

Encephalitis

Seronegative employees in the paediatric field

Mumps

Meningitis
Otitis
Pancreatitis

Seronegative employees in the paediatric field

Rubella

Embryopathy

Seronegative employees in paediatry/midwifery/
ambulances, seronegative women capable of giving
birth

Poliomyelitis

 

All employees, e.g., those involved in vaccination
campaigns

Tetanus

 

Employees in gardening and technical fields obligatory,
offered to all employees, TD combination vaccine used

Tuberculosis

 

In all events employees in pulmonology and lung surgery
on a voluntary basis (BCG)

Varicellas

Foetal risks

Seronegative employees in paediatry or at least in the
encephalomyelitis paediatric oncology (protection of
patient) and oncological wards

  

Prophylactic therapy

In some exposures when it is known that the worker is not immune and has been exposed to a proven or highly suspected risk of infection, prophylactic therapy may be instituted. Especially if the worker presents any evidence of possible immunodeficiency, human immunoglobulin may be administered. Where specific “hyperimmune” serum is available, as in mumps and hepatitis B, it is preferable. In infections which, like hepatitis B, may be slow to develop, or “booster” doses are advisable, as in tetanus, a vaccine may be administered. When vaccines are not available, as in meningococcus infections and plague, prophylactic antibiotics may be used either alone or as a supplement to immune globulin. Prophylactic regimens of other drugs have been developed for tuberculosis and, more recently, for potential HIV infections, as discussed elsewhere in this chapter.

 

Back

Wednesday, 02 March 2011 15:50

Case Study: Treatment of Back Pain

Most episodes of acute back pain respond promptly to several days of rest followed by the gradual resumption of activities within the limits of pain. Non-narcotic analgesics and non-steroidal anti-inflammatory drugs may be helpful in relieving pain but do not shorten the course. (Since some of these drugs affect alertness and reaction time, they should be used with caution by individuals who drive vehicles or have assignments where momentary lapses may result in harm to patients.) A variety of forms of physiotherapy (e.g., local applications of heat or cold, diathermy, massage, manipulation, etc.) often provide short periods of transient relief; they are particularly useful as a prelude to graded exercises that will promote the restoration of muscle strength and relaxation as well as flexibility. Prolonged bed rest, traction and the use of lumbar corsets tend to delay recovery and often lengthen the period of disability (Blow and Jayson 1988).

Chronic, recurrent back pain is best treated by a secondary prevention regimen. Getting enough rest, sleeping on a firm mattress, sitting in straight chairs, wearing comfortable, well-fitted shoes, maintaining good posture and avoiding long periods of standing in one position are important adjuncts. Excessive or prolonged use of medications increase the risk of side effects and should be avoided. Some cases are helped by the injection of “trigger points”, localized tender nodules in muscles and ligaments, as originally advocated in the seminal report by Lange (1931).

Exercise of key postural muscles (upper and lower abdominal, back, gluteal and thigh muscles) is the mainstay of both chronic care and prevention of back pain. Kraus (1970) has formulated a regimen that features strengthening exercises to correct muscle weakness, relaxing exercises to relief tension, spasticity and rigidity, stretching exercises to minimize contractures and exercises to improve balance and coordination. These exercises, he cautions, should be individualized on the basis of examination of the patient and functional tests of muscle strength, holding power and elasticity (e.g., the Kraus-Weber tests (Kraus 1970)). To avoid adverse effects of exercise, each session should include warm-up and cool-down exercises as well as limbering and relaxing exercises, and the number, duration and intensity of the exercises should be increased gradually as conditioning improves. Simply giving the patient a printed exercise sheet or booklet is not enough; initially, he or she should be given individual instruction and observed to be sure that the exercises are being done correctly.

In 1974, the YMCA in New York introduced the “Y’s Way to a Healthy Back Program”, a low-cost course of exercise training based on the Kraus exercises; in 1976 it became a national programme in the US and, later, it was established in Australia and in several European countries (Melleby 1988). The twice-a-week, six week programme is given by specially-trained YMCA exercise instructors and volunteers, mainly in urban YMCAs (arrangements for courses at the worksite have been made by a number of employers), and it emphasizes the indefinite continuation of the exercises at home. Approximately 80% of the thousands of individuals with chronic or recurrent back pain who have participated in this program have reported elimination or improvement in their pain.

 

Back

Epidemiology

The significance of back pain among instances of disease in developed industrial societies is currently on the rise. According to data provided by the National Center for Health Statistics in the United States, chronic diseases of the back and of the vertebral column make up the dominant group among disorders affecting employable individuals under 45 in the US population. Countries such as Sweden, which have at their disposal traditionally good occupational accident statistics, show that musculoskeletal injuries occur twice as frequently in the health services as in all other fields (Lagerlöf and Broberg 1989).

In an analysis of accident frequency in a 450-bed hospital in the United States, Kaplan and Deyo (1988) were able to demonstrate an 8 to 9% yearly incidence of injury to lumbar vertebrae in nurses, leading on average to 4.7 days of absence from work. Thus of all employee groups in hospitals, nurses were the one most afflicted by this condition.

As is clear from a survey of studies done in the last 20 years (Hofmann and Stössel 1995), this disorder has become the object of intensive epidemiological research. All the same, such research—particularly when it aims at furnishing internationally comparable results—is subject to a variety of methodological difficulties. Sometimes all employee categories in the hospital are investigated, sometimes simply nurses. Some studies have suggested that it would make sense to differentiate, within the group “nurses”, between registered nurses and nursing aides. Since nurses are predominantly women (about 80% in Germany), and since reported incidence and prevalence rates regarding this disorder do not differ significantly for male nurses, gender-related differentiation would seem to be of less importance to epidemiological analyses.

More important is the question of what investigative tools should be used to research back pain conditions and their gradations. Along with the interpretation of accident, compensation and treatment statistics, one frequently finds, in the international literature, a retrospectively applied standardized questionnaire, to be filled out by the person tested. Other investigative approaches operate with clinical investigative procedures such as orthopaedic function studies or radiological screening procedures. Finally, the more recent investigative approaches also use biomechanical modelling and direct or video-taped observation to study the pathophysiology of work performance, particularly as it involves the lumbo-sacral area (see Hagberg et al. 1993 and 1995).

An epidemiological determination of the extent of the problem based on self-reported incidence and prevalence rates, however, poses difficulties as well. Cultural-anthropological studies and comparisons of health systems have shown that perceptions of pain differ not only between members of different societies but also within societies (Payer 1988). Also, there is the difficulty of objectively grading the intensity of pain, a subjective experience. Finally, the prevailing perception among nurses that “back pain goes with the job” leads to under-reporting.

International comparisons based on analyses of governmental statistics on occupational disorders are unreliable for scientific evaluation of this disorder because of variations in the laws and regulations related to occupational disorders among different countries. Further, within a single country, there is the truism that such data are only as reliable as the reports upon which they are based.

In summary, many studies have determined that 60 to 80% of all nursing staff (averaging 30 to 40 years in age) have had at least one episode of back pain during their working lives. The reported incidence rates usually do not exceed 10%. When classifying back pain, it has been helpful to follow the suggestion of Nachemson and Anderson (1982) to distinguish between back pain and back pain with sciatica. In an as-yet unpublished study a subjective complaint of sciatica was found to be useful in classifying the results of subsequent CAT scans (computer assisted tomography) and magnetic resonance imaging (MRI).

Economic Costs

Estimates of the economic costs differ greatly, depending, in part, on the possibilities and conditions of diagnosis, treatment and compensation available at the particular time and/or place. Thus, in the US for 1976, Snook (1988b) estimated that the costs of back pain totalled US$14 billion, while a total cost of US$25 billion was calculated for 1983. The calculations of Holbrook et al. (1984), which estimated 1984 costs to total just under US$16 billion, appear to be most reliable. In the United Kingdom, costs were estimated to have risen by US$2 billion between 1987 and 1989 according to Ernst and Fialka (1994). Estimates of direct and indirect costs for 1990 reported by Cats-Baril and Frymoyer (1991) indicate that the costs of back pain have continued to increase. In 1988 the US Bureau of National Affairs reported that chronic back pain generated costs of US$80,000 per chronic case per year.

In Germany, the two largest workers’ accident insurance funds (Berufsgenossenschaften) developed statistics showing that, in 1987, about 15 million work days were lost because of back pain. This corresponds to roughly one-third of all missed work days annually. These losses appear to be increasing at a current average cost of DM 800 per lost day.

It may therefore be said, independently of national differences and vocational groups, that back disorders and their treatment represent not simply a human and a medical problem, but also an enormous economic burden. Accordingly, it seems advisable to pay special attention to the prevention of these disorders in particularly burdened vocational groups such as nursing.

In principle one should differentiate, in research concerning the causes of work-related disorders of the lower back in nurses, between those attributed to a particular incident or accident and those whose genesis lacks such specificity. Both may give rise to chronic back pain if not properly treated. Reflecting their presumed medical knowledge, nurses are much more prone to use self-medication and self-treatment, without consulting a physician, than other groups in the working population. This is not always a disadvantage, since many physicians either do not know how to treat back problems or give them short shrift, simply prescribing sedatives and advising heat applications to the area. The latter reflects the oft-repeated truism that “backaches come with the job”, or the tendency to regard workers with chronic back complaints as malingerers.

Detailed analyses of work accident occurrences in the area of spinal disorders have only just begun to be made (see Hagberg et al. 1995). This is also true of the analysis of so-called near-accidents, which can provide a particular sort of information concerning the precursor conditions of a given work accident.

The cause of low back disorders has been attributed by the majority of the studies to the physical demands of the work of nursing, i.e., lifting, supporting and moving of patients and handling heavy and/or bulky equipment and materials, often without ergonomic aids or the help of additional personnel. These activities are often conducted in awkward body positions, where footing is uncertain, and when, out of wilfulness or dementia, the nurse’s efforts are resisted by the patient. Trying to keep a patient from falling often results in injury to the nurse or the attendant. Current research, however, is characterized by a strong tendency to speak in terms of multicausality, whereby both the biomechanical basis of demands made upon the body and the anatomical preconditions are discussed.

In addition to faulty biomechanics, injury in such situations can be pre-conditioned by fatigue, muscular weakness (especially of the abdominals, back extensors and quadriceps), diminished flexibility of joints and ligaments and various forms of arthritis. Excessive psychosocial stress can contribute in two ways: (1) prolonged unconscious muscular tension and spasm leading to muscular fatigue and proneness to injury, and (2) irritation and impatience which prompts injudicious attempts to work hurriedly and without waiting for assistance. Enhanced ability to cope with stress and the availability of social support in the workplace are helpful (Theorell 1989; Bongers et al. 1992) when work-related stressors cannot be eliminated or controlled.

Diagnosis

Certain risk situations and dispositions may be added to the risk factors deriving from the biomechanics of the forces acting on the spine and from the anatomy of the support and movement apparatus, ones which are attributable to the work environment. Even though current research is not clear on this point, there is still some indication that the increased and recurrent incidence of psychosocial stress factors in nursing work has the capacity to reduce the threshold of sensitivity to physically burdensome activities, thus contributing to an increased level of vulnerability. In any case, whether such stress factors exist appears to be less decisive in this connection than how nursing staff manages them in a demanding situation and whether they can count on social support in the workplace (Theorell 1989; Bongers et al. 1992).

The proper diagnosis of low back pain requires a complete medical and a detailed occupational history including accidents resulting in injury or near-misses and prior episodes of back pain. The physical examination should include evaluation of gait and posture, palpation for areas of tenderness and evaluation of muscle strength, range of motion and joint flexibility. Complaints of weakness in the leg, areas of numbness and pain that radiate below the knee are indications for neurological examination to seek evidence of spinal cord and/or peripheral nerve involvement. Psychosocial problems may be disclosed through judicious probing of emotional status, attitudes and pain tolerance.

Radiological studies and scans are rarely helpful since, in the vast majority of cases, the problem lies in the muscles and ligaments rather than the bony structures. In fact, bony abnormalities are found in many individuals who have never had back pain; ascribing the back pain to such radiological findings as disc space narrowing or spondylosis may lead to needlessly heroic treatment. Myelography should not be undertaken unless spinal surgery is contemplated.

Clinical laboratory tests are useful in assessing general medical status and may be helpful in disclosing systemic diseases such as arthritis.

Treatment

Various modes of management are indicated depending on the nature of the disorder. Besides ergonomic interventions to enable the return of injured workers to the workplace, surgical, invasive-radiological, pharmacological, physical, physiotherapeutic and also psychotherapeutic management approaches may be necessary—sometimes in combination (Hofmann et al. 1994). Again, however, the vast majority of cases resolve regardless of the therapy offered. Treatment is discussed further in the Case Study: Treatment of Back Pain.

Prevention in the Work Environment

Primary prevention of back pain in the workplace involves the application of ergonomic principles and the use of technical aids, coupled with physical conditioning and training of the workers.

Despite the reservations frequently held by nursing staff regarding the use of technical aids for the lifting, positioning and moving of patients, the importance of ergonomic approaches to prevention is increasing (see Estryn-Béhar, Kaminski and Peigné 1990; Hofmann et al. 1994).

In addition to the major systems (permanently installed ceiling lifters, mobile floor lifters), a series of small and simple systems has been introduced noticeably into nursing practice (turntables, walking girdles, lifting cushions, slide boards, bed ladders, anti-slide mats and so on). When using these aids it is important that their actual use fits in well with the care concept of the particular area of nursing in which they are used. Wherever the use of such lifting aids stands in contradiction to the care concept practised, acceptance of such technical lifting aids by nursing staff tends to be low.

Even where technical aids are employed, training in techniques of lifting, carrying and supporting are essential. Lidström and Zachrisson (1973) describe a Swedish “Back School” in which physiotherapists trained in communication conduct classes explaining the structure of the spine and its muscles, how they work in different positions and movements and what can go wrong with them, and demonstrating appropriate lifting and handling techniques that will prevent injury. Klaber Moffet et al. (1986) describe the success of a similar programme in the UK. Such training in lifting and carrying is particularly important where, for one reason or another, use of technical aids is not possible. Numerous studies have shown that training in such techniques must constantly be reviewed; knowledge gained through instruction is frequently “unlearned” in practice.

Unfortunately, the physical demands presented by patients’ size, weight, illness and positioning are not always amenable to nurses’ control and they are not always able to modify the physical environment and the way their duties are structured. Accordingly, it is important for institutional managers and nursing supervisors to be included in the educational programme so that, when making decisions about work environments, equipment and job assignments, factors making for “back friendly” working conditions can be considered. At the same time, deployment of staff, with particular reference to nurse-patient ratios and the availability of “helping hands”, must be appropriate to the nurses’ well-being as well as consistent with the care concept, as hospitals in the Scandinavian countries seem to have managed to do in exemplary fashion. This is becoming ever more important where fiscal constraints dictate staff reductions and cut-backs in equipment procurement and maintenance.

Recently developed holistic concepts, which see such training not simply as instruction in bedside lifting and carrying techniques but rather as movement programmes for both nurses and patients, could take the lead in future developments in this area. Approaches to “participatory ergonomics” and programmes of health advancement in hospitals (understood as organizational development) must also be more intensively discussed and researched as future strategies (see article “Hospital ergonomics: A review”).

Since psychosocial stress factors also exercise a moderating function in the perception and mastering of the physical demands made by work, prevention programmes should also ensure that colleagues and superiors work to ensure satisfaction with work, avoid making excessive demands on the mental and physical capacities of workers and provide an appropriate level of social support.

Preventive measures should extend beyond professional life to include work in the home (housekeeping and caring for small children who have to be lifted and carried are particular hazards) as well as in sports and other recreational activities. Individuals with persistent or recurrent back pain, however it is acquired, should be no less diligent in following an appropriate preventive regimen.

Rehabilitation

The key to a rapid recovery is early mobilization and a prompt resumption of activities with the limits of tolerance and comfort. Most patients with acute back injuries recover fully and return to their usual work without incident. Resumption of an unrestricted range of activity should not be undertaken until exercises have fully restored muscle strength and flexibility and banished the fear and temerity that make for recurrent injury. Many individuals exhibit a tendency to recurrences and chronicity; for these, physiotherapy coupled with exercise and control of psychosocial factors will often be helpful. It is important that they return to some form of work as quickly as possible. Temporary elimination of more strenuous tasks and limitation of hours with a graduated return to unrestricted activity will promote a more complete recovery in these cases.

Fitness for work

The professional literature attributes only a very limited prognostic value to screening done before employees start work (US Preventive Services Task Force 1989). Ethical considerations and laws such as the Americans with Disabilities Act mitigate against pre-employment screening. It is generally agreed that pre-employment back x rays have no value, particularly when one considers their cost and the needless exposure to radiation. Newly-hired nurses and other health workers and those returning from an episode of disability due to back pain should be evaluated to detect any predisposition to this problem and provided with access to educational and physical conditioning programmes that will prevent it.

Conclusion

The social and economic impact of back pain, a problem particularly prevalent among nurses, can be minimized by the application of ergonomic principles and technology in the organization of their work and its environment, by physical conditioning that enhances the strength and flexibility of the postural muscles, by education and training in the performance of problematic activities and, when episodes of back pain do occur, by treatment that emphasizes a minimum of medical intervention and a prompt return to activity.

 

Back

Wednesday, 02 March 2011 15:40

Ergonomics of the Physical Work Environment

Several countries have established recommended noise, temperature and lighting levels for hospitals. These recommendations are, however, rarely included in the specifications given to hospital designers. Further, the few studies examining these variables have reported disquieting levels.

Noise

In hospitals, it is important to distinguish between machine-generated noise capable of impairing hearing (above 85 dBA) and noise which is associated with a degradation of ambiance, administrative work and care (65 to 85 dBA).

Machine-generated noise capable of impairing hearing

Prior to the 1980s, a few publications had already drawn attention to this problem. Van Wagoner and Maguire (1977) evaluated the incidence of hearing loss among 100 employees in an urban hospital in Canada. They identified five zones in which noise levels were between 85 and 115 dBA: the electrical plant, laundry, dish-washing station and printing department and areas where maintenance workers used hand or power tools. Hearing loss was observed in 48% of the 50 workers active in these noisy areas, compared to 6% of workers active in quieter areas.

Yassi et al. (1992) conducted a preliminary survey to identify zones with dangerously high noise levels in a large Canadian hospital. Integrated dosimetry and mapping were subsequently used to study these high-risk areas in detail. Noise levels exceeding 80 dBA were common. The laundry, central processing, nutrition department, rehabilitation unit, stores and electrical plant were all studied in detail. Integrated dosimetry revealed levels of up to 110 dBA at some of these locations.

Noise levels in a Spanish hospital’s laundry exceeded 85 dBA at all workstations and reached 97 dBA in some zones (Montoliu et al. 1992). Noise levels of 85 to 94 dBA were measured at some workstations in a French hospital’s laundry (Cabal et al. 1986). Although machine re-engineering reduced the noise generated by pressing machines to 78 dBA, this process was not applicable to other machines, due to their inherent design.

A study in the United States reported that electrical surgical instruments generate noise levels of 90 to 100 dBA (Willet 1991). In the same study, 11 of 24 orthopaedic surgeons were reported to suffer from significant hearing loss. The need for better instrument design was emphasized. Vacuum and monitor alarms have been reported to generate noise levels of up to 108 dBA (Hodge and Thompson 1990).

Noise associated with a degradation of ambiance, administrative work and care

A systematic review of noise levels in six Egyptian hospitals revealed the presence of excessive levels in offices, waiting rooms and corridors (Noweir and al-Jiffry 1991). This was attributed to the characteristics of hospital construction and of some of the machines. The authors recommended the use of more appropriate building materials and equipment and the implementation of good maintenance practices.

Work in the first computerized facilities was hindered by the poor quality of printers and the inadequate acoustics of offices. In the Paris region, groups of cashiers talked to their clients and processed invoices and payments in a crowded room whose low plaster ceiling had no acoustic absorption capacity. Noise levels with only one printer active (in practice, all four usually were) were 78 dBA for payments and 82 dBA for invoices.

In a 1992 study of a rehabilitation gymnasium consisting of 8 cardiac rehabilitation bicycles surrounded by four private patient areas, noise levels of 75 to 80 dBA and 65 to 75 dBA were measured near cardiac rehabilitation bicycles and in the neighbouring kinesiology area, respectively. Levels such as these render personalized care difficult.

Shapiro and Berland (1972) viewed noise in operating theatres as the “third pollution”, since it increases the fatigue of the surgeons, exerts physiological and psychological effects and influences the accuracy of movements. Noise levels were measured during a cholecystectomy and during tubal ligation. Irritating noises were associated with the opening of a package of gloves (86 dBA), the installation of a platform on the floor (85 dBA), platform adjustment (75 to 80 dBA), placing surgical instruments upon each other (80 dBA), suctioning of trachea of patient (78 dBA), continuous suction bottle (75 to 85 dBA) and the heels of nurses’ shoes (68 dBA). The authors recommended the use of heat-resistant plastic, less noisy instruments and, to minimize reverberation, easily cleaned materials other than ceramic or glass for walls, tiles and ceilings.

Noise levels of 51 to 82 dBA and 54 to 73 dBA have been measured in the centrifuge room and automated analyser room of a medical analytical laboratory. The Leq (reflecting full-shift exposure) at the control station was 70.44 dBA, with 3 hours over 70 dBA. At the technical station, the Leq was 72.63 dBA, with 7 hours over 70 dBA. The following improvements were recommended: installing telephones with adjustable ring levels, grouping centrifuges in a closed room, moving photocopiers and printers and installing hutches around the printers.

Patient Care and Comfort

In several countries, recommended noise limits for care units are 35 dBA at night and 40 dBA during the day (Turner, King and Craddock 1975). Falk and Woods (1973) were the first to draw attention to this point, in their study of noise levels and sources in neonatology incubators, recovery rooms and two rooms in an intensive-care unit. The following mean levels were measured over a 24-hour period: 57.7 dBA (74.5 dB) in the incubators, 65.5 dBA (80 dB linear) at the head of patients in the recovery room, 60.1 dBA (73.3 dB) in the intensive care unit and 55.8 dBA (68.1 dB) in one patient room. Noise levels in the recovery room and intensive-care unit were correlated with the number of nurses. The authors emphasized the probable stimulation of patients’ hypophyseal-corticoadrenal system by these noise levels, and the resultant increase in peripheral vasoconstriction. There was also some concern about the hearing of patients receiving aminoglycoside antibiotics. These noise levels were considered incompatible with sleep.

Several studies, most of which have been conducted by nurses, have shown that noise control improves patient recovery and quality of life. Reports of research conducted in neonatology wards caring for low-birth-weight babies emphasized the need to reduce the noise caused by personnel, equipment and radiology activities (Green 1992; Wahlen 1992; Williams and Murphy 1991; Oëler 1993; Lotas 1992; Halm and Alpen 1993). Halm and Alpen (1993) have studied the relationship between noise levels in intensive-care units and the psychological well-being of patients and their families (and in extreme cases, even of post-resuscitation psychosis). The effect of ambient noise on the quality of sleep has been rigorously evaluated under experimental conditions (Topf 1992). In intensive care units, the playing of pre-recorded sounds was associated with a deterioration of several sleep parameters.

A multi-ward study reported peak noise levels at the head of patients in excess of 80 dBA, especially in intensive- and respiratory-care units (Meyer et al. 1994). Lighting and noise levels were recorded continuously over seven consecutive days in a medical intensive-care unit, one-bed and multi-bed rooms in a respiratory-care unit and a private room. Noise levels were very high in all cases. The number of peaks exceeding 80 dBA was particularly high in the intensive- and respiratory-care units, with a maximum observed between 12:00 and 18:00 and a minimum between 00:00 and 06:00. Sleep deprivation and fragmentation were considered to have a negative impact on the respiratory system of patients and impair the weaning of patients from mechanical ventilation.

Blanpain and Estryn-Béhar (1990) found few noisy machines such as waxers, ice machines and hotplates in their study of ten Paris-area wards. However, the size and surfaces of the rooms could either reduce or amplify the noise generated by these machines, as well as that (albeit lower) generated by passing cars, ventilation systems and alarms. Noise levels in excess of 45 dBA (observed in 7 of 10 wards) did not promote patient rest. Furthermore, noise disturbed hospital personnel performing very precise tasks requiring close attention. In five of 10 wards, noise levels at the nursing station reached 65 dBA; in two wards, levels of 73 dBA were measured. Levels in excess of 65 dBA were measured in three pantries.

In some cases, architectural decorative effects were instituted with no thought to their effect on acoustics. For example, glass walls and ceilings have been in fashion since the 1970s and have been used in patient admission open-space offices. The resultant noise levels do not contribute to the creation of a calm environment in which patients about to enter the hospital can fill out forms. Fountains in this type of hall generated a background noise level of 73 dBA at the reception desk, requiring receptionists to ask one-third of people requesting information to repeat themselves.

Heat stress

Costa, Trinco and Schallenberg (1992) studied the effect of installing a laminar flow system, which maintained air sterility, on heat stress in an orthopaedic operating theatre. Temperature in the operating theatre increased by approximately 3 °C on average and could reach 30.2 °C. This was associated with a deterioration of the thermal comfort of operating-room personnel, who must wear very bulky clothes that favour heat retention.

Cabal et al. (1986) analysed heat stress in a hospital laundry in central France prior to its renovation. They noted that the relative humidity at the hottest workstation, the “gown-dummy”, was 30%, and radiant temperature reached 41 °C. Following installation of double-pane glass and reflective outside walls, and implementation of 10 to 15 air changes per hour, thermal comfort parameters fell within standard levels at all workstations, regardless of the weather outside. A study of a Spanish hospital laundry has shown that high wet-bulb temperatures result in oppressive work environments, especially in ironing areas, where temperatures may exceed 30 °C (Montoliu et al. 1992).

Blanpain and Estryn-Béhar (1990) characterized the physical work environment in ten wards whose work content they had already studied. Temperature was measured twice in each of ten wards. The nocturnal temperature in patient rooms may be below 22 °C, as patients use covers. During the day, as long as patients are relatively inactive, a temperature of 24 °C is acceptable but should not be exceeded, since some nursing interventions require significant exertion.

The following temperatures were observed between 07:00 and 07:30: 21.5 °C in geriatric wards, 26 °C in a non-sterile room in the haematology ward. At 14:30 on a sunny day, the temperatures were as follows: 23.5 °C in the emergency room and 29 °C in the haematology ward. Afternoon temperatures exceeded 24 °C in 9 of 19 cases. The relative humidity in four out of five wards with general air-conditioning was below 45% and was below 35% in two wards.

Afternoon temperature also exceeded 22 °C at all nine care preparation stations and 26 °C at three care stations. The relative humidity was below 45% in all five stations of wards with air-conditioning. In the pantries, temperatures ranged between 18 °C and 28.5 °C.

Temperatures of 22 °C to 25 °C were measured at the urine drains, where there were also odour problems and where dirty laundry was sometimes stored. Temperatures of 23 °C to 25 °C were measured in the two dirty-laundry closets; a temperature of 18 °C would be more appropriate.

Complaints concerning thermal comfort were frequent in a survey of 2,892 women working in Paris-area wards (Estryn-Béhar et al. 1989a). Complaints of being often or always hot were reported by 47% of morning- and afternoon-shift nurses and 37% of night-shift nurses. Although nurses were sometimes obliged to perform physically strenuous work, such as making several beds, the temperature in the various rooms was too high to perform these activities comfortably while wearing polyester-cotton clothes, which hinder evaporation, or gowns and masks necessary for the prevention of nosocomial infections.

On the other hand, 46% of night-shift nurses and 26% of morning- and afternoon-shift nurses reported being often or always cold. The proportions reporting never suffering from the cold were 11% and 26%.

To conserve energy, the heating in hospitals was often lowered during the night, when patients are under covers. However nurses, who must remain alert despite chronobiologically mediated drops in core body temperatures, were required to put on jackets (not always very hygienic ones) around 04:00. At the end of the study, some wards installed adjustable space-heating at nursing stations.

Studies of 1,505 women in 26 units conducted by occupational physicians revealed that rhinitis and eye irritation were more frequent among nurses working in air-conditioned rooms (Estryn-Béhar and Poinsignon 1989) and that work in air-conditioned environments was related to an almost twofold increase in dermatoses likely to be occupational in origin (adjusted odds ratio of 2) (Delaporte et al. 1990).

Lighting

Several studies have shown that the importance of good lighting is still underestimated in administrative and general departments of hospitals.

Cabal et al. (1986) observed that lighting levels at half of the workstations in a hospital laundry were no higher than 100 lux. Lighting levels following renovations were 300 lux at all workstations, 800 lux at the darning station and 150 lux between the washing tunnels.

Blanpain and Estryn-Béhar (1990) observed maximum night lighting levels below 500 lux in 9 out of 10 wards. Lighting levels were below 250 lux in five pharmacies with no natural lighting and were below 90 lux in three pharmacies. It should be recalled that the difficulty in reading small lettering on labels experienced by older persons may be mitigated by increasing the level of illumination.

Building orientation can result in high day-time lighting levels that disturb patients’ rest. For example, in geriatric wards, beds furthest from the windows received 1,200 lux, while those nearest the windows received 5,000 lux. The only window shading available in these rooms were solid window blinds and nurses were unable to dispense care in four-bed rooms when these were drawn. In some cases, nurses stuck paper on the windows to provide patients with some relief.

The lighting in some intensive-care units is too intense to allow patients to rest (Meyer et al. 1994). The effect of lighting on patients’ sleep has been studied in neonatology wards by North American and German nurses (Oëler 1993; Boehm and Bollinger 1990).

In one hospital, surgeons disturbed by reflections from white tiles requested the renovation of the operating theatre. Lighting levels outside the shadow-free zone (15,000 to 80,000 lux) were reduced. However, this resulted in levels of only 100 lux at the instrument nurses’ work surface, 50 to 150 lux at the wall unit used for equipment storage, 70 lux at the patients’ head and 150 lux at the anaesthetists’ work surface. To avoid generating glare capable of affecting the accuracy of surgeons’ movements, lamps were installed outside of surgeons’ sight-lines. Rheostats were installed to control lighting levels at the nurses’ work surface between 300 and 1,000 lux and general levels between 100 and 300 lux.

Construction of a hospital with extensive natural lighting

In 1981, planning for the construction of Saint Mary’s Hospital on the Isle of Wight began with a goal of halving energy costs (Burton 1990). The final design called for extensive use of natural lighting and incorporated double-pane windows that could be opened in the summer. Even the operating theatre has an outside view and paediatric wards are located on the ground floor to allow access to play areas. The other wards, on the second and third (top) floors, are equipped with windows and ceiling lighting. This design is quite suitable for temperate climates but may be problematic where ice and snow inhibit overhead lighting or where high temperatures may lead to a significant greenhouse effect.

Architecture and Working Conditions

Flexible design is not multi-functionality

Prevailing concepts from 1945 to 1985, in particular the fear of instant obsolescence, were reflected in the construction of multi-purpose hospitals composed of identical modules (Games and Taton-Braen 1987). In the United Kingdom this trend led to the development of the “Harnes system”, whose first product was the Dudley Hospital, built in 1974. Seventy other hospitals were later built on the same principles. In France, several hospitals were constructed on the “Fontenoy” model.

Building design should not prevent modifications necessitated by the rapid evolution of therapeutic practice and technology. For example, partitions, fluid circulation subsystems and technical duct-work should all be capable of being easily moved. However, this flexibility should not be construed as an endorsement of the goal of complete multi-functionality—a design goal which leads to the construction of facilities poorly suited to any speciality. For example, the surface area needed to store machines, bottles, disposable equipment and medication is different in surgical, cardiology and geriatric wards. Failure to recognize this will lead to rooms being used for purposes they were not designed for (e.g., bathrooms being used for bottle storage).

The Loma Linda Hospital in California (United States) is an example of better hospital design and has been copied elsewhere. Here, nursing and technical medicine departments are located above and below technical floors; this “sandwich” structure permits easy maintenance and adjustment of fluid circulation.

Unfortunately, hospital architecture does not always reflect the needs of those who work there, and multi-functional design has been responsible for reported problems related to physical and cognitive strain. Consider a 30-bed ward composed of one- and two-bed rooms, in which there is only one functional area of each type (nursing station, pantry, storage of disposable materials, linen or medication), all based on the same all-purpose design. In this ward, the management and dispensation of care obliges nurses to change location extremely frequently, and work is greatly fragmented. A comparative study of ten wards has shown that the distance from the nurses’ station to the farthest room is an important determinant of both nurses’ fatigue (a function of the distance walked) and the quality of care (a function of the time spent in patients’ rooms) (Estryn-Béhar and Hakim-Serfaty 1990).

This discrepancy between the architectural design of spaces, corridors and materials, on the one hand, and the realities of hospital work, on the other, has been characterized by Patkin (1992), in a review of Australian hospitals, as an ergonomic “debacle”.

Preliminary analysis of the spatial organization in nursing areas

The first mathematical model of the nature, purposes and frequency of staff movements, based on the Yale Traffic Index, appeared in 1960 and was refined by Lippert in 1971. However, attention to one problem in isolation may in fact aggravate others. For example, locating a nurses’ station in the centre of the building, in order to reduce the distances walked, may worsen working conditions if nurses must spend over 30% of their time in such windowless surroundings, known to be a source of problems related to lighting, ventilation and psychological factors (Estryn-Béhar and Milanini 1992).

The distance of the preparation and storage areas from patients is less problematic in settings with a high staff-patient ratio and where the existence of a centralized preparation area facilitates the delivery of supplies several times per day, even on holidays. In addition, long waits for elevators are less common in high-rise hospitals with over 600 beds, where the number of elevators is not limited by financial constraints.

Research on the design of specific but flexible hospital units

In the United Kingdom in the late 1970s, the Health Ministry created a team of ergonomists to compile a database on ergonomics training and on the ergonomic layout of hospital work areas (Haigh 1992). Noteworthy examples of the success of this programme include the modification of the dimensions of laboratory furniture to take into account the demands of microscopy work and the redesign of maternity rooms to take into account nurses’ work and mothers’ preferences.

Cammock (1981) emphasized the need to provide distinct nursing, public and common areas, with separate entrances for nursing and public areas, and separate connections between these areas and the common area. Furthermore, there should be no direct contact between the public and nursing areas.

The Krankenanstalt Rudolfsstiftung is the first pilot hospital of the “European Healthy Hospitals” project. The Viennese pilot project consists of eight sub-projects, one of which, the “Service Reorganization” project, is an attempt, in collaboration with ergonomists, to promote functional reorganization of available space (Pelikan 1993). For example, all the rooms in an intensive care unit were renovated and rails for patient lifts installed in the ceilings of each room.

A comparative analysis of 90 Dutch hospitals suggests that small units (floors of less than 1,500 m2) are the most efficient, as they allow nurses to tailor their care to the specifics of patients’ occupational therapy and family dynamics (Van Hogdalem 1990). This design also increases the time nurses can spend with patients, since they waste less time in changes of location and are less subject to uncertainty. Finally, the use of small units reduces the number of windowless work areas.

A study carried out in the health administration sector in Sweden reported better employee performance in buildings incorporating individual offices and conference rooms, as opposed to an open plan (Ahlin 1992). The existence in Sweden of an institute dedicated to the study of working conditions in hospitals, and of legislation requiring consultation with employee representatives both before and during all construction or renovation projects, has resulted in the regular recourse to participatory design based on ergonomic training and intervention (Tornquist and Ullmark 1992).

Architectural design based on participatory ergonomics

Workers must be involved in the planning of the behavioural and organizational changes associated with the occupation of a new work space. The adequate organization and equipping of a workplace requires taking into account the organizational elements that require modification or emphasis. Two detailed examples taken from two hospitals illustrate this.

Estryn-Béhar et al. (1994) report the results of the renovation of the common areas of a medical ward and a cardiology ward of the same hospital. The ergonomics of the work performed by each profession in each ward was observed over seven entire workdays and discussed over a two-day period with each group. The groups included representatives of all occupations (department heads, supervisors, interns, nurses, nurses’ aides, orderlies) from all the shifts. One entire day was spent developing architectural and organizational proposals for each problem noted. Two more days were spent on the simulation of characteristic activities by the entire group, in collaboration with an architect and an ergonomist, using modular cardboard mock-ups and scale models of objects and people. Through this simulation, representatives of the various occupations were able to agree on distances and the distribution of space within each ward. Only after this process was concluded was the design specification drawn up.

The same participatory method was used in a cardiac intensive-care unit in another hospital (Estryn-Béhar et al. 1995a, 1995b). It was found that four types of virtually incompatible activities were conducted at the nursing station:

  • care preparation, requiring the use of a drain-board and sink
  • decontamination, which also used the sink
  • meeting, writing and monitoring; the area used for these activities was also sometimes used for the preparation of care
  • clean-equipment storage (three units) and waste storage (one unit).

 

These zones overlapped, and nurses had to cross the meeting-writing-monitoring area to reach the other areas. Because of the position of the furniture, nurses had to change direction three times to get to the drain-board. Patient rooms were laid out along a corridor, both for regular intensive care and highly intensive care. The storage units were located at the far end of the ward from the nursing station.

In the new layout, the station’s longitudinal orientation of functions and traffic is replaced with a lateral one which allows direct and central circulation in a furniture-free area. The meeting-writing-monitoring area is now located at the end of the room, where it offers a calm space near windows, while remaining accessible. The clean and dirty preparation areas are located by the entrance to the room and are separated from each other by a large circulation area. The highly intensive care rooms are large enough to accommodate emergency equipment, a preparation counter and a deep washbasin. A glass wall installed between the preparation areas and the highly intensive care rooms ensures that patients in these rooms are always visible. The main storage area was rationalized and reorganized. Plans are available for each work and storage area.

Architecture, ergonomics and developing countries

These problems are also found in developing countries; in particular, renovations there frequently involve the elimination of common rooms. The performance of ergonomic analysis would identify existing problems and help avoid new ones. For example, the construction of wards comprised of only one- or two-bed rooms increases the distances that personnel must travel. Inadequate attention to staffing levels and the layout of nursing stations, satellite kitchens, satellite pharmacies and storage areas may lead to significant reductions in the amount of time nurses spend with patients and may render work organization more complex.

Furthermore, the application in developing countries of the multi-functional hospital model of developed countries does not take into account different cultures’ attitudes toward space utilization. Manuaba (1992) has pointed out that the layout of developed countries’ hospital rooms and the type of medical equipment used is poorly suited to developing countries, and that the rooms are too small to comfortably accommodate visitors, essential partners in the curative process.

Hygiene and Ergonomics

In hospital settings, many breaches of asepsis can be understood and corrected only by reference to work organization and work space. Effective implementation of the necessary modifications requires detailed ergonomic analysis. This analysis serves to characterize the interdependencies of team tasks, rather than their individual characteristics, and identify discrepancies between real and nominal work, especially nominal work described in official protocols.

Hand-mediated contamination was one of the first targets in the fight against nosocomial infections. In theory, hands should be systemtically washed on entering and leaving patients’ rooms. Although initial and ongoing training of nurses emphasizes the results of descriptive epidemiological studies, research indicates persistent problems associated with hand-washing. In a study conducted in 1987 and involving continuous observation of entire 8-hour shifts in 10 wards, Delaporte et al. (1990) observed an average of 17 hand-washings by morning-shift nurses, 13 by afternoon-shift nurses and 21 by night-shift nurses.

Nurses washed their hands one-half to one-third as often as is recommended for their number of patient contacts (without even considering care-preparation activities); for nurses’ aides, the ratio was one-third to one-fifth. Hand-washing before and after each activity is, however, clearly impossible, in terms of both time and skin damage, given the atomization of activity, number of technical interventions and frequency of interruptions and attendant repetition of care that personnel must cope with. Reduction of work interruptions is thus essential and should take precedence over simply reaffirming the importance of hand-washing, which, in any event, cannot be performed over 25 to 30 times per day.

Similar patterns of hand-washing were found in a study based on observations collected over 14 entire workdays in 1994 during the reorganization of the common areas of two university hospital wards (Estryn-Béhar et al. 1994). In every case, nurses would have been incapable of dispensing the required care if they had returned to the nursing station to wash their hands. In short-term-stay units, for example, almost all the patients have blood samples drawn and subsequently receive oral and intravenous medication at virtually the same time. The density of activities at certain times also renders appropriate hand-washing impossible: in one case, an afternoon-shift nurse responsible for 13 patients in a medical ward entered patients’ rooms 21 times in one hour. Poorly organized information provision and transmission structures contributed to the number of visits he was obliged to perform. Given the impossibility of washing his hands 21 times in one hour, the nurse washed them only when dealing with the most fragile patients (i.e., those suffering from pulmonary failure).

Ergonomically based architectural design takes several factors affecting hand-washing into account, especially those concerning the location and access to wash-basins, but also the implementation of truly functional “dirty” and “clean” circuits. Reduction of interruptions through participatory analysis of organization helps to make hand-washing possible.

 

Back

Wednesday, 02 March 2011 15:37

Exposure to Physical Agents

Health care workers (HCWs) confront numerous physical hazards.

Electrical Hazards

Failure to meet standards for electrical equipment and its use is the most frequently cited violation in all industries. In hospitals, electrical malfunctions are the second leading cause of fires. Additionally, hospitals require that a wide variety of electrical equipment be used in hazardous environments (i.e., in wet or damp locations or adjacent to flammables or combustibles).

Recognition of these facts and the danger they may pose to patients has led most hospitals to put great effort into electrical safety promotion in patient-care areas. However, non-patient areas are sometimes neglected and employee- or hospital-owned appliances may be found with:

  • three-wire (grounded) plugs attached to two-wire (ungrounded) cords
  • ground prongs bent or cut off
  • ungrounded appliances attached to ungrounded multiple-plug “spiders”
  • extension cords with improper grounding
  • cords moulded to plugs not properly wired (25% of the x-ray equipment in one hospital study was incorrectly wired).

 

Prevention and control

It is critical that all electrical installations be in accordance with prescribed safety standards and regulations. Measures that can be taken to prevent fires and avoid shocks to employees include the following:

  • provision for regular inspection of all employee work areas by an electrical engineer to discover and correct hazardous conditions such as ungrounded or poorly maintained appliances or tools
  • inclusion of electrical safety in both orientation and in-service training programmes.

 

Employees should be instructed:

  • not to use electrical equipment with wet hands, on wet surfaces or when standing on wet floors
  • not to use devices that blow a fuse or trip a circuit breaker until they have been inspected
  • not to use any appliance, equipment or wall receptacle that appears to be damaged or in poor repair
  • to use extension cords only temporarily and only in emergency situations
  • to use extension cords designed to carry the voltage required
  • to turn off equipment before unplugging it
  • to report all shocks immediately (including small tingles) and not to use equipment again until it has been inspected.

 

Heat

Although heat-related health effects on hospital workers can include heat stroke, exhaustion, cramps and fainting, these are rare. More common are the milder effects of increased fatigue, discomfort and inability to concentrate. These are important because they may increase the risk of accidents.

Heat exposure can be measured with wet bulb and globe thermometers, expressed as the Wet Bulb Globe Temperature (WBGT) Index, which combines the effects of radiant heat and humidity with the dry bulb temperature. This testing should only be done by a skilled individual.

The boiler room, laundry and kitchen are the most common high-temperature environments in the hospital. However, in old buildings with inadequate ventilation and cooling systems heat may be a problem in many locations in summer months. Heat exposure may also be a problem where ambient temperatures are elevated and health care personnel are required to wear occlusive gowns, caps, masks and gloves.

Prevention and control

Although it may be impossible to keep some hospital settings at a comfortable temperature, there are measures to keep temperatures at acceptable levels and to ameliorate the effects of heat upon workers, including:

  • provision of adequate ventilation. Central air-conditioning systems may need to be supplemented by floor fans, for example.
  • making cool drinking water easily accessible
  • rotating employees so that periodic relief is scheduled
  • scheduling frequent breaks in cool areas.

 

Noise

Exposure to high levels of noise in the workplace is a common job hazard. The “quiet” image of hospitals notwithstanding, they can be noisy places to work.

Exposure to loud noises can cause a loss in hearing acuity. Short-term exposure to loud noises can cause a decrease in hearing called a “temporary threshold shift” (TTS). While these TTSs can be reversed with sufficient rest from high noise levels, the nerve damage resulting from long-term exposure to loud noises cannot.

The US Occupational Safety and Health Administration (OSHA) has set 90 dBA as the permissible limit per 8 hours of work. For 8-hour average exposures in excess of 85 dBA, a hearing conservation programme is mandated. (Sound level meters, the basic noise measuring instrument, are provided with three weighting networks. OSHA standards use the A scale, expressed as dBA.)

The effects of noise at the 70-dB level are reported by the National Institute of Environmental Health Sciences to be:

  • blood vessel constriction that can lead to higher blood pressure and decreased circulation in the hands and feet (perceived as coldness)
  • headaches
  • increased irritability
  • difficulty in communicating with co-workers
  • reduced ability to work
  • more difficulty with tasks that require alertness, concentration and attention to detail.

 

Food service areas, laboratories, engineering areas (which usually includes the boiler room), business office and medical records and nursing units can be so noisy that productivity is reduced. Other departments where noise levels are sometimes quite high are laundries, print shops and construction areas.

Prevention and control

If a noise survey of the facility shows that employees’ noise exposure is in excess of the OSHA standard, a noise abatement programme is required. Such a programme should include:

  • periodic measurement
  • engineering controls such as isolating noisy equipment, installing mufflers and acoustic ceilings and carpets
  • administrative controls limiting workers’ exposure time to excessive noise.

 

In addition to abatement measures, a hearing conservation programme should be established that provides for:

  • hearing tests for new employees to provide baselines for future testing
  • annual audiometric testing
  • hearing protection for use while controls are being implemented and for situations where levels cannot be brought within approved limits.

 

Inadequate Ventilation

The specific ventilation requirements for various types of equipment are engineering matters and will not be discussed here. However, both old and new facilities present general ventilation problems that warrant mentioning.

In older facilities built before central heating and cooling systems were common, ventilation problems must often be solved on a location-by-location basis. Frequently, the problem rests in achieving uniform temperatures and correct circulation.

In newer facilities that are hermetically sealed, a phenomenon called “tight-building syndrome” or “sick building syndrome” is sometimes experienced. When the circulation system does not exchange the air rapidly enough, concentrations of irritants may build up to the extent that employees may experience such reactions as sore throat, runny nose and watery eyes. This situation can provoke severe reaction in sensitized individuals. It can be exacerbated by various chemicals emitted from such sources as foam insulation, carpeting, adhesives and cleaning agents.

Prevention and control

While careful attention is paid to ventilation in sensitive areas such as surgical suites, less attention is given to general-purpose areas. It is important to alert employees to report irritant reactions that appear only in the workplace. If local air quality cannot be improved with venting, it may be necessary to transfer individuals who have become sensitized to some irritant in their workstation.

Laser Smoke

During surgical procedures using a laser or electrosurgical unit, the thermal destruction of tissue creates smoke as a by-product. NIOSH has confirmed studies showing that this smoke plume can contain toxic gases and vapours such as benzene, hydrogen cyanide and formaldehyde, bioaerosols, dead and live cellular material (including blood fragments) and viruses. At high concentrations, the smoke causes ocular and upper respiratory tract irritation in health care personnel and may create visual problems for the surgeon. The smoke has an unpleasant odour and has been shown to have mutagenic material.

Prevention and control

Exposure to airborne contaminants in such smoke can be effectively controlled by proper ventilation of the treatment room, supplemented by local exhaust ventilation (LEV) using a high-efficiency suction unit (i.e., a vacuum pump with an inlet nozzle held within 2 inches of the surgical site) that is activated throughout the procedure. Both the room ventilation system and the local exhaust ventilator should be equipped with filters and absorbers that capture particulates and absorb or inactivate airborne gases and vapours. These filters and absorbers require monitoring and replacement on a regular basis and are considered a possible biohazard requiring proper disposal.

Radiation

Ionizing radiation

When ionizing radiation strikes cells in living tissue, it may either kill the cell directly (i.e., cause burns or hair loss) or it may alter the genetic material of the cell (i.e., cause cancer or reproductive damage). Standards involving ionizing radiation may refer to exposure (the amount of radiation the body is exposed to) or dose (the amount of radiation the body absorbs) and may be expressed in terms of millirem (mrem), the usual measure of radiation, or rems (1,000 millirems).

Various jurisdictions have developed regulations governing the procurement, use, transportation and disposal of radioactive materials, as well as established limits for exposure (and in some places specific limits for dosage to various parts of the body), providing a strong measure of protection for radiation workers. In addition, institutions using radioactive materials in treatment and research generally develop their own internal controls in addition to those prescribed by law.

The greatest dangers to hospital workers are from scatter, the small amount of radiation that is deflected or reflected from the beam into the immediate vicinity, and from unexpected exposure, either because they are inadvertently exposed in an area not defined as a radiation area or because the equipment is not well maintained.

Radiation workers in diagnostic radiology (including x ray, fluoroscopy and angiography for diagnostic purposes, dental radiography and computerized axial tomography (CAT) scanners), in therapeutic radiology, in nuclear medicine for diagnostic and therapeutic procedures, and in radiopharmaceutical laboratories are carefully followed and checked for exposure, and radiation safety is usually well managed in their workstations, although there are many localities in which control is inadequate.

There are other areas not usually designated as “radiation areas”, where careful monitoring is needed to ensure that appropriate precautions are being taken by staff and that correct safeguards are provided for patients who might be exposed. These include angiography, emergency rooms, intensive care units, locations where portable x rays are being taken and operating rooms.

Prevention and control

The following protective measures are strongly recommended for ionizing radiation (x rays and radioisotopes):

  • Rooms that house radiation sources should be properly marked and entered only by authorized personnel.
  • All films should be held in place by patients or members of the patient’s family. If the patient must be held, a member of the family should do so. If staff must hold film or patients, the task should be rotated through the staff to minimize the overall dose per individual.
  • Where portable x-ray units and radioisotopes are used, only the patient and trained personnel should be allowed in the room.
  • Adequate warning should be given to nearby workers when x rays using portable units are about to be taken.
  • X-ray controls should be located to prevent the unintentional energizing of the unit.
  • X-ray room doors should be kept closed when equipment is in use.
  • All x-ray machines should be checked before each use to ensure that the secondary radiation cones and filters are in place.
  • Patients who have received radioactive implants or other therapeutic radiology procedures should be clearly identified. Bedding, dressings, wastes and so forth from such patients should be so labelled.

 

Lead aprons, gloves and goggles must be worn by employees working in the direct field or where scatter radiation levels are high. All such protective equipment should be checked annually for cracks in the lead.

Dosimeters must be worn by all personnel exposed to ionizing radiation sources. Dosimeter badges should be regularly analysed by a laboratory with good quality control, and the results should be recorded. Records must be kept not only of each employee’s personal radiation exposure but also of the receipt and disposition of all radioisotopes.

In therapeutic radiology settings, periodic dose checks should be done using lithium fluoride (LiF) solid-state dosimeters to check on system calibration. Treatment rooms should be equipped with radiation monitor-door interlock and visual-alarm systems.

During internal or intravenous treatment with radioactive sources, the patient should be housed in a room located to minimize exposure to other patients and staff and signs posted warning others not to enter. Staff contact time should be limited, and staff should be careful in handling bedding, dressings and wastes from these patients.

During fluoroscopy and angiography, the following measures can minimize unnecessary exposure:

  • full protective equipment
  • minimal number of personnel in the room
  • “dead-man” switches (must have active operator control)
  • minimal beam size and energy
  • careful shielding to reduce scatter.

 

Full protective equipment should also be used by operating-room personnel during radiation procedures, and, when possible, personnel should stand 2 m or more from the patient.

Non-ionizing radiation

Ultraviolet radiation, lasers and microwaves are non-ionizing radiation sources. They are generally far less hazardous than ionizing radiation but nevertheless require special care to prevent injury.

Ultraviolet radiation is used in germicidal lamps, in certain dermatology treatments and in air filters in some hospitals. It is also produced in welding operations. Exposure of the skin to ultraviolet light causes sunburn, ages the skin and increases the risk of skin cancer. Eye exposure can result in temporary but extremely painful conjunctivitis. Long-term exposure can lead to partial loss of vision.

Standards regarding exposure to ultraviolet radiation are not widely applicable. The best approach to prevention is education and wearing shaded protective eyeglasses.

The Bureau of Radiological Health of the US Food and Drug Administration regulates lasers and classifies them into four classes, I to IV. The laser used to position patients in radiology is considered Class I and represents minimal risk. Surgical lasers, however, can pose a significant hazard to the retina of the eye where the intense beam can cause total loss of vision. Because of the high voltage supply required, all lasers present the risk of electrical shock. The accidental reflection of the laser beam during surgical procedures can result in injury to the staff. Guidelines for laser use have been developed by the American National Standards Institute and the US Army; for example, laser users should wear protective goggles specifically designed for each type of laser and take care not to focus the beam on reflecting surfaces.

The primary concern regarding exposure to microwaves, which are used in hospitals chiefly for cooking and heating food and for diathermy treatments, is the heating effect they have on the body. The eye lens and gonads, having fewer vessels with which to remove heat, are most vulnerable to damage. The long-term effects of low-level exposure have not been established, but there is some evidence that nervous system effects, decreased sperm count, sperm malformations (at least partially reversible after exposure ceases) and cataracts may result.

Prevention and control

The OSHA standard for exposure to microwaves is 10 milliwatts per square centimetre (10 mW/cm). This is the level established to protect against the thermal effects of microwaves. In other countries where levels have been established to protect against reproductive and nervous system damage, the standards are as much as two orders of magnitude lower, that is, 0.01 mW/cm2 at 1.2 m.

To ensure the safety of workers, microwave ovens should be kept clean to protect the integrity of the door seals and should be checked for leakage at least every three months. Leakage from diathermy equipment should be monitored in the vicinity of the therapist before each treatment.

Hospital workers should be aware of the radiation hazards of ultraviolet exposure and of infrared heat used for therapy. They should have appropriate eye protection when using or repairing ultraviolet equipment, such as germicidal lamps and air purifiers or infrared instruments and equipment.

Conclusion

Physical agents represent an important class of hazards to workers in hospitals, clinics and private offices where diagnostic and therapeutic procedures are performed. These agents are discussed in more detail elsewhere in this Encyclopaedia. Their control requires education and training of all health professionals and support staff who may be involved and constant vigilance and systemic monitoring of both the equipment and the way it is used.

 

Back

Wednesday, 02 March 2011 15:30

Work Schedules and Night Work in Health Care

For a long time, nurses and nursing assistants were among the only women working at night in many countries (Gadbois 1981; Estryn-Béhar and Poinsignon 1989). In addition to the problems already documented among men, these women suffer additional problems related to their family responsibilities. Sleep deprivation has been convincingly demonstrated among these women, and there is concern about the quality of care they are able to dispense in the absence of appropriate rest.

Organization of Schedules and Family Obligations

It appears that personal feelings about social and family life are at least partially responsible for the decision to accept or refuse night work. These feelings, in turn, lead workers to minimize or exaggerate their health problems (Lert, Marne and Gueguen 1993; Ramaciotti et al. 1990). Among non-professional personnel, financial compensation is the main determinant of the acceptance or refusal of night work.

Other work schedules may also pose problems. Morning-shift workers sometimes must rise before 05:00 and so lose some of the sleep that is essential for their recovery. Afternoon shifts finish between 21:00 and 23:00, limiting social and family life. Thus, often only 20% of women working in large university hospitals have work schedules in synchrony with the rest of society (Cristofari et al. 1989).

Complaints related to work schedules are more frequent among health care workers than among other employees (62% versus 39%) and indeed are among the complaints most frequently voiced by nurses (Lahaye et al. 1993).

One study demonstrated the interaction of work satisfaction with social factors, even in the presence of sleep deprivation (Verhaegen et al. 1987). In this study, nurses working only night shifts were more satisfied with their work than nurses working rotating shifts. These differences were attributed to the fact that all the night-shift nurses chose to work at night and organized their family life accordingly, while rotating-shift nurses found even rare night-shift work a disturbance of their personal and family lives. However, Estryn-Béhar et al. (1989b) reported that mothers working only night shifts were more tired and went out less frequently compared with male night-shift nurses.

In the Netherlands, the prevalence of work complaints was higher among nurses working rotating shifts than among those working only day shifts (Van Deursen et al. 1993) (see table 1).

Table 1. Prevalence of work complaints according to shift

 

Rotating shifts (%)

Day shifts (%)

Arduous physical work

55.5

31.3

Arduous mental work

80.2

61.9

Work often too tiring

46.8

24.8

Under-staffing

74.8

43.8

Insufficient time for breaks

78.4

56.6

Interference of work with private life

52.8

31.0

Dissatisfaction with schedules

36.9

2.7

Frequent lack of sleep

34.9

19.5

Frequent fatigue on rising

31.3

17.3

Source: Van Deursen et al. 1993.

Sleep disturbances

On workdays, night-shift nurses sleep an average of two hours less than other nurses (Escribà Agüir et al. 1992; Estryn-Béhar et al. 1978; Estryn-Béhar et al. 1990; Nyman and Knutsson 1995). According to several studies, their quality of sleep is also poor (Schroër et al. 1993; Lee 1992; Gold et al. 1992; Estryn-Béhar and Fonchain 1986).

In their interview study of 635 Massachusetts nurses, Gold et al. (1992) found that 92.2% of nurses working alternating morning and afternoon shifts were able to maintain a nocturnal “anchor” sleep of four hours at the same schedule throughout the month, compared to only 6.3% of night-shift nurses and none of the nurses working alternating day and night shifts. The age- and seniority-adjusted odds ratio for “poor sleep” was 1.8 for night-shift nurses and 2.8 for rotating-shift nurses with night work, compared to morning- and afternoon-shift nurses. The odds ratio for taking sleep medication was 2.0 for night- and rotating-shift nurses, compared to morning- and afternoon-shift nurses.

Affective Problems and Fatigue

The prevalence of stress-related symptoms and reports of having stopped enjoying their work was higher among Finnish nurses working rotating shifts than among other nurses (Kandolin 1993). Estryn-Béhar et al. (1990) showed that night-shift nurses’ scores on the General Health Questionnaire used to evaluate mental health, compared to day-shift nurses (odds ratio of 1.6) showed poorer general health.

In another study, Estryn-Béhar et al. (1989b), interviewed a representative sample of one-quarter of night-shift employees (1,496 individuals) in 39 Paris-area hospitals. Differences appear according to sex and qualification (“qualified”=head nurses and nurses; “unqualified”=nurses’ aides and orderlies). Excessive fatigue was reported by 40% of qualified women, 37% of unqualified women, 29% of qualified men and 20% of unqualified men. Fatigue on rising was reported by 42% of qualified women, 35% of unqualified women, 28% of qualified men and 24% of unqualified men. Frequent irritability was reported by one-third of night-shift workers and by a significantly greater proportion of women. Women with no children were twice as likely to report excessive fatigue, fatigue on rising and frequent irritability than were comparable men. The increase compared to single men with no children was even more marked for women with one or two children, and greater still (a four-fold increase) for women with at least three children.

Fatigue on rising was reported by 58% of night-shift hospital workers and 42% of day-shift workers in a Swedish study using a stratified sample of 310 hospital workers (Nyman and Knutsson 1995). Intense fatigue at work was reported by 15% of day-shift workers and 30% of night-shift workers. Almost one-quarter of night-shift workers reported falling asleep at work. Memory problems were reported by 20% of night-shift workers and 9% of day-shift workers.

In Japan, the health and safety association publishes the results of medical examinations of all the country’s salaried employees. This report includes the results of 600,000 employees in the health and hygiene sector. Nurses generally work rotating shifts. Complaints concerning fatigue are highest in night-shift nurses, followed in order by evening- and morning-shift nurses (Makino 1995). Symptoms reported by night-shift nurses include sleepiness, sadness and difficulty concentrating, with numerous complaints about accumulated fatigue and disturbed social life (Akinori and Hiroshi 1985).

Sleep and Affective Disorders among Physicians

The effect of work content and duration on young physicians’ private lives, and the attendant risk of depression, has been noted. Valko and Clayton (1975) found that 30% of young residents suffered a bout of depression lasting an average of five months during their first year of residency. Of the 53 residents studied, four had suicidal thoughts and three made concrete suicide plans. Similar rates of depression have been reported by Reuben (1985) and Clark et al. (1984).

In a questionnaire study, Friedman, Kornfeld and Bigger (1971) showed that interns suffering from sleep deprivation reported more sadness, selfishness and modification of their social life than did more-rested interns. During interviews following the tests, interns suffering from sleep deprivation reported symptoms such as difficulty reasoning, depression, irritability, depersonalization, inappropriate reactions and short-term memory deficits.

In a one-year longitudinal study, Ford and Wentz (1984) evaluated 27 interns four times during their internship. During this period, four interns suffered at least one major bout of depression meeting standard criteria and 11 others reported clinical depression. Anger, fatigue and mood swings increased throughout the year and were inversely correlated with the amount of sleep the preceding week.

A literature review has identified six studies in which interns having spent one sleepless night exhibited deteriorations of mood, motivation and reasoning ability and increased fatigue and anxiety (Samkoff and Jacques 1991).

Devienne et al. (1995) interviewed a stratified sample of 220 general practitioners in the Paris area. Of these, 70 were on call at night. Most of the on-call physicians reported having had their sleep disturbed while on call and finding it particularly difficult to get back to sleep after having been awakened (men: 65%; women: 88%). Waking up in the middle of the night for reasons unrelated to service calls was reported by 22% of men and 44% of women. Having or almost having a car accident due to sleepiness related to being on call was reported by 15% of men and 19% of women. This risk was greater among physicians who were on call more than four times per month (30%) than in those on call three or four times per month (22%) or one to three times per month (10%). The day after being on call, 69% of women and 46% of men reported having difficulty concentrating and feeling less effective, while 37% of men and 31% of women reported experiencing mood swings. Accumulated sleep deficits were not recovered the day following on-call work.

Family and Social Life

A survey of 848 night-shift nurses found that over the previous month one-quarter had not gone out and had entertained no guests, and half had participated in such activities only once (Gadbois 1981). One-third reported refusing an invitation because of fatigue, and two-thirds reported going out only once, with this proportion rising to 80% among mothers.

Kurumatani et al. (1994) reviewed the time sheets of 239 Japanese nurses working rotating shifts over a total of 1,016 days and found that nurses with young children slept less and spent less time on leisure activities than did nurses without young children.

Estryn-Béhar et al. (1989b) observed that women were significantly less likely than men to spend at least one hour per week participating in team or individual sports (48% of qualified women, 29% of unqualified women, 65% of qualified men and 61% of unqualified men). Women were also less likely to frequently (at least four times per month) attend shows (13% of qualified women, 6% of unqualified women, 20% of qualified men and 13% of unqualified men). On the other hand, similar proportions of women and men practised home-based activities such as watching television and reading. Multivariate analysis showed that men with no children were twice as likely to spend at least one hour per week on athletic activities than were comparable women. This gap increases with the number of children. Child care, and not gender, influences reading habits. A significant proportion of the subjects in this study were single parents. This was very rare among qualified men (1%), less rare among unqualified men (4.5%), common in qualified women (9%) and extremely frequent in unqualified women (24.5%).

In Escribà Agüir’s (1992) study of Spanish hospital workers, incompatibility of rotating shifts with social and family life was the leading source of dissatisfaction. In addition, night-shift work (either permanent or rotating) disturbed the synchronization of their schedules with those of their spouses.

Lack of free time interferes severely with the private life of interns and residents. Landau et al. (1986) found that 40% of residents reported major conjugal problems. Of these residents, 72% attributed the problems to their work. McCall (1988) noted that residents have little time to spend on their personal relationships; this problem is particularly serious for women nearing the end of their low-risk-pregnancy years.

Irregular Shift Work and Pregnancy

Axelsson, Rylander and Molin (1989) distributed a questionnaire to 807 women employed at the hospital in Mölna, Sweden. The birth weights of children born to non-smoking women working irregular shifts were significantly lower than that of children born to non-smoking women who only worked day shifts. The difference was greatest for infants of at least grade 2 (3,489 g versus 3,793 g). Similar differences were also found for infants of at least grade 2 born to women working afternoon shifts (3,073 g) and shifts alternating every 24 hours (3,481 g).

Vigilance and Quality of Work among Night-Shift Nurses

Englade, Badet and Becque (1994) performed Holter EEGs on two groups of nine nurses. It showed that the group not allowed to sleep had attention deficits characterized by sleepiness, and in some cases even sleep of which they were unaware. An experimental group practised polyphasic sleep in an attempt to recover a little sleep during work hours, while the control group was not allowed any sleep recovery.

These results are similar to those reported by a survey of 760 California nurses (Lee 1992), in which 4.0% of night-shift nurses and 4.3% of nurses working rotating shifts reported suffering frequent attention deficits; no nurses from the other shifts mentioned lack of vigilance as a problem. Occasional attention deficits were reported by 48.9% of night-shift nurses, 39.2% of rotating-shift nurses, 18.5% of day-shift nurses and 17.5% of evening-shift nurses. Struggling to stay awake while dispensing care during the month preceding the survey was reported by 19.3% of night-shift and rotating-shift nurses, compared to 3.8% of day- and evening-shift nurses. Similarly, 44% of nurses reported having had to struggle to stay awake while driving during the preceding month, compared to 19% of day-shift nurses and 25% of evening-shift nurses.

Smith et al. (1979) studied 1,228 nurses in 12 American hospitals. The incidence of occupational accidents was 23.3 for nurses working rotating shifts, 18.0 for night-shift nurses, 16.8 for day-shift nurses and 15.7 for afternoon-shift nurses.

In an attempt to better characterize problems related to attention deficits among night-shift nurses, Blanchard et al. (1992) observed activity and incidents throughout a series of night shifts. Six wards, ranging from intensive care to chronic care, were studied. In each ward, one continuous observation of a nurse was performed on the second night (of night work) and two observations on the third or fourth nights (depending on the wards’ schedule). Incidents were not associated with serious outcomes. On the second night, the number of incidents rose from 8 in the first half of the night to 18 in the second half. On the third or fourth night, the increase was from 13 to 33 in one case and from 11 to 35 in another. The authors emphasized the role of sleep breaks in limiting risks.

Gold et al. (1992) collected information from 635 Massachusetts nurses on the frequency and consequences of attention deficits. Experiencing at least one episode of sleepiness at work per week was reported by 35.5% of rotating-shift nurses with night work, 32.4% of night-shift nurses and 20.7% of morning-shift and afternoon-shift nurses working exceptionally at night. Less than 3% of nurses working the morning and afternoon shifts reported such incidents.

The odds ratio for sleepiness while driving to and from work was 3.9 for rotating-shift nurses with night work and 3.6 for night-shift nurses, compared to morning- and afternoon-shift nurses. The odds ratio for total accidents and errors over the past year (car accidents driving to and from work, errors in medication or work procedures, occupational accidents related to sleepiness) was almost 2.00 for rotating-shift nurses with night work compared to morning- and afternoon-shift nurses.

Effect of Fatigue and Sleepiness on the Performance of Physicians

Several studies have shown that the fatigue and sleeplessness induced by night-shift and on-call work leads to deteriorations of physician performance.

Wilkinson, Tyler and Varey (1975) conducted a postal questionnaire survey of 6,500 British hospital physicians. Of the 2,452 who responded, 37% reported suffering a degradation of their effectiveness due to excessively long work hours. In response to open-ended questions, 141 residents reported committing errors due to overwork and lack of sleep. In a study performed in Ontario, Canada, 70% of 1,806 hospital physicians reported often worrying about the effect of the quantity of their work had on its quality (Lewittes and Marshall 1989). More specifically, 6% of the sample—and 10% of interns—reported often worrying about fatigue affecting the quality of care they dispensed.

Given the difficulty in performing real-time evaluations of clinical performance, several studies on the effects of sleep deprivation on physicians have relied upon neuropsychological tests.

In the majority of studies reviewed by Samkoff and Jacques (1991), residents deprived of sleep for one night exhibited little deterioration in their performance of rapid tests of manual dexterity, reaction time and memory. Fourteen of these studies used extensive test batteries. According to five tests, the effect on performance was ambiguous; according to six, a performance deficit was observed; but according to eight other tests, no deficit was observed.

Rubin et al. (1991) tested 63 medical-ward residents before and after an on-call period of 36 hours and a subsequent full day of work, using a battery of self-administered computerized behavioural tests. Physicians tested after being on call exhibited significant performance deficits in tests of visual attention, coding speed and accuracy and short-term memory. The duration of sleep enjoyed by the residents while on call was as follows: two hours at most in 27 subjects, four hours at most in 29 subjects, six hours at most in four subjects and seven hours in three subjects. Lurie et al. (1989) reported similarly brief sleep durations.

Virtually no difference has been observed in the performance of actual or simulated short-duration clinical tasks—including filling out a laboratory requisition (Poulton et al. 1978; Reznick and Folse 1987), simulated suturing (Reznick and Folse 1987), endotracheal intubation (Storer et al. 1989) and venous and arterial catheterization (Storer et al. 1989)—by sleep-deprived and control groups. The only difference observed was a slight lengthening of the time required by sleep-deprived residents to perform arterial catheterization.

On the other hand, several studies have demonstrated significant differences for tasks requiring continuous vigilance or intense concentration. For example, sleep-deprived interns committed twice as many errors when reading 20-minute ECGs as did rested interns (Friedman et al. 1971). Two studies, one relying on 50-minute VDU-based simulations (Beatty, Ahern and Katz 1977), the other on 30-minute video simulations (Denisco, Drummond and Gravenstein 1987), have reported poorer performance by anaesthesiologists deprived of sleep for one night. Another study has reported significantly poorer performance by sleep-deprived residents on a four-hour test exam (Jacques, Lynch and Samkoff 1990). Goldman, McDonough and Rosemond (1972) used closed-circuit filming to study 33 surgical procedures. Surgeons with less than two hours of sleep were reported to perform “worse” than more-rested surgeons. The duration of surgical inefficiency or indecision (i.e., of poorly planned manoeuvres) was over 30% of the total duration of the operation.

Bertram (1988) examined the charts of emergency admissions by second-year residents over a one-month period. For a given diagnosis, less information on medical histories and the results of clinical examinations was gathered as the number of hours worked and patients seen increased.

Smith-Coggins et al. (1994) analysed the EEG, mood, cognitive performance and motor performance of six emergency-ward physicians over two 24-hour periods, one with diurnal work and nocturnal sleep, the other with nocturnal work and diurnal sleep.

Physicians working at night slept significantly less (328.5 versus 496.6 minutes) and performed significantly less well. This poorer motor performance was reflected in the increased time required to perform a simulated intubation (42.2 versus 31.56 seconds) and an increased number of protocol errors.

Their cognitive performance was evaluated at five test periods throughout their shift. For each test, physicians were required to review four charts drawn from a pool of 40, rank them and list the initial procedures, the treatments and the appropriate laboratory tests. Performance deteriorated as the shift progressed for both night-shift and day-shift physicians. Night-shift physicians were less successful at providing correct responses than day-shift physicians.

Physicians working during the day rated themselves as less sleepy, more satisfied and more lucid than did night-shift physicians.

Recommendations in English-speaking countries concerning the work schedules of physicians-in-training have tended to take these results into account and now call for work-weeks of at most 70 hours and the provision of recovery periods following on-call work. In the US, following the death of a patient attributed to errors by an overworked, poorly supervised resident physician which received much media attention, New York State enacted legislation limiting work hours for hospital staff physicians and defining the role of attending physicians in supervising their activities.

Content of Night Work in Hospitals

Night work has long been undervalued. In France, nurses used to be seen as guardians, a term rooted in a vision of nurses’ work as the mere monitoring of sleeping patients, with no delivery of care. The inaccuracy of this vision became increasingly obvious as the length of hospitalization decreased and patients’ uncertainty about their hospitalization increased. Hospital stays require frequent technical interventions during the night, precisely when the nurse:patient ratio is lowest.

The importance of the amount of time spent by nurses in patients’ rooms is demonstrated by the results of a study based on continuous observation of the ergonomics of nurses’ work in each of three shifts in ten wards (Estryn-Béhar and Bonnet 1992). The time spent in rooms accounted for an average of 27% of the day and night shifts and 30% of the afternoon shift. In four of the ten wards, nurses spent more time in the rooms during the night than during the day. Blood samples were of course taken less frequently during the night, but other technical interventions such as monitoring vital signs and medication, and administering, adjusting and monitoring intravenous drips and transfusions were more frequent during the night in six of seven wards where detailed analysis was performed. The total number of technical and non-technical direct-care interventions was higher during the night in six of seven wards.

Nurses’ work postures varied from shift to shift. The percentage of time spent seated (preparation, writing, consultations, time spent with patients, breaks) was higher at night in seven of ten wards, and exceeded 40% of shift time in six wards. However, the time spent in painful postures (bent over, crouched, arms extended, carrying loads) exceeded 10% of shift time in all wards and 20% of shift time in six wards at night; in five wards the percentage of time spent in painful positions was higher at night. In fact, night-shift nurses also make beds and perform tasks related to hygiene, comfort and voiding, tasks which are all normally performed by nurses’ aides during the day.

Night-shift nurses may be obliged to change location very frequently. Night-shift nurses in all the wards changed location over 100 times per shift; in six wards, the number of changes of location was higher at night. However, because rounds were scheduled at 00:00, 02:00, 04:00 and 06:00, nurses did not travel greater distances, except in juvenile intensive-care wards. Nonetheless, nurses walked over six kilometres in three of the seven wards where podometry was performed.

Conversations with patients were frequent at night, exceeding 30 per shift in all wards; in five wards these conversations were more frequent at night. Conversations with physicians were much rarer and almost always brief.

Leslie et al. (1990) conducted continuous observation of 12 of 16 interns in the medical ward of a 340-bed Edinburgh (Scotland) hospital over 15 consecutive winter days. Each ward cared for approximately 60 patients. In all, 22 day shifts (08:00 to 18:00) and 18 on-call shifts (18:00 to 08:00), equivalent to 472 hours of work, were observed. The nominal duration of the interns’ work week was 83 to 101 hours, depending on whether or not they were on call during the weekends. However, in addition to the official work schedule, each intern also spent an average of 7.3 hours each week on miscellaneous hospital activities. Information on the time spent performing each of 17 activities, on a minute-by-minute basis, was collected by trained observers assigned to each intern.

The longest continuous work period observed was 58 hours (08:00 Saturday to 06:00 Monday) and the longest work period was 60.5 hours. Calculations showed that a one-week sickness leave of one intern would require the other two interns in the ward to increase their workload by 20 hours.

In practice, in wards admitting patients during on-call shifts, interns working consecutive day, on-call and night shifts worked all but 4.6 of the 34 elapsed hours. These 4.6 hours were devoted to meals and rest, but interns remained on call and available during this time. In wards that did not admit new patients during on-call shifts, interns’ workload abated only after midnight.

Due to the on-call schedules in other wards, interns spent approximately 25 minutes outside their home ward each shift. On average, they walked 3 kilometres and spent 85 minutes (32 to 171 minutes) in other wards each night shift.

Time spent filling out requests for examinations and charts, in addition, is often performed outside of their normal work hours. Non-systematic observation of this additional work over several days revealed that it accounts for approximately 40 minutes of additional work at the end of each shift (18:00).

During the day, 51 to 71% of interns’ time was spent on patient-oriented duties, compared to 20 to 50% at night. Another study, conducted in the United States, reported that 15 to 26% of work time was spent on patient-oriented duties (Lurie et al. 1989).

The study concluded that more interns were needed and that interns should no longer be required to attend other wards while on call. Three additional interns were hired. This reduced interns’ work week to an average of 72 hours, with no work, excepting on-call shifts, after 18:00. Interns also obtained a free half-day following an on-call shift and preceding a weekend when they were to be on call. Two secretaries were hired on a trial basis by two wards. Working 10 hours per week, the secretaries were able to fill out 700 to 750 documents per ward. In the opinion of both senior physicians and nurses, this resulted in more efficient rounds, since all the information had been entered correctly.

 

Back

Page 80 of 122

" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."

Contents

Chemical Processing References

Adams, WV, RR Dingman, and JC Parker. 1995. Dual gas sealing technology for pumps. Proceedings 12th International Pump Users Symposium. March, College Station, TX.

American Petroleum Institute (API). 1994. Shaft Sealing Systems for Centrifugal Pumps. API Standard 682. Washington, DC: API.

Auger, JE. 1995. Build a proper PSM program from the ground-up. Chemical Engineering Progress 91:47-53.

Bahner, M. 1996. Level-measurement tools keep tank contents where they belong. Environmental Engineering World 2:27-31.

Balzer, K. 1994. Strategies for developing biosafety programs in biotechnology facilities. Presented at the 3rd National Symposium on Biosafety, 1 March, Atlanta, GA.

Barletta, T, R Bayle, and K Kennelley. 1995. TAPS storage tank bottom: Fitted with improved connection. Oil & Gas Journal 93:89-94.

Bartknecht, W. 1989. Dust Explosions. New York: Springer-Verlag.

Basta, N. 1994. Technology lifts the VOC cloud. Chemical Engineering 101:43-48.

Bennett, AM. 1990. Health Hazards in Biotechnology. Salisbury, Wiltshire, UK: Division of Biologics, Public Health Laboratory Service, Centre for Applied Microbiology and Research.

Berufsgenossenschaftlices Institut für Arbeitssicherheit (BIA). 1997. Measurement of Hazardous Substances: Determination of Exposure to Chemical and Biological Agents. BIA Working Folder. Bielefeld: Erich Schmidt Verlag.

Bewanger, PC and RA Krecter. 1995. Making safety data “safe”. Chemical Engineering 102:62-66.

Boicourt, GW. 1995. Emergency relief system (ERS) design: An integrated approach using DIERS methodology. Process Safety Progress 14:93-106.

Carroll, LA and EN Ruddy. 1993. Select the best VOC control strategy. Chemical Engineering Progress 89:28-35.

Center for Chemical Process Safety (CCPS). 1988. Guidelines for Safe Storage and Handling of High Toxic Hazard Materials. New York: American Institute of Chemical Engineers.

—. 1993. Guidelines for Engineering Design for Process Safety. New York: American Institute of Chemical Engineers.
Cesana, C and R Siwek. 1995. Ignition behavior of dusts meaning and interpretation. Process Safety Progress 14:107-119.

Chemical and Engineering News. 1996. Facts and figures for the chemical industry. C&EN (24 June):38-79.

Chemical Manufacturers Association (CMA). 1985. Process Safety Management (Control of Acute Hazards). Washington, DC: CMA.

Committee on Recombinant DNA Molecules, Assembly of Life Sciences, National Research Council, National Academy of Sciences. 1974. Letter to the editor. Science 185:303.

Council of the European Communities. 1990a. Council Directive of 26 November 1990 on the protection of workers from risks related to exposure to biological agents at work. 90/679/EEC. Official Journal of the European Communities 50(374):1-12.

—. 1990b. Council Directive of 23 April 1990 on the deliberate release into the environment of genetically modified organisms. 90/220/EEC. Official Journal of the European Communities 50(117): 15-27.

Dow Chemical Company. 1994a. Dow’s Fire & Explosion Index Hazard Classification Guide, 7th edition. New York: American Institute of Chemical Engineers.

—. 1994b. Dow’s Chemical Exposure Index Guide. New York: American Institute of Chemical Engineers.

Ebadat, V. 1994. Testing to assess your powder’s fire and explosion hazards. Powder and Bulk Engineering 14:19-26.
Environmental Protection Agency (EPA). 1996. Proposed guidelines for ecological risk assessment. Federal Register 61.

Fone, CJ. 1995. The application of innovation and technology to the containment of shaft seals. Presented at the First European Conference on Controlling Fugitive Emissions from Valves, Pumps, and Flanges, 18-19 October, Antwerp.

Foudin, AS and C Gay. 1995. Introduction of genetically engineered microorganisms into the environment: Review under USDA, APHIS regulatory authority. In Engineered Organisms in Environmental Settings: Biotechnological and Agricultural Applications, edited by MA Levin and E Israeli. Boca Raton, FL:CRC Press.

Freifelder, D (ed.). 1978. The controversy. In Recombinant DNA. San Francisco, CA: WH Freeman.

Garzia, HW and JA Senecal. 1996. Explosion protection of pipe systems conveying combustible dusts or flammable gases. Presented at the 30th Loss Prevention Symposium, 27 February, New Orleans, LA.

Green, DW, JO Maloney, and RH Perry (eds.). 1984. Perry’s Chemical Engineer’s Handbook, 6th edition. New York: McGraw-Hill.

Hagen, T and R Rials. 1994. Leak-detection method ensures integrity of double bottom storage tanks. Oil & Gas Journal (14 November).

Ho, M-W. 1996. Are current transgenic technologies safe? Presented at the Workshop on Capacity Building in Biosafety for Developing Countries, 22-23 May, Stockholm.

Industrial Biotechnology Association. 1990. Biotechnology in Perspective. Cambridge, UK: Hobsons Publishing plc.

Industrial Risk Insurers (IRI). 1991. Plant Layout and Spacing for Oil and Chemical Plants. IRI Information Manual 2.5.2. Hartford, CT: IRI.

International Commission on Non-Ionizing Radiation Protection (ICNIRP). In press. Practical Guide for Safety in the Use of RF Dielectric Heaters and Sealers. Geneva: ILO.

Lee, SB and LP Ryan. 1996. Occupational health and safety in the biotechnology industry: A survey of practicing professionals. Am Ind Hyg Assoc J 57:381-386.

Legaspi, JA and C Zenz. 1994. Occupational health aspects of pesticides: Clinical and hygienic principles. In Occupational Medicine, 3rd edition, edited by C Zenz, OB Dickerson, and EP Horvath. St. Louis: Mosby-Year Book, Inc.

Lipton, S and JR Lynch. 1994. Handbook of Health Hazard Control in the Chemical Process Industry. New York: John Wiley & Sons.

Liberman, DF, AM Ducatman, and R Fink. 1990. Biotechnology: Is there a role for medical surveillance? In Bioprocessing Safety: Worker and Community Safety and Health Considerations. Philadelphia, PA: American Society for Testing and Materials.

Liberman, DF, L Wolfe, R Fink, and E Gilman. 1996. Biological safety considerations for environmental release of transgenic organisms and plants. In Engineered Organisms in Environmental Settings: Biotechnological and Agricultural Applications, edited by MA Levin and E Israeli. Boca Raton, FL: CRC Press.

Lichtenstein, N and K Quellmalz. 1984. Flüchtige Zersetzungsprodukte von Kunststoffen I: ABS-Polymere. Staub-Reinhalt 44(1):472-474.

—. 1986a. Flüchtige Zersetzungsprodukte von Kunststoffen II: Polyethylen. Staub-Reinhalt 46(1):11-13.

—. 1986b. Flüchtige Zersetzungsprodukte von Kunststoffen III: Polyamide. Staub-Reinhalt 46(1):197-198.

—. 1986c. Flüchtige Zersetzungsprodukte von Kunststoffen IV: Polycarbonate. Staub-Reinhalt 46(7/8):348-350.

Massachusetts Biotechnology Council Community Relations Committee. 1993. Unpublished statistics.

Mecklenburgh, JC. 1985. Process Plant Layout. New York: John Wiley & Sons.

Miller, H. 1983. Report on the World Health Organization Working Group on Health Implications of Biotechnology. Recombinant DNA Technical Bulletin 6:65-66.

Miller, HI, MA Tart and TS Bozzo. 1994. Manufacturing new biotech products: Gains and growing pains. J Chem Technol Biotechnol 59:3-7.

Moretti, EC and N Mukhopadhyay. 1993. VOC control: Current practices and future trends. Chemical Engineering Progress 89:20-26.

Mowrer, DS. 1995. Use quantitative analysis to manage fire risk. Hydrocarbon Processing 74:52-56.

Murphy, MR. 1994. Prepare for EPA’s risk management program rule. Chemical Engineering Progress 90:77-82.

National Fire Protection Association (NFPA). 1990. Flammable and Combustible Liquid. NFPA 30. Quincy, MA: NFPA.

National Institute for Occupational Safety and Health (NIOSH). 1984. Recommendations for Control of Occupational Safety and Health Hazards. Manufacture of Paint and Allied Coating Products. DHSS (NIOSH) Publication No. 84-115. Cincinnati, OH: NIOSH.

National Institute of Health (Japan). 1996. Personal communication.

National Institutes of Health (NIH). 1976. Recombinant DNA research. Federal Register 41:27902-27905.

—. 1991. Recombinant DNA research actions under the guidelines. Federal Register 56:138.

—. 1996. Guidelines for research involving recombinant DNA molecules. Federal Register 61:10004.

Netzel, JP. 1996. Seal technology: A control for industrial pollution. Presented at the 45th Society of Tribologists and Lubrication Engineers Annual Meetings. 7-10 May, Denver.

Nordlee, JA, SL Taylor, JA Townsend, LA Thomas, and RK Bush. 1996. Identification of a Brazil-nut allergen in transgenic soybeans. New Engl J Med 334 (11):688-692.

Occupational Safety and Health Administration (OSHA). 1984. 50 FR 14468. Washington, DC: OSHA.

—. 1994. CFR 1910.06. Washington, DC:OSHA.

Office of Science and Technology Policy (OSTP). 1986. Coordinated Framework for Biotechnology Regulation. FR 23303. Washington, DC: OSTP.

Openshaw, PJ, WH Alwan, AH Cherrie, and FM Record. 1991. Accidental infection of laboratory worker with recombinant vaccinia virus. Lancet 338.(8764):459.

Parliament of the European Communities. 1987. Treaty Establishing a Single Council and a Single Commission of the European Communities. Official Journal of the European Communities 50(152):2.

Pennington, RL. 1996. VOC and HAP control operations. Separations and Filtration Systems Magazine 2:18-24.

Pratt, D and J May. 1994. Agricultural occupational medicine. In Occupational Medicine, 3rd edition, edited by C Zenz, OB Dickerson, and EP Horvath. St. Louis: Mosby-Year Book, Inc.

Reutsch, C-J and TR Broderick. 1996. New biotechnology legislation in the European Community and Federal Republic of Germany. Biotechnology.

Sattelle, D. 1991. Biotechnology in perspective. Lancet 338:9,28.

Scheff, PA and RA Wadden. 1987. Engineering Design for Control of Workplace Hazards. New York: McGraw-Hill.

Siegell, JH. 1996. Exploring VOC control options. Chemical Engineering 103:92-96.

Society of Tribologists and Lubrication Engineers (STLE). 1994. Guidelines for Meeting Emission Regulations for Rotating Machinery with Mechanical Seals. STLE Special Publication SP-30. Park Ridge, IL: STLE.

Sutton, IS. 1995. Integrated management systems improve plant reliability. Hydrocarbon Processing 74:63-66.

Swiss Interdisciplinary Committee for Biosafety in Research and Technology (SCBS). 1995. Guidelines for Work with Genetically Modified Organisms. Zurich: SCBS.

Thomas, JA and LA Myers (eds.). 1993. Biotechnology and Safety Assessment. New York: Raven Press.

Van Houten, J and DO Flemming. 1993. Comparative analysis of current US and EC biosafety regulations and their impact on the industry. Journal of Industrial Microbiology 11:209-215.

Watrud, LS, SG Metz, and DA Fishoff. 1996. Engineered plants in the environment. In Engineered Organisms in Environmental Settings: Biotechnological and Agricultural Applications, edited by M Levin and E Israeli. Boca Raton, FL: CRC Press.

Woods, DR. 1995. Process Design and Engineering Practice. Englewood Cliffs, NJ: Prentice Hall.