A formal Environmental Management System (EMS), using the International Organization for Standardization (ISO) standard 14001 as the performance specification, has been developed and is being implemented in one of the largest teaching health care complexes in Canada. The Health Sciences Centre (HSC) consists of five hospitals and associated clinical and research laboratories, occupying a 32-acre site in central Winnipeg. Of the 32 segregated solid waste streams at the facility, hazardous wastes account for seven. This summary focuses on the hazardous waste disposal aspect of the hospital’s operations.
ISO 14000
The ISO 14000 standards system is a typical continuous improvement model based on a controlled management system. The ISO 14001 standard addresses the environmental management system structure exclusively. To conform with the standard, an organization must have processes in place for:
The hierarchy for carrying out these processes in the HSC is presented in table 1.
Table 1. HSC EMS documentation hierarchy
EMS level |
Purpose |
Governance document |
Includes the Board’s expectations on each core performance category and its requirements for corporate competency in each category. |
Level 1 |
Prescribes the outputs that will be delivered in response to customer and stakeholder (C/S) needs (including government regulatory requirements). |
Level 2 |
Prescribes the methodologies, systems, processes and resources to be used for achieving C/S requirements; the goals, objectives and performance standards essential for confirming that the C/S requirements have been met (e.g., a schedule of required systems and processes including responsibility centre for each). |
Level 3 |
Prescribes the design of each business system or process that will be operated to achieve the C/S requirements (e.g., criteria and boundaries for system operation; each information collection and data reporting point; position responsible for the system and for each component of the process, etc.). |
Level 4 |
Prescribes detailed task instructions (specific methods and techniques), for each work activity (e.g., describe the task to be done; identify the position responsible for completing the task; state skills required for the task; prescribe education or training methodology to achieve required skills; identify task completion and conformance data, etc.). |
Level 5 |
Organizes and records measurable outcome data on the operation of systems, processes and tasks designed to verify completion according to specification. (e.g., measures for system or process compliance; resource allocation and budget compliance; effectiveness, efficiency, quality, risk, ethics, etc.). |
Level 6 |
Analyses records and processes to establish corporate performance in relation to standards set for each output requirement (Level 1) related to C/S needs (e.g., compliance, quality, effectiveness, risk, utilization, etc.); and financial and staff resources. |
ISO standards encourage businesses to integrate all environmental considerations into mainstream business decisions and not restrict attention to concerns that are regulated. Since the ISO standards are not technical documents, the function of specifying numerical standards remains the responsibility of governments or independent expert bodies.
Management System Approach
Applying the generic ISO framework in a health care facility requires the adoption of management systems along the lines of those in table 1, which describes how this has been addressed by the HSC. Each level in the system is supported by appropriate documentation to confirm diligence in the process. While the volume of work is substantial, it is compensated by the resulting performance consistency and by the “expert” information that remains within the corporation when experienced persons leave.
The main objective of the EMS is to establish consistent, controlled and repeatable processes for addressing the environmental aspects of the corporation’s operations. To facilitate management review of the hospital’s performance, an EMS Score Card was conceived based on the ISO 14001 standard. The Score Card closely follows the requirements in the ISO 14001 standard and, with use, will be developed into the hospital’s audit protocol.
Application of the EMS to the Hazardous Waste Process
Facility hazardous waste process
The HSC hazardous waste process currently consists of the following elements:
The roles and responsibilities of the four main organizational units involved in the hazardous waste process are listed in table 2.
Table 2. Role and responsibilities
Organizational unit |
Responsibility |
S&DS |
Operates the process and is the process owner/leader, and arranges responsible disposal of waste. |
UD–User Departments |
Identifies waste, selects packaging, initiates disposal activities. |
DOEM |
Provides specialist technical support in identifying risks and protective measures associated with materials used by HSC and identifies improvement opportunities. |
EPE |
Provides specialist support in process performance monitoring and reporting, identifies emerging regulatory trends and compliance requirements, and identifies improvement opportunities. |
ALL–All participants |
Shares responsibility for process development activities. |
Process description
The initial step in preparing a process description is to identify the inputs (see table 3 ).
Table 3. Process inputs
Organizational unit |
Examples of process inputs and supporting inputs |
S&DS (S&DS) |
Maintain stock of Hazardous Waste Disposal Requisition forms and labels |
S&DS (UD, DOEM, EPE) (S&DS) |
Maintain supply of packaging containers in warehouse for UDs |
DOEM |
Produce SYMBAS Classification Decision Chart. |
EPE |
Produce the list of materials for which HSC is registered as a waste generator with regulatory department. |
S&DS |
Produce a database of SYMBAS classifications, packaging requirements, TDG classifications, and tracking information for each material disposed by HSC. |
The next process component is the list of specific activities required for proper disposal of waste (see table 4 ).
Table 4. List of activities
Unit |
Examples of activities required |
UD |
Order Hazardous Waste Disposal Requisition, label and packaging from S&DS as per standard stock ordering procedure. |
S&DS |
Deliver Requisition, label and packaging to UD. |
UD |
Determine whether a waste material is hazardous (check MSDS, DOEM, and such considerations as dilution, mixture with other chemicals, etc.). |
UD |
Assign the Classification to the waste material using SYMBAS Chemical Decision Chart and WHMIS information. Classification can be checked with the S&DS Data Base for materials previously disposed by HSC. Call first S&DS and second DOEM for assistance if required. |
UD |
Determine appropriate packaging requirements from WHMIS information using professional judgement or from S&DS Data Base of materials previously disposed by HSC. Call first S&DS and second DOEM for assistance if required. |
Communication
To support the process description, the hospital produced a Disposal Guide for Hazardous Waste to assist staff in the proper disposal of hazardous waste materials. The guide contains information on the specific steps to follow in identifying hazardous waste and preparing it for disposal. Supplemental information is also provided on legislation, the Workplace Hazardous Materials Information System (WHMIS) and key contacts for assistance.
A database was developed to track all relevant information pertaining to each hazardous waste event from originating source to final disposal. In addition to waste data, information is also collected on the performance of the process (e.g., source and frequency of phone calls for assistance to identify areas which may require further training; source, type, quantity and frequency of disposal requests from each user department; consumption of containers and packaging). Any deviations from the process are recorded on the corporate incident reporting form. Results from performance monitoring are reported to the executive and the board of directors. To support effective implementation of the process, a staff education programme was developed to elaborate on the information in the guide. Each of the core participants in the process carries specific responsibilities on staff education.
Continuous Improvement
To explore continuous improvement opportunities, the HSC established a multidisciplinary Waste Process Improvement Team. The Team’s mandate is to address all issues pertaining to waste management. Further to encourage continuous improvement, the hazardous waste process includes specific triggers to initiate process revisions. Typical improvement ideas generated to date include:
The ISO standards require regulatory issues to be addressed and state that business processes must be in place for this purpose. Under the ISO standards, the existence of corporate commitments, performance measuring and documentation provide a more visible and more convenient trail for regulators to check for compliance. It is conceivable that the opportunity for consistency provided by the ISO documents could automate reporting of key environmental performance factors to government authorities.
An adaptation of current guidelines on the disposal of hospital wastes, as well as improvements in internal safety and hygiene, must be part of an overall plan of hospital waste management that establishes the procedures to follow. This should be done through properly coordinating internal and external services, as well as defining responsibilities in each of the management phases. The main goal of this plan is to protect the health of health care personnel, patients, visitors and the general public both in the hospital and beyond.
At the same time, the health of the people who come in contact with the waste once it leaves the medical centre should not be overlooked, and the risks to them should also be minimized.
Such a plan should be campaigned for and applied according to a global strategy that always keeps in mind the realities of the workplace, as well as the knowledge and the training of the personnel involved.
Stages followed in the implementation of a waste management plan are:
The group should include personnel from the general services department, personnel from the nursing department and personnel from the medical department. The medical centre’s waste manager should coordinate the committee by:
Classification of hospital wastes
Until 1992, following the classical waste management system, the practice was to classify most hospital wastes as hazardous. Since then, applying an advanced management technique, only a very small proportion of the large volume of these wastes is considered hazardous.
The tendency has been to adopt an advanced management technique. This technique classifies wastes starting from the baseline assumption that only a very small percentage of the volume of wastes generated is hazardous.
Wastes should always be classified at the point where they are generated. According to the nature of the wastes and their source, they are classified as follows:
According to their physical state, wastes can be classified as follows:
Gaseous wastes, such as CFCs from freezers and refrigerators, are not normally captured (see article “Waste anaesthetic gases”).
By definition, the following wastes are not considered sanitary wastes:
Group I Wastes
All wastes generated within the medical centre that are not directly related to sanitary activities are considered solid urban wastes (SUW). According to the local ordinances in Cataluna, Spain, as in most communities, the municipalities must remove these wastes selectively, and it is therefore convenient to facilitate this task for them. The following are considered wastes that can be assimilated to urban refuse according to their point of origin:
Kitchen wastes:
Wastes generated by people treated in the hospital and non-medical personnel:
Wastes from administrative activities:
Other wastes:
So long as they are not included on other selective removal plans, SUW will be placed in white polyethylene bags that will be removed by janitorial personnel.
Group II Wastes
Group II wastes include all those wastes generated as a by-product of medical activities that do not pose a risk to health or the environment. For reasons of safety and industrial hygiene the type of internal management recommended for this group is different from that recommended for Group I wastes. Depending on where they originate, Group II wastes include:
Wastes derived from hospital activities, such as:
Group II wastes will be deposited in yellow polyethylene bags that will be removed by janitorial personnel.
Group III Wastes
Group III includes hospital wastes which, due to their nature or their point of origin, could pose risks to health or the environment if several special precautions are not observed during handling and removal.
Group III wastes can be classified in the following way:
Sharp and pointed instruments:
Infectious wastes. Group III wastes (including single-use items) generated by the diagnosis and treatment of patients who suffer from one of the infectious diseases are listed in table 1.
Table 1. Infectious diseases and Group III wastes
Infections |
Wastes contaminated with |
Viral haemorrhagic fevers |
All wastes |
Brucellosis |
Pus |
Diphtheria |
Pharyngeal diphtheria: respiratory secretions |
Cholera |
Stools |
Creutzfelt-Jakob encephalitis |
Stools |
Borm |
Secretions from skin lesions |
Tularaemia |
Pulmonary tularaemia: respiratory secretions |
Anthrax |
Cutaneous anthrax: pus |
Plague |
Bubonic plague: pus |
Rabies |
Respiratory secretions |
Q Fever |
Respiratory secretions |
Active tuberculosis |
Respiratory secretions |
Laboratory wastes:
Wastes of the Group III type will be placed in single-use, rigid, colour-coded polyethylene containers and hermetically sealed (in Cataluna, black containers are required). The containers should be clearly labelled as “Hazardous hospital wastes” and kept in the room until collected by janitorial personnel. Group III wastes should never be compacted.
To facilitate their removal and reduce risks to a minimum, containers should not be filled to capacity so that they can be closed easily. Wastes should never be handled once they are placed in these rigid containers. It is forbidden to dispose of biohazardous wastes by dumping them into the drainage system.
Group IV Wastes
Group IV wastes are surplus antineoplastic drugs that are not fit for therapeutic use, as well as all single-use material that has been in contact with the same (needles, syringes, catheters, gloves, IV set-ups and so on).
Given the danger they pose to persons and the environment, Group IV hospital wastes must be collected in rigid, watertight, sealable single-use, colour-coded containers (in Cataluna, they are blue) which should be clearly labelled “Chemically contaminated material: Cytostatic agents”.
Other Wastes
Guided by environmental concerns and the need to enhance waste management for the community, medical centres, with the cooperation of all personnel, staff and visitors, should encourage and facilitate the selective disposal (i.e., in special containers designated for specific materials) of recyclable materials such as:
The protocol established by the local sanitation department for the collection, transport and disposal of each of these types of materials should be followed.
Disposal of large pieces of equipment, furniture and other materials not covered in these guidelines should follow the directions recommended by the appropriate environmental authorities.
Internal transport and storage of wastes
Internal transport of all the wastes generated within the hospital building should be done by the janitorial personnel, according to established schedules. It is important that the following recommendations be observed when transporting wastes within the hospital:
The hospital must have an area specifically for the storage of wastes; it should conform to current guidelines and fulfil, in particular, the following conditions:
All the transport and storage operations that involve hospital wastes must be conducted under conditions of maximum safety and hygiene. In particular, one must remember:
Liquid Wastes: Biological and Chemical
Liquid wastes can be classified as biological or chemical.
Liquid biological wastes
Liquid biological wastes can usually be poured directly into the hospital’s drainage system since they do not require any treatment before disposal. The exceptions are the liquid wastes of patients with infectious diseases and the liquid cultures of microbiology laboratories. These should be collected in specific containers and treated before being dumped.
It is important that the waste be dumped directly into the drainage system with no splashing or spraying. If this is not possible and wastes are gathered in disposable containers that are difficult to open, the containers should not be forced open. Instead, the entire container should be disposed of, as with Group III solid wastes. When liquid waste is eliminated like Group III solid waste, it should be taken into consideration that the conditions of work differ for the disinfection of solid and liquid wastes. This must be kept in mind in order to ensure the effectiveness of the treatment.
Liquid chemical wastes
Liquid wastes generated in the hospital (generally in the laboratories) can be classified in three groups:
This classification is based on considerations related to the health and quality of life of the entire community. These include:
Liquid wastes that can pose a serious threat to people or to the environment because they are toxic, noxious, flammable, corrosive or carcinogenic should be separated and collected so that they can subsequently be recovered or destroyed. They should be collected as follows:
Mixtures of chemical and biological liquid wastes
Treatment of chemical wastes is more aggressive than treatment of biological wastes. Mixtures of these two wastes should be treated using the steps indicated for liquid chemical wastes. Labels on containers should note the presence of biological wastes.
Any liquid or solid materials that are carcinogenic, mutagenic or teratogenic should be disposed of in rigid colour-coded containers specifically designed and labelled for this type of waste.
Dead animals that have been inoculated with biohazardous substances will be disposed of in closed rigid containers, which will be sterilized before being reused.
Disposal of Sharp and Pointed Instruments
Sharp and pointed instruments (e.g., needles and lancets), once used, must be placed in specifically designed, rigid “sharps” containers that have been strategically placed throughout the hospital. These wastes will be disposed of as hazardous wastes even if used on uninfected patients. They must never be disposed of except in the rigid sharps container.
All HCWs must be repeatedly reminded of the danger of accidental cuts or punctures with this type of material, and instructed to report them when they occur, so that appropriate preventive measures may be instituted. They should be specifically instructed not to attempt to recap used hypodermic needles before dropping them into the sharps container.
Whenever possible, needles to be placed in the sharps container without recapping may be separated from the syringes which, without the needle, can generally be disposed of as Group II waste. Many sharps containers have a special fitting for separating the syringe without risk of a needlestick to the worker; this saves space in the sharps containers for more needles. The sharps containers, which should never be opened by hospital personnel, should be removed by designated janitorial personnel and forwarded for appropriate disposal of their contents.
If it is not possible to separate the needle in adequately safe conditions, the whole needle-syringe combination must be considered as biohazardous and must be placed in the rigid sharps containers.
These sharps containers will be removed by the janitorial personnel.
Staff Training
There must be an ongoing training programme in waste management for all hospital personnel aimed at indoctrinating the staff on all levels with the imperative of always following the established guidelines for collecting, storing and disposing wastes of all kinds. It is particularly important that the housekeeping and janitorial staffs be trained in the details of the protocols for recognizing and dealing with the various categories of hazardous waste. The janitorial, security and fire-fighting staff must also be drilled in the correct course of action in the event of an emergency.
It is also important for the janitorial personnel to be informed and trained on the correct course of action in case of an accident.
Particularly when the programme is first launched, the janitorial staff should be instructed to report any problems that may hinder their performance of these assigned duties. They may be given special cards or forms on which to record such findings.
Waste Management Committee
To monitor the performance of the waste management programme and resolve any problems that may arise as it is implemented, a permanent waste management committee should be created and meet regularly, quarterly at a minimum. The committee should be accessible to any member of the hospital staff with a waste disposal problem or concern and should have access as needed to top management.
Implementing the Plan
The way the waste management programme is implemented may well determine whether it succeeds or not.
Since the support and cooperation of the various hospital committees and departments is essential, details of the programme should be presented to such groups as the administrative teams of the hospital, the health and safety committee and the infection control committee. It is necessary also to obtain validation of the programme from such community agencies as the departments of health, environmental protection and sanitation. Each of these may have helpful modifications to suggest, particularly with respect to the way the programme impinges on their areas of responsibility.
Once the programme design has been finalized, a pilot test in a selected area or department should permit rough edges to be polished and any unforeseen problems resolved. When this has been completed and its results analysed, the programme may be implemented progressively throughout the entire medical centre. A presentation, with audio-visual supports and distribution of descriptive literature, can be delivered in each unit or department, followed by delivery of bags and/or containers as required. Following the start-up of the programme, the department or unit should be visited so that any needed revisions may be instituted. In this manner, the participation and support of the entire hospital staff, without which the programme would never succeed, can be earned.
A hospital is not an isolated social environment; it has, given its mission, very serious intrinsic social responsibilities. A hospital needs to be integrated with its surroundings and should minimize its impact upon them, thus contributing to the welfare of the people who live near it.
From a regulatory perspective, the health industry has never been considered to be on the same level as other industries when they are ranked according to the health risks they pose. The result is that specific legislation in this sphere has been non-existent until recently, although in the last few years this deficiency has been addressed. While in many other kinds of industrial activities, health and safety is an integral part of the organization, most health centres still pay little or no attention to it.
One reason for this could be the attitudes of HCWs themselves, who may be preoccupied more with research and the acquisition of the latest technologies and diagnostic and treatment techniques than with looking into the effects that these advances could have on their own health and on the environment.
New developments in science and health care must be combined with environmental protection, because environmental policies in a hospital affect the quality of life of HCWs within the hospital and those who live outside it.
Integrated Health, Safety and Environmental Programmes
HCWs represent a major group, comparable in size to the large enterprises of the private sector. The number of people who pass through a hospital every day is very large: visitors, inpatients, outpatients, medical and commercial representatives, subcontractors and so on. All of them, to a greater or lesser degree, are exposed to the potential risks posed by the activities of the medical centre and, at the same time, contribute on a certain level to the improvement or the worsening of the safety and the care of the centre’s surroundings.
Strict measures are needed in order to safeguard HCWs, the general public and the surrounding environment from the deleterious effects that may stem from hospital activities. These activities include the use of ever more sophisticated technology, the more frequent use of extremely powerful drugs (the effects of which can have a profound and irreparable impact on the people who prepare or administer them), the too-often uncontrolled use of chemical products and the incidence of infectious diseases, some of which are incurable.
The risks of working in a hospital are many. Some are easy to identify, while others are very hard to detect; the measures to be taken should therefore always be rigorous.
Different groups of health professionals are particularly exposed to risks common to the health care industry in general, as well as to specific risks related to their profession and/or to the activities they perform in the course of their work.
The concept of prevention, therefore, must of necessity be incorporated to the health care field and encompass:
We should be aware that the environment is directly and intimately related to the safety and hygiene in the workplace, because natural resources are consumed at work, and because these resources are later reincorporated into our surroundings. Our quality of life will be good or bad depending on whether we make correct use of these resources and use appropriate technologies.
Everyone’s involvement is necessary in order to contribute to further:
Goals
Such a programme should endeavour to:
Plan
A hospital should be conceived as a system that, through a number of processes, generates services. These services are the main goal of the activities performed in a hospital.
For the process to begin, certain commitments of energy, investments and technology are needed, which in turn will generate their own emissions and wastes. Their only aim is to provide service.
In addition to these prerequisites, consideration should be given to the conditions of the areas of the building where these activities will take place, since they have been designed a certain way and built with basic construction materials.
Control, planning and coordination are all necessary for an integrated safety, health and environmental project to succeed.
Methodology
Because of the complexity and the variety of risks in the health care field, multidisciplinary groups are required if solutions to each particular problem are to be found.
It is important for health care workers to be able to collaborate with safety studies, participating in the decisions that will be made to improve their working conditions. This way changes will be seen with a better attitude and the guidelines will be more readily accepted.
The safety, hygiene and environmental service should advise, stimulate and coordinate the programmes developed at the health centre. Responsibility for their implementation should fall upon whoever heads up the service where this programme will be followed. This is the only way to involve the entire organization.
In each particular case, the following will be selected:
The study will consist of:
In order to implement the plan successfully it will always be necessary to:
This type of study may be a global one encompassing the centre as a whole (e.g., internal plan for the disposal of hospital wastes) or partial, encompassing only one concrete area (e.g., where cancer chemotherapeutic drugs are prepared).
The study of these factors will give an idea of the degree to which safety measures are disregarded, as much from the legal as from the scientific point of view. The concept of “legal” here encompasses advances in science and technology as they occur, which requires the constant revision and modification of established norms and guidelines.
It would be convenient indeed if the regulations and the laws by which safety, hygiene and the environment are regulated were the same in all countries, something that would make the installation, management and use of technology or products from other countries much easier.
Results
The following examples show some of the measures that can be taken while following the aforementioned methodology.
Laboratories
An advisory service can be developed involving professionals of the various laboratories and coordinated by the safety and hygiene service of the medical centre. The main goal would be to improve the safety and health of the occupants of all the labs, involving and giving responsibility to the entire professional staff of each and trying at the same time to make sure that these activities do not have a negative impact on public health and the environment.
The measures taken should include:
Mercury
Thermometers, when broken, release mercury into the environment. A pilot project has been started with “unbreakable” thermometers to consider eventually substituting them for the glass thermometers. In some countries, such as the United States, electronic thermometers have replaced mercury thermometers to a very great extent.
Training the workers
The training and the commitment of the workers is the most important part of an integrated safety, health and environment programme. Given enough resources and time, the technicalities of almost any problem can be solved, but a complete solution will not be achieved without informing the workers of the risks and training them to avoid or control them. The training and education must be continuous, integrating health and safety techniques into all the other training programmes in the hospital.
Conclusions
The results that have been achieved so far in applying this work model allow us so far to be optimistic. They have shown that when people are informed about the whys and wherefores, their attitude toward change is very positive.
The response of health care personnel has been very good. They feel more motivated in their work and more valued when they have participated directly in the study and in the decision-making process. This participation, in turn, helps to educate the individual health care worker and to increase the degree of responsibility he or she is willing to accept.
The attainment of the goals of this project is a long-term objective, but the positive effects it generates more than compensate for the effort and the energy invested in it.
The health maintenance and enhancement, the safety and the comfort of people in health care facilities are seriously affected if specific building requirements are not met. Health care facilities are rather unique buildings, in which heterogeneous environments coexist. Different people, several activities in each environment and many risk factors are involved in the pathogenesis of a broad spectrum of diseases. Functional organization criteria classify health care facility environments as follows: nursing units, operating theatres, diagnostic facilities (radiology unit, laboratory units and so on), outpatients’ departments, administration area (offices), dietary facilities, linen services, engineering services and equipment areas, corridors and passages. The group of people which attends a hospital is composed of health personnel, staff personnel, patients (long-stay inpatients, acute inpatients and outpatients) and visitors. The processes include health care specific activities—diagnostic activities, therapeutic activities, nursing activities—and activities common to many public buildings—office work, technological maintenance, food preparation and so on. The risk factors are physical agents (ionizing and non-ionizing radiation, noise, lighting and microclimatic factors), chemicals (e.g., organic solvents and disinfectants), biological agents (viruses, bacteria, fungi and so on), ergonomics (postures, lifting and so on) and psychological and organizational factors (e.g., environmental perceptions and work hours). The illnesses related to the above-mentioned factors range from environmental annoyance or discomfort (e.g., thermal discomfort or irritative symptoms) to severe diseases (e.g., hospital-acquired infections and traumatic accidents). In this perspective, the risk assessment and control require an interdisciplinary approach involving physicians, hygienists, engineers, architects, economists and so on and fulfilment of preventive measures in the building planning, design, construction and management tasks. Specific building requirements are extremely important among these preventive measures, and, according to the guidelines for healthy buildings introduced by Levin (1992), they should be classified as follows:
This article focuses on general hospital buildings. Obviously, adaptations would be required for specialty hospitals (e.g., orthopaedic centres, eye and ear hospitals, maternity centres, psychiatric institutions, long-term care facilities and rehabilitation institutes), for ambulatory care clinics, emergency/urgent care facilities and offices for individual and group practices. These will be determined by the numbers and types of patients (including their physical and mental status) and by the number of HCWs and the tasks they perform. Considerations promoting the safety and well-being of both patients and staff that are common to all health care facilities include:
Site Planning Requirements
The health care facility site must be chosen following four main criteria (Catananti and Cambieri 1990; Klein and Platt 1989; Decree of the President of Ministers Council 1986; Commission of the European Communities 1990; NHS 1991a, 1991b):
Architectural Design
Health care facilities architectural design usually follows several criteria:
The listed criteria lead health care facilities planners to choose the best building shape for each situation, ranging essentially from an extended horizontal hospital with scattered buildings to a monolithic vertical or horizontal building (Llewelyn-Davies and Wecks 1979). The first case (a preferable format for low-density buildings) is normally used for hospitals up to 300 beds, because of its low costs in construction and management. It is particularly considered for small rural hospitals and community hospitals (Llewelyn-Davies and Wecks 1979). The second case (usually preferred for high-density buildings) becomes cost-effective for hospitals with more than 300 beds, and it is advisable for acute-care hospitals (Llewelyn-Davies and Wecks 1979). The internal space dimensions and distribution have to cope with many variables, among which one can consider: functions, processes, circulation and connections to other areas, equipment, predicted workload, costs, and flexibility, convertibility and susceptibility of shared use. Compartments, exits, fire alarms, automatic extinction systems and other fire prevention and protection measures should follow local regulations. Furthermore, several specific requirements have been defined for each area in health care facilities:
1. Nursing units. Internal layout of nursing units usually follows one of the following three basic models (Llewelyn-Davies and Wecks 1979): an open ward (or “Nightingale” ward)—a broad room with 20 to 30 beds, heads to the windows, ranged along both walls; the “Rigs” layout—in this model beds were placed parallel to the windows, and, at first, they were in open bays on either side of a central corridor (as at Rigs Hospital in Copenhagen), and in later hospitals the bays were often enclosed, so that they became rooms with 6 to 10 beds; small rooms, with 1 to 4 beds. Four variables should lead the planner to choose the best layout: bed need (if high, an open ward is advisable), budget (if low, an open ward is the cheapest one), privacy needs (if considered high, small rooms are unavoidable) and intensive care level (if high, the open ward or Rigs layout with 6 to 10 beds are advisable). The space requirements should be at least: 6 to 8 square metres (sqm) per bed for open wards, inclusive of circulation and ancillary rooms (Llewelyn-Davies and Wecks 1979); 5 to 7 sqm/bed for multiple bedrooms and 9 sqm for single bedrooms (Decree of the President of Ministers Council 1986; American Institute of Architects Committee on Architecture for Health 1987). In open wards, toilet facilities should be close to patients’ beds (Llewelyn-Davies and Wecks 1979). For single and multiple bedrooms, handwashing facilities should be provided in each room; lavatories may be omitted where a toilet room is provided to serve one single-bed room or one two-bed room (American Institute of Architects Committee on Architecture for Health 1987). Nursing stations should be large enough to accommodate desks and chairs for record keeping, tables and cabinets for preparation of drugs, instruments and supplies, chairs for sit-down conferences with physicians and other staff members, a wash-up sink and access to a staff toilet.
2. Operating theatres. Two main classes of elements should be considered: operating rooms and service areas (American Institute of Architects Committee on Architecture for Health 1987). Operating rooms should be classified as follows:
Service areas should include: sterilizing facility with high-speed autoclave, scrub facilities, medical gas storage facilities and staff clothing change areas.
3. Diagnostic facilities: Each radiology unit should include (Llewelyn-Davies and Wecks 1979; American Institute of Architects Committee on Architecture for Health 1987):
The wall thickness in a radiology unit should be 8 to 12 cm (poured concrete) or 12 to 15 cm (cinder block or bricks). The diagnostic activities in health care facilities may require tests in haematology, clinical chemistry, microbiology, pathology and cytology. Each laboratory area should be provided with work areas, sample and material storage facilities (refrigerated or not), specimen collection facilities, facilities and equipment for terminal sterilization and waste disposal, and a special facility for radioactive material storage (where necessary) (American Institute of Architects Committee on Architecture for Health 1987).
4. Outpatient departments. Clinical facilities should include (American Institute of Architects Committee on Architecture for Health 1987): general-purpose examination rooms (7.4 sqm), special-purpose examination rooms (varying with the specific equipment needed) and treatment rooms (11 sqm). In addition, administrative facilities are needed for the admittance of outpatients.
5. Administration area (offices). Facilities such as common office building areas are needed. These include a loading dock and storage areas for receiving supplies and equipment and dispatching materials not disposed of by the separate waste removal system.
6. Dietary facilities (optional). Where present, these should provide the following elements (American Institute of Architects Committee on Architecture for Health 1987): a control station for receiving and controlling food supplies, storage spaces (including cold storage), food preparation facilities, handwashing facilities, facility for assembling and distributing patients’ meals, dining space, dishwashing space (located in a room or an alcove separated from the food preparation and serving area), waste storage facilities and toilets for dietary staff.
7. Linen services (optional). Where present, these should provide the following elements: a room for receiving and holding soiled linen, a clean-linen storage area, a clean-linen inspection and mending area and handwashing facilities (American Institute of Architects Committee on Architecture for Health 1987).
8. Engineering services and equipment areas. Adequate areas, varying in size and characteristics for each health care facility, have to be provided for: boiler plant (and fuel storage, if necessary), electrical supply, emergency generator, maintenance workshops and stores, cold-water storage, plant rooms (for centralized or local ventilation) and medical gases (NHS 1991a).
9. Corridors and passages. These have to be organized to avoid confusion for visitors and disruptions in the work of hospital personnel; circulation of clean and dirty goods should be strictly separated. Minimum corridor width should be 2 m (Decree of the President of Ministers Council 1986). Doorways and elevators must be large enough to allow easy passage of stretchers and wheelchairs.
Requirements for Building Materials and Furnishings
The choice of materials in modern health care facilities is often aimed to reduce the risk in accidents and fire occurrence: materials must be non-inflammable and must not produce noxious gases or smokes when burnt (American Institute of Architects Committee on Architecture for Health 1987). Trends in hospital floor-covering materials have shown a shift from stone materials and linoleum to polyvinyl chloride (PVC). In operating rooms, in particular, PVC is considered the best choice to avoid electrostatic effects that may cause explosion of anaesthetic flammable gases. Up to some years ago, walls were painted; today, PVC coverings and fibreglass wallpaper are the most used wall finishes. False ceilings are today built mainly from mineral fibres instead of gypsum board; a new trend appears to be that of using stainless steel ceilings (Catananti et al. 1993). However, a more complete approach should consider that each material and furnishing may cause effects in the outdoor and indoor environmental systems. Accurately chosen building materials may reduce environmental pollution and high social costs and improve the safety and comfort of building occupants. At the same time, internal materials and finishes may influence the functional performance of the building and its management. Besides, the choice of materials in hospitals should also consider specific criteria, such as ease of cleaning, washing and disinfecting procedures and susceptibility to becoming a habitat for living beings. A more detailed classification of criteria to be considered in this task, derived from the European Community Council Directive No. 89/106 (Council of the European Communities 1988), is shown in table 1 .
Table 1. Criteria and variables to be considered in the choice of materials
Criteria |
Variables |
Functional performance |
Static load, transit load, impact load, durability, construction requirements |
Safety |
Collapse risk, fire risk (reaction to fire, fire resistance, flammability), static electric charge (explosion risk), disperse electric power (electric shock risk), sharp surface (wound risk), poisoning risk (hazardous chemical emission), slip risk, radioactivity |
Comfort and pleasantness |
Acoustic comfort (features related to noise), optical and visual comfort (features related to light), tactile comfort (consistence, surface), hygrothermal comfort (features related to heat), aesthetics, odour emissions, indoor air quality perception |
Hygienicity |
Living beings habitat (insects, moulds, bacteria), susceptibility to stains, susceptibility to dust, easiness in cleaning, washing and disinfecting, maintenance procedures |
Flexibility |
Susceptibility to modifications, conformational factors (tile or panel dimensions and morphology) |
Environmental impact |
Raw material, industrial manufacturing, waste management |
Cost |
Material cost, installation cost, maintenance cost |
Source: Catananti et al. 1994.
On the matter of odour emissions, it should be observed that a correct ventilation after floor or wall-coverings installation or renovation work reduces exposure of personnel and patients to indoor pollutants (especially volatile organic compounds (VOCs)) emitted by building materials and furnishings.
Requirements for Heating, Ventilation and Air-Conditioning Systems and for Microclimatic Conditions
The control of microclimatic conditions in health care facilities areas may be carried out by heating, ventilation and/or air-conditioning systems (Catananti and Cambieri 1990). Heating systems (e.g., radiators) permit only temperature regulation and may be sufficient for common nursing units. Ventilation, which induces changes of air speed, may be natural (e.g., by porous building materials), supplementary (by windows) or artificial (by mechanical systems). The artificial ventilation is especially recommended for kitchens, laundries and engineering services. Air-conditioning systems, particularly recommended for some health care facility areas such as operating rooms and intensive-care units, should guarantee:
General requirements of air-conditioning systems include outdoor intake locations, air filter features and air supply outlets (ASHRAE 1987). Outdoor intake locations should be far enough, at least 9.1 m, from pollution sources such as exhaust outlets of combustion equipment stacks, medical-surgical vacuum systems, ventilation exhaust outlets from the hospital or adjoining buildings, areas that may collect vehicular exhaust and other noxious fumes, or plumbing vent stacks. Besides, their distance from ground level should be at least 1.8 m. Where these components are installed above the roof, their distance from roof level should be at least 0.9 m.
Number and efficiency of filters should be adequate for the specific areas supplied by air conditioning systems. For example, two filter beds of 25 and 90% efficiency should be used in operating rooms, intensive-care units and transplant organ rooms. Installation and maintenance of filters follow several criteria: lack of leakage between filter segments and between the filter bed and its supporting frame, installation of a manometer in the filter system in order to provide a reading of the pressure so that filters can be identified as expired and provision of adequate facilities for maintenance without introducing contamination into the air flow. Air supply outlets should be located on the ceiling with perimeter or several exhaust inlets near the floor (ASHRAE 1987).
Ventilation rates for health care facility areas permitting air purity and comfort of occupants are listed in table 2 .
Table 2. Ventilation requirements in health care facilities areas
Areas |
Pressure relationships to adjacent areas |
Minimum air changes of outdoor air per hour supplied to room |
Minimum total air changes per hour supplied to room |
All air exhausted directly to outdoors |
Recirculated within room units |
Nursing units |
|||||
Patient room |
+/– |
2 |
2 |
Optional |
Optional |
Intensive care |
P |
2 |
6 |
Optional |
No |
Patient corridor |
+/– |
2 |
4 |
Optional |
Optional |
Operating theatres |
|||||
Operating room (all outdoor system) |
P |
15 |
15 |
Yes1 |
No |
Operating room (recirculating system) |
P |
5 |
25 |
Optional |
No2 |
Diagnostic facilities |
|||||
X ray |
+/– |
2 |
6 |
Optional |
Optional |
Laboratories |
|||||
Bacteriology |
N |
2 |
6 |
Yes |
No |
Clinical chemistry |
P |
2 |
6 |
Optional |
No |
Pathology |
N |
2 |
6 |
Yes |
No |
Serology |
P |
2 |
6 |
Optional |
No |
Sterilizing |
N |
Optional |
10 |
Yes |
No |
Glasswashing |
N |
2 |
10 |
Yes |
Optional |
Dietary facilities |
|||||
Food preparation centres3 |
+/– |
2 |
10 |
Yes |
No |
Dishwashing |
N |
Optional |
10 |
Yes |
No |
Linen service |
|||||
Laundry (general) |
+/– |
2 |
10 |
Yes |
No |
Soiled linen sorting and storage |
N |
Optional |
10 |
Yes |
No |
Clean linen storage |
P |
2 (Optional) |
2 |
Optional |
Optional |
P = Positive. N = Negative. +/– = Continuous directional control not required.
1 For operating rooms, use of 100% outside air should be limited to these cases where local codes require it, only if heat recovery devices are used; 2 recirculating room units meeting the filtering requirement for the space may be used; 3 food preparation centres shall have ventilation systems that have an excess of air supply for positive pressure when hoods are not in operation. The number of air changes may be varied to any extent required for odour control when the space is not in use.
Source: ASHRAE 1987.
Specific requirements of air-conditioning systems and microclimatic conditions regarding several hospital areas are reported as follows (ASHRAE 1987):
Nursing units. In common patient rooms a temperature (T) of 24 °C and a 30% relative humidity (RH) for winter and a T of 24 °C with 50% RH for summer are recommended. In intensive-care units a variable range temperature capability of 24 to 27 °C and a RH of 30% minimum and 60% maximum with a positive air pressure are recommended. In immunosuppressed patient units a positive pressure should be maintained between patient room and adjacent area and HEPA filters should be used.
In full-term nursery a T of 24 °C with RH from 30% minimum to 60% maximum is recommended. The same microclimatic conditions of intensive-care units are required in special-care nursery.
Operating theatres. Variable temperature range capability of 20 to 24 °C with RH of 50% minimum and 60% maximum and positive air pressure are recommended in operating rooms. A separate air-exhaust system or special vacuum system should be provided in order to remove anaesthetic gas traces (see “Waste anaesthetic gases” in this chapter).
Diagnostic facilities. In the radiology unit, fluoroscopic and radiographic rooms require T of 24 to 27 °C and RH of 40 to 50%. Laboratory units should be supplied with adequate hood exhaust systems to remove dangerous fumes, vapours and bioaerosols. The exhaust air from the hoods of the units of clinical chemistry, bacteriology and pathology should be discharged to the outdoors with no recirculation. Also, the exhaust air from infectious disease and virology laboratories requires sterilization before being exhausted to the outdoors.
Dietary facilities. These should be provided with hoods over the cooking equipment for removal of heat, odours and vapours.
Linen services. The sorting room should be maintained at a negative pressure in relation to adjoining areas. In the laundry processing area, washers, flatwork ironers, tumblers, and so on should have direct overhead exhaust to reduce humidity.
Engineering services and equipment areas. At work stations, the ventilation system should limit temperature to 32 °C.
Conclusion
The essence of specific building requirements for health care facilities is the accommodation of external standard-based regulations to subjective index-based guidelines. In fact, subjective indices, such as Predicted Mean Vote (PMV) (Fanger 1973) and olf, a measure of odour (Fanger 1992), are able to make predictions of the comfort levels of patients and personnel without neglecting the differences related to their clothing, metabolism and physical status. Finally, the planners and architects of hospitals should follow the theory of “building ecology” (Levin 1992) which describes dwellings as a complex series of interactions among buildings, their occupants and the environment. Health facilities, accordingly, should be planned and built focusing on the whole “system” rather than any particular partial frames of reference.
With the advent of the universal precautions against bloodborne infections which dictate the use of gloves whenever HCWs are exposed to patients or materials that might be infected with hepatitis B or HIV, the frequency and severity of allergic reactions to natural rubber latex (NRL) have zoomed upward. For example, the Department of Dermatology at the Erlangen-Nuremberg University in Germany reported a 12-fold increase in the number of patients with latex allergy between 1989 and 1995. More serious systemic manifestations increased from 10.7% in 1989 to 44% in 1994-1995 (Hesse et al. 1996).
It seems ironic that so much difficulty is attributable to rubber gloves when they were intended to protect the hands of nurses and other HCWs when they were originally introduced toward the end of the nineteenth century. This was the era of antiseptic surgery in which instruments and operative sites were bathed in caustic solutions of carbolic acid and bichloride of mercury. These not only killed germs but they also macerated the hands of the surgical team. According to what has become a romantic legend, William Stewart Halsted, one of the surgical “giants” of the time who is credited with a host of contributions to the techniques of surgery, is said to have “invented” rubber gloves around 1890 to make it more pleasant to hold hands with Caroline Hampton, his scrub nurse, whom he later married (Townsend 1994). Although Halsted may be credited with introducing and popularizing the use of rubber surgical gloves in the United States, many others had a hand in it, according to Miller (1982) who cited a report of their use in the United Kingdom published a half century earlier (Acton 1848).
Latex Allergy
Allergy to NRL is succinctly described by Taylor and Leow (see the article “Rubber contact dermatitis and latex allergy” in the chapter Rubber industry) as “an immunoglobulin E-mediated, immediate, Type I allergic reaction, most always due to NRL proteins present in medical and non-medical latex devices. The spectrum of clinical signs ranges from contact urticaria, generalized urticaria, allergic rhinitis, allergic conjunctivitis, angioedema (severe swelling) and asthma (wheezing) to anaphylaxis (severe, life-threatening allergic reaction)”. Symptoms may result from direct contact of normal or inflamed skin with gloves or other latex-containing materials or indirectly by mucosal contact with or inhalation of aerosolized NRL proteins or talcum powder particles to which NRL proteins have adhered. Such indirect contact can cause a Type IV reaction to the rubber accelerators. (Approximately 80% of “latex glove allergy” is actually a Type IV reaction to the accelerators.) The diagnosis is confirmed by patch, prick, scratch or other skin sensitivity tests or by serological studies for the immune globulin. In some individuals, the latex allergy is associated with allergy to certain foods (e.g., banana, chestnuts, avocado, kiwi and papaya).
While most common among health care workers, latex allergy is also found among employees in rubber manufacturing plants, other workers who habitually use rubber gloves (e.g., greenhouse workers (Carillo et al. 1995)) and in patients with a history of multiple surgical procedures (e.g., spina bifida, congenital urogenital abnormalities, etc.) (Blaycock 1995). Cases of allergic reactions after the use of latex condoms have been reported (Jonasson, Holm and Leegard 1993), and in one case, a potential reaction was averted by eliciting a history of an allergic reaction to a rubber swimming cap (Burke, Wilson and McCord 1995). Reactions have occurred in sensitive patients when hypodermic needles used to prepare doses of parenteral medications picked up NRL protein as they were pushed through the rubber caps on the vials.
According to a recent study of 63 patients with NRL allergy, it took an average of 5 years of working with latex products for the first symptoms, usually a contact urticaria, to develop. Some also had rhinitis or dyspnoea. It took, on average, an additional 2 years for the appearance of lower respiratory tract symptoms (Allmeers et al. 1996).
Frequency of latex allergy
To determine the frequency of NRL allergy, allergy tests were performed on 224 employees at the University of Cincinnati College of Medicine, including nurses, laboratory technicians, physicians, respiratory therapists, housekeeping and clerical workers (Yassin et al. 1994). Of these, 38 (17%) tested positive to latex extracts; the incidence ranged from 0% among housekeeping workers to 38% among dental staff. Exposure of these sensitized individuals to latex caused itching in 84%, a skin rash in 68%, urticaria in 55%, lachrymation and ocular itching in 45%, nasal congestion in 39% and sneezing in 34%. Anaphylaxis occurred in 10.5%.
In a similar study at the University of Oulo in Finland, 56% of 534 hospital employees who used protective latex or vinyl gloves on a daily basis had skin disorders related to the usage of the gloves (Kujala and Reilula 1995). Rhinorrhoea or nasal congestion was present in 13% of workers who used powdered gloves. The prevalence of both skin and respiratory symptoms was significantly higher among those who used the gloves for more than 2 hours a day.
Valentino and colleagues (1994) reported latex induced asthma in four health care workers in an Italian regional hospital, and the Mayo Medical Center in Rochester Minnesota, where 342 employees who reported symptoms suggestive of latex allergy were evaluated, recorded 16 episodes of latex-related anaphylaxis in 12 subjects (six episodes occurred after skin testing) (Hunt et al. 1995). The Mayo researchers also reported respiratory symptoms in workers who did not wear gloves but worked in areas where large numbers of gloves were being used, presumably due to air-borne talcum powder/latex protein particles.
Control and Prevention
The most effective preventive measure is modification of standard procedures to replace the use of gloves and equipment made with NRL with similar items made of vinyl or other non-rubber materials. This requires involvement of the purchasing and supply departments, which should also mandate the labelling of all latex-containing items so that they may be avoided by individuals with latex sensitivity. This is important not only to the staff but also to patients who may have a history suggestive of latex allergy. Aerosolized latex, from latex powder, is also problematic. HCWs who are allergic to latex and who do not use latex gloves may still be affected by the powdered latex gloves used by co-workers. A significant problem is presented by the wide variation in content of latex allergen among gloves from different manufacturers and, indeed, among different lots of gloves from the same manufacturer.
Glove manufacturers are experimenting with gloves using formulations with smaller amounts of NRL as well as coatings that will obviate the need for talcum powder to make the gloves easy to put on and take off. The goal is to provide comfortable, easy to wear, non-allergenic gloves that still provide effective barriers to the transmission of the hepatitis B virus, HIV and other pathogens.
A careful medical history with a particular emphasis on prior latex exposures should be elicited from all health care workers who present symptoms suggestive of latex allergy. In suspect cases, evidence of latex sensitivity may be confirmed by skin or serological testing. Since there is evidently a risk of provoking an anaphylactic reaction, the skin testing should only be performed by experienced medical personnel.
At the present time, allergens for desensitization are not available so that the only remedy is avoidance of exposure to products containing NRL. In some instances, this may require a change of job. Weido and Sim (1995) at the University of Texas Medical Branch at Galveston suggest advising individuals in high-risk groups to carry self-injectable epinephrine to use in the event of a systemic reaction.
Following the appearance of several clusters of latex allergy cases in 1990, the Mayo Medical Center in Rochester, Minnesota, formed a multidisciplinary work group to address the problem (Hunt et al. 1996). Subsequently, this was formalized in a Latex Allergy Task Force with members from the departments of allergy, preventive medicine, dermatology and surgery as well as the Director of Purchasing, the Surgical Nursing Clinical Director and the Director of Employee Health. Articles on latex allergy were published in staff newsletters and information bulletins to educate the 20,000 member workforce to the problem and to encourage those with suggestive symptoms to seek medical consultation. A standardized approach to testing for latex sensitivity and techniques for quantifying the amount of latex allergen in manufactured products and the amount and particle size of air-borne latex allergen were developed. The latter proved to be sufficiently sensitive to measure the exposure of individual workers while performing particular high-risk tasks. Steps were initiated to monitor a gradual transition to low-allergen gloves (an incidental effect was a lowering of their cost by concentrating glove purchases among the fewer vendors who could meet the low allergen requirements) and to minimize exposures of staff and patients with known sensitivity to NLR.
To alert the public to the risks of NLR allergy, a consumer group, the Delaware Valley Latex Allergy Support Network has been formed. This group has created an Internet website (http://www.latex.org) and maintains a toll-free telephone line (1-800 LATEXNO) to provide up-to-date factual information about latex allergy to persons with this problem and those who care for them. This organization, which has a Medical Advisory Group, maintains a Literature Library and a Product Center and encourages the exchange of experiences among those who have had allergic reactions.
Conclusion
Latex allergies are becoming an increasingly important problem among health care workers. The solution lies in minimizing contact with latex allergen in their work environment, especially by substituting non-latex surgical gloves and appliances.
The use of inhaled anaesthetics was introduced in the decade of 1840 to 1850. The first compounds to be used were diethyl ether, nitrous oxide and chloroform. Cyclopropane and trichloroethylene were introduced many years later (circa 1930-1940), and the use of fluoroxene, halothane and methoxiflurane began in the decade of the 1950s. By the end of the 1960s enflurane was being used and, finally, isoflurane was introduced in the 1980s. Isoflurane is now considered the most widely used inhalation anaesthetic even though it is more expensive than the others. A summary of the physical and chemical characteristics of methoxiflurane, enflurane, halothane, isoflurane and nitrous oxide, the most commonly used anaesthetics, is shown in table 1 (Wade and Stevens 1981).
Table 1. Properties of inhaled anaesthetics
Isoflurane, |
Enflurane, |
Halothane, |
Methoxyflurane, |
Dinitrogen oxide, |
|
Molecular weight |
184.0 |
184.5 |
197.4 |
165.0 |
44.0 |
Boiling point |
48.5°C |
56.5°C |
50.2°C |
104.7°C |
— |
Density |
1.50 |
1.52 (25°C) |
1.86 (22°C) |
1.41 (25°C) |
— |
Vapour pressure at 20 °C |
250.0 |
175.0 (20°C) |
243.0 (20°C) |
25.0 (20°C) |
— |
Smell |
Pleasant, sharp |
Pleasant, like ether |
Pleasant, sweet |
Pleasant, fruity |
Pleasant, sweet |
Separation coefficients: |
|||||
Blood/gas |
1.40 |
1.9 |
2.3 |
13.0 |
0.47 |
Brain/gas |
3.65 |
2.6 |
4.1 |
22.1 |
0.50 |
Fat/gas |
94.50 |
105.0 |
185.0 |
890.0 |
1.22 |
Liver/gas |
3.50 |
3.8 |
7.2 |
24.8 |
0.38 |
Muscle/gas |
5.60 |
3.0 |
6.0 |
20.0 |
0.54 |
Oil/gas |
97.80 |
98.5 |
224.0 |
930.0 |
1.4 |
Water/gas |
0.61 |
0.8 |
0.7 |
4.5 |
0.47 |
Rubber/gas |
0.62 |
74.0 |
120.0 |
630.0 |
1.2 |
Metabolic rate |
0.20 |
2.4 |
15–20 |
50.0 |
— |
All of them, with the exception of nitrous oxide (N2O), are hydrocarbons or chlorofluorinated liquid ethers that are applied by vapourization. Isoflurane is the most volatile of these compounds; it is the one that is metabolized at the lowest rate and the one that is least soluble in blood, in fats and in the liver.
Normally, N2O, a gas, is mixed with a halogenated anaesthetic, although they are sometimes used separately, depending on the type of anaesthesia that is required, the characteristics of the patient and the work habits of the anaesthetist. The normally used concentrations are 50 to 66% N2O and up to 2 or 3% of the halogenated anaesthetic (the rest is usually oxygen).
The anaesthesia of the patient is usually started by the injection of a sedative drug followed by an inhaled anaesthetic. The volumes given to the patient are in the order of 4 or 5 litres/minute. Parts of the oxygen and of the anaesthetic gases in the mixture are retained by the patient while the remainder is exhaled directly into the atmosphere or is recycled into the respirator, depending among other things on the type of mask used, on whether the patient is intubated and on whether or not a recycling system is available. If recycling is available, exhaled air can be recycled after it is cleaned or it can be vented to the atmosphere, expelled from the operating room or aspirated by a vacuum. Recycling (closed circuit) is not a common procedure and many respirators do not have exhaust systems; all the air exhaled by the patient, including the waste anaesthetic gases, therefore, ends up in the air of the operating room.
The number of workers occupationally exposed to waste anaesthetic gases is high, because it is not only the anaesthetists and their assistants who are exposed, but all the other people who spend time in operating rooms (surgeons, nurses and support staff), the dentists who perform odontological surgery, the personnel in delivery rooms and intensive care units where patients may be under inhaled anaesthesia and veterinary surgeons. Similarly, the presence of waste anaesthetic gases is detected in recovery rooms, where they are exhaled by patients who are recovering from surgery. They are also detected in other areas adjacent to operating rooms because, for reasons of asepsis, operating rooms are kept at positive pressure and this favours the contamination of surrounding areas.
Health Effects
Problems due to the toxicity of anaesthetic gases were not seriously studied until the 1960s, even though a few years after the use of inhaled anaesthetics became common, the relationship between the illnesses (asthma, nephritis) that affected some of the first professional anaesthetists and their work as such was already suspected (Ginesta 1989). In this regard the appearance of an epidemiological study of more than 300 anaesthetists in the Soviet Union, the Vaisman (1967) survey, was the starting point for several other epidemiological and toxicological studies. These studies—mostly during the 1970s and the first half of the 1980s—focused on the effects of anaesthetic gases, in most cases nitrous oxide and halothane, on people occupationally exposed to them.
The effects observed in most of these studies were an increase in spontaneous abortions among women exposed during or before pregnancy, and among women partners of exposed men; an increase in congenital malformations in children of exposed mothers; and the occurrence of hepatic, renal and neurological problems and of some types of cancer in both men and women (Bruce et al. 1968, 1974; Bruce and Bach 1976). Even though the toxic effects of nitrous oxide and of halothane (and probably its substitutes as well) on the body are not exactly the same, they are commonly studied together, given that exposure generally occurs simultaneously.
It appears likely that there is a correlation between these exposures and an increased risk, particularly for spontaneous abortions and congenital malformations in children of women exposed during pregnancy (Stoklov et al. 1983; Spence 1987; Johnson, Buchan and Reif 1987). As a result, many of the people exposed have expressed great concern. Rigorous statistical analysis of these data, however, casts doubt on the existence of such a relationship. More recent studies reinforce these doubts while chromosomal studies yield ambiguous results.
The works published by Cohen and colleagues (1971, 1974, 1975, 1980), who carried out extensive studies for the American Society of Anaesthetists (ASA), constitute a fairly extensive series of observations. Follow-up publications criticized some of the technical aspects of the earlier studies, particularly with respect to the sampling methodology and, especially, the proper selection of a control group. Other deficiencies included lack of reliable information on the concentrations to which the subjects had been exposed, the methodology for dealing with false positives and the lack of controls for factors such as tobacco and alcohol use, prior reproductive histories and voluntary infertility. Consequently, some of the studies are now even considered invalid (Edling 1980; Buring et al. 1985; Tannenbaum and Goldberg 1985).
Laboratory studies have shown that exposure of animals to ambient concentrations of anaesthetic gases equivalent to those found in operating rooms does cause deterioration in their development, growth and adaptive behaviour (Ferstandig 1978; ACGIH 1991). These are not conclusive, however, since some of these experimental exposures involved anaesthetic or subanaesthetic levels, concentrations significantly higher than the levels of waste gases usually found in operating room air (Saurel-Cubizolles et al. 1994; Tran et al. 1994).
Nevertheless, even acknowledging that a relationship between the deleterious effects and exposures to waste anaesthetic gases has not been definitively established, the fact is that the presence of these gases and their metabolites is readily detected in the air of operating rooms, in exhaled air and in biological fluids. Accordingly, since there is concern about their potential toxicity, and because it is technically feasible to do so without inordinate effort or expense, it would be prudent to take steps to eliminate or reduce to a minimum the concentrations of waste anaesthetic gases in operating rooms and nearby areas (Rosell, Luna and Guardino 1989; NIOSH 1994).
Maximum Allowable Exposure Levels
The American Conference of Governmental Industrial Hygienists (ACGIH) has adopted a threshold limit value-time weighted average (TLV-TWA) of 50 ppm for nitrous oxide and halothane (ACGIH 1994). The TLV-TWA is the guideline for the production of the compound, and the recommendations for operating rooms are that its concentration be kept lower, at a level below 1 ppm (ACGIH 1991). NIOSH sets a limit of 25 ppm for nitrous oxide and of 1 ppm for halogenated anaesthetics, with the additional recommendation that when they are used together, the concentration of halogenated compounds be reduced to a limit of 0.5 ppm (NIOSH 1977b).
With regard to values in biological fluids, the recommended limit for nitrous oxide in urine after 4 hours of exposure at average ambient concentrations of 25 ppm ranges from 13 to 19 μg/L, and for 4 hours of exposure at average ambient concentrations of 50 ppm, the range is 21 to 39 μg/L (Guardino and Rosell 1995). If exposure is to a mixture of a halogenated anaesthetic and nitrous oxide, the measurement of the values from nitrous oxide is used as the basis for controlling exposure, because as higher concentrations are used, quantification becomes easier.
Analytical Measurement
Most of the procedures described for measuring residual anaesthetics in air are based on the capture of these compounds by adsorption or in an inert bag or container, later to be analysed by gas chromatography or infrared spectroscopy (Guardino and Rosell 1985). Gas chromatography is also employed to measure nitrous oxide in urine (Rosell, Luna and Guardino 1989), while isoflurane is not readily metabolized and is therefore seldom measured.
Common Levels of Residual Concentrations in the Air of Operating Rooms
In the absence of preventive measures, such as the extraction of residual gases and/or introducing an adequate supply of new air into the operating suite, personal concentrations of more than 6,000 ppm of nitrous oxide and 85 ppm of halothane have been measured (NIOSH 1977). Concentrations of up to 3,500 ppm and 20 ppm, respectively, in the ambient air of operating rooms, have been measured. The implementation of corrective measures can reduce these concentrations to values below the environmental limits cited earlier (Rosell, Luna and Guardino 1989).
Factors that Affect the Concentration of Waste Anaesthetic Gases
The factors which most directly affect the presence of waste anaesthetic gases in the environment of the operating room are the following.
Method of anaesthesia. The first question to consider is the method of anaesthesia, for example, whether or not the patient is intubated and the type of face mask being used. In dental, laryngeal or other forms of surgery in which intubation is precluded, the patient’s expired air would be an important source of emissions of waste gases, unless equipment specifically designed to trap these exhalations is properly placed near the patient’s breathing zone. Accordingly, dental and oral surgeons are considered to be particularly at risk (Cohen, Belville and Brown 1975; NIOSH 1977a), as are veterinary surgeons (Cohen, Belville and Brown 1974; Moore, Davis and Kaczmarek 1993).
Proximity to the focus of emission. As is usual in industrial hygiene, when the known point of emission of a contaminant exists, proximity to the source is the first factor to consider when dealing with personal exposure. In this case, the anaesthetists and their assistants are the persons most directly affected by the emission of waste anaesthetic gases, and personal concentrations have been measured in the order of two times the average levels found in the air of operating rooms (Guardino and Rosell 1985).
Type of circuit. It goes without saying that in the few cases in which closed circuits are used, with reinspiration after the cleansing of the air and the resupply of oxygen and the necessary anaesthetics, there will be no emissions except in the case of equipment malfunction or if a leak exists. In other cases, it will depend on the characteristics of the system used, as well as on whether or not it is possible to add an extraction system to the circuit.
The concentration of anaesthetic gases. Another factor to take into account is the concentrations of the anaesthetics used since, obviously, those concentrations and the amounts found in the air of the operating room are directly related (Guardino and Rosell 1985). This factor is especially important when it comes to surgical procedures of long duration.
Type of surgical procedures. The duration of the operations, the time elapsed between procedures done in the same operating room and the specific characteristics of each procedure—which often determine which anaesthetics are used—are other factors to consider. The duration of the operation directly affects the residual concentration of anaesthetics in the air. In operating rooms where procedures are scheduled successively, the time elapsed between them also affects the presence of residual gases. Studies done in large hospitals with uninterrupted use of the operating rooms or with emergency operating rooms that are used beyond standard work schedules, or in operating rooms used for prolonged procedures (transplants, laryngotomies), show that substantial levels of waste gases are detected even before the first procedure of the day. This contributes to increased levels of waste gases in subsequent procedures. On the other hand, there are procedures that require temporary interruptions of inhalation anaesthesia (where extracorporeal circulation is needed, for example), and this also interrupts the emission of waste anaesthetic gases into the environment (Guardino and Rosell 1985).
Characteristics specific to the operating room. Studies done in operating rooms of different sizes, design and ventilation (Rosell, Luna and Guardino 1989) have demonstrated that these characteristics greatly influence the concentration of waste anaesthetic gases in the room. Large and non-partitioned operating rooms tend to have the lowest measured concentrations of waste anaesthetic gases, while in small operating rooms (e.g., paediatric operating rooms) the measured concentrations of waste gases are usually higher. The general ventilation system of the operating room and its proper operation is a fundamental factor for the reduction of the concentration of waste anaesthetics; the design of the ventilation system also affects the circulation of waste gases within the operating room and the concentrations in different locations and at various heights, something that can be easily verified by carefully taking samples.
Characteristics specific to the anaesthesia equipment. The emission of gases into the environment of the operating room depends directly on the characteristics of the anaesthesia equipment used. The design of the system, whether it includes a system for the return of excess gases, whether it can be attached to a vacuum or vented out of the operating room, whether it has leaks, disconnected lines and so on are always to be considered when determining the presence of waste anaesthetic gases in the operating room.
Factors specific to the anaesthetist and his or her team. The anaesthetist and his or her team are the last element to consider, but not necessarily the least important. Knowledge of the anaesthesia equipment, of its potential problems and the level of maintenance it receives—both by the team and by the maintenance staff in the hospital—are factors that affect very directly the emission of waste gases into the air of the operating room (Guardino and Rosell 1995). It has been clearly shown that, even when using adequate technology, the reduction of the ambient concentrations of anaesthetic gases cannot be achieved if a preventive philosophy is absent from the work routines of anaesthetists and their assistants (Guardino and Rosell 1992).
Preventive Measures
The basic preventive actions required to reduce occupational exposure to waste anaesthetic gases effectively can be summarized in the following six points:
Conclusion
Although not definitively proven, there is enough evidence to suggest that exposures to waste anaesthetic gases may be harmful to HCWs. Stillbirths and congenital malformations in infants born to female workers and to the spouses of male workers represent the major forms of toxicity. Since it is technically feasible at a low cost, it is desirable to reduce the concentration of these gases in the ambient air in operating rooms and adjacent areas to a minimum. This requires not only the use and correct maintenance of anaesthesia equipment and ventilation/air conditioning systems but also the education and training of all personnel involved, especially anaesthetists and their assistants, who generally are exposed to higher concentrations. Given the work conditions peculiar to operating rooms, indoctrination in the correct work habits and procedures is very important in trying to reduce the amounts of anaesthetic waste gases in the air to a minimum.
The vast array of chemicals in hospitals, and the multitude of settings in which they occur, call for a systematic approach to their control. A chemical-by-chemical approach to prevention of exposures and their deleterious outcome is simply too inefficient to handle a problem of this scope. Moreover, as noted in the article “Overview of chemical hazards in health care”, many chemicals in the hospital environment have been inadequately studied; new chemicals are constantly being introduced and for others, even some that have become quite familiar (e.g., gloves made of latex), new hazardous effects are only now becoming manifest. Thus, while it is useful to follow chemical-specific control guidelines, a more comprehensive approach is needed whereby individual chemical control policies and practices are superimposed on a strong foundation of general chemical hazard control.
The control of chemical hazards in hospitals must be based on classic principles of good occupational health practice. Because health care facilities are accustomed to approaching health through the medical model, which focuses on the individual patient and treatment rather than on prevention, special effort is required to ensure that the orientation for handling chemicals is indeed preventive and that measures are principally focused on the workplace rather than on the worker.
Environmental (or engineering) control measures are the key to prevention of deleterious exposures. However, it is necessary to train each worker correctly in appropriate exposure prevention techniques. In fact, right-to-know legislation, as described below, requires that workers be informed of the hazards with which they work, as well as of the appropriate safety precautions. Secondary prevention at the level of the worker is the domain of medical services, which may include medical monitoring to ascertain whether health effects of exposure can be medically detected; it also consists of prompt and appropriate medical intervention in the event of accidental exposure. Chemicals that are less toxic must replace more toxic ones, processes should be enclosed wherever possible and good ventilation is essential.
While all means to prevent or minimize exposures should be implemented, if exposure does occur (e.g., a chemical is spilled), procedures must be in place to ensure prompt and appropriate response to prevent further exposure.
Applying the General Principles of Chemical Hazard Control in the Hospital Environment
The first step in hazard control is hazard identification. This, in turn, requires a knowledge of the physical properties, chemical constituents and toxicological properties of the chemicals in question. Material safety data sheets (MSDSs), which are becoming increasingly available by legal requirement in many countries, list such properties. The vigilant occupational health practitioner, however, should recognize that the MSDS may be incomplete, particularly with respect to long-term effects or effects of low-dose chronic exposure. Hence, a literature search may be contemplated to supplement the MSDS material, when appropriate.
The second step in controlling a hazard is characterizing the risk. Does the chemical pose a carcinogenic risk? Is it an allergen? A teratogen? Is it mainly short-term irritancy effects that are of concern? The answer to these questions will influence the way in which exposure is assessed.
The third step in chemical hazard control is to assess the actual exposure. Discussion with the health care workers who use the product in question is the most important element in this endeavour. Monitoring methods are necessary in some situations to ascertain that exposure controls are functioning properly. These may be area sampling, either grab sample or integrated, depending on the nature of the exposure; it may be personal sampling; in some cases, as discussed below, medical monitoring may be contemplated, but usually as a last resort and only as back-up to other means of exposure assessment.
Once the properties of the chemical product in question are known, and the nature and extent of exposure are assessed, a determination could be made as to the degree of risk. This generally requires that at least some dose-response information be available.
After evaluating the risk, the next series of steps is, of course, to control the exposure, so as to eliminate or at least minimize the risk. This, first and foremost, involves applying the general principles of exposure control.
Organizing a Chemical Control Programme in Hospitals
The traditional obstacles
The implementation of adequate occupational health programmes in health care facilities has lagged behind the recognition of the hazards. Labour relations are increasingly forcing hospital management to look at all aspects of their benefits and services to employees, as hospitals are no longer tacitly exempt by custom or privilege. Legislative changes are now compelling hospitals in many jurisdictions to implement control programmes.
However, obstacles remain. The preoccupation of the hospital with patient care, emphasizing treatment rather than prevention, and the staff’s ready access to informal “corridor consultation”, have hindered the rapid implementation of control programmes. The fact that laboratory chemists, pharmacists and a host of medical scientists with considerable toxicological expertise are heavily represented in management has, in general, not served to hasten the development of programmes. The question may be asked, “Why do we need an occupational hygienist when we have all these toxicology experts?” To the extent that changes in procedures threaten to have an impact on the tasks and services provided by these highly skilled personnel, the situation may be made worse: “We cannot eliminate the use of Substance X as it is the best bactericide around.” Or, “If we follow the procedure that you are recommending, patient care will suffer.” Moreover, the “we don’t need training” attitude is commonplace among the health care professions and hinders the implementation of the essential components of chemical hazard control. Internationally, the climate of cost constraint in health care is clearly also an obstacle.
Another problem of particular concern in hospitals is preserving the confidentiality of personal information about health care workers. While occupational health professionals should need only to indicate that Ms. X cannot work with chemical Z and needs to be transferred, curious clinicians are often more prone to push for the clinical explanation than their non-health care counterparts. Ms. X may have liver disease and the substance is a liver toxin; she may be allergic to the chemical; or she may be pregnant and the substance has potential teratogenic properties. While the need to alter the work assignment of particular individuals should not be routine, the confidentiality of the medical details should be protected if it is necessary.
Right-to-know legislation
Many jurisdictions around the world have implemented right-to-know legislation. In Canada, for example, WHMIS has revolutionized the handling of chemicals in industry. This country-wide system has three components: (1) the labelling of all hazardous substances with standardized labels indicating the nature of the hazard; (2) the provision of MSDSs with the constituents, hazards and control measures for each substance; and (3) the training of workers to understand the labels and the MSDSs and to use the product safely.
Under WHMIS in Canada and OSHA’s Hazard Communications requirements in the United States, hospitals have been required to construct inventories of all chemicals on the premises so that those that are “controlled substances” can be identified and addressed according to the legislation. In the process of complying with the training requirements of these regulations, hospitals have had to engage occupational health professionals with appropriate expertise and the spin-off benefits, particularly when bipartite train-the-trainer programmes were conducted, have included a new spirit to work cooperatively to address other health and safety concerns.
Corporate commitment and the role of joint health and safety committees
The most important element in the success of any occupational health and safety programme is corporate commitment to ensure its successful implementation. Policies and procedures regarding the safe handling of chemicals in hospitals must be written, discussed at all levels within the organization and adopted and enforced as corporate policy. Chemical hazard control in hospitals should be addressed by general as well as specific policies. For example, there should be a policy on responsibility for the implementation of right-to-know legislation that clearly outlines each party’s obligations and the procedures to be followed by individuals at each level of the organization (e.g., who chooses the trainers, how much work time is allowed for preparation and provision of training, to whom should communication regarding non-attendance be communicated and so on). There should be a generic spill clean-up policy indicating the responsibility of the worker and the department where the spill occurred, the indications and protocol for notifying the emergency response team, including the appropriate in-hospital and external authorities and experts, follow-up provisions for exposed workers and so on. Specific policies should also exist regarding the handling, storage and disposal of specific classes of toxic chemicals.
Not only is it essential that management be strongly committed to these programmes; the workforce, through its representatives, must also be actively involved in the development and implementation of policies and procedures. Some jurisdictions have legislatively mandated joint (labour-management) health and safety committees that meet at a minimum prescribed interval (bimonthly in the case of Manitoba hospitals), have written operating procedures and keep detailed minutes. Indeed in recognizing the importance of these committees, the Manitoba Workers’ Compensation Board (WCB) provides a rebate on WCB premiums paid by employers based on the successful functioning of these committees. To be effective, the members must be appropriately chosen—specifically, they must be elected by their peers, knowledgeable about the legislation, have appropriate education and training and be allotted sufficient time to conduct not only incident investigations but regular inspections. With respect to chemical control, the joint committee has both a pro-active and a re-active role: assisting in setting priorities and developing preventive policies, as well as serving as a sounding board for workers who are not satisfied that all appropriate controls are being implemented.
The multidisciplinary team
As noted above, the control of chemical hazards in hospitals requires a multidisciplinary endeavour. At a minimum, it requires occupational hygiene expertise. Generally hospitals have maintenance departments that have within them the engineering and physical plant expertise to assist a hygienist in determining whether workplace alterations are necessary. Occupational health nurses also play a prominent role in evaluating the nature of concerns and complaints, and in assisting an occupational physician in ascertaining whether clinical intervention is warranted. In hospitals, it is important to recognize that numerous health care professionals have expertise that is quite relevant to the control of chemical hazards. It would be unthinkable to develop policies and procedures for the control of laboratory chemicals without the involvement of lab chemists, for example, or procedures for handling anti-neoplastic drugs without the involvement of the oncology and pharmacology staff. While it is wise for occupational health professionals in all industries to consult with line staff prior to implementing control measures, it would be an unforgivable error to fail to do so in health care settings.
Data collection
As in all industries, and with all hazards, data need to be compiled both to help in priority setting and in evaluating the success of programmes. With respect to data collection on chemical hazards in hospitals, minimally, data need to be kept regarding accidental exposures and spills (so that these areas can receive special attention to prevent recurrences); the nature of concerns and complaints should be recorded (e.g., unusual odours); and clinical cases need to be tabulated, so that, for example, an increase in dermatitis from a given area or occupational group could be identified.
Cradle-to-grave approach
Increasingly, hospitals are becoming cognizant of their obligation to protect the environment. Not only the workplace hazardous properties, but the environmental properties of chemicals are being taken into consideration. Moreover, it is no longer acceptable to pour hazardous chemicals down the drain or release noxious fumes into the air. A chemical control programme in hospitals must, therefore, be capable of tracking chemicals from their purchase and acquisition (or, in some cases, synthesis on site), through the work handling, safe storage and finally to their ultimate disposal.
Conclusion
It is now recognized that there are thousands of potentially very toxic chemicals in the work environment of health care facilities; all occupational groups may be exposed; and the nature of the exposures are varied and complex. Nonetheless, with a systematic and comprehensive approach, with strong corporate commitment and a fully informed and involved workforce, chemical hazards can be managed and the risks associated with these chemicals controlled.
Exposure to potentially hazardous chemicals is a fact of life for health care workers. They are encountered in the course of diagnostic and therapeutic procedures, in laboratory work, in preparation and clean-up activities and even in emanations from patients, to say nothing of the “infrastructure” activities common to all worksites such as cleaning and housekeeping, laundry, painting, plumbing and maintenance work. Despite the constant threat of such exposures and the large numbers of workers involved—in most countries, health care invariably is one of the most labour-intensive industries—this problem has received scant attention from those involved in occupational health and safety research and regulation. The great majority of chemicals in common use in hospitals and other health care settings are not specifically covered under national and international occupational exposure standards. In fact, very little effort has been made to date to identify the chemicals most frequently used, much less to study the mechanisms and intensity of exposures to them and the epidemiology of the effects on the health care workers involved.
This may be changing in the many jurisdictions in which right-to-know laws, such as the Canadian Workplace Hazardous Materials Information Systems (WHMIS) are being legislated and enforced. These laws require that workers be informed of the name and nature of the chemicals to which they may be exposed on the job. They have introduced a daunting challenge to administrators in the health care industry who must now turn to occupational health and safety professionals to undertake a de novo inventory of the identity and location of the thousands of chemicals to which their workers may be exposed.
The wide range of professions and jobs and the complexity of their interplay in the health care workplace require unique diligence and astuteness on the part of those charged with such occupational safety and health responsibilities. A significant complication is the traditional altruistic focus on the care and well-being of the patients, even at the expense of the health and well-being of those providing the services. Another complication is the fact that these services are often required at times of great urgency when important preventive and protective measures may be forgotten or deliberately disregarded.
Categories of Chemical Exposures in the Health Care Setting
Table 1 lists the categories of chemicals encountered in the health care workplace. Laboratory workers are exposed to the broad range of chemical reagents they employ, histology technicians to dyes and stains, pathologists to fixative and preservative solutions (formaldeyde is a potent sensitizer), and asbestos is a hazard to workers making repairs or renovations in older health care facilities.
Table 1. Categories of chemicals used in health care
Types of chemicals |
Locations most likely to be found |
Disinfectants |
Patient areas |
Sterilants |
Central supply |
Medicines |
Patient areas |
Laboratory reagents |
Laboratories |
Housekeeping/maintenance chemicals |
Hospital-wide |
Food ingredients and products |
Kitchen |
Pesticides |
Hospital-wide |
Even when liberally applied in combating and preventing the spread of infectious agents, detergents, disinfectants and sterilants offer relatively little danger to patients whose exposure is usually of brief duration. Even though individual doses at any one time may be relatively low, their cumulative effect over the course of a working lifetime may, however, constitute a significant risk to health care workers.
Occupational exposures to drugs can cause allergic reactions, such as have been reported over many years among workers administering penicillin and other antibiotics, or much more serious problems with such highly carcinogenic agents as the antineoplastic drugs. The contacts may occur during the preparation or administration of the dose for injection or in cleaning up after it has been administered. Although the danger of this mechanism of exposure had been known for many years, it was fully appreciated only after mutagenic activity was detected in the urine of nurses administering antineoplastic agents.
Another mechanism of exposure is the administration of drugs as aerosols for inhalation. The use of antineoplastic agents, pentamidine and ribavarin by this route has been studied in some detail, but there has been, as of this writing, no report of a systematic study of aerosols as a source of toxicity among health care workers.
Anaesthetic gases represent another class of drugs to which many health care workers are exposed. These chemicals are associated with a variety of biological effects, the most obvious of which are on the nervous system. Recently, there have been reports suggesting that repeated exposures to anaesthetic gases may, over time, have adverse reproductive effects among both male and female workers. It should be recognized that appreciable amounts of waste anaesthetic gases may accumulate in the air in recovery rooms as the gases retained in the blood and other tissues of patients are eliminated by exhalation.
Chemical disinfecting and sterilizing agents are another important category of potentially hazardous chemical exposures for health care workers. Used primarily in the sterilization of non-disposable equipment, such as surgical instruments and respiratory therapy apparatus, chemical sterilants such as ethylene oxide are effective because they interact with infectious agents and destroy them. Alkylation, whereby methyl or other alkyl groups bind chemically with protein-rich entities such as the amino groups in haemoglobiin and DNA, is a powerful biological effect. In intact organisms, this may not cause direct toxicity but should be considered potentially carcinogenic until proven otherwise. Ethylene oxide itself, however, is a known carcinogen and is associated with a variety of adverse health effects, as discussed elsewhere in the Encyclopaedia. The potent alkylation capability of ethylene oxide, probably the most widely-used sterilant for heat-sensitive materials, has led to its use as a classic probe in studying molecular structure.
For years, the methods used in the chemical sterilization of instruments and other surgical materials have carelessly and needlessly put many health care workers at risk. Not even rudimentary precautions were taken to prevent or limit exposures. For example, it was the common practice to leave the door of the sterilizer partially open to allow the escape of excess ethylene oxide, or to leave freshly-sterilized materials uncovered and open to the room air until enough had been assembled to make efficient use of the aerator unit.
The fixation of metallic or ceramic replacement parts so common in dentistry and orthopaedic surgery may be a source of potentially hazardous chemical exposure such as silica. These and the acrylic resins often used to glue them in place are usually biologically inert, but health care workers may be exposed to the monomers and other chemical reactants used during the preparation and application process. These chemicals are often sensitizing agents and have been associated with chronic effects in animals. The preparation of mercury amalgam fillings can lead to mercury exposure. Spills and the spread of mercury droplets is a particular concern since these may linger unnoticed in the work environment for many years. The acute exposure of patients to them appears to be entirely safe, but the long-term health implications of the repeated exposure of health care workers have not been adequately studied.
Finally, such medical techniques as laser surgery, electro-cauterization and use of other radiofrequency and high-energy devices can lead to the thermal degradation of tissues and other substances resulting in the formation of potentially toxic smoke and fumes. For example, the cutting of “plaster” casts made of polyester resin impregnated bandages has been shown to release potentially toxic fumes.
The hospital as a “mini-municipality”
A listing of the varied jobs and tasks performed by the personnel of hospitals and other large health care facilities might well serve as a table of contents for the commercial listings of a telephone directory for a sizeable municipality. All of these entail chemical exposures intrinsic to the particular work activity in addition to those that are peculiar to the health care environment. Thus, painters and maintenance workers are exposed to solvents and lubricants. Plumbers and others engaged in soldering are exposed to fumes of lead and flux. Housekeeping workers are exposed to soaps, detergents and other cleansing agents, pesticides and other household chemicals. Cooks may be exposed to potentially carcinogenic fumes in broiling or frying foods and to oxides of nitrogen from the use of natural gas as fuel. Even clerical workers may be exposed to the toners used in copiers and printers. The occurrence and effects of such chemical exposures are detailed elsewhere in this Encyclopaedia.
One chemical exposure that is diminishing in importance as more and more HCWs quit smoking and more health care facilities become “smoke-free” is “second hand” tobacco smoke.
Unusual chemical exposures in health care
Table 2 presents a partial listing of the chemicals most commonly encountered in health care workplaces. Whether or not they will be toxic will depend on the nature of the chemical and its biological proclivities, the manner, intensity and duration of the exposure, the susceptibilities of the exposed worker, and the speed and effectiveness of any countermeasures that may have been attempted. Unfortunately, a compendium of the nature, mechanisms, effects and treatment of chemical exposures of health care workers has not yet been published.
There are some unique exposures in the health care workplace that substantiate the dictum that a high level of vigilance is necessary to protect workers fully from such risks. For example, it was recently reported that health care workers had been overcome by toxic fumes emanating from a patient under treatment from a massive chemical exposure. Cases of cyanide poisoning arising from patient emissions have also been reported. In addition to the direct toxicity of waste anaesthetic gases to anaesthetists and other personnel in operating theatres, there is the often unrecognized problem created by the frequent use in such areas of high-energy sources which can transform the anaesthetic gases to free radicals, a form in which they are potentially carcinogenic.
Table 2. Chemicals cited Hazardous Substances Database (HSDB)
The following chemicals are listed in the HSDB as being used in some area of the health care environment. The HSDB is produced by the US National Library of Medicine and is a compilation of more than 4,200 chemicals with known toxic effects in commercial use. Absence of a chemical from the list does not imply that it is not toxic, but that it is not present in the HSDB.
Use list in the HSDB |
Chemical name |
CAS number* |
Disinfectants; antiseptics |
benzylalkonium chloride |
0001-54-5 |
Sterilants |
beta-propiolactone |
57-57-8 |
Laboratory reagents: |
2,4-xylidine (magenta-base) |
3248-93-9 |
* Chemical Abstracts identification number.
Transmission of Mycobacterium tuberculosis is a recognized risk in health care facilities. The magnitude of the risk to HCWs varies considerably by the type of health care facility, the prevalence of TB in the community, the patient population served, the HCW’s occupational group, the area of the health care facility in which the HCW works and the effectiveness of TB infection-control interventions. The risk may be higher in areas where patients with TB are provided care before diagnosis and initiation of TB treatment and isolation precautions (e.g., in clinic waiting areas and emergency departments) or where diagnostic or treatment procedures that stimulate coughing are performed. Nosocomial transmission of M. tuberculosis has been associated with close contact with persons who have infectious TB and with the performance of certain procedures (e.g., bronchoscopy, endotracheal intubation and suctioning, open abscess irrigation and autopsy). Sputum induction and aerosol treatments that induce coughing may also increase the potential for transmission of M. tuberculosis. Personnel in health care facilities should be particularly alert to the need for preventing transmission of M. tuberculosis in those facilities in which immunocompromised persons (e.g., HIV-infected persons) work or receive care—especially if cough-inducing procedures, such as sputum induction and aerosolized pentamidine treatments, are being performed.
Transmission and Pathogenesis
M. tuberculosis is carried in airborne particles, or droplet nuclei, that can be generated when persons who have pulmonary or laryngeal TB sneeze, cough, speak or sing. The particles are an estimated 1 to 5 μm in size and normal air currents can keep them airborne for prolonged time periods and spread them throughout a room or building. Infection occurs when a susceptible person inhales droplet nuclei containing M. tuberculosis and these droplet nuclei traverse the mouth or nasal passages, upper respiratory tract and bronchi to reach the alveoli of the lungs. Once in the alveoli, the organisms are taken up by alveolar macrophages and spread throughout the body. Usually within two to ten weeks after initial infection with M. tuberculosis, the immune response limits further multiplication and spread of the tubercle bacilli; however, some of the bacilli remain dormant and viable for many years. This condition is referred to as latent TB infection. Persons with latent TB infection usually have positive purified protein derivative (PPD)-tuberculin skin-test results, but they do not have symptoms of active TB, and they are not infectious.
In general, persons who become infected with M. tuberculosis have approximately a 10% risk for developing active TB during their lifetimes. This risk is greatest during the first two years after infection. Immunocompromised persons have a greater risk for the progression of latent TB infection to active TB disease; HIV infection is the strongest known risk factor for this progression. Persons with latent TB infection who become co-infected with HIV have approximately an 8 to 10% risk per year for developing active TB. HIV-infected persons who are already severely immunosuppressed and who become newly infected with M. tuberculosis have an even greater risk for developing active TB.
The probability that a person who is exposed to M. tuberculosis will become infected depends primarily on the concentration of infectious droplet nuclei in the air and the duration of exposure. Characteristics of the TB patient that enhance transmission include:
Environmental factors that enhance the likelihood of transmission include:
Characteristics of the persons exposed to M. tuberculosis that may affect the risk for becoming infected are not as well defined. In general, persons who have been infected previously with M. tuberculosis may be less susceptible to subsequent infection. However, reinfection can occur among previously infected persons, especially if they are severely immunocompromised. Vaccination with Bacille of Calmette and Guérin (BCG) probably does not affect the risk for infection; rather, it decreases the risk for progressing from latent TB infection to active TB. Finally, although it is well established that HIV infection increases the likelihood of progressing from latent TB infection to active TB, it is unknown whether HIV infection increases the risk for becoming infected if exposed to M. tuberculosis.
Epidemiology
Several TB outbreaks among persons in health care facilities have been reported recently in the United States. Many of these outbreaks involved transmission of multidrug-resistant strains of M. tuberculosis to both patients and HCWs. Most of the patients and some of the HCWs were HIV-infected persons in whom new infection progressed rapidly to active disease. Mortality associated with those outbreaks was high (with a range of 43 to 93%). Furthermore, the interval between diagnosis and death was brief (with a range of median intervals of 4 to 16 weeks). Factors contributing to these outbreaks included delayed diagnosis of TB, delayed recognition of drug resistance and delayed initiation of effective therapy, all of which resulted in prolonged infectiousness, delayed initiation and inadequate duration of TB isolation, inadequate ventilation in TB isolation rooms, lapses in TB isolation practices and inadequate precautions for cough-inducing procedures and lack of adequate respiratory protection.
Fundamentals of TB infection control
An effective TB infection-control programme requires early identification, isolation and effective treatment of persons who have active TB. The primary emphasis of the TB infection-control plan should be on achieving these three goals. In all health care facilities, particularly those in which persons who are at high risk for TB work or receive care, policies and procedures for TB control should be developed, reviewed periodically and evaluated for effectiveness to determine the actions necessary to minimize the risk for transmission of M. tuberculosis.
The TB infection-control programme should be based on a hierarchy of control measures. The first level of the hierarchy, and that which affects the largest number of persons, is using administrative measures intended primarily to reduce the risk for exposing uninfected persons to persons who have infectious TB. These measures include:
The second level of the hierarchy is the use of engineering controls to prevent the spread and reduce the concentration of infectious droplet nuclei. These controls include:
The first two levels of the hierarchy minimize the number of areas in the health care facility where exposure to infectious TB may occur, and they reduce, but do not eliminate, the risk in those few areas where exposure to M. tuberculosis can still occur (e.g., rooms in which patients with known or suspected infectious TB are being isolated and treatment rooms in which cough-inducing or aerosol-generating procedures are performed on such patients). Because persons entering such rooms may be exposed to M. tuberculosis, the third level of the hierarchy is the use of personal respiratory protective equipment in these and certain other situations in which the risk for infection with M. tuberculosis may be relatively higher.
Specific measures to reduce the risk for transmission of M. tuberculosis include the following:
1. Assigning to specific persons in the health care facility the supervisory responsibility for designing, implementing, evaluating and maintaining the TB infection-control programme.
2. Conducting a risk assessment to evaluate the risk for transmission of M. tuberculosis in all areas of the health care facility, developing a written TB infection-control programme based on the risk assessment and periodically repeating the risk assessment to evaluate the effectiveness of the TB infection-control programme. TB infection-control measures for each health care facility should be based on a careful assessment of the risk for transmission of M. tuberculosis in that particular setting. The first step in developing the TB infection-control programme should be to conduct a baseline risk assessment to evaluate the risk for transmission of M. tuberculosis in each area and occupational group in the facility. Appropriate infection-control interventions can then be developed on the basis of actual risk. Risk assessments should be performed for all inpatient and outpatient settings (e.g., medical and dental offices). Classification of risk for a facility, for a specific area and for a specific occupational group should be based on the profile of TB in the community, the number of infectious TB patients admitted to the area or ward, or the estimated number of infectious TB patients to whom HCWs in an occupational group may be exposed and the results of analysis of HCW PPD test conversions (where applicable) and possible person-to-person transmission of M. tuberculosis. Regardless of risk level, the management of patients with known or suspected infectious TB should not vary. However, the index of suspicion for infectious TB among patients, the frequency of HCW PPD skin testing, the number of TB isolation rooms and other factors will depend on the level of risk for transmission of M. tuberculosis in the facility, area or occupational group.
3. Developing, implementing and enforcing policies and protocols to ensure early identification, diagnostic evaluation and effective treatment of patients who may have infectious TB. A diagnosis of TB may be considered for any patient who has a persistent cough (i.e., a cough lasting for longer than 3 weeks) or other signs or symptoms compatible with active TB (e.g., bloody sputum, night sweats, weight loss, anorexia or fever). However, the index of suspicion for TB will vary in different geographic areas and will depend on the prevalence of TB and other characteristics of the population served by the facility. The index of suspicion for TB should be very high in geographic areas or among groups of patients in which the prevalence of TB is high. Appropriate diagnostic measures should be conducted and TB precautions implemented for patients in whom active TB is suspected.
4. Providing prompt triage for and appropriate management of patients in the outpatient setting who may have infectious TB. Triage of patients in ambulatory-care settings and emergency departments should include vigorous efforts to identify promptly patients who have active TB. HCWs who are the first points of contact in facilities that serve populations at risk for TB should be trained to ask questions that will facilitate identification of patients with signs and symptoms suggestive of TB. Patients with signs or symptoms suggestive of TB should be evaluated promptly to minimize the amount of time they are in ambulatory-care areas. TB precautions should be followed while the diagnostic evaluation is being conducted for these patients. TB precautions in the ambulatory-care setting should include placing these patients in a separate area apart from other patients and not in open waiting areas (ideally, in a room or enclosure meeting TB isolation requirements), giving these patients surgical masks to wear and instructing them to keep their masks on and giving these patients tissues and instructing them to cover their mouths and noses with the tissues when coughing or sneezing. Surgical masks are designed to prevent the respiratory secretions of the person wearing the mask from entering the air. When not in a TB isolation room, patients suspected of having TB should wear surgical masks to reduce the expulsion of droplet nuclei into the air. These patients do not need to wear particulate respirators, which are designed to filter the air before it is inhaled by the person wearing the mask. Patients suspected of having or known to have TB should never wear a respirator that has an exhalation valve, because the device would provide no barrier to the expulsion of droplet nuclei into the air.
5. Promptly initiating and maintaining TB isolation for persons who may have infectious TB and who are admitted to the inpatient setting. In hospitals and other inpatient facilities, any patient suspected of having or known to have infectious TB should be placed in a TB isolation room that has currently recommended ventilation characteristics (see below). Written policies for initiating isolation should specify the indications for isolation, the person(s) authorized to initiate and discontinue isolation, the isolation practices to follow, the monitoring of isolation, the management of patients who do not adhere to isolation practices and the criteria for discontinuing isolation.
6. Effectively planning arrangements for discharge. Before a TB patient is discharged from the health care facility, the facility’s staff and public health authorities should collaborate to ensure continuation of therapy. Discharge planning in the health care facility should include, at a minimum, a confirmed outpatient appointment with the provider who will manage the patient until the patient is cured, sufficient medication to take until the outpatient appointment and placement into case management (e.g., directly observed therapy (DOT)) or outreach programmes of the public health department. These plans should be initiated and in place before the patient’s discharge.
7. Developing, installing, maintaining and evaluating ventilation and other engineering controls to reduce the potential for airborne exposure to M. tuberculosis. Local exhaust ventilation is a preferred source control technique, and it is often the most efficient way to contain airborne contaminants because it captures these contaminants near their source before they can disperse. Therefore, the technique should be used, if feasible, wherever aerosol-generating procedures are performed. Two basic types of local exhaust devices use hoods: the enclosing type, in which the hood either partially or fully encloses the infectious source, and the exterior type, in which the infectious source is near but outside the hood. Fully enclosed hoods, booths or tents are always preferable to exterior types because of their superior ability to prevent contaminants from escaping into the HCW’s breathing zone. General ventilation can be used for several purposes, including diluting and removing contaminated air, controlling airflow patterns within rooms and controlling the direction of airflow throughout a facility. General ventilation maintains air quality by two processes: dilution and removal of airborne contaminants. Uncontaminated supply air mixes with the contaminated room air (i.e., dilution), which is subsequently removed from the room by the exhaust system. These processes reduce the concentration of droplet nuclei in the room air. Recommended general ventilation rates for health care facilities are usually expressed in number of air changes per hour (ACH).
This number is the ratio of the volume of air entering the room per hour to the room volume and is equal to the exhaust airflow (Q, in cubic feet per minute) divided by the room volume (V, in cubic feet) multiplied by 60 (i.e., ACH = Q / V x 60). For the purposes of reducing the concentration of droplet nuclei, TB isolation and treatment rooms in existing health care facilities should have an airflow of greater than 6 ACH. Where feasible, this airflow rate should be increased to at least 12 ACH by adjusting or modifying the ventilation system or by using auxiliary means (e.g., recirculation of air through fixed HEPA filtration systems or portable air cleaners). New construction or renovation of existing health care facilities should be designed so that TB isolation rooms achieve an airflow of at least 12 ACH. The general ventilation system should be designed and balanced so that air flows from less contaminated (i.e., more clean) to more contaminated (less clean) areas. For example, air should flow from corridors into TB isolation rooms to prevent spread of contaminants to other areas. In some special treatment rooms in which operative and invasive procedures are performed, the direction of airflow is from the room to the hallway to provide cleaner air during these procedures. Cough-inducing or aerosol-generating procedures (e.g., bronchoscopy and irrigation of tuberculous abscesses) should not be performed in rooms with this type of airflow on patients who may have infectious TB. HEPA filters may be used in a number of ways to reduce or eliminate infectious droplet nuclei from room air or exhaust. These methods include placement of HEPA filters in exhaust ducts discharging air from booths or enclosures into the surrounding room, in ducts or in ceiling- or wall-mounted units, for recirculation of air within an individual room (fixed recirculation systems), in portable air cleaners, in exhaust ducts to remove droplet nuclei from air being discharged to the outside, either directly or through ventilation equipment, and in ducts discharging air from the TB isolation room into the general ventilation system. In any application, HEPA filters should be installed carefully and maintained meticulously to ensure adequate functioning. For general use areas in which the risk for transmission of M. tuberculosis is relatively high, ultraviolet lamps (UVGI) may be used as an adjunct to ventilation for reducing the concentration of infectious droplet nuclei, although the effectiveness of such units has not been evaluated adequately. Ultraviolet (UV) units can be installed in a room or corridor to irradiate the air in the upper portion of the room, or they can be installed in ducts to irradiate air passing through the ducts.
8. Developing, implementing, maintaining and evaluating a respiratory protection programme. Personal respiratory protection (i.e., respirators) should be used by persons entering rooms in which patients with known or suspected infectious TB are being isolated, persons present during cough-inducing or aerosol-generating procedures performed on such patients and persons in other settings where administrative and engineering controls are not likely to protect them from inhaling infectious airborne droplet nuclei. These other settings include transporting patients who may have infectious TB in emergency transport vehicles and providing urgent surgical or dental care to patients who may have infectious TB before a determination has been made that the patient is non-infectious.
9. Educating and training HCWs about TB, effective methods for preventing transmission of M. tuberculosis and the benefits of medical screening programmes. All HCWs, including physicians, should receive education regarding TB that is relevant to persons in their particular occupational group. Ideally, training should be conducted before initial assignment and the need for additional training should be re-evaluated periodically (e.g., once a year). The level and detail of this education will vary according to the HCW’s work responsibilities and the level of risk in the facility (or area of the facility) in which the HCW works. However, the programme may include the following elements:
10. Developing and implementing a programme for routine periodic counselling and screening of HCWs for active TB and latent TB infection. A TB counselling, screening and prevention programme for HCWs should be established to protect both HCWs and patients. HCWs who have positive PPD test results, PPD test conversions or symptoms suggestive of TB should be identified, evaluated to rule out a diagnosis of active TB and started on therapy or preventive therapy if indicated. In addition, the results of the HCW PPD screening programme will contribute to evaluation of the effectiveness of current infection-control practices. Because of the increased risk for rapid progression from latent TB infection to active TB in human immunodeficiency virus, HIV-infected or otherwise severely immunocompromised persons, all HCWs should know if they have a medical condition or are receiving a medical treatment that may lead to severely impaired cell-mediated immunity. HCWs who may be at risk for HIV infection should know their HIV status (i.e., they should be encouraged to voluntarily seek counselling and testing for HIV antibody status). Existing guidelines for counselling and testing should be followed routinely. Knowledge of these conditions allows the HCW to seek the appropriate preventive measures and to consider voluntary work reassignments.
11. ll HCWs should be informed about the need to follow existing recommendations for infection control to minimize the risk for exposure to infectious agents; implementation of these recommendations will greatly reduce the risk for occupational infections among HCWs. All HCWs should also be informed about the potential risks to severely immunocompromised persons associated with caring for patients who have some infectious diseases, including TB. It should be emphasized that limiting exposure to TB patients is the most protective measure that severely immunosuppressed HCWs can take to avoid becoming infected with M. tuberculosis. HCWs who have severely impaired cell-mediated immunity and who may be exposed to M. tuberculosis may consider a change in job-setting to avoid such exposure. HCWs should be advised of the legal option in many jurisdictions that severely immunocompromised HCWs can choose to transfer voluntarily to areas and work activities in which there is the lowest possible risk for exposure to M. tuberculosis. This choice should be a personal decision for HCWs after they have been informed of the risks to their health.
12. Employers should make reasonable accommodations (e.g., alternative job assignments) for employees who have a health condition that compromises cell-mediated immunity and who work in settings where they may be exposed to M. tuberculosis. HCWs who are known to be immunocompromised should be referred to employee health professionals who can individually counsel the employees regarding their risk for TB. Upon the request of the immunocompromised HCW, employers should offer, but not compel, a work setting in which the HCW would have the lowest possible risk for occupational exposure to M. tuberculosis.
13. All HCWs should be informed that immunosuppressed HCWs should have appropriate follow-up and screening for infectious diseases, including TB, provided by their medical practitioner. HCWs who are known to be HIV-infected or otherwise severely immunosuppressed should be tested for cutaneous anergy at the time of PPD testing. Consideration should be given to retesting, at least every 6 months, those immunocompromised HCWs who are potentially exposed to M. tuberculosis because of the high risk for rapid progression to active TB if they become infected.
14. Information provided by HCWs regarding their immune status should be treated confidentially. If the HCW requests voluntary job reassignment, the privacy of the HCW should be maintained. Facilities should have written procedures on confidential handling of such information.
15. Promptly evaluating possible episodes of M. tuberculosis transmission in health care facilities, including PPD skin-test conversions among HCWs, epidemiologically associated cases among HCWs or patients and contacts of patients or HCWs who have TB and who were not promptly identified and isolated. Epidemiological investigations may be indicated for several situations. These include, but are not limited to, the occurrence of PPD test conversions or active TB in HCWs, the occurrence of possible person-to-person transmission of M. tuberculosis and situations in which patients or HCWs with active TB are not promptly identified and isolated, thus exposing other persons in the facility to M. tuberculosis. The general objectives of the epidemiological investigations in these situations are as follows:
16. Coordinating activities with the local public health department, emphasizing reporting and ensuring adequate discharge follow-up and the continuation and completion of therapy. As soon as a patient or HCW is known or suspected to have active TB, the patient or HCW should be reported to the public health department so that appropriate follow-up can be arranged and a community contact investigation can be performed. The health department should be notified well before patient discharge to facilitate follow-up and continuation of therapy. A discharge plan coordinated with the patient or HCW, the health department and the inpatient facility should be implemented.
Prevention of occupational transmission of bloodborne pathogens (BBP) including the human immunodeficiency virus (HIV), hepatitis B virus (HBV) and more recently hepatitis C virus (HCV), has received significant attention. Although HCWs are the primary occupational group at risk of acquisition of infection, any worker who is exposed to blood or other potentially infectious body fluids during the performance of job duties is at risk. Populations at risk for occupational exposure to BBP include workers in health care delivery, public safety and emergency response workers and others such as laboratory researchers and morticians. The potential for occupational transmission of bloodborne pathogens including HIV will continue to increase as the number of persons who have HIV and other bloodborne infections and require medical care increases.
In the US, the Centers for Disease Control and Prevention (CDC) recommended in 1982 and 1983 that patients with the acquired immunodeficiency syndrome (AIDS) be treated according to the (now obsolete) category of “blood and body fluid precautions” (CDC 1982; CDC 1983). Documentation that HIV, the causative agent of AIDS, had been transmitted to HCWs by percutaneous and mucocutaneous exposures to HIV-infected blood, as well as the realization that the HIV infection status of most patients or blood specimens encountered by HCWs would be unknown at the time of the encounter, led CDC to recommend that blood and body fluid precautions be applied to all patients, a concept known as “universal precautions” (CDC 1987a, 1987b). The use of universal precautions eliminates the need to identify patients with bloodborne infections, but is not intended to replace general infection control practices. Universal precautions include the use of handwashing, protective barriers (e.g., goggles, gloves, gowns and face protection) when blood contact is anticipated and care in the use and disposal of needles and other sharp instruments in all health care settings. Also, instruments and other reusable equipment used in performing invasive procedures should be appropriately disinfected or sterilized (CDC 1988a, 1988b). Subsequent CDC recommendations have addressed prevention of transmission of HIV and HBV to public safety and emergency responders (CDC 1988b), management of occupational exposure to HIV, including the recommendations for the use of zidovudine (CDC 1990), immunization against HBV and management of HBV exposure (CDC 1991a), infection control in dentistry (CDC 1993) and the prevention of HIV transmission from HCWs to patients during invasive procedures (CDC 1991b).
In the US, CDC recommendations do not have the force of law, but have often served as the foundation for government regulations and voluntary actions by industry. The Occupational Health and Safety Administration (OSHA), a federal regulatory agency, promulgated a standard in 1991 on Occupational Exposure to Bloodborne Pathogens (OSHA 1991). OSHA concluded that a combination of engineering and work practice controls, personal protective clothing and equipment, training, medical surveillance, signs and labels and other provisions can help to minimize or eliminate exposure to bloodborne pathogens. The standard also mandated that employers make available hepatitis B vaccination to their employees.
The World Health Organization (WHO) has also published guidelines and recommendations pertaining to AIDS and the workplace (WHO 1990, 1991). In 1990, the European Economic Council (EEC) issued a council directive (90/679/EEC) on protection of workers from risks related to exposure to biological agents at work. The directive requires employers to conduct an assessment of the risks to the health and safety of the worker. A distinction is drawn between activities where there is a deliberate intention to work with or use biological agents (e.g., laboratories) and activities where exposure is incidental (e.g., patient care). Control of risk is based on a hierarchical system of procedures. Special containment measures, according to the classification of the agents, are set out for certain types of health facilities and laboratories (McCloy 1994). In the US, CDC and the National Institutes of Health also have specific recommendations for laboratories (CDC 1993b).
Since the identification of HIV as a BBP, knowledge about HBV transmission has been helpful as a model for understanding modes of transmission of HIV. Both viruses are transmitted via sexual, perinatal and bloodborne routes. HBV is present in the blood of individuals positive for hepatitis B e antigen (HBeAg, a marker for high infectivity) at a concentration of approximately 108 to 109 viral particles per millilitre (ml) of blood (CDC 1988b). HIV is present in blood at much lower concentrations: 103 to 104 viral particles/ml for a person with AIDS and 10 to 100/ml for a person with asymptomatic HIV infection (Ho, Moudgil and Alam 1989). The risk of HBV transmission to a HCW after percutaneous exposure to HBeAg-positive blood is approximately 100-fold higher than the risk of HIV transmission after percutaneous exposure to HIV-infected blood (i.e., 30% versus 0.3%) (CDC 1989).
Hepatitis
Hepatitis, or inflammation of the liver, can be caused by a variety of agents, including toxins, drugs, autoimmune disease and infectious agents. Viruses are the most common cause of hepatitis (Benenson 1990). Three types of bloodborne viral hepatitis have been recognized: hepatitis B, formerly called serum hepatitis, the major risk to HCWs; hepatitis C, the major cause of parenterally transmitted non-A, non-B hepatitis; and hepatitis D, or delta hepatitis.
Hepatitis B. The major infectious bloodborne occupational hazard to HCWs is HBV. Among US HCWs with frequent exposure to blood, the prevalence of serological evidence of HBV infection ranges between approximately 15 and 30%. In contrast, the prevalence in the general populations averages 5%. The cost-effectiveness of serological screening to detect susceptible individuals among HCWs depends on the prevalence of infection, the cost of testing and the vaccine costs. Vaccination of persons who already have antibodies to HBV has not been shown to cause adverse effects. Hepatitis B vaccine provides protection against hepatitis B for at least 12 years after vaccination; booster doses currently are not recommended. The CDC estimated that in 1991 there were approximately 5,100 occupationally acquired HBV infections in HCWs in the United States, causing 1,275 to 2,550 cases of clinical acute hepatitis, 250 hospitalizations and about 100 deaths (unpublished CDC data). In 1991, approximately 500 HCWs became HBV carriers. These individuals are at risk of long-term sequelae, including disabling chronic liver disease, cirrhosis and liver cancer.
The HBV vaccine is recommended for use in HCWs and public safety workers who may be exposed to blood in the workplace (CDC 1991b). Following a percutaneous exposure to blood, the decision to provide prophylaxis must include considerations of several factors: whether the source of the blood is available, the HBsAg status of the source and the hepatitis B vaccination and vaccine-response status of the exposed person. For any exposure of a person not previously vaccinated, hepatitis B vaccination is recommended. When indicated, hepatitis B immune globulin (HBIG) should be administered as soon as possible after exposure since its value beyond 7 days after exposure is unclear. Specific CDC recommendations are indicated in table 1 (CDC 1991b).
Table 1. Recommendation for post-exposure prophylaxis for percutaneous or permucosal exposure to hepatitis B virus, United States
Exposed person |
When source is |
||
HBsAg1 positive |
HBsAg negative |
Source not tested or |
|
Unvaccinated |
HBIG2´1 and initiate |
Initiate HB vaccine |
Initiate HB vaccine |
Previously Known |
No treatment |
No treatment |
No treatment |
Known non- |
HBIG´2 or HBIG´1 and |
No treatment |
If known high-risk source |
Response |
Test exposed for anti-HBs4 |
No treatment |
Test exposed for anti-HBs |
1 HBsAg = Hepatitis B surface antigen. 2 HBIG = Hepatitis B immune globulin; dose 0.06 mL/kg IM. 3 HB vaccine = hepatitis B vaccine. 4 Anti-HBs = antibody to hepatitis B surface antigen. 5 Adequate anti-HBs is ≥10 mIU/mL.
Table 2. Provisional US Public Health Service recommendations for chemoprophylaxis after occupational exposure to HIV, by type of exposure and source of material, 1996
Type of exposure |
Source material1 |
Antiretroviral |
Antiretroviral regimen3 |
Percutaneous |
Blood |
|
|
Mucous membrane |
Blood |
Offer |
ZDV plus 3TC, ± IDV5 |
Skin, increased risk7 |
Blood |
Offer |
ZDV plus 3TC, ± IDV5 |
1 Any exposure to concentrated HIV (e.g., in a research laboratory or production facility) is treated as percutaneous exposure to blood with highest risk. 2 Recommend—Postexposure prophylaxis (PEP) should be recommended to the exposed worker with counselling. Offer—PEP should be offered to the exposed worker with counselling. Not offer—PEP should not be offered because these are not occupational exposures to HIV. 3 Regimens: zidovudine (ZDV), 200 mg three times a day; lamivudine (3TC), 150 mg two times a day; indinavir (IDV), 800 mg three times a day (if IDV is not available, saquinavir may be used, 600 mg three times a day). Prophylaxis is given for 4 weeks. For full prescribing information, see package inserts. 4 Risk definitions for percutaneous blood exposure: Highest risk—BOTH larger volume of blood (e.g., deep injury with large diameter hollow needle previously in source patient’s vein or artery, especially involving an injection of source-patient’s blood) AND blood containing a high titre of HIV (e.g., source with acute retroviral illness or end-stage AIDS; viral load measurement may be considered, but its use in relation to PEP has not been evaluated). Increased risk—EITHER exposure to larger volume of blood OR blood with a high titre of HIV. No increased risk—NEITHER exposure to larger volume of blood NOR blood with a high titre of HIV (e.g., solid suture needle injury from source patient with asymptomatic HIV infection). 5 Possible toxicity of additional drug may not be warranted. 6 Includes semen; vaginal secretions; cerebrospinal, synovial, pleural, peritoneal, pericardial and amniotic fluids. 7 For skin, risk is increased for exposures involving a high titre of HIV, prolonged contact, an extensive area, or an area in which skin integrity is visibly compromised. For skin exposures without increased risk, the risk for drug toxicity outweighs the benefit of PEP.
Article 14(3) of EEC Directive 89/391/EEC on vaccination required only that effective vaccines, where they exist, be made available for exposed workers who are not already immune. There was an amending Directive 93/88/EEC which contained a recommended code of practice requiring that workers at risk be offered vaccination free of charge, informed of the benefits and disadvantages of vaccination and non-vaccination, and be provided a certificate of vaccination (WHO 1990).
The use of hepatitis B vaccine and appropriate environmental controls will prevent almost all occupational HBV infections. Reducing blood exposure and minimizing puncture injuries in the health care setting will reduce also the risk of transmission of other bloodborne viruses.
Hepatitis C. Transmission of HCV is similar to that of HBV, but infection persists in most patients indefinitely and more frequently progresses to long-term sequelae (Alter et al. 1992). The prevalence of anti-HCV among US hospital-based health care workers averages 1 to 2% (Alter 1993). HCWs who sustain accidental injuries from needlesticks contaminated with anti-HCV-positive blood have a 5 to 10% risk of acquiring HCV infection (Lampher et al. 1994; Mitsui et al. 1992). There has been one report of HCV transmission after a blood splash to the conjunctiva (Sartori et al. 1993). Prevention measures again consist of adherence to universal precautions and percutaneous injury prevention, since no vaccine is available and immune globulin does not appear to be effective.
Hepatitis D. Hepatitis D virus requires the presence of hepatitis B virus for replication; thus, HDV can infect persons only as a coinfection with acute HBV or as a superinfection of chronic HBV infection. HDV infection can increase the severity of liver disease; one case of occupationally acquired HDV infection hepatitis has been reported (Lettau et al. 1986). Hepatitis B vaccination of HBV-susceptible persons will also prevent HDV infection; however, there is no vaccine to prevent HDV superinfection of an HBV carrier. Other prevention measures consist of adherence to universal precautions and percutaneous injury prevention.
HIV
The first cases of AIDS were recognized in June of 1981. Initially, over 92% of the cases reported in the United States were in homosexual or bisexual men. However, by the end of 1982, AIDS cases were identified among injection drug users, blood transfusion recipients, haemophilia patients treated with clotting factor concentrates, children and Haitians. AIDS is the result of infection with HIV, which was isolated in 1985. HIV has spread rapidly. In the United States, for example, the first 100,000 AIDS cases occurred between 1981 and 1989; the second 100,000 cases occurred between 1989 and 1991. As of June 1994, 401,749 cases of AIDS had been reported in the United States (CDC 1994b).
Globally, HIV has affected many countries including those in Africa, Asia and Europe. As of 31 December 1994, 1,025,073 cumulative cases of AIDS in adults and children had been reported to the WHO. This represented a 20% increase from the 851,628 cases reported through December 1993. It was estimated that 18 million adults and about 1.5 million children have been infected with HIV since the beginning of the pandemic (late 1970s to early 1980s) (WHO 1995).
Although HIV has been isolated from human blood, breast milk, vaginal secretions, semen, saliva, tears, urine, cerebrospinal fluid and amniotic fluid, epidemiological evidence has implicated only blood, semen, vaginal secretions and breast milk in the transmission of the virus. The CDC has also reported on the transmission of HIV as the result of contact with blood or other body secretions or excretions from an HIV-infected person in the household (CDC 1994c). Documented modes of occupational HIV transmission include having percutaneous or mucocutaneous contact with HIV-infected blood. Exposure by the percutaneous route is more likely to result in infection transmission than is mucocutaneous contact.
There are a number of factors which may influence the likelihood of occupational bloodborne pathogen transmission, including: the volume of fluid in the exposure, the virus titre, the length of time of the exposure and the immune status of the worker. Additional data are needed to determine precisely the importance of these factors. Preliminary data from a CDC case-control study indicate that for percutaneous exposures to HIV-infected blood, HIV transmission is more likely if the source patient has advanced HIV disease and if the exposure involves a larger inoculum of blood (e.g., injury due to a large-bore hollow needle) (Cardo et al. 1995). Virus titre can vary between individuals and over time within a single individual. Also, blood from persons with AIDS, particularly in the terminal stages, may be more infectious than blood from persons in earlier stages of HIV infection, except possibly during the illness associated with acute infection (Cardo et al. 1995).
Occupational exposure and HIV infection
As of December 1996, CDC reported 52 HCWs in the United States who have seroconverted to HIV following a documented occupational exposure to HIV, including 19 laboratory workers, 21 nurses, six physicians and six in other occupations. Forty-five of the 52 HCWs sustained percutaneous exposures, five had mucocutaneous exposures, one had both a percutaneous and a mucocutaneous exposure and one had an unknown route of exposure. In addition, 111 possible cases of occupationally acquired infection have been reported. These possible cases have been investigated and are without identifiable non-occupational or transfusion risks; each reported percutaneous or mucocutaneous occupational exposures to blood or body fluids, or laboratory solutions containing HIV, but HIV seroconversion specifically resulting from an occupational exposure was not documented (CDC 1996a).
In 1993, the AIDS Centre at the Communicable Disease Surveillance Centre (UK) summarized reports of cases of occupational HIV transmission including 37 in the United States, four in the UK and 23 from other countries (France, Italy, Spain, Australia, South Africa, Germany and Belgium) for a total of 64 documented seroconversions after a specific occupational exposure. In the possible or presumed category there were 78 in the United States, six in the UK and 35 from other countries (France, Italy, Spain, Australia, South Africa, Germany, Mexico, Denmark, Netherlands, Canada and Belgium) for a total of 118 (Heptonstall, Porter and Gill 1993). The number of reported occupationally acquired HIV infections is likely to represent only a portion of the actual number due to under-reporting and other factors.
HIV post-exposure management
Employers should make available to workers a system for promptly initiating evaluation, counselling and follow-up after a reported occupational exposure that may place a worker at risk of acquiring HIV infection. Workers should be educated and encouraged to report exposures immediately after they occur so that appropriate interventions can be implemented (CDC 1990).
If an exposure occurs, the circumstances should be recorded in the worker’s confidential medical record. Relevant information includes the following: date and time of exposure; job duty or task being performed at the time of exposure; details of exposure; description of source of exposure, including, if known, whether the source material contained HIV or HBV; and details about counselling, post-exposure management and follow-up. The source individual should be informed of the incident and, if consent is obtained, tested for serological evidence of HIV infection. If consent cannot be obtained, policies should be developed for testing source individuals in compliance with applicable regulations. Confidentiality of the source individual should be maintained at all times.
If the source individual has AIDS, is known to be HIV seropositive, refuses testing or the HIV status is unknown, the worker should be evaluated clinically and serologically for evidence of HIV infection as soon as possible after the exposure (baseline) and, if seronegative, should be retested periodically for a minimum of 6 months after exposure (e.g., six weeks, 12 weeks and six months after exposure) to determine whether HIV infection has occurred. The worker should be advised to report and seek medical evaluation for any acute illness that occurs during the follow-up period. During the follow-up period, especially the first six to 12 weeks after the exposure, exposed workers should be advised to refrain from blood, semen or organ donation and to abstain from, or use measures to prevent HIV transmission, during sexual intercourse.
In 1990, CDC published a statement on the management of exposure to HIV including considerations regarding zidovudine (ZDV) post-exposure use. After a careful review of the available data, CDC stated that the efficacy of zidovudine could not be assessed due to insufficient data, including available animal and human data (CDC 1990).
In 1996, information suggesting that ZDV post-exposure prophylaxis (PEP) may reduce the risk for HIV transmission after occupational exposure to HIV-infected blood (CDC 1996a) prompted a US Public Health Service (PHS) to update a previous PHS statement on management of occupational exposure to HIV with the following findings and recommendations on PEP (CDC 1996b). Although failures of ZDV PEP have occurred (Tokars et al. 1993), ZDV PEP was associated with a decrease of approximately 79% in the risk for HIV seroconversion after percutaneous exposure to HIV-infected blood in a case-control study among HCWs (CDC 1995).
Although information about the potency and toxicity of antiretroviral drugs is available from studies of HIV-infected patients, it is uncertain to what extent this information can be applied to uninfected persons receiving PEP. In HIV-infected patients, combination therapy with the nucleosides ZDV and lamivudine (3TC) has greater antiretroviral activity than ZDV alone and is active against many ZDV-resistant HIV strains without significantly increased toxicity (Anon. 1996). Adding a protease inhibitor provides even greater increases in antiretroviral activity; among protease inhibitors, indinavir (IDV) is more potent than saquinavir at currently recommended doses and appears to have fewer drug interactions and short-term adverse effects than ritonavir (Niu, Stein and Schnittmann 1993). Few data exist to assess possible long-term (i.e., delayed) toxicity resulting from use of these drugs in persons not infected with HIV.
The following PHS recommendations are provisional because they are based on limited data regarding efficacy and toxicity of PEP and risk for HIV infection after different types of exposure. Because most occupational exposures to HIV do not result in infection transmission, potential toxicity must be carefully considered when prescribing PEP. Changes in drug regimens may be appropriate, based on factors such as the probable antiretroviral drug resistance profile of HIV from the source patient, local availability of drugs and medical conditions, concurrent drug therapy and drug toxicity in the exposed worker. If PEP is used, drug-toxicity monitoring should include a complete blood count and renal and hepatic chemical function tests at baseline and two weeks after starting PEP. If subjective or objective toxicity is noted, drug reduction or drug substitution should be considered, and further diagnostic studies may be indicated.
Chemoprophylaxis should be recommended to exposed workers after occupational exposures associated with the highest risk for HIV transmission. For exposures with a lower, but non-negligible risk, PEP should be offered, balancing the lower risk against the use of drugs having uncertain efficacy and toxicity. For exposures with negligible risk, PEP is not justified (see table 2 ). Exposed workers should be informed that knowledge about the efficacy and toxicity of PEP is limited, that for agents other than ZDV, data are limited regarding toxicity in persons without HIV infection or who are pregnant and that any or all drugs for PEP may be declined by the exposed worker.
PEP should be initiated promptly, preferably with 1 to 2 hours post-exposure. Although animal studies suggest that PEP probably is not effective when started later than 24 to 36 hours post-exposure (Niu, Stein and Schnittmann 1993; Gerberding 1995), the interval after which there is no benefit from PEP for humans is undefined. Initiating therapy after a longer interval (e.g., 1 to 2 weeks) may be considered for the highest risk exposures; even if infection is not prevented, early treatment of acute HIV infection may be beneficial (Kinloch-de-los et al. 1995).
If the source patient or the patient’s HIV status is unknown, initiating PEP should be decided on a case-by-case basis, based on the exposure risk and likelihood of infection in known or possible source patients.
Other Bloodborne Pathogens
Syphilis, malaria, babesiosis, brucellosis, leptospirosis, arboviral infections, relapsing fever, Creutzfeldt-Jakob disease, human T-lymphotropic virus type 1 and viral haemorrhagic fever have also been transmitted by the bloodborne route (CDC 1988a; Benenson 1990). Occupational transmission of these agents has only rarely been recorded, if ever.
Prevention of Transmission of Bloodborne Pathogens
There are several basic strategies which relate to the prevention of occupational transmission of bloodborne pathogens. Exposure prevention, the mainstay of occupational health, can be accomplished by substitution (e.g., replacing an unsafe device with a safer one), engineering controls (i.e., controls that isolate or remove the hazard), administrative controls (e.g., prohibiting recapping of needles by a two-handed technique) and use of personal protective equipment. The first choice is to “engineer out the problem”.
In order to reduce exposures to bloodborne pathogens, adherence to general infection control principles, as well as strict compliance with universal precaution guidelines, is required. Important components of universal precautions include the use of appropriate personal protective equipment, such as gloves, gowns and eye protection, when exposure to potentially infectious body fluids is anticipated. Gloves are one of the most important barriers between the worker and the infectious material. While they do not prevent needlesticks, protection for the skin is provided. Gloves should be worn when contact with blood or body fluids is anticipated. Washing of gloves in not recommended. Recommendations also advise workers to take precautions to prevent injuries by needles, scalpels and other sharp instruments or devices during procedures; when cleaning used instruments; during disposal of used needles; and when handling sharp instruments after procedures.
Percutaneous exposures to blood
Since the major risk of infection results from parenteral exposure from sharp instruments such as syringe needles, engineering controls such as resheathing needles, needleless IV systems, blunt suture needles and appropriate selection and use of sharps disposal containers to minimize exposures to percutaneous injuries are critical components of universal precautions.
The most common type of percutaneous inoculation occurs through inadvertent needlestick injury, many of which are associated with recapping of needles. The following reasons have been indicated by workers as reasons for recapping: inability to properly dispose of needles immediately, sharps disposal containers too far away, lack of time, dexterity problems and patient interaction.
Needles and other sharp devices can be redesigned to prevent a significant proportion of percutaneous exposures. A fixed barrier should be provided between hands and the needle after use. Worker’s hands should remain behind the needle. Any safety feature should be an integral part of the device. The design should be simple and little or no training should be required (Jagger et al. 1988).
Implementing safer needle devices must be accompanied by evaluation. In 1992, the American Hospital Association (AHA) published a briefing to assist hospitals with the selection, evaluation and adoption of safer needle devices (AHA 1992). The briefing stated that “because safer needle devices, unlike drugs and other therapies, do not undergo clinical testing for safety and efficacy before they are marketed, hospitals are essentially ‘on their own’ when it comes to selecting appropriate products for their specific institutional needs”. Included in the AHA document are guidance for the evaluation and adoption of safer needle devices, case studies of the use of safety devices, evaluation forms and listing of some, but not all, products on the US market.
Prior to implementation of a new device, health care institutions must ensure that there is an appropriate needlestick surveillance system in place. In order to accurately assess the efficacy of new devices, the number of reported exposures should be expressed as an incidence rate.
Possible denominators for reporting the number of needlestick injuries include patient days, hours worked, number of devices purchased, number of devices used and number of procedures performed. The collection of specific information on device-related injuries is an important component of the evaluation of the effectiveness of a new device. Factors to be considered in collecting information on needlestick injuries include: new product distribution, stocking and tracking; identification of users; removal of other devices; compatibility with other devices (especially IV equipment); ease of use; and mechanical failure. Factors which may contribute to bias include compliance, subject selection, procedures, recall, contamination, reporting and follow-up. Possible outcome measures include rates of needlestick injuries, HCW compliance, patient care complications and cost.
Finally, training and feedback from workers are important components of any successful needlestick prevention programme. User acceptance is a critical factor, but one that seldom receives enough attention.
Elimination or reduction of percutaneous injuries should result if adequate engineering controls are available. If HCWs, product evaluation committees, administrators and purchasing departments all work together to identify where and what safer devices are needed, safety and cost effectiveness can be combined. Occupational transmission of bloodborne pathogens is costly, both in terms of money and the impact on the employee. Every needlestick injury causes undue stress on the employee and may affect job performance. Referral to mental health professionals for supportive counselling may be required.
In summary, a comprehensive approach to prevention is essential to maintaining a safe and healthy environment in which to provide health care services. Prevention strategies include the use of vaccines, post-exposure prophylaxis and prevention or reduction of needlestick injuries. Prevention of needlestick injuries can be accomplished by improvement in the safety of needle-bearing devices, development of procedures for safer use and disposal and adherence to infection control recommendations.
Acknowledgements: The authors thank Mariam Alter, Lawrence Reed and Barbara Gooch for their manuscript review.
Infectious diseases play a significant part in worldwide occurrences of occupational disease in HCWs. Since reporting procedures vary from country to country, and since diseases considered job-related in one country may be classified as non-occupational elsewhere, accurate data concerning their frequency and their proportion of the overall number of occupational diseases among HCWs are difficult to obtain. The proportions range from about 10% in Sweden (Lagerlöf and Broberg 1989), to about 33% in Germany (BGW 1993) and nearly 40% in France (Estryn-Béhar 1991).
The prevalence of infectious diseases is directly related to the efficacy of preventive measures such as vaccines and post-exposure prophylaxis. For example, during the 1980s in France, the proportion of all viral hepatitides fell to 12.7% of its original level thanks to the introduction of vaccination against hepatitis B (Estryn-Béhar 1991). This was noted even before hepatitis A vaccine became available.
Similarly, it may be presumed that, with the declining immunization rates in many countries (e.g., in the Russian Federation and Ukraine in the former Soviet Union during 1994-1995), cases of diphtheria and poliomyelitis among HCWs will increase.
Finally, occasional infections with streptococci, staphylococci and Salmonella typhi are being reported among health care workers.
Epidemiological Studies
The following infectious diseases—listed in order of frequency—are the most important in worldwide occurrences of occupational infectious diseases in health care workers:
Also important are the following (not in order of frequency):
It is very doubtful that the very many cases of enteric infection (e.g., salmonella, shigella, etc.) often included in the statistics are, in fact, job-related, since these infections are transmitted faecally/orally as a rule.
Much data is available concerning the epidemiological significance of these job-related infections mostly in relation to hepatitis B and its prevention but also in relation to tuberculosis, hepatitis A and hepatitis C. Epidemiological studies have also dealt with measles, mumps, rubella, varicella and Ringenröteln. In using them, however, care must be taken to distinguish between incidence studies (e.g., determination of annual hepatitis B infection rates), sero-epidemiological prevalence studies and other types of prevalence studies (e.g., tuberculin tests).
Hepatitis B
The risk of hepatitis B infections, which are primarily transmitted through contact with blood during needlestick injuries, among HCWs, depends on the frequency of this disease in the population they serve. In northern, central and western Europe, Australia and North America it is found in about 2% of the population. It is encountered in about 7% of the population in southern and south-eastern Europe and most parts of Asia. In Africa, the northern parts of South America and in eastern and south-eastern Asia, rates as high as 20% have been observed (Hollinger 1990).
A Belgian study found that 500 HCWs in northern Europe became infected with hepatitis B each year while the figure for southern Europe was 5,000 (Van Damme and Tormanns 1993). The authors calculated that the annual case rate for western Europe is about 18,200 health care workers. Of these, about 2,275 ultimately develop chronic hepatitis, of whom some 220 will develop cirrhosis of the liver and 44 will develop hepatic carcinoma.
A large study involving 4,218 HCWs in Germany, where about 1% of the population is positive for hepatitis B surface antigen (HBsAg), found that the risk of contracting hepatitis B is approximately 2.5 greater among HCWs than in the general population (Hofmann and Berthold 1989). The largest study to date, involving 85,985 HCWs worldwide, demonstrated that those in dialysis, anaesthesiology and dermatology departments were at greatest risk of hepatitis B (Maruna 1990).
A commonly overlooked source of concern is the HCW who has a chronic hepatitis B infection. More than 100 instances have been recorded worldwide in which the source of the infection was not the patient but the doctor. The most spectacular instance was the Swiss doctor who infected 41 patients (Grob et al. 1987).
While the most important mechanism for transmitting the hepatitis B virus is an injury by a blood-contaminated needle (Hofmann and Berthold 1989), the virus has been detected in a number of other body fluids (e.g., male semen, vaginal secretions, cerebrospinal fluid and pleural exudate) (CDC 1989).
Tuberculosis
In most countries around the world, tuberculosis continues to rank first or second in importance of work-related infections among HCWs (see the article “Tuberculosis prevention, control and surveillance”). Many studies have demonstrated that although the risk is present throughout the professional life, it is greatest during the period of training. For example, a Canadian study in the 1970s demonstrated the tuberculosis rate among female nurses to be double that of women in other professions (Burhill et al. 1985). And, in Germany, where the tuberculosis incidence ranges around 18 per 100,000 for the general population, it is about 26 per 100,000 among health care workers (BGW 1993).
A more accurate estimate of the risk of tuberculosis may be obtained from epidemiological studies based on the tuberculin test. A positive reaction is an indicator of infection by Mycobacterium tuberculosis or other mycobacteria or a prior inoculation with the BCG vaccine. If that inoculation was received 20 or more years earlier, it is presumed that the positive test indicates at least one contact with tubercle bacilli.
Today, tuberculin testing is done by means of the patch test in which the response is read within five to seven days after the application of the “stamp”. A large-scale German study based on such skin tests showed a rate of positives among health professionals that was only moderately higher than that among the general population (Hofmann et al. 1993), but long-range studies demonstrate that a greatly heightened risk of tuberculosis does exist in some areas of health care services.
More recently, anxiety has been generated by the increasing number of cases infected with drug-resistant organisms. This is a matter of particular concern in designing a prophylactic regimen for apparently healthy health care workers whose tuberculin tests “converted” to positive after exposure to patients with tuberculosis.
Hepatitis A
Since the hepatitis A virus is transmitted almost exclusively through faeces, the number of HCWs at risk is substantially smaller than for hepatitis B. An early study conducted in West Berlin showed that paediatric personnel were at greatest risk of this infection (Lange and Masihi 1986). These results were subsequently confirmed by a similar study in Belgium (Van Damme et al. 1989). Similarly, studies in Southwest Germany showed increase risk to nurses, paediatric nurses and cleaning women (Hofmann et al. 1992; Hofmann, Berthold and Wehrle 1992). A study undertaken in Cologne, Germany, revealed no risk to geriatric nurses in contrast to higher prevalence rates among the personnel of child care centres. Another study showed increased risk of hepatitis A among paediatric nurses in Ireland, Germany and France; in the last of these, greater risk was found in workers in psychiatric units treating children and youngsters. Finally, a study of infection rates among handicapped people disclosed higher levels of risk for the patients as well as the workers caring for them (Clemens et al. 1992).
Hepatitis C
Hepatitis C, discovered in 1989, like hepatitis B, is primarily transmitted through blood introduced via needle puncture wounds. Until recently, however, data relating to its threat to HCWs have been limited. A 1991 New York study of 456 dentists and 723 controls showed an infection rate of 1.75% among the dentists compared with 0.14% among the controls (Klein et al. 1991). A German research group demonstrated the prevalence of hepatitis C in prisons and attributed it to the large number of intravenous drug users among the inmates (Gaube et al. 1993). An Austrian study found 2.0% of 294 health care personnel to be seropositive for hepatitis C antibodies, a figure thought to be much higher than that among the general population (Hofmann and Kunz 1990). This was confirmed by another study of HCWs conducted in Cologne, Germany (Chriske and Rossa 1991).
A study in Freiburg, Germany, found that contact with handicapped residents of nursing homes, particularly those with infantile cerebral paresis and trisomia-21, patients with haemophilia and those dependent on drugs administered intravenously presented a particular risk of hepatitis C to workers involved in their care. A significantly increased prevalence rate was found in dialysis personnel and the relative risk to all health care workers was estimated to be 2.5% (admittedly calculated from a relatively small sample).
A possible alternative path of infection was demonstrated in 1993 when a case of hepatitis C was shown to have developed after a splash into the eye (Sartori et al. 1993).
Varicella
Studies of the prevalence of varicella, an illness particularly grave in adults, have consisted of tests for varicella antibodies (anti VZV) conducted in Anglo-Saxon countries. Thus, a seronegative rate of 2.9% was found among 241 hospital employees aged 24 to 62, but the rate was 7.5% for those under the age of 35 (McKinney, Horowitz and Baxtiola 1989). Another study in a paediatric clinic yielded a negative rate of 5% among 2,730 individuals tested in the clinic, but these data become less impressive when it is noted that the serological tests were performed only on persons without a history of having had varicella. A significantly increased risk of varicella infection for paediatric hospital personnel, however, was demonstrated by a study conducted in Freiburg, which found that, in a group of 533 individuals working in hospital care, paediatric hospital care and administration, evidence of varicella immunity was present in 85% of persons younger than 20 years.
Mumps
In considering risk levels of mumps infection, a distinction must be made between countries in which mumps immunization is mandatory and those in which these inoculations are voluntary. In the former, nearly all children and young people will have been immunized and, therefore, mumps poses little risk to health care workers. In the latter, which includes Germany, cases of mumps are becoming more frequent. As a result of lack of immunity, the complications of mumps have been increasing, particularly among adults. A report of an epidemic in a non-immune Inuit population on St. Laurance Island (located between Siberia and Alaska) demonstrated the frequency of such complications of mumps as orchitis in men, mastitis in women and pancreatitis in both sexes (Philip, Reinhard and Lackman 1959).
Unfortunately, epidemiological data on mumps among HCWs are very sparse. A 1986 study in Germany showed that the rate of mumps immunity among 15 to 10 year-olds was 84% but, with voluntary rather than mandatory inoculation, one may presume that this rate has been declining. A 1994 study involving 774 individuals in Freiburg indicated a significantly increased risk to employees in paediatric hospitals (Hofmann, Sydow and Michaelis 1994).
Measles
The situation with measles is similar to that with mumps. Reflecting its high degree of contagiousness, risks of infection among adults emerge as their immunization rates fall. A US study reported an immunity rate of over 99% (Chou, Weil and Arnmow 1986) and two years later 98% of a cohort of 163 nursing students were found to have immunity (Wigand and Grenner 1988). A study in Freiburg yielded rates of 96 to 98% among nurses and paediatric nurses while the rates of immunity among non-medical personnel were only 87 to 90% (Sydow and Hofman 1994). Such data would support a recommendation that immunization be made mandatory for the general population.
Rubella
Rubella falls between measles and mumps with respect to its contagiousness. Studies have shown that about 10% of HCWs are not immune (Ehrengut and Klett 1981; Sydow and Hofmann 1994) and, therefore, at high risk of infection when exposed. Although generally not a serious illness among adults, rubella may be responsible for devastating effects on the foetus during the first 18 weeks of pregnancy: abortion, stillbirth or congenital defects (see table 1) (South, Sever and Teratogen 1985; Miller, Vurdien and Farrington 1993). Since these may be produced even before the woman knows that she is pregnant and, since health care workers, particularly those in contact with paediatric patients, are likely to be exposed, it is especially important that inoculation be urged (and perhaps even required) for all female health care workers of child-bearing age who are not immune.
Table 1. Congenital abnormalities following rubella infection in pregnancy
Studies by South, Sever and Teratogen (1985) |
|||||
Week of pregnancy |
<4 |
5–8 |
9–12 |
13–16 |
>17 |
Deformity rate (%) |
70 |
40 |
25 |
40 |
8 |
Studies by Miller, Vurdien and Farrington (1993) |
|||||
Week of pregnancy |
<10 |
11–12 |
13–14 |
15–16 |
>17 |
Deformity rate (%) |
90 |
33 |
11 |
24 |
0 |
HIV/AIDS
During the 1980s and 1990s, HIV seroconversions (i.e., a positive reaction in an individual previously found to have been negative) became a minor occupational risk among HCWs, although clearly not one to be ignored. By early 1994, reports of some 24 reliably documented cases and 35 possible cases were collected in Europe (Pérez et al. 1994) with an additional 43 documented cases and 43 possible cases were reported in the US (CDC 1994a). Unfortunately, except for avoiding needlesticks and other contacts with infected blood or body fluids, there are no effective preventive measures. Some prophylactic regimens for individuals who have been exposed are recommended and described in the article “Prevention of occupational transmission of bloodborne pathogens”.
Other infectious diseases
The other infectious diseases listed earlier in this article have not yet emerged as significant hazards to HCWs either because they have not been recognized and reported or because their epidemiology has not yet been studied. Sporadic reports of single and small clusters of cases suggest that the identification and testing of serological markers should be explored. For example, a 33-month study of typhus conducted by the Centers for Disease Control (CDC) revealed that 11.2% of all sporadic cases not associated with outbreaks occurred in laboratory workers who had examined stool specimens (Blazer et al. 1980).
The future is clouded by two simultaneous problems: the emergence of new pathogens (e.g., new strains such as hepatitis G and new organisms such as the Ebola virus and the equine morbillivirus recently discovered to be fatal to both horses and humans in Australia) and the continuing development of drug resistance by well-recognized organisms such as the tuberculus bacillus. HCWs are likely to be the first to be systematically exposed. This makes their prompt and accurate identification and the epidemiological study of their patterns of susceptibility and transmission of the utmost importance.
Prevention of Infectious Diseases among Health Care Workers
The first essential in the prevention of infectious disease is the indoctrination of all HCWs, support staff as well as health professionals, in the fact that health care facilities are “hotbeds” of infection with every patient representing a potential risk. This is important not only for those directly involved in diagnostic or therapeutic procedures, but also those who collect and handle blood, faeces and other biological materials and those who come in contact with dressings, linens, dishes and other fomites. In some instances, even breathing the same air may be a possible hazard. Each health care facility, therefore, must develop a detailed procedure manual identifying these potential risks and the steps needed to eliminate, avoid or control them. Then, all personnel must be drilled in following these procedures and monitored to ensure that they are being properly performed. Finally, all failures of these protective measures must be recorded and reported so that revision and/or retraining may be undertaken.
Important secondary measures are the labelling of areas and materials which may be especially infectious and the provision of gloves, gowns, masks, forceps and other protective equipment. Washing the hands with germicidal soap and running water (wherever possible) will not only protect the health care worker but also will minimize the risk of his or her transmitting the infection to co-workers and other patients.
All blood and body fluid specimens or splashes and materials stained with them must be handled as though they are infected. The use of rigid plastic containers for the disposal of needles and other sharp instruments and diligence in the proper disposal of potentially infectious wastes are important preventive measures.
Careful medical histories, serological testing and patch testing should be performed prior to or as soon as health care workers report for duty. Where advisable (and there are no contraindications), appropriate vaccines should be administered (hepatitis B, hepatitis A and rubella appear to be the most important) (see table 2). In any case, seroconversion may indicate an acquired infection and the advisability of prophylactic treatment.
Table 2. Indications for vaccinations in health service employees.
Disease |
Complications |
Who should be vaccinated? |
Diptheria |
In the event of an epidemic, all employees without |
|
Hepatitis A |
Employees in the paediatric field as well as in infection |
|
Hepatitis B |
All seronegative employees with possibility of contact |
|
Influenza |
Regularly offered to all employees |
|
Measles |
Encephalitis |
Seronegative employees in the paediatric field |
Mumps |
Meningitis |
Seronegative employees in the paediatric field |
Rubella |
Embryopathy |
Seronegative employees in paediatry/midwifery/ |
Poliomyelitis |
All employees, e.g., those involved in vaccination |
|
Tetanus |
Employees in gardening and technical fields obligatory, |
|
Tuberculosis |
In all events employees in pulmonology and lung surgery |
|
Varicellas |
Foetal risks |
Seronegative employees in paediatry or at least in the |
Prophylactic therapy
In some exposures when it is known that the worker is not immune and has been exposed to a proven or highly suspected risk of infection, prophylactic therapy may be instituted. Especially if the worker presents any evidence of possible immunodeficiency, human immunoglobulin may be administered. Where specific “hyperimmune” serum is available, as in mumps and hepatitis B, it is preferable. In infections which, like hepatitis B, may be slow to develop, or “booster” doses are advisable, as in tetanus, a vaccine may be administered. When vaccines are not available, as in meningococcus infections and plague, prophylactic antibiotics may be used either alone or as a supplement to immune globulin. Prophylactic regimens of other drugs have been developed for tuberculosis and, more recently, for potential HIV infections, as discussed elsewhere in this chapter.
Most episodes of acute back pain respond promptly to several days of rest followed by the gradual resumption of activities within the limits of pain. Non-narcotic analgesics and non-steroidal anti-inflammatory drugs may be helpful in relieving pain but do not shorten the course. (Since some of these drugs affect alertness and reaction time, they should be used with caution by individuals who drive vehicles or have assignments where momentary lapses may result in harm to patients.) A variety of forms of physiotherapy (e.g., local applications of heat or cold, diathermy, massage, manipulation, etc.) often provide short periods of transient relief; they are particularly useful as a prelude to graded exercises that will promote the restoration of muscle strength and relaxation as well as flexibility. Prolonged bed rest, traction and the use of lumbar corsets tend to delay recovery and often lengthen the period of disability (Blow and Jayson 1988).
Chronic, recurrent back pain is best treated by a secondary prevention regimen. Getting enough rest, sleeping on a firm mattress, sitting in straight chairs, wearing comfortable, well-fitted shoes, maintaining good posture and avoiding long periods of standing in one position are important adjuncts. Excessive or prolonged use of medications increase the risk of side effects and should be avoided. Some cases are helped by the injection of “trigger points”, localized tender nodules in muscles and ligaments, as originally advocated in the seminal report by Lange (1931).
Exercise of key postural muscles (upper and lower abdominal, back, gluteal and thigh muscles) is the mainstay of both chronic care and prevention of back pain. Kraus (1970) has formulated a regimen that features strengthening exercises to correct muscle weakness, relaxing exercises to relief tension, spasticity and rigidity, stretching exercises to minimize contractures and exercises to improve balance and coordination. These exercises, he cautions, should be individualized on the basis of examination of the patient and functional tests of muscle strength, holding power and elasticity (e.g., the Kraus-Weber tests (Kraus 1970)). To avoid adverse effects of exercise, each session should include warm-up and cool-down exercises as well as limbering and relaxing exercises, and the number, duration and intensity of the exercises should be increased gradually as conditioning improves. Simply giving the patient a printed exercise sheet or booklet is not enough; initially, he or she should be given individual instruction and observed to be sure that the exercises are being done correctly.
In 1974, the YMCA in New York introduced the “Y’s Way to a Healthy Back Program”, a low-cost course of exercise training based on the Kraus exercises; in 1976 it became a national programme in the US and, later, it was established in Australia and in several European countries (Melleby 1988). The twice-a-week, six week programme is given by specially-trained YMCA exercise instructors and volunteers, mainly in urban YMCAs (arrangements for courses at the worksite have been made by a number of employers), and it emphasizes the indefinite continuation of the exercises at home. Approximately 80% of the thousands of individuals with chronic or recurrent back pain who have participated in this program have reported elimination or improvement in their pain.
Epidemiology
The significance of back pain among instances of disease in developed industrial societies is currently on the rise. According to data provided by the National Center for Health Statistics in the United States, chronic diseases of the back and of the vertebral column make up the dominant group among disorders affecting employable individuals under 45 in the US population. Countries such as Sweden, which have at their disposal traditionally good occupational accident statistics, show that musculoskeletal injuries occur twice as frequently in the health services as in all other fields (Lagerlöf and Broberg 1989).
In an analysis of accident frequency in a 450-bed hospital in the United States, Kaplan and Deyo (1988) were able to demonstrate an 8 to 9% yearly incidence of injury to lumbar vertebrae in nurses, leading on average to 4.7 days of absence from work. Thus of all employee groups in hospitals, nurses were the one most afflicted by this condition.
As is clear from a survey of studies done in the last 20 years (Hofmann and Stössel 1995), this disorder has become the object of intensive epidemiological research. All the same, such research—particularly when it aims at furnishing internationally comparable results—is subject to a variety of methodological difficulties. Sometimes all employee categories in the hospital are investigated, sometimes simply nurses. Some studies have suggested that it would make sense to differentiate, within the group “nurses”, between registered nurses and nursing aides. Since nurses are predominantly women (about 80% in Germany), and since reported incidence and prevalence rates regarding this disorder do not differ significantly for male nurses, gender-related differentiation would seem to be of less importance to epidemiological analyses.
More important is the question of what investigative tools should be used to research back pain conditions and their gradations. Along with the interpretation of accident, compensation and treatment statistics, one frequently finds, in the international literature, a retrospectively applied standardized questionnaire, to be filled out by the person tested. Other investigative approaches operate with clinical investigative procedures such as orthopaedic function studies or radiological screening procedures. Finally, the more recent investigative approaches also use biomechanical modelling and direct or video-taped observation to study the pathophysiology of work performance, particularly as it involves the lumbo-sacral area (see Hagberg et al. 1993 and 1995).
An epidemiological determination of the extent of the problem based on self-reported incidence and prevalence rates, however, poses difficulties as well. Cultural-anthropological studies and comparisons of health systems have shown that perceptions of pain differ not only between members of different societies but also within societies (Payer 1988). Also, there is the difficulty of objectively grading the intensity of pain, a subjective experience. Finally, the prevailing perception among nurses that “back pain goes with the job” leads to under-reporting.
International comparisons based on analyses of governmental statistics on occupational disorders are unreliable for scientific evaluation of this disorder because of variations in the laws and regulations related to occupational disorders among different countries. Further, within a single country, there is the truism that such data are only as reliable as the reports upon which they are based.
In summary, many studies have determined that 60 to 80% of all nursing staff (averaging 30 to 40 years in age) have had at least one episode of back pain during their working lives. The reported incidence rates usually do not exceed 10%. When classifying back pain, it has been helpful to follow the suggestion of Nachemson and Anderson (1982) to distinguish between back pain and back pain with sciatica. In an as-yet unpublished study a subjective complaint of sciatica was found to be useful in classifying the results of subsequent CAT scans (computer assisted tomography) and magnetic resonance imaging (MRI).
Economic Costs
Estimates of the economic costs differ greatly, depending, in part, on the possibilities and conditions of diagnosis, treatment and compensation available at the particular time and/or place. Thus, in the US for 1976, Snook (1988b) estimated that the costs of back pain totalled US$14 billion, while a total cost of US$25 billion was calculated for 1983. The calculations of Holbrook et al. (1984), which estimated 1984 costs to total just under US$16 billion, appear to be most reliable. In the United Kingdom, costs were estimated to have risen by US$2 billion between 1987 and 1989 according to Ernst and Fialka (1994). Estimates of direct and indirect costs for 1990 reported by Cats-Baril and Frymoyer (1991) indicate that the costs of back pain have continued to increase. In 1988 the US Bureau of National Affairs reported that chronic back pain generated costs of US$80,000 per chronic case per year.
In Germany, the two largest workers’ accident insurance funds (Berufsgenossenschaften) developed statistics showing that, in 1987, about 15 million work days were lost because of back pain. This corresponds to roughly one-third of all missed work days annually. These losses appear to be increasing at a current average cost of DM 800 per lost day.
It may therefore be said, independently of national differences and vocational groups, that back disorders and their treatment represent not simply a human and a medical problem, but also an enormous economic burden. Accordingly, it seems advisable to pay special attention to the prevention of these disorders in particularly burdened vocational groups such as nursing.
In principle one should differentiate, in research concerning the causes of work-related disorders of the lower back in nurses, between those attributed to a particular incident or accident and those whose genesis lacks such specificity. Both may give rise to chronic back pain if not properly treated. Reflecting their presumed medical knowledge, nurses are much more prone to use self-medication and self-treatment, without consulting a physician, than other groups in the working population. This is not always a disadvantage, since many physicians either do not know how to treat back problems or give them short shrift, simply prescribing sedatives and advising heat applications to the area. The latter reflects the oft-repeated truism that “backaches come with the job”, or the tendency to regard workers with chronic back complaints as malingerers.
Detailed analyses of work accident occurrences in the area of spinal disorders have only just begun to be made (see Hagberg et al. 1995). This is also true of the analysis of so-called near-accidents, which can provide a particular sort of information concerning the precursor conditions of a given work accident.
The cause of low back disorders has been attributed by the majority of the studies to the physical demands of the work of nursing, i.e., lifting, supporting and moving of patients and handling heavy and/or bulky equipment and materials, often without ergonomic aids or the help of additional personnel. These activities are often conducted in awkward body positions, where footing is uncertain, and when, out of wilfulness or dementia, the nurse’s efforts are resisted by the patient. Trying to keep a patient from falling often results in injury to the nurse or the attendant. Current research, however, is characterized by a strong tendency to speak in terms of multicausality, whereby both the biomechanical basis of demands made upon the body and the anatomical preconditions are discussed.
In addition to faulty biomechanics, injury in such situations can be pre-conditioned by fatigue, muscular weakness (especially of the abdominals, back extensors and quadriceps), diminished flexibility of joints and ligaments and various forms of arthritis. Excessive psychosocial stress can contribute in two ways: (1) prolonged unconscious muscular tension and spasm leading to muscular fatigue and proneness to injury, and (2) irritation and impatience which prompts injudicious attempts to work hurriedly and without waiting for assistance. Enhanced ability to cope with stress and the availability of social support in the workplace are helpful (Theorell 1989; Bongers et al. 1992) when work-related stressors cannot be eliminated or controlled.
Diagnosis
Certain risk situations and dispositions may be added to the risk factors deriving from the biomechanics of the forces acting on the spine and from the anatomy of the support and movement apparatus, ones which are attributable to the work environment. Even though current research is not clear on this point, there is still some indication that the increased and recurrent incidence of psychosocial stress factors in nursing work has the capacity to reduce the threshold of sensitivity to physically burdensome activities, thus contributing to an increased level of vulnerability. In any case, whether such stress factors exist appears to be less decisive in this connection than how nursing staff manages them in a demanding situation and whether they can count on social support in the workplace (Theorell 1989; Bongers et al. 1992).
The proper diagnosis of low back pain requires a complete medical and a detailed occupational history including accidents resulting in injury or near-misses and prior episodes of back pain. The physical examination should include evaluation of gait and posture, palpation for areas of tenderness and evaluation of muscle strength, range of motion and joint flexibility. Complaints of weakness in the leg, areas of numbness and pain that radiate below the knee are indications for neurological examination to seek evidence of spinal cord and/or peripheral nerve involvement. Psychosocial problems may be disclosed through judicious probing of emotional status, attitudes and pain tolerance.
Radiological studies and scans are rarely helpful since, in the vast majority of cases, the problem lies in the muscles and ligaments rather than the bony structures. In fact, bony abnormalities are found in many individuals who have never had back pain; ascribing the back pain to such radiological findings as disc space narrowing or spondylosis may lead to needlessly heroic treatment. Myelography should not be undertaken unless spinal surgery is contemplated.
Clinical laboratory tests are useful in assessing general medical status and may be helpful in disclosing systemic diseases such as arthritis.
Treatment
Various modes of management are indicated depending on the nature of the disorder. Besides ergonomic interventions to enable the return of injured workers to the workplace, surgical, invasive-radiological, pharmacological, physical, physiotherapeutic and also psychotherapeutic management approaches may be necessary—sometimes in combination (Hofmann et al. 1994). Again, however, the vast majority of cases resolve regardless of the therapy offered. Treatment is discussed further in the Case Study: Treatment of Back Pain.
Prevention in the Work Environment
Primary prevention of back pain in the workplace involves the application of ergonomic principles and the use of technical aids, coupled with physical conditioning and training of the workers.
Despite the reservations frequently held by nursing staff regarding the use of technical aids for the lifting, positioning and moving of patients, the importance of ergonomic approaches to prevention is increasing (see Estryn-Béhar, Kaminski and Peigné 1990; Hofmann et al. 1994).
In addition to the major systems (permanently installed ceiling lifters, mobile floor lifters), a series of small and simple systems has been introduced noticeably into nursing practice (turntables, walking girdles, lifting cushions, slide boards, bed ladders, anti-slide mats and so on). When using these aids it is important that their actual use fits in well with the care concept of the particular area of nursing in which they are used. Wherever the use of such lifting aids stands in contradiction to the care concept practised, acceptance of such technical lifting aids by nursing staff tends to be low.
Even where technical aids are employed, training in techniques of lifting, carrying and supporting are essential. Lidström and Zachrisson (1973) describe a Swedish “Back School” in which physiotherapists trained in communication conduct classes explaining the structure of the spine and its muscles, how they work in different positions and movements and what can go wrong with them, and demonstrating appropriate lifting and handling techniques that will prevent injury. Klaber Moffet et al. (1986) describe the success of a similar programme in the UK. Such training in lifting and carrying is particularly important where, for one reason or another, use of technical aids is not possible. Numerous studies have shown that training in such techniques must constantly be reviewed; knowledge gained through instruction is frequently “unlearned” in practice.
Unfortunately, the physical demands presented by patients’ size, weight, illness and positioning are not always amenable to nurses’ control and they are not always able to modify the physical environment and the way their duties are structured. Accordingly, it is important for institutional managers and nursing supervisors to be included in the educational programme so that, when making decisions about work environments, equipment and job assignments, factors making for “back friendly” working conditions can be considered. At the same time, deployment of staff, with particular reference to nurse-patient ratios and the availability of “helping hands”, must be appropriate to the nurses’ well-being as well as consistent with the care concept, as hospitals in the Scandinavian countries seem to have managed to do in exemplary fashion. This is becoming ever more important where fiscal constraints dictate staff reductions and cut-backs in equipment procurement and maintenance.
Recently developed holistic concepts, which see such training not simply as instruction in bedside lifting and carrying techniques but rather as movement programmes for both nurses and patients, could take the lead in future developments in this area. Approaches to “participatory ergonomics” and programmes of health advancement in hospitals (understood as organizational development) must also be more intensively discussed and researched as future strategies (see article “Hospital ergonomics: A review”).
Since psychosocial stress factors also exercise a moderating function in the perception and mastering of the physical demands made by work, prevention programmes should also ensure that colleagues and superiors work to ensure satisfaction with work, avoid making excessive demands on the mental and physical capacities of workers and provide an appropriate level of social support.
Preventive measures should extend beyond professional life to include work in the home (housekeeping and caring for small children who have to be lifted and carried are particular hazards) as well as in sports and other recreational activities. Individuals with persistent or recurrent back pain, however it is acquired, should be no less diligent in following an appropriate preventive regimen.
Rehabilitation
The key to a rapid recovery is early mobilization and a prompt resumption of activities with the limits of tolerance and comfort. Most patients with acute back injuries recover fully and return to their usual work without incident. Resumption of an unrestricted range of activity should not be undertaken until exercises have fully restored muscle strength and flexibility and banished the fear and temerity that make for recurrent injury. Many individuals exhibit a tendency to recurrences and chronicity; for these, physiotherapy coupled with exercise and control of psychosocial factors will often be helpful. It is important that they return to some form of work as quickly as possible. Temporary elimination of more strenuous tasks and limitation of hours with a graduated return to unrestricted activity will promote a more complete recovery in these cases.
Fitness for work
The professional literature attributes only a very limited prognostic value to screening done before employees start work (US Preventive Services Task Force 1989). Ethical considerations and laws such as the Americans with Disabilities Act mitigate against pre-employment screening. It is generally agreed that pre-employment back x rays have no value, particularly when one considers their cost and the needless exposure to radiation. Newly-hired nurses and other health workers and those returning from an episode of disability due to back pain should be evaluated to detect any predisposition to this problem and provided with access to educational and physical conditioning programmes that will prevent it.
Conclusion
The social and economic impact of back pain, a problem particularly prevalent among nurses, can be minimized by the application of ergonomic principles and technology in the organization of their work and its environment, by physical conditioning that enhances the strength and flexibility of the postural muscles, by education and training in the performance of problematic activities and, when episodes of back pain do occur, by treatment that emphasizes a minimum of medical intervention and a prompt return to activity.
Several countries have established recommended noise, temperature and lighting levels for hospitals. These recommendations are, however, rarely included in the specifications given to hospital designers. Further, the few studies examining these variables have reported disquieting levels.
Noise
In hospitals, it is important to distinguish between machine-generated noise capable of impairing hearing (above 85 dBA) and noise which is associated with a degradation of ambiance, administrative work and care (65 to 85 dBA).
Machine-generated noise capable of impairing hearing
Prior to the 1980s, a few publications had already drawn attention to this problem. Van Wagoner and Maguire (1977) evaluated the incidence of hearing loss among 100 employees in an urban hospital in Canada. They identified five zones in which noise levels were between 85 and 115 dBA: the electrical plant, laundry, dish-washing station and printing department and areas where maintenance workers used hand or power tools. Hearing loss was observed in 48% of the 50 workers active in these noisy areas, compared to 6% of workers active in quieter areas.
Yassi et al. (1992) conducted a preliminary survey to identify zones with dangerously high noise levels in a large Canadian hospital. Integrated dosimetry and mapping were subsequently used to study these high-risk areas in detail. Noise levels exceeding 80 dBA were common. The laundry, central processing, nutrition department, rehabilitation unit, stores and electrical plant were all studied in detail. Integrated dosimetry revealed levels of up to 110 dBA at some of these locations.
Noise levels in a Spanish hospital’s laundry exceeded 85 dBA at all workstations and reached 97 dBA in some zones (Montoliu et al. 1992). Noise levels of 85 to 94 dBA were measured at some workstations in a French hospital’s laundry (Cabal et al. 1986). Although machine re-engineering reduced the noise generated by pressing machines to 78 dBA, this process was not applicable to other machines, due to their inherent design.
A study in the United States reported that electrical surgical instruments generate noise levels of 90 to 100 dBA (Willet 1991). In the same study, 11 of 24 orthopaedic surgeons were reported to suffer from significant hearing loss. The need for better instrument design was emphasized. Vacuum and monitor alarms have been reported to generate noise levels of up to 108 dBA (Hodge and Thompson 1990).
Noise associated with a degradation of ambiance, administrative work and care
A systematic review of noise levels in six Egyptian hospitals revealed the presence of excessive levels in offices, waiting rooms and corridors (Noweir and al-Jiffry 1991). This was attributed to the characteristics of hospital construction and of some of the machines. The authors recommended the use of more appropriate building materials and equipment and the implementation of good maintenance practices.
Work in the first computerized facilities was hindered by the poor quality of printers and the inadequate acoustics of offices. In the Paris region, groups of cashiers talked to their clients and processed invoices and payments in a crowded room whose low plaster ceiling had no acoustic absorption capacity. Noise levels with only one printer active (in practice, all four usually were) were 78 dBA for payments and 82 dBA for invoices.
In a 1992 study of a rehabilitation gymnasium consisting of 8 cardiac rehabilitation bicycles surrounded by four private patient areas, noise levels of 75 to 80 dBA and 65 to 75 dBA were measured near cardiac rehabilitation bicycles and in the neighbouring kinesiology area, respectively. Levels such as these render personalized care difficult.
Shapiro and Berland (1972) viewed noise in operating theatres as the “third pollution”, since it increases the fatigue of the surgeons, exerts physiological and psychological effects and influences the accuracy of movements. Noise levels were measured during a cholecystectomy and during tubal ligation. Irritating noises were associated with the opening of a package of gloves (86 dBA), the installation of a platform on the floor (85 dBA), platform adjustment (75 to 80 dBA), placing surgical instruments upon each other (80 dBA), suctioning of trachea of patient (78 dBA), continuous suction bottle (75 to 85 dBA) and the heels of nurses’ shoes (68 dBA). The authors recommended the use of heat-resistant plastic, less noisy instruments and, to minimize reverberation, easily cleaned materials other than ceramic or glass for walls, tiles and ceilings.
Noise levels of 51 to 82 dBA and 54 to 73 dBA have been measured in the centrifuge room and automated analyser room of a medical analytical laboratory. The Leq (reflecting full-shift exposure) at the control station was 70.44 dBA, with 3 hours over 70 dBA. At the technical station, the Leq was 72.63 dBA, with 7 hours over 70 dBA. The following improvements were recommended: installing telephones with adjustable ring levels, grouping centrifuges in a closed room, moving photocopiers and printers and installing hutches around the printers.
Patient Care and Comfort
In several countries, recommended noise limits for care units are 35 dBA at night and 40 dBA during the day (Turner, King and Craddock 1975). Falk and Woods (1973) were the first to draw attention to this point, in their study of noise levels and sources in neonatology incubators, recovery rooms and two rooms in an intensive-care unit. The following mean levels were measured over a 24-hour period: 57.7 dBA (74.5 dB) in the incubators, 65.5 dBA (80 dB linear) at the head of patients in the recovery room, 60.1 dBA (73.3 dB) in the intensive care unit and 55.8 dBA (68.1 dB) in one patient room. Noise levels in the recovery room and intensive-care unit were correlated with the number of nurses. The authors emphasized the probable stimulation of patients’ hypophyseal-corticoadrenal system by these noise levels, and the resultant increase in peripheral vasoconstriction. There was also some concern about the hearing of patients receiving aminoglycoside antibiotics. These noise levels were considered incompatible with sleep.
Several studies, most of which have been conducted by nurses, have shown that noise control improves patient recovery and quality of life. Reports of research conducted in neonatology wards caring for low-birth-weight babies emphasized the need to reduce the noise caused by personnel, equipment and radiology activities (Green 1992; Wahlen 1992; Williams and Murphy 1991; Oëler 1993; Lotas 1992; Halm and Alpen 1993). Halm and Alpen (1993) have studied the relationship between noise levels in intensive-care units and the psychological well-being of patients and their families (and in extreme cases, even of post-resuscitation psychosis). The effect of ambient noise on the quality of sleep has been rigorously evaluated under experimental conditions (Topf 1992). In intensive care units, the playing of pre-recorded sounds was associated with a deterioration of several sleep parameters.
A multi-ward study reported peak noise levels at the head of patients in excess of 80 dBA, especially in intensive- and respiratory-care units (Meyer et al. 1994). Lighting and noise levels were recorded continuously over seven consecutive days in a medical intensive-care unit, one-bed and multi-bed rooms in a respiratory-care unit and a private room. Noise levels were very high in all cases. The number of peaks exceeding 80 dBA was particularly high in the intensive- and respiratory-care units, with a maximum observed between 12:00 and 18:00 and a minimum between 00:00 and 06:00. Sleep deprivation and fragmentation were considered to have a negative impact on the respiratory system of patients and impair the weaning of patients from mechanical ventilation.
Blanpain and Estryn-Béhar (1990) found few noisy machines such as waxers, ice machines and hotplates in their study of ten Paris-area wards. However, the size and surfaces of the rooms could either reduce or amplify the noise generated by these machines, as well as that (albeit lower) generated by passing cars, ventilation systems and alarms. Noise levels in excess of 45 dBA (observed in 7 of 10 wards) did not promote patient rest. Furthermore, noise disturbed hospital personnel performing very precise tasks requiring close attention. In five of 10 wards, noise levels at the nursing station reached 65 dBA; in two wards, levels of 73 dBA were measured. Levels in excess of 65 dBA were measured in three pantries.
In some cases, architectural decorative effects were instituted with no thought to their effect on acoustics. For example, glass walls and ceilings have been in fashion since the 1970s and have been used in patient admission open-space offices. The resultant noise levels do not contribute to the creation of a calm environment in which patients about to enter the hospital can fill out forms. Fountains in this type of hall generated a background noise level of 73 dBA at the reception desk, requiring receptionists to ask one-third of people requesting information to repeat themselves.
Heat stress
Costa, Trinco and Schallenberg (1992) studied the effect of installing a laminar flow system, which maintained air sterility, on heat stress in an orthopaedic operating theatre. Temperature in the operating theatre increased by approximately 3 °C on average and could reach 30.2 °C. This was associated with a deterioration of the thermal comfort of operating-room personnel, who must wear very bulky clothes that favour heat retention.
Cabal et al. (1986) analysed heat stress in a hospital laundry in central France prior to its renovation. They noted that the relative humidity at the hottest workstation, the “gown-dummy”, was 30%, and radiant temperature reached 41 °C. Following installation of double-pane glass and reflective outside walls, and implementation of 10 to 15 air changes per hour, thermal comfort parameters fell within standard levels at all workstations, regardless of the weather outside. A study of a Spanish hospital laundry has shown that high wet-bulb temperatures result in oppressive work environments, especially in ironing areas, where temperatures may exceed 30 °C (Montoliu et al. 1992).
Blanpain and Estryn-Béhar (1990) characterized the physical work environment in ten wards whose work content they had already studied. Temperature was measured twice in each of ten wards. The nocturnal temperature in patient rooms may be below 22 °C, as patients use covers. During the day, as long as patients are relatively inactive, a temperature of 24 °C is acceptable but should not be exceeded, since some nursing interventions require significant exertion.
The following temperatures were observed between 07:00 and 07:30: 21.5 °C in geriatric wards, 26 °C in a non-sterile room in the haematology ward. At 14:30 on a sunny day, the temperatures were as follows: 23.5 °C in the emergency room and 29 °C in the haematology ward. Afternoon temperatures exceeded 24 °C in 9 of 19 cases. The relative humidity in four out of five wards with general air-conditioning was below 45% and was below 35% in two wards.
Afternoon temperature also exceeded 22 °C at all nine care preparation stations and 26 °C at three care stations. The relative humidity was below 45% in all five stations of wards with air-conditioning. In the pantries, temperatures ranged between 18 °C and 28.5 °C.
Temperatures of 22 °C to 25 °C were measured at the urine drains, where there were also odour problems and where dirty laundry was sometimes stored. Temperatures of 23 °C to 25 °C were measured in the two dirty-laundry closets; a temperature of 18 °C would be more appropriate.
Complaints concerning thermal comfort were frequent in a survey of 2,892 women working in Paris-area wards (Estryn-Béhar et al. 1989a). Complaints of being often or always hot were reported by 47% of morning- and afternoon-shift nurses and 37% of night-shift nurses. Although nurses were sometimes obliged to perform physically strenuous work, such as making several beds, the temperature in the various rooms was too high to perform these activities comfortably while wearing polyester-cotton clothes, which hinder evaporation, or gowns and masks necessary for the prevention of nosocomial infections.
On the other hand, 46% of night-shift nurses and 26% of morning- and afternoon-shift nurses reported being often or always cold. The proportions reporting never suffering from the cold were 11% and 26%.
To conserve energy, the heating in hospitals was often lowered during the night, when patients are under covers. However nurses, who must remain alert despite chronobiologically mediated drops in core body temperatures, were required to put on jackets (not always very hygienic ones) around 04:00. At the end of the study, some wards installed adjustable space-heating at nursing stations.
Studies of 1,505 women in 26 units conducted by occupational physicians revealed that rhinitis and eye irritation were more frequent among nurses working in air-conditioned rooms (Estryn-Béhar and Poinsignon 1989) and that work in air-conditioned environments was related to an almost twofold increase in dermatoses likely to be occupational in origin (adjusted odds ratio of 2) (Delaporte et al. 1990).
Lighting
Several studies have shown that the importance of good lighting is still underestimated in administrative and general departments of hospitals.
Cabal et al. (1986) observed that lighting levels at half of the workstations in a hospital laundry were no higher than 100 lux. Lighting levels following renovations were 300 lux at all workstations, 800 lux at the darning station and 150 lux between the washing tunnels.
Blanpain and Estryn-Béhar (1990) observed maximum night lighting levels below 500 lux in 9 out of 10 wards. Lighting levels were below 250 lux in five pharmacies with no natural lighting and were below 90 lux in three pharmacies. It should be recalled that the difficulty in reading small lettering on labels experienced by older persons may be mitigated by increasing the level of illumination.
Building orientation can result in high day-time lighting levels that disturb patients’ rest. For example, in geriatric wards, beds furthest from the windows received 1,200 lux, while those nearest the windows received 5,000 lux. The only window shading available in these rooms were solid window blinds and nurses were unable to dispense care in four-bed rooms when these were drawn. In some cases, nurses stuck paper on the windows to provide patients with some relief.
The lighting in some intensive-care units is too intense to allow patients to rest (Meyer et al. 1994). The effect of lighting on patients’ sleep has been studied in neonatology wards by North American and German nurses (Oëler 1993; Boehm and Bollinger 1990).
In one hospital, surgeons disturbed by reflections from white tiles requested the renovation of the operating theatre. Lighting levels outside the shadow-free zone (15,000 to 80,000 lux) were reduced. However, this resulted in levels of only 100 lux at the instrument nurses’ work surface, 50 to 150 lux at the wall unit used for equipment storage, 70 lux at the patients’ head and 150 lux at the anaesthetists’ work surface. To avoid generating glare capable of affecting the accuracy of surgeons’ movements, lamps were installed outside of surgeons’ sight-lines. Rheostats were installed to control lighting levels at the nurses’ work surface between 300 and 1,000 lux and general levels between 100 and 300 lux.
Construction of a hospital with extensive natural lighting
In 1981, planning for the construction of Saint Mary’s Hospital on the Isle of Wight began with a goal of halving energy costs (Burton 1990). The final design called for extensive use of natural lighting and incorporated double-pane windows that could be opened in the summer. Even the operating theatre has an outside view and paediatric wards are located on the ground floor to allow access to play areas. The other wards, on the second and third (top) floors, are equipped with windows and ceiling lighting. This design is quite suitable for temperate climates but may be problematic where ice and snow inhibit overhead lighting or where high temperatures may lead to a significant greenhouse effect.
Architecture and Working Conditions
Flexible design is not multi-functionality
Prevailing concepts from 1945 to 1985, in particular the fear of instant obsolescence, were reflected in the construction of multi-purpose hospitals composed of identical modules (Games and Taton-Braen 1987). In the United Kingdom this trend led to the development of the “Harnes system”, whose first product was the Dudley Hospital, built in 1974. Seventy other hospitals were later built on the same principles. In France, several hospitals were constructed on the “Fontenoy” model.
Building design should not prevent modifications necessitated by the rapid evolution of therapeutic practice and technology. For example, partitions, fluid circulation subsystems and technical duct-work should all be capable of being easily moved. However, this flexibility should not be construed as an endorsement of the goal of complete multi-functionality—a design goal which leads to the construction of facilities poorly suited to any speciality. For example, the surface area needed to store machines, bottles, disposable equipment and medication is different in surgical, cardiology and geriatric wards. Failure to recognize this will lead to rooms being used for purposes they were not designed for (e.g., bathrooms being used for bottle storage).
The Loma Linda Hospital in California (United States) is an example of better hospital design and has been copied elsewhere. Here, nursing and technical medicine departments are located above and below technical floors; this “sandwich” structure permits easy maintenance and adjustment of fluid circulation.
Unfortunately, hospital architecture does not always reflect the needs of those who work there, and multi-functional design has been responsible for reported problems related to physical and cognitive strain. Consider a 30-bed ward composed of one- and two-bed rooms, in which there is only one functional area of each type (nursing station, pantry, storage of disposable materials, linen or medication), all based on the same all-purpose design. In this ward, the management and dispensation of care obliges nurses to change location extremely frequently, and work is greatly fragmented. A comparative study of ten wards has shown that the distance from the nurses’ station to the farthest room is an important determinant of both nurses’ fatigue (a function of the distance walked) and the quality of care (a function of the time spent in patients’ rooms) (Estryn-Béhar and Hakim-Serfaty 1990).
This discrepancy between the architectural design of spaces, corridors and materials, on the one hand, and the realities of hospital work, on the other, has been characterized by Patkin (1992), in a review of Australian hospitals, as an ergonomic “debacle”.
Preliminary analysis of the spatial organization in nursing areas
The first mathematical model of the nature, purposes and frequency of staff movements, based on the Yale Traffic Index, appeared in 1960 and was refined by Lippert in 1971. However, attention to one problem in isolation may in fact aggravate others. For example, locating a nurses’ station in the centre of the building, in order to reduce the distances walked, may worsen working conditions if nurses must spend over 30% of their time in such windowless surroundings, known to be a source of problems related to lighting, ventilation and psychological factors (Estryn-Béhar and Milanini 1992).
The distance of the preparation and storage areas from patients is less problematic in settings with a high staff-patient ratio and where the existence of a centralized preparation area facilitates the delivery of supplies several times per day, even on holidays. In addition, long waits for elevators are less common in high-rise hospitals with over 600 beds, where the number of elevators is not limited by financial constraints.
Research on the design of specific but flexible hospital units
In the United Kingdom in the late 1970s, the Health Ministry created a team of ergonomists to compile a database on ergonomics training and on the ergonomic layout of hospital work areas (Haigh 1992). Noteworthy examples of the success of this programme include the modification of the dimensions of laboratory furniture to take into account the demands of microscopy work and the redesign of maternity rooms to take into account nurses’ work and mothers’ preferences.
Cammock (1981) emphasized the need to provide distinct nursing, public and common areas, with separate entrances for nursing and public areas, and separate connections between these areas and the common area. Furthermore, there should be no direct contact between the public and nursing areas.
The Krankenanstalt Rudolfsstiftung is the first pilot hospital of the “European Healthy Hospitals” project. The Viennese pilot project consists of eight sub-projects, one of which, the “Service Reorganization” project, is an attempt, in collaboration with ergonomists, to promote functional reorganization of available space (Pelikan 1993). For example, all the rooms in an intensive care unit were renovated and rails for patient lifts installed in the ceilings of each room.
A comparative analysis of 90 Dutch hospitals suggests that small units (floors of less than 1,500 m2) are the most efficient, as they allow nurses to tailor their care to the specifics of patients’ occupational therapy and family dynamics (Van Hogdalem 1990). This design also increases the time nurses can spend with patients, since they waste less time in changes of location and are less subject to uncertainty. Finally, the use of small units reduces the number of windowless work areas.
A study carried out in the health administration sector in Sweden reported better employee performance in buildings incorporating individual offices and conference rooms, as opposed to an open plan (Ahlin 1992). The existence in Sweden of an institute dedicated to the study of working conditions in hospitals, and of legislation requiring consultation with employee representatives both before and during all construction or renovation projects, has resulted in the regular recourse to participatory design based on ergonomic training and intervention (Tornquist and Ullmark 1992).
Architectural design based on participatory ergonomics
Workers must be involved in the planning of the behavioural and organizational changes associated with the occupation of a new work space. The adequate organization and equipping of a workplace requires taking into account the organizational elements that require modification or emphasis. Two detailed examples taken from two hospitals illustrate this.
Estryn-Béhar et al. (1994) report the results of the renovation of the common areas of a medical ward and a cardiology ward of the same hospital. The ergonomics of the work performed by each profession in each ward was observed over seven entire workdays and discussed over a two-day period with each group. The groups included representatives of all occupations (department heads, supervisors, interns, nurses, nurses’ aides, orderlies) from all the shifts. One entire day was spent developing architectural and organizational proposals for each problem noted. Two more days were spent on the simulation of characteristic activities by the entire group, in collaboration with an architect and an ergonomist, using modular cardboard mock-ups and scale models of objects and people. Through this simulation, representatives of the various occupations were able to agree on distances and the distribution of space within each ward. Only after this process was concluded was the design specification drawn up.
The same participatory method was used in a cardiac intensive-care unit in another hospital (Estryn-Béhar et al. 1995a, 1995b). It was found that four types of virtually incompatible activities were conducted at the nursing station:
These zones overlapped, and nurses had to cross the meeting-writing-monitoring area to reach the other areas. Because of the position of the furniture, nurses had to change direction three times to get to the drain-board. Patient rooms were laid out along a corridor, both for regular intensive care and highly intensive care. The storage units were located at the far end of the ward from the nursing station.
In the new layout, the station’s longitudinal orientation of functions and traffic is replaced with a lateral one which allows direct and central circulation in a furniture-free area. The meeting-writing-monitoring area is now located at the end of the room, where it offers a calm space near windows, while remaining accessible. The clean and dirty preparation areas are located by the entrance to the room and are separated from each other by a large circulation area. The highly intensive care rooms are large enough to accommodate emergency equipment, a preparation counter and a deep washbasin. A glass wall installed between the preparation areas and the highly intensive care rooms ensures that patients in these rooms are always visible. The main storage area was rationalized and reorganized. Plans are available for each work and storage area.
Architecture, ergonomics and developing countries
These problems are also found in developing countries; in particular, renovations there frequently involve the elimination of common rooms. The performance of ergonomic analysis would identify existing problems and help avoid new ones. For example, the construction of wards comprised of only one- or two-bed rooms increases the distances that personnel must travel. Inadequate attention to staffing levels and the layout of nursing stations, satellite kitchens, satellite pharmacies and storage areas may lead to significant reductions in the amount of time nurses spend with patients and may render work organization more complex.
Furthermore, the application in developing countries of the multi-functional hospital model of developed countries does not take into account different cultures’ attitudes toward space utilization. Manuaba (1992) has pointed out that the layout of developed countries’ hospital rooms and the type of medical equipment used is poorly suited to developing countries, and that the rooms are too small to comfortably accommodate visitors, essential partners in the curative process.
Hygiene and Ergonomics
In hospital settings, many breaches of asepsis can be understood and corrected only by reference to work organization and work space. Effective implementation of the necessary modifications requires detailed ergonomic analysis. This analysis serves to characterize the interdependencies of team tasks, rather than their individual characteristics, and identify discrepancies between real and nominal work, especially nominal work described in official protocols.
Hand-mediated contamination was one of the first targets in the fight against nosocomial infections. In theory, hands should be systemtically washed on entering and leaving patients’ rooms. Although initial and ongoing training of nurses emphasizes the results of descriptive epidemiological studies, research indicates persistent problems associated with hand-washing. In a study conducted in 1987 and involving continuous observation of entire 8-hour shifts in 10 wards, Delaporte et al. (1990) observed an average of 17 hand-washings by morning-shift nurses, 13 by afternoon-shift nurses and 21 by night-shift nurses.
Nurses washed their hands one-half to one-third as often as is recommended for their number of patient contacts (without even considering care-preparation activities); for nurses’ aides, the ratio was one-third to one-fifth. Hand-washing before and after each activity is, however, clearly impossible, in terms of both time and skin damage, given the atomization of activity, number of technical interventions and frequency of interruptions and attendant repetition of care that personnel must cope with. Reduction of work interruptions is thus essential and should take precedence over simply reaffirming the importance of hand-washing, which, in any event, cannot be performed over 25 to 30 times per day.
Similar patterns of hand-washing were found in a study based on observations collected over 14 entire workdays in 1994 during the reorganization of the common areas of two university hospital wards (Estryn-Béhar et al. 1994). In every case, nurses would have been incapable of dispensing the required care if they had returned to the nursing station to wash their hands. In short-term-stay units, for example, almost all the patients have blood samples drawn and subsequently receive oral and intravenous medication at virtually the same time. The density of activities at certain times also renders appropriate hand-washing impossible: in one case, an afternoon-shift nurse responsible for 13 patients in a medical ward entered patients’ rooms 21 times in one hour. Poorly organized information provision and transmission structures contributed to the number of visits he was obliged to perform. Given the impossibility of washing his hands 21 times in one hour, the nurse washed them only when dealing with the most fragile patients (i.e., those suffering from pulmonary failure).
Ergonomically based architectural design takes several factors affecting hand-washing into account, especially those concerning the location and access to wash-basins, but also the implementation of truly functional “dirty” and “clean” circuits. Reduction of interruptions through participatory analysis of organization helps to make hand-washing possible.
" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."