Thursday, 10 March 2011 16:45

Goals, Definitions and General Information

Work is essential for life, development and personal fulfilment. Unfortunately, indispensable activities such as food production, extraction of raw materials, manufacturing of goods, energy production and services involve processes, operations and materials which can, to a greater or lesser extent, create hazards to the health of workers and those in nearby communities, as well as to the general environment.

However, the generation and release of harmful agents in the work environment can be prevented, through adequate hazard control interventions, which not only protect workers’ health but also limit the damage to the environment often associated with industrialization. If a harmful chemical is eliminated from a work process, it will neither affect the workers nor go beyond, to pollute the environment.

The profession that aims specifically at the prevention and control of hazards arising from work processes is occupational hygiene. The goals of occupational hygiene include the protection and promotion of workers’ health, the protection of the environment and contribution to a safe and sustainable development.

The need for occupational hygiene in the protection of workers’ health cannot be overemphasized. Even when feasible, the diagnosis and the cure of an occupational disease will not prevent further occurrences, if exposure to the aetiological agent does not cease. So long as the unhealthy work environment remains unchanged, its potential to impair health remains. Only the control of health hazards can break the vicious circle illustrated in figure 1.

Figure 1. Interactions between people and the environment

IHY010F1

However, preventive action should start much earlier, not only before the manifestation of any health impairment but even before exposure actually occurs. The work environment should be under continuous surveillance so that hazardous agents and factors can be detected and removed, or controlled, before they cause any ill effects; this is the role of occupational hygiene.

Furthermore, occupational hygiene may also contribute to a safe and sustainable development, that is “to ensure that (development) meets the needs of the present without compromising the ability of the future generations to meet their own needs” (World Commission on Environment and Development 1987). Meeting the needs of the present world population without depleting or damaging the global resource base, and without causing adverse health and environmental consequences, requires knowledge and means to influence action (WHO 1992a); when related to work processes this is closely related to occupational hygiene practice.

 

 

 

 

 

 

 

 

 

 

 

 

Occupational health requires a multidisciplinary approach and involves fundamental disciplines, one of which is occupational hygiene, along with others which include occupational medicine and nursing, ergonomics and work psychology. A schematic representation of the scopes of action for occupational physicians and occupational hygienists is presented in figure 2.

Figure 2. Scopes of action for occupational physicians and occupational hygienists.

IHY010F2

It is important that decision makers, managers and workers themselves, as well as all occupational health professionals, understand the essential role that occupational hygiene plays in the protection of workers’ health and of the environment, as well as the need for specialized professionals in this field. The close link between occupational and environmental health should also be kept in mind, since the prevention of pollution from industrial sources, through the adequate handling and disposal of hazardous effluents and waste, should be started at the workplace level. (See “Evaluation of the work environment”).

 

 

 

 

Concepts and Definitions

Occupational hygiene

Occupational hygiene is the science of the anticipation, recognition, evaluation and control of hazards arising in or from the workplace, and which could impair the health and well-being of workers, also taking into account the possible impact on the surrounding communities and the general environment.

Definitions of occupational hygiene may be presented in different ways; however, they all have essentially the same meaning and aim at the same fundamental goal of protecting and promoting the health and well-being of workers, as well as protecting the general environment, through preventive actions in the workplace.

Occupational hygiene is not yet universally recognized as a profession; however, in many countries, framework legislation is emerging that will lead to its establishment.


Occupational hygienist

 An occupational hygienist is a professional able to:

  • anticipate the health hazards that may result from work processes, operations and equipment, and accordingly advise on their planning and design
  • recognize and understand, in the work environment, the occurrence (real or potential) of chemical, physical and biological agents and other stresses, and their interactions with other factors, which may affect the health and well-being of workers
  • understand the possible routes of agent entry into the human body, and the effects that such agents and other factors may have on health
  • assess workers’ exposure to potentially harmful agents and factors and to evaluate the results
  •  evaluate work processes and methods, from the point of view of the possible generation and release/propagation of potentially harmful agents and other factors, with a view to eliminating exposures, or reducing them to acceptable levels
  • design, recommend for adoption, and evaluate the effectiveness of control strategies, alone or in collaboration with other professionals to ensure effective and economical control
  • participate in overall risk analysis and management of an agent, process or workplace, and contribute to the establishment of priorities for risk management
  • understand the legal framework for occupational hygiene practice in their own country
  • educate, train, inform and advise persons at all levels, in all aspects of hazard communication
  • work effectively in a multidisciplinary team involving other professionals
  • recognize agents and factors that may have environmental impact, and understand the need to integrate occupational hygiene practice with environmental protection.

 

It should be kept in mind that a profession consists not only of a body of knowledge, but also of a Code of Ethics; national occupational hygiene associations, as well as the International Occupational Hygiene Association (IOHA), have their own Codes of Ethics (WHO 1992b).  


 

Occupational hygiene technician

An occupational hygiene technician is “a person competent to carry out measurements of the work environment” but not “to make the interpretations, judgements, and recommendations required from an occupational hygienist”. The necessary level of competence may be obtained in a comprehensive or limited field (WHO 1992b).

International Occupational Hygiene Association (IOHA)

IOHA was formally established, during a meeting in Montreal, on June 2, 1987. At present IOHA has the participation of 19 national occupational hygiene associations, with over nineteen thousand members from seventeen countries.

The primary objective of IOHA is to promote and develop occupational hygiene throughout the world, at a high level of professional competence, through means that include the exchange of information among organizations and individuals, the further development of human resources and the promotion of a high standard of ethical practice. IOHA activities include scientific meetings and publication of a newsletter. Members of affiliated associations are automatically members of IOHA; it is also possible to join as an individual member, for those in countries where there is not yet a national association.

Certification

In addition to an accepted definition of occupational hygiene and of the role of the occupational hygienist, there is need for the establishment of certification schemes to ensure acceptable standards of occupational hygiene competence and practice. Certification refers to a formal scheme based on procedures for establishing and maintaining knowledge, skills and competence of professionals (Burdorf 1995).

IOHA has promoted a survey of existing national certification schemes (Burdorf 1995), together with recommendations for the promotion of international cooperation in assuring the quality of professional occupational hygienists, which include the following:

  • “the harmonization of standards on the competence and practice of professional occupational hygienists”
  • “the establishment of an international body of peers to review the quality of existing certification schemes”.

 

Other suggestions in this report include items such as: “reciprocity” and “cross-acceptance of national designations, ultimately aiming at an umbrella scheme with one internationally accepted designation”.

The Practice of Occupational Hygiene

The classical steps in occupational hygiene practice are:

  • the recognition of the possible health hazards in the work environment
  • the evaluation of hazards, which is the process of assessing exposure and reaching conclusions as to the level of risk to human health
  • prevention and control of hazards, which is the process of developing and implementing strategies to eliminate, or reduce to acceptable levels, the occurrence of harmful agents and factors in the workplace, while also accounting for environmental protection.

 

The ideal approach to hazard prevention is “anticipated and integrated preventive action”, which should include:

  • occupational health and environmental impact assessments, prior to the design and installation of any new workplace
  • selection of the safest, least hazardous and least polluting technology (“cleaner production”)
  • environmentally appropriate location
  • proper design, with adequate layout and appropriate control technology, including for the safe handling and disposal of the resulting effluents and waste
  • elaboration of guidelines and regulations for training on the correct operation of processes, including on safe work practices, maintenance and emergency procedures.

 

The importance of anticipating and preventing all types of environmental pollution cannot be overemphasized. There is, fortunately, an increasing tendency to consider new technologies from the point of view of the possible negative impacts and their prevention, from the design and installation of the process to the handling of the resulting effluents and waste, in the so-called cradle-to-grave approach. Environmental disasters, which have occurred in both developed and developing countries, could have been avoided by the application of appropriate control strategies and emergency procedures in the workplace.

Economic aspects should be viewed in broader terms than the usual initial cost consideration; more expensive options that offer good health and environmental protection may prove to be more economical in the long run. The protection of workers’ health and of the environment must start much earlier than it usually does. Technical information and advice on occupational and environmental hygiene should always be available to those designing new processes, machinery, equipment and workplaces. Unfortunately such information is often made available much too late, when the only solution is costly and difficult retrofitting, or worse, when consequences have already been disastrous.

Recognition of hazards

Recognition of hazards is a fundamental step in the practice of occupational hygiene, indispensable for the adequate planning of hazard evaluation and control strategies, as well as for the establishment of priorities for action. For the adequate design of control measures, it is also necessary to physically characterize contaminant sources and contaminant propagation paths.

The recognition of hazards leads to the determination of:

  • which agents may be present and under which circumstances
  • the nature and possible extent of associated adverse effects on health and well-being.

 

The identification of hazardous agents, their sources and the conditions of exposure requires extensive knowledge and careful study of work processes and operations, raw materials and chemicals used or generated, final products and eventual by-products, as well as of possibilities for the accidental formation of chemicals, decomposition of materials, combustion of fuels or the presence of impurities. The recognition of the nature and potential magnitude of the biological effects that such agents may cause if overexposure occurs, requires knowledge on and access to toxicological information. International sources of information in this respect include International Programme on Chemical Safety (IPCS), International Agency for Research on Cancer (IARC) and International Register of Potentially Toxic Chemicals, United Nations Environment Programme (UNEP-IRPTC).

Agents which pose health hazards in the work environment include airborne contaminants; non-airborne chemicals; physical agents, such as heat and noise; biological agents; ergonomic factors, such as inadequate lifting procedures and working postures; and psychosocial stresses.

Occupational hygiene evaluations

Occupational hygiene evaluations are carried out to assess workers’ exposure, as well as to provide information for the design, or to test the efficiency, of control measures.

Evaluation of workers’ exposure to occupational hazards, such as airborne contaminants, physical and biological agents, is covered elsewhere in this chapter. Nevertheless, some general considerations are provided here for a better understanding of the field of occupational hygiene.

It is important to keep in mind that hazard evaluation is not an end in itself, but must be considered as part of a much broader procedure that starts with the realization that a certain agent, capable of causing health impairment, may be present in the work environment, and concludes with the control of this agent so that it will be prevented from causing harm. Hazard evaluation paves the way to, but does not replace, hazard prevention.

Exposure assessment

Exposure assessment aims at determining how much of an agent workers have been exposed to, how often and for how long. Guidelines in this respect have been established both at the national and international level—for example, EN 689, prepared by the Comité Européen de Normalisation (European Committee for Standardization) (CEN 1994).

In the evaluation of exposure to airborne contaminants, the most usual procedure is the assessment of inhalation exposure, which requires the determination of the air concentration of the agent to which workers are exposed (or, in the case of airborne particles, the air concentration of the relevant fraction, e.g., the “respirable fraction”) and the duration of the exposure. However, if routes other than inhalation contribute appreciably to the uptake of a chemical, an erroneous judgement may be made by looking only at the inhalation exposure. In such cases, total exposure has to be assessed, and a very useful tool for this is biological monitoring.

The practice of occupational hygiene is concerned with three kinds of situations:

  • initial studies to assess workers’ exposure
  • follow-up monitoring/surveillance
  • exposure assessment for epidemiological studies.

 

A primary reason for determining whether there is overexposure to a hazardous agent in the work environment, is to decide whether interventions are required. This often, but not necessarily, means establishing whether there is compliance with an adopted standard, which is usually expressed in terms of an occupational exposure limit. The determination of the “worst exposure” situation may be enough to fulfil this purpose. Indeed, if exposures are expected to be either very high or very low in relation to accepted limit values, the accuracy and precision of quantitative evaluations can be lower than when the exposures are expected to be closer to the limit values. In fact, when hazards are obvious, it may be wiser to invest resources initially on controls and to carry out more precise environmental evaluations after controls have been implemented.

Follow-up evaluations are often necessary, particularly if the need existed to install or improve control measures or if changes in the processes or materials utilized were foreseen. In these cases, quantitative evaluations have an important surveillance role in:

  • evaluating the adequacy, testing the efficiency or disclosing possible failures in the control systems
  • detecting whether alterations in the processes, such as operating temperature, or in the raw materials, have altered the exposure situation.

 

Whenever an occupational hygiene survey is carried out in connection with an epidemiological study in order to obtain quantitative data on relationships between exposure and health effects, the exposure must be characterized with a high level of accuracy and precision. In this case, all exposure levels must be adequately characterized, since it would not be enough, for example, to characterize only the worst case exposure situation. It would be ideal, although difficult in practice, to always keep precise and accurate exposure assessment records since there may be a future need to have historical exposure data.

In order to ensure that evaluation data is representative of workers’ exposure, and that resources are not wasted, an adequate sampling strategy, accounting for all possible sources of variability, must be designed and followed. Sampling strategies, as well as measurement techniques, are covered in “Evaluation of the work environment”.

Interpretation of results

The degree of uncertainty in the estimation of an exposure parameter, for example, the true average concentration of an airborne contaminant, is determined through statistical treatment of the results from measurements (e.g., sampling and analysis). The level of confidence on the results will depend on the coefficient of variation of the “measuring system” and on the number of measurements. Once there is an acceptable confidence, the next step is to consider the health implications of the exposure: what does it mean for the health of the exposed workers: now? in the near future? in their working life? will there be an impact on future generations?

The evaluation process is only completed when results from measurements are interpreted in view of data (sometimes referred to as “risk assessment data”) derived from experimental toxicology, epidemiological and clinical studies and, in certain cases, clinical trials. It should be clarified that the term risk assessment has been used in connection with two types of assessments—the assessment of the nature and extent of risk resulting from exposure to chemicals or other agents, in general, and the assessment of risk for a particular worker or group of workers, in a specific workplace situation.

In the practice of occupational hygiene, exposure assessment results are often compared with adopted occupational exposure limits which are intended to provide guidance for hazard evaluation and for setting target levels for control. Exposure in excess of these limits requires immediate remedial action by the improvement of existing control measures or implementation of new ones. In fact, preventive interventions should be made at the “action level”, which varies with the country (e.g., one-half or one-fifth of the occupational exposure limit). A low action level is the best assurance of avoiding future problems.

Comparison of exposure assessment results with occupational exposure limits is a simplification, since, among other limitations, many factors which influence the uptake of chemicals (e.g., individual susceptibilities, physical activity and body build) are not accounted for by this procedure. Furthermore, in most workplaces there is simultaneous exposure to many agents; hence a very important issue is that of combined exposures and agent interactions, because the health consequences of exposure to a certain agent alone may differ considerably from the consequences of exposure to this same agent in combination with others, particularly if there is synergism or potentiation of effects.

Measurements for control

Measurements with the purpose of investigating the presence of agents and the patterns of exposure parameters in the work environment can be extremely useful for the planning and design of control measures and work practices. The objectives of such measurements include:

  • source identification and characterization
  • spotting of critical points in closed systems or enclosures (e.g., leaks)
  • determination of propagation paths in the work environment
  • comparison of different control interventions
  • verification that respirable dust has settled together with the coarse visible dust, when using water sprays
  • checking that contaminated air is not coming from an adjacent area.

 

Direct-reading instruments are extremely useful for control purposes, particularly those which can be used for continuous sampling and reflect what is happening in real time, thus disclosing exposure situations which might not otherwise be detected and which need to be controlled. Examples of such instruments include: photo-ionization detectors, infrared analysers, aerosol meters and detector tubes. When sampling to obtain a picture of the behaviour of contaminants, from the source throughout the work environment, accuracy and precision are not as critical as they would be for exposure assessment.

Recent developments in this type of measurement for control purposes include visualization techniques, one of which is the Picture Mix Exposure—PIMEX (Rosen 1993). This method combines a video image of the worker with a scale showing airborne contaminant concentrations, which are continuously measured, at the breathing zone, with a real-time monitoring instrument, thus making it possible to visualize how the concentration varies while the task is performed. This provides an excellent tool for comparing the relative efficacy of different control measures, such as ventilation and work practices, thus contributing to better design.

Measurements are also needed to assess the efficiency of control measures. In this case, source sampling or area sampling are convenient, alone or in addition to personal sampling, for the assessment of workers’ exposure. In order to assure validity, the locations for “before” and “after” sampling (or measurements) and the techniques used should be the same, or equivalent, in sensitivity, accuracy and precision.

Hazard prevention and control

The primary goal of occupational hygiene is the implementation of appropriate hazard prevention and control measures in the work environment. Standards and regulations, if not enforced, are meaningless for the protection of workers’ health, and enforcement usually requires both monitoring and control strategies. The absence of legally established standards should not be an obstacle to the implementation of the necessary measures to prevent harmful exposures or control them to the lowest level feasible. When serious hazards are obvious, control should be recommended, even before quantitative evaluations are carried out. It may sometimes be necessary to change the classical concept of “recognition-evaluation-control” to “recognition-control-evaluation”, or even to “recognition-control”, if capabilities for evaluation of hazards do not exist. Some examples of hazards in obvious need of action without the necessity of prior environmental sampling are electroplating carried out in an unventilated, small room, or using a jackhammer or sand-blasting equipment with no environmental controls or protective equipment. For such recognized health hazards, the immediate need is control, not quantitative evaluation.

Preventive action should in some way interrupt the chain by which the hazardous agent—a chemical, dust, a source of energy—is transmitted from the source to the worker. There are three major groups of control measures: engineering controls, work practices and personal measures.

The most efficient hazard prevention approach is the application of engineering control measures which prevent occupational exposures by managing the work environment, thus decreasing the need for initiatives on the part of workers or potentially exposed persons. Engineering measures usually require some process modifications or mechanical structures, and involve technical measures that eliminate or reduce the use, generation or release of hazardous agents at their source, or, when source elimination is not possible, engineering measures should be designed to prevent or reduce the spread of hazardous agents into the work environment by:

  • containing them
  • removing them immediately beyond the source
  • interfering with their propagation
  • reducing their concentration or intensity.

 

Control interventions which involve some modification of the source are the best approach because the harmful agent can be eliminated or reduced in concentration or intensity. Source reduction measures include substitution of materials, substitution/modification of processes or equipment and better maintenance of equipment.

When source modifications are not feasible, or are not sufficient to attain the desired level of control, then the release and dissemination of hazardous agents in the work environment should be prevented by interrupting their transmission path through measures such as isolation (e.g., closed systems, enclosures), local exhaust ventilation, barriers and shields, isolation of workers.

Other measures aiming at reducing exposures in the work environment include adequate workplace design, dilution or displacement ventilation, good housekeeping and adequate storage. Labelling and warning signs can assist workers in safe work practices. Monitoring and alarm systems may be required in a control programme. Monitors for carbon monoxide around furnaces, for hydrogen sulphide in sewage work, and for oxygen deficiency in closed spaces are some examples.

Work practices are an important part of control—for example, jobs in which a worker’s work posture can affect exposure, such as whether a worker bends over his or her work. The position of the worker may affect the conditions of exposure (e.g., breathing zone in relation to contaminant source, possibility of skin absorption).

Lastly, occupational exposure can be avoided or reduced by placing a protective barrier on the worker, at the critical entry point for the harmful agent in question (mouth, nose, skin, ear)—that is, the use of personal protective devices. It should be pointed out that all other possibilities of control should be explored before considering the use of personal protective equipment, as this is the least satisfactory means for routine control of exposures, particularly to airborne contaminants.

Other personal preventive measures include education and training, personal hygiene and limitation of exposure time.

Continuous evaluations, through environmental monitoring and health surveillance, should be part of any hazard prevention and control strategy.

Appropriate control technology for the work environment must also encompass measures for the prevention of environmental pollution (air, water, soil), including adequate management of hazardous waste.

Although most of the control principles hereby mentioned apply to airborne contaminants, many are also applicable to other types of hazards. For example, a process can be modified to produce less air contaminants or to produce less noise or less heat. An isolating barrier can isolate workers from a source of noise, heat or radiation.

Far too often prevention dwells on the most widely known measures, such as local exhaust ventilation and personal protective equipment, without proper consideration of other valuable control options, such as alternative cleaner technologies, substitution of materials, modification of processes, and good work practices. It often happens that work processes are regarded as unchangeable when, in reality, changes can be made which effectively prevent or at least reduce the associated hazards.

Hazard prevention and control in the work environment requires knowledge and ingenuity. Effective control does not necessarily require very costly and complicated measures. In many cases, hazard control can be achieved through appropriate technology, which can be as simple as a piece of impervious material between the naked shoulder of a dock worker and a bag of toxic material that can be absorbed through the skin. It can also consist of simple improvements such as placing a movable barrier between an ultraviolet source and a worker, or training workers in safe work practices.

Aspects to be considered when selecting appropriate control strategies and technology, include the type of hazardous agent (nature, physical state, health effects, routes of entry into the body), type of source(s), magnitude and conditions of exposure, characteristics of the workplace and relative location of workstations.

The required skills and resources for the correct design, implementation, operation, evaluation and maintenance of control systems must be ensured. Systems such as local exhaust ventilation must be evaluated after installation and routinely checked thereafter. Only regular monitoring and maintenance can ensure continued efficiency, since even well-designed systems may lose their initial performance if neglected.

Control measures should be integrated into hazard prevention and control programmes, with clear objectives and efficient management, involving multidisciplinary teams made up of occupational hygienists and other occupational health and safety staff, production engineers, management and workers. Programmes must also include aspects such as hazard communication, education and training covering safe work practices and emergency procedures.

Health promotion aspects should also be included, since the workplace is an ideal setting for promoting healthy life-styles in general and for alerting as to the dangers of hazardous non-occupational exposures caused, for example, by shooting without adequate protection, or smoking.

The Links among Occupational Hygiene, Risk Assessment and Risk Management

Risk assessment

Risk assessment is a methodology that aims at characterizing the types of health effects expected as a result of a certain exposure to a given agent, as well as providing estimates on the probability of occurrence of these health effects, at different levels of exposure. It is also used to characterize specific risk situations. It involves hazard identification, the establishment of exposure-effect relationships, and exposure assessment, leading to risk characterization.

The first step refers to the identification of an agent—for example, a chemical—as causing a harmful health effect (e.g., cancer or systemic poisoning). The second step establishes how much exposure causes how much of a given effect in how many of the exposed persons. This knowledge is essential for the interpretation of exposure assessment data.

Exposure assessment is part of risk assessment, both when obtaining data to characterize a risk situation and when obtaining data for the establishment of exposure-effect relationships from epidemiological studies. In the latter case, the exposure that led to a certain occupational or environmentally caused effect has to be accurately characterized to ensure the validity of the correlation.

Although risk assessment is fundamental to many decisions which are taken in the practice of occupational hygiene, it has limited effect in protecting workers’ health, unless translated into actual preventive action in the workplace.

Risk assessment is a dynamic process, as new knowledge often discloses harmful effects of substances until then considered relatively harmless; therefore the occupational hygienist must have, at all times, access to up-to-date toxicological information. Another implication is that exposures should always be controlled to the lowest feasible level.

Figure 3 is presented as an illustration of different elements of risk assessment.

Figure 3. Elements of risk assessment.

IHY010F3

Risk management in the work environment

It is not always feasible to eliminate all agents that pose occupational health risks because some are inherent to work processes that are indispensable or desirable; however, risks can and must be managed.

Risk assessment provides a basis for risk management. However, while risk assessment is a scientific procedure, risk management is more pragmatic, involving decisions and actions that aim at preventing, or reducing to acceptable levels, the occurrence of agents which may pose hazards to the health of workers, surrounding communities and the environment, also accounting for the socio-economic and public health context.

Risk management takes place at different levels; decisions and actions taken at the national level pave the way for the practice of risk management at the workplace level.

Risk management at the workplace level requires information and knowledge on:

  • health hazards and their magnitude, identified and rated according to risk assessment findings
  • legal requirements and standards
  • technological feasibility, in terms of the available and applicable control technology
  • economic aspects, such as the costs to design, implement, operate and maintain control systems, and cost-benefit analysis (control costs versus financial benefits incurred by controlling occupational and environment hazards)
  • human resources (available and required)
  • socio-economic and public health context

 

to serve as a basis for decisions which include:

  • establishment of a target for control
  • selection of adequate control strategies and technologies
  • establishment of priorities for action in view of the risk situation, as well as of the existing socio-economic and public health context (particularly important in developing countries)

 

and which should lead to actions such as:

  • identification/search of financial and human resources (if not yet available)
  • design of specific control measures, which should be appropriate for the protection of workers’ health and of the environment, as well as safeguarding as much as possible the natural resource base
  • implementation of control measures, including provisions for adequate operation, maintenance and emergency procedures
  • establishment of a hazard prevention and control programme with adequate management and including routine surveillance.

 

Traditionally, the profession responsible for most of these decisions and actions in the workplace is occupational hygiene.

One key decision in risk management, that of acceptable risk (what effect can be accepted, in what percentage of the working population, if any at all?), is usually, but not always, taken at the national policy-making level and followed by the adoption of occupational exposure limits and the promulgation of occupational health regulations and standards. This leads to the establishment of targets for control, usually at the workplace level by the occupational hygienist, who should have knowledge of the legal requirements. However, it may happen that decisions on acceptable risk have to be taken by the occupational hygienist at the workplace level—for example, in situations when standards are not available or do not cover all potential exposures.

All these decisions and actions must be integrated into a realistic plan, which requires multidisciplinary and multisectorial coordination and collaboration. Although risk management involves pragmatic approaches, its efficiency should be scientifically evaluated. Unfortunately risk management actions are, in most cases, a compromise between what should be done to avoid any risk and the best which can be done in practice, in view of financial and other limitations.

Risk management concerning the work environment and the general environment should be well coordinated; not only are there overlapping areas, but, in most situations, the success of one is interlinked with the success of the other.

Occupational Hygiene Programmes and Services

Political will and decision making at the national level will, directly or indirectly, influence the establishment of occupational hygiene programmes or services, either at the governmental or private level. It is beyond the scope of this article to provide detailed models for all types of occupational hygiene programmes and services; however, there are general principles that are applicable to many situations and may contribute to their efficient implementation and operation.

A comprehensive occupational hygiene service should have the capability to carry out adequate preliminary surveys, sampling, measurements and analysis for hazard evaluation and for control purposes, and to recommend control measures, if not to design them.

Key elements of a comprehensive occupational hygiene programme or service are human and financial resources, facilities, equipment and information systems, well organized and coordinated through careful planning, under efficient management, and also involving quality assurance and continuous programme evaluation. Successful occupational hygiene programmes require a policy basis and commitment from top management. The procurement of financial resources is beyond the scope of this article.

Human resources

Adequate human resources constitute the main asset of any programme and should be ensured as a priority. All staff should have clear job descriptions and responsibilities. If needed, provisions for training and education should be made. The basic requirements for occupational hygiene programmes include:

  • occupational hygienists—in addition to general knowledge on the recognition, evaluation and control of occupational hazards, occupational hygienists may be specialized in specific areas, such as analytical chemistry or industrial ventilation; the ideal situation is to have a team of well-trained professionals in the comprehensive practice of occupational hygiene and in all required areas of expertise
  • laboratory personnel, chemists (depending on the extent of analytical work)
  • technicians and assistants, for field surveys and for laboratories, as well as for instrument maintenance and repairs
  • information specialists and administrative support.

 

One important aspect is professional competence, which must not only be achieved but also maintained. Continuous education, in or outside the programme or service, should cover, for example, legislation updates, new advances and techniques, and gaps in knowledge. Participation in conferences, symposia and workshops also contribute to the maintenance of competence.

Health and safety for staff

Health and safety should be ensured for all staff in field surveys, laboratories and offices. Occupational hygienists may be exposed to serious hazards and should wear the required personal protective equipment. Depending on the type of work, immunization may be required. If rural work is involved, depending on the region, provisions such as antidote for snake bites should be made. Laboratory safety is a specialized field discussed elsewhere in this Encyclopaedia.

Occupational hazards in offices should not be overlooked—for example, work with visual display units and sources of indoor pollution such as laser printers, photocopying machines and air-conditioning systems. Ergonomic and psychosocial factors should also be considered.

Facilities

These include offices and meeting room(s), laboratories and equipment, information systems and library. Facilities should be well designed, accounting for future needs, as later moves and adaptations are usually more costly and time consuming.

Occupational hygiene laboratories and equipment

Occupational hygiene laboratories should have in principle the capability to carry out qualitative and quantitative assessment of exposure to airborne contaminants (chemicals and dusts), physical agents (noise, heat stress, radiation, illumination) and biological agents. In the case of most biological agents, qualitative assessments are enough to recommend controls, thus eliminating the need for the usually difficult quantitative evaluations.

Although some direct-reading instruments for airborne contaminants may have limitations for exposure assessment purposes, these are extremely useful for the recognition of hazards and identification of their sources, the determination of peaks in concentration, the gathering of data for control measures, and for checking on controls such as ventilation systems. In connection with the latter, instruments to check air velocity and static pressure are also needed.

One of the possible structures would comprise the following units:

  • field equipment (sampling, direct-reading)
  • analytical laboratory
  • particles laboratory
  • physical agents (noise, thermal environment, illumination and radiation)
  • workshop for maintenance and repairs of instrumentation.

 

Whenever selecting occupational hygiene equipment, in addition to performance characteristics, practical aspects have to be considered in view of the expected conditions of use—for example, available infrastructure, climate, location. These aspects include portability, required source of energy, calibration and maintenance requirements, and availability of the required expendable supplies.

Equipment should be purchased only if and when:

  • there is a real need
  • skills for the adequate operation, maintenance and repairs are available
  • the complete procedure has been developed, since it is of no use, for example, to purchase sampling pumps without a laboratory to analyse the samples (or an agreement with an outside laboratory).

 

Calibration of all types of occupational hygiene measuring and sampling as well as analytical equipment should be an integral part of any procedure, and the required equipment should be available.

Maintenance and repairs are essential to prevent equipment from staying idle for long periods of time, and should be ensured by manufacturers, either by direct assistance or by providing training of staff.

If a completely new programme is being developed, only basic equipment should be initially purchased, more items being added as the needs are established and operational capabilities ensured. However, even before equipment and laboratories are available and operational, much can be achieved by inspecting workplaces to qualitatively assess health hazards, and by recommending control measures for recognized hazards. Lack of capability to carry out quantitative exposure assessments should never justify inaction concerning obviously hazardous exposures. This is particularly true for situations where workplace hazards are uncontrolled and heavy exposures are common.

Information

This includes library (books, periodicals and other publications), databases (e.g. on CD-ROM) and communications.

Whenever possible, personal computers and CD-ROM readers should be provided, as well as connections to the INTERNET. There are ever-increasing possibilities for on-line networked public information servers (World Wide Web and GOPHER sites), which provide access to a wealth of information sources relevant to workers’ health, therefore fully justifying investment in computers and communications. Such systems should include e-mail, which opens new horizons for communications and discussions, either individually or as groups, thus facilitating and promoting exchange of information throughout the world.

Planning

Timely and careful planning for the implementation, management and periodic evaluation of a programme is essential to ensure that the objectives and goals are achieved, while making the best use of the available resources.

Initially, the following information should be obtained and analysed:

  • nature and magnitude of prevailing hazards, in order to establish priorities
  • legal requirements (legislation, standards)
  • available resources
  • infrastructure and support services.

 

The planning and organization processes include:

  • establishment of the purpose of the programme or service, definition of objectives and the scope of the activities, in view of the expected demand and the available resources
  • allocation of resources
  • definition of the organizational structure
  • profile of the required human resources and plans for their development (if needed)
  • clear assignment of responsibilities to units, teams and individuals
  • design/adaptation of the facilities
  • selection of equipment
  • operational requirements
  • establishment of mechanisms for communication within and outside the service
  • timetable.

 

Operational costs should not be underestimated, since lack of resources may seriously hinder the continuity of a programme. Requirements which cannot be overlooked include:

  • purchase of expendable supplies (including items such as filters, detector tubes, charcoal tubes, reagents), spare parts for equipment, etc.
  • maintenance and repairs of equipment
  • transportation (vehicles, fuel, maintenance) and travel
  • information update.

 

Resources must be optimized through careful study of all elements which should be considered as integral parts of a comprehensive service. A well-balanced allocation of resources to the different units (field measurements, sampling, analytical laboratories, etc.) and all the components (facilities and equipment, personnel, operational aspects) is essential for a successful programme. Moreover, allocation of resources should allow for flexibility, because occupational hygiene services may have to undergo adaptations in order to respond to the real needs, which should be periodically assessed.

Communication, sharing and collaboration are key words for successful teamwork and enhanced individual capabilities. Effective mechanisms for communication, within and outside the programme, are needed to ensure the required multidisciplinary approach for the protection and promotion of workers’ health. There should be close interaction with other occupational health professionals, particularly occupational physicians and nurses, ergonomists and work psychologists, as well as safety professionals. At the workplace level, this should include workers, production personnel and managers.

The implementation of successful programmes is a gradual process. Therefore, at the planning stage, a realistic timetable should be prepared, according to well-established priorities and in view of the available resources.

Management

Management involves decision-making as to the goals to be achieved and actions required to efficiently achieve these goals, with participation of all concerned, as well as foreseeing and avoiding, or recognizing and solving, the problems which may create obstacles to the completion of the required tasks. It should be kept in mind that scientific knowledge is no assurance of the managerial competence required to run an efficient programme.

The importance of implementing and enforcing correct procedures and quality assurance cannot be overemphasized, since there is much difference between work done and work well done. Moreover, the real objectives, not the intermediate steps, should serve as a yardstick; the efficiency of an occupational hygiene programme should be measured not by the number of surveys carried out, but rather by the number of surveys that led to actual action to protect workers’ health.

Good management should be able to distinguish between what is impressive and what is important; very detailed surveys involving sampling and analysis, yielding very accurate and precise results, may be very impressive, but what is really important are the decisions and actions that will be taken afterwards.

Quality assurance

The concept of quality assurance, involving quality control and proficiency testing, refers primarily to activities which involve measurements. Although these concepts have been more often considered in connection with analytical laboratories, their scope has to be extended to also encompass sampling and measurements.

Whenever sampling and analysis are required, the complete procedure should be considered as one, from the point of view of quality. Since no chain is stronger than the weakest link, it is a waste of resources to use, for the different steps of a same evaluation procedure, instruments and techniques of unequal levels of quality. The accuracy and precision of a very good analytical balance cannot compensate for a pump sampling at a wrong flowrate.

The performance of laboratories has to be checked so that the sources of errors can be identified and corrected. There is need for a systematic approach in order to keep the numerous details involved under control. It is important to establish quality assurance programmes for occupational hygiene laboratories, and this refers both to internal quality control and to external quality assessments (often called “proficiency testing”).

Concerning sampling, or measurements with direct-reading instruments (including for measurement of physical agents), quality involves adequate and correct:

  • preliminary studies including the identification of possible hazards and the factors required for the design of the strategy
  • design of the sampling (or measurement) strategy
  • selection and utilization of methodologies and equipment for sampling or measurements, accounting both for the purpose of the investigation and for quality requirements
  • performance of the procedures, including time monitoring
  • handling, transport and storage of samples (if the case).

 

Concerning the analytical laboratory, quality involves adequate and correct:

  • design and installation of the facilities
  • selection and utilization of validated analytical methods (or, if necessary, validation of analytical methods)
  • selection and installation of instrumentation
  • adequate supplies (reagents, reference samples, etc.).

 

For both, it is indispensable to have:

  • clear protocols, procedures and written instructions
  • routine calibration and maintenance of the equipment
  • training and motivation of the staff to adequately perform the required procedures
  • adequate management
  • internal quality control
  • external quality assessment or proficiency testing (if applicable).

 

Furthermore, it is essential to have a correct treatment of the obtained data and interpretation of results, as well as accurate reporting and record keeping.

Laboratory accreditation, defined by CEN (EN 45001) as “formal recognition that a testing laboratory is competent to carry out specific tests or specific types of tests” is a very important control tool and should be promoted. It should cover both the sampling and the analytical procedures.

Programme evaluation

The concept of quality must be applied to all steps of occupational hygiene practice, from the recognition of hazards to the implementation of hazard prevention and control programmes. With this in mind, occupational hygiene programmes and services must be periodically and critically evaluated, aiming at continuous improvement.

Concluding Remarks

Occupational hygiene is essential for the protection of workers’ health and the environment. Its practice involves many steps, which are interlinked and which have no meaning by themselves but must be integrated into a comprehensive approach.

 

Back

Thursday, 10 March 2011 17:05

Recognition of Hazards

A workplace hazard can be defined as any condition that may adversely affect the well-being or health of exposed persons. Recognition of hazards in any occupational activity involves characterization of the workplace by identifying hazardous agents and groups of workers potentially exposed to these hazards. The hazards might be of chemical, biological or physical origin (see table 1). Some hazards in the work environment are easy to recognize—for example, irritants, which have an immediate irritating effect after skin exposure or inhalation. Others are not so easy to recognize—for example, chemicals which are accidentally formed and have no warning properties. Some agents like metals (e.g., lead, mercury, cadmium, manganese), which may cause injury after several years of exposure, might be easy to identify if you are aware of the risk. A toxic agent may not constitute a hazard at low concentrations or if no one is exposed. Basic to the recognition of hazards are identification of possible agents at the workplace, knowledge about health risks of these agents and awareness of possible exposure situations.

Table 1.  Hazards of chemical, biological and physical agents.

Type of hazard

Description

Examples

CHEMICAL

HAZARDS

 

Chemicals enter the body principally through inhalation, skin absorption or ingestion. The toxic effect might be acute, chronic or both.,

 

Corrosion

Corrosive chemicals actually cause tissue destruction at the site of contact. Skin, eyes and digestive system are the most commonly affected parts of the body.

Concentrated acids and alkalis, phosphorus

Irritation

Irritants cause inflammation of tissues where they are deposited. Skin irritants may cause reactions like eczema or dermatitis. Severe respiratory irritants might cause shortness of breath, inflammatory responses and oedema.

Skin: acids, alkalis, solvents, oils Respiratory: aldehydes, alkaline dusts, ammonia, nitrogendioxide, phosgene, chlorine, bromine, ozone

Allergic reactions

Chemical allergens or sensitizers can cause skin or respiratory allergic reactions.

Skin: colophony (rosin), formaldehyde, metals like chromium or nickel, some organic dyes, epoxy hardeners, turpentine

Respiratory: isocyanates, fibre-reactive dyes, formaldehyde, many tropical wood dusts, nickel

 

Asphyxiation

Asphyxiants exert their effects by interfering with the oxygenation of the tissues. Simple asphyxiants are inert gases that dilute the available atmospheric oxygen below the level required to support life. Oxygen-deficient atmospheres may occur in tanks, holds of ships, silos or mines. Oxygen concentration in air should never be below 19.5% by volume. Chemical asphyxiants prevent oxygen transport and the normal oxygenation of blood or prevent normal oxygenation of tissues.

Simple asphyxiants: methane, ethane, hydrogen, helium

Chemical asphyxiants: carbon monoxide, nitrobenzene, hydrogencyanide, hydrogen sulphide

 

Cancer

Known human carcinogens are chemicals that have been clearly demonstrated to cause cancer in humans. Probable human carcinogens are chemicals that have been clearly demonstrated to cause cancer in animals or the evidence is not definite in humans. Soot and coal tars were the first chemicals suspected to cause cancer.

Known: benzene (leukaemia); vinyl chloride (liver angio-sarcoma); 2-naphthylamine, benzidine (bladder cancer); asbestos (lung cancer, mesothelioma); hardwood dust (nasalor nasal sinus adenocarcinoma) Probable: formaldehyde, carbon tetrachloride, dichromates, beryllium

Reproductive

effects

 

Reproductive toxicants interfere with reproductive or sexual functioning of an individual.

Manganese, carbon disulphide, monomethyl and ethyl ethers of ethylene glycol, mercury

 

Developmental toxicants are agents that may cause an adverse effect in offspring of exposed persons; for example, birth defects. Embryotoxic or foetotoxic chemicals can cause spontaneous abortions or miscarriages.

Organic mercury compounds, carbon monoxide, lead, thalidomide, solvents

Systemic

poisons

 

Systemic poisons are agents that cause injury to particular organs or body systems.

Brain: solvents, lead, mercury, manganese

Peripheral nervous system: n-hexane, lead, arsenic, carbon disulphide

Blood-forming system: benzene, ethylene glycol ethers

Kidneys: cadmium, lead, mercury, chlorinated hydrocarbons

Lungs: silica, asbestos, coal dust (pneumoconiosis)

 

 

 

 

BIOLOGICAL

HAZARDS

 

Biological hazards can be defined as organic dusts originating from different sources of biological origin such as virus, bacteria, fungi, proteins from animals or substances from plants such as degradation products of natural fibres. The aetiological agent might be derived from a viable organism or from contaminants or constitute a specific component in the dust. Biological hazards are grouped into infectious and non-infectious agents. Non-infectious hazards can be further divided into viable organisms, biogenic toxins and biogenic allergens.

 

Infectious hazards

Occupational diseases from infectious agents are relatively uncommon. Workers at risk include employees at hospitals, laboratory workers, farmers, slaughterhouse workers, veterinarians, zoo keepers and cooks. Susceptibility is very variable (e.g., persons treated with immunodepressing drugs will have a high sensitivity).

Hepatitis B, tuberculosis, anthrax, brucella, tetanus, chlamydia psittaci, salmonella

Viable organisms and biogenic toxins

Viable organisms include fungi, spores and mycotoxins; biogenic toxins include endotoxins, aflatoxin and bacteria. The products of bacterial and fungal metabolism are complex and numerous and affected by temperature, humidity and kind of substrate on which they grow. Chemically they might consist of proteins, lipoproteins or mucopolysaccharides. Examples are Gram positive and Gram negative bacteria and moulds. Workers at risk include cotton mill workers, hemp and flax workers, sewage and sludge treatment workers, grain silo workers.

Byssinosis, “grain fever”, Legionnaire’s disease

Biogenic allergens

Biogenic allergens include fungi, animal-derived proteins, terpenes, storage mites and enzymes. A considerable part of the biogenic allergens in agriculture comes from proteins from animal skin, hair from furs and protein from the faecal material and urine. Allergens might be found in many industrial environments, such as fermentation processes, drug production, bakeries, paper production, wood processing (saw mills, production, manufacturing) as well as in bio-technology (enzyme and vaccine production, tissue culture) and spice production. In sensitized persons, exposure to the allergic agents may induce allergic symptoms such as allergic rhinitis, conjunctivitis or asthma. Allergic alveolitis is characterized by acute respiratory symptoms like cough, chills, fever, headache and pain in the muscles, which might lead to chronic lung fibrosis.

Occupational asthma: wool, furs, wheat grain, flour, red cedar, garlic powder

Allergic alveolitis: farmer’s disease, bagassosis, “bird fancier’s disease”, humidifier fever, sequoiosis

 

PHYSICAL HAZARDS

 

 

Noise

Noise is considered as any unwanted sound that may adversely affect the health and well-being of individuals or populations. Aspects of noise hazards include total energy of the sound, frequency distribution, duration of exposure and impulsive noise. Hearing acuity is generally affected first with a loss or dip at 4000 Hz followed by losses in the frequency range from 2000 to 6000 Hz. Noise might result in acute effects like communication problems, decreased concentration, sleepiness and as a consequence interference with job performance. Exposure to high levels of noise (usually above 85 dBA) or impulsive noise (about 140 dBC) over a significant period of time may cause both temporary and chronic hearing loss. Permanent hearing loss is the most common occupational disease in compensation claims.

Foundries, woodworking, textile mills, metalworking

Vibration

Vibration has several parameters in common with noise-frequency, amplitude, duration of exposure and whether it is continuous or intermittent. Method of operation and skilfulness of the operator seem to play an important role in the development of harmful effects of vibration. Manual work using powered tools is associated with symptoms of peripheral circulatory disturbance known as “Raynaud’s phenomenon” or “vibration-induced white fingers” (VWF). Vibrating tools may also affect the peripheral nervous system and the musculo-skeletal system with reduced grip strength, low back pain and degenerative back disorders.

Contract machines, mining loaders, fork-lift trucks, pneumatic tools, chain saws

Ionizing

radiation

 

The most important chronic effect of ionizing radiation is cancer, including leukaemia. Overexposure from comparatively low levels of radiation have been associated with dermatitis of the hand and effects on the haematological system. Processes or activities which might give excessive exposure to ionizing radiation are very restricted and regulated.

Nuclear reactors, medical and dental x-ray tubes, particle accelerators, radioisotopes

Non-ionizing

radiation

 

Non-ionizing radiation consists of ultraviolet radiation, visible radiation, infrared, lasers, electromagnetic fields (microwaves and radio frequency) and extreme low frequency radiation. IR radiation might cause cataracts. High-powered lasers may cause eye and skin damage. There is an increasing concern about exposure to low levels of electromagnetic fields as a cause of cancer and as a potential cause of adverse reproductive outcomes among women, especially from exposure to video display units. The question about a causal link to cancer is not yet answered. Recent reviews of available scientific knowledge generally conclude that there is no association between use of VDUs and adverse reproductive outcome.

Ultraviolet radiation: arc welding and cutting; UV curing of inks, glues, paints, etc.; disinfection; product control

Infrared radiation: furnaces, glassblowing

Lasers: communications, surgery, construction

 

 

 

Identification and Classification of Hazards

Before any occupational hygiene investigation is performed the purpose must be clearly defined. The purpose of an occupational hygiene investigation might be to identify possible hazards, to evaluate existing risks at the workplace, to prove compliance with regulatory requirements, to evaluate control measures or to assess exposure with regard to an epidemiological survey. This article is restricted to programmes aimed at identification and classification of hazards at the workplace. Many models or techniques have been developed to identify and evaluate hazards in the working environment. They differ in complexity, from simple checklists, preliminary industrial hygiene surveys, job-exposure matrices and hazard and operability studies to job exposure profiles and work surveillance programmes (Renes 1978; Gressel and Gideon 1991; Holzner, Hirsh and Perper 1993; Goldberg et al. 1993; Bouyer and Hémon 1993; Panett, Coggon and Acheson 1985; Tait 1992). No single technique is a clear choice for everyone, but all techniques have parts which are useful in any investigation. The usefulness of the models also depends on the purpose of the investigation, size of workplace, type of production and activity as well as complexity of operations.

Identification and classification of hazards can be divided into three basic elements: workplace characterization, exposure pattern and hazard evaluation.

Workplace characterization

A workplace might have from a few employees up to several thousands and have different activities (e.g., production plants, construction sites, office buildings, hospitals or farms). At a workplace different activities can be localized to special areas such as departments or sections. In an industrial process, different stages and operations can be identified as production is followed from raw materials to finished products.

Detailed information should be obtained about processes, operations or other activities of interest, to identify agents utilized, including raw materials, materials handled or added in the process, primary products, intermediates, final products, reaction products and by-products. Additives and catalysts in a process might also be of interest to identify. Raw material or added material which has been identified only by trade name must be evaluated by chemical composition. Information or safety data sheets should be available from manufacturer or supplier.

Some stages in a process might take place in a closed system without anyone exposed, except during maintenance work or process failure. These events should be recognized and precautions taken to prevent exposure to hazardous agents. Other processes take place in open systems, which are provided with or without local exhaust ventilation. A general description of the ventilation system should be provided, including local exhaust system.

When possible, hazards should be identified in the planning or design of new plants or processes, when changes can be made at an early stage and hazards might be anticipated and avoided. Conditions and procedures that may deviate from the intended design must be identified and evaluated in the process state. Recognition of hazards should also include emissions to the external environment and waste materials. Facility locations, operations, emission sources and agents should be grouped together in a systematic way to form recognizable units in the further analysis of potential exposure. In each unit, operations and agents should be grouped according to health effects of the agents and estimation of emitted amounts to the work environment.

Exposure patterns

The main exposure routes for chemical and biological agents are inhalation and dermal uptake or incidentally by ingestion. The exposure pattern depends on frequency of contact with the hazards, intensity of exposure and time of exposure. Working tasks have to be systematically examined. It is important not only to study work manuals but to look at what actually happens at the workplace. Workers might be directly exposed as a result of actually performing tasks, or be indirectly exposed because they are located in the same general area or location as the source of exposure. It might be necessary to start by focusing on working tasks with high potential to cause harm even if the exposure is of short duration. Non-routine and intermittent operations (e.g., maintenance, cleaning and changes in production cycles) have to be considered. Working tasks and situations might also vary throughout the year.

Within the same job title exposure or uptake might differ because some workers wear protective equipment and others do not. In large plants, recognition of hazards or a qualitative hazard evaluation very seldom can be performed for every single worker. Therefore workers with similar working tasks have to be classified in the same exposure group. Differences in working tasks, work techniques and work time will result in considerably different exposure and have to be considered. Persons working outdoors and those working without local exhaust ventilation have been shown to have a larger day-to-day variability than groups working indoors with local exhaust ventilation (Kromhout, Symanski and Rappaport 1993). Work processes, agents applied for that process/job or different tasks within a job title might be used, instead of the job title, to characterize groups with similar exposure. Within the groups, workers potentially exposed must be identified and classified according to hazardous agents, routes of exposure, health effects of the agents, frequency of contact with the hazards, intensity and time of exposure. Different exposure groups should be ranked according to hazardous agents and estimated exposure in order to determine workers at greatest risk.

Qualitative hazard evaluation

Possible health effects of chemical, biological and physical agents present at the workplace should be based on an evaluation of available epidemiological, toxicological, clinical and environmental research. Up-to-date information about health hazards for products or agents used at the workplace should be obtained from health and safety journals, databases on toxicity and health effects, and relevant scientific and technical literature.

Material Safety Data Sheets (MSDSs) should if necessary be updated. Data Sheets document percentages of hazardous ingredients together with the Chemical Abstracts Service chemical identifier, the CAS-number, and threshold limit value (TLV), if any. They also contain information about health hazards, protective equipment, preventive actions, manufacturer or supplier, and so on. Sometimes the ingredients reported are rather rudimentary and have to be supplemented with more detailed information.

Monitored data and records of measurements should be studied. Agents with TLVs provide general guidance in deciding whether the situation is acceptable or not, although there must be allowance for possible interactions when workers are exposed to several chemicals. Within and between different exposure groups, workers should be ranked according to health effects of agents present and estimated exposure (e.g., from slight health effects and low exposure to severe health effects and estimated high exposure). Those with the highest ranks deserve highest priority. Before any prevention activities start it might be necessary to perform an exposure monitoring programme. All results should be documented and easily attainable. A working scheme is illustrated in figure 1.

Figure 1. Elements of risk assessment

IHY010F3

In occupational hygiene investigations the hazards to the outdoor environment (e.g., pollution and greenhouse effects as well as effects on the ozone layer) might also be considered.

Chemical, Biological and Physical Agents

Hazards might be of chemical, biological or physical origin. In this section and in table 1 a brief description of the various hazards will be given together with examples of environments or activities where they will be found (Casarett 1980; International Congress on Occupational Health 1985; Jacobs 1992; Leidel, Busch and Lynch 1977; Olishifski 1988; Rylander 1994). More detailed information will be found elsewhere in this Encyclopaedia.

Chemical agents

Chemicals can be grouped into gases, vapours, liquids and aerosols (dusts, fumes, mists).

Gases

Gases are substances that can be changed to liquid or solid state only by the combined effects of increased pressure and decreased temperature. Handling gases always implies risk of exposure unless they are processed in closed systems. Gases in containers or distribution pipes might accidentally leak. In processes with high temperatures (e.g., welding operations and exhaust from engines) gases will be formed.

Vapours

Vapours are the gaseous form of substances that normally are in the liquid or solid state at room temperature and normal pressure. When a liquid evaporates it changes to a gas and mixes with the surrounding air. A vapour can be regarded as a gas, where the maximal concentration of a vapour depends on the temperature and the saturation pressure of the substance. Any process involving combustion will generate vapours or gases. Degreasing operations might be performed by vapour phase degreasing or soak cleaning with solvents. Work activities like charging and mixing liquids, painting, spraying, cleaning and dry cleaning might generate harmful vapours.

Liquids

Liquids may consist of a pure substance or a solution of two or more substances (e.g., solvents, acids, alkalis). A liquid stored in an open container will partially evaporate into the gas phase. The concentration in the vapour phase at equilibrium depends on the vapour pressure of the substance, its concentration in the liquid phase, and the temperature. Operations or activities with liquids might give rise to splashes or other skin contact, besides harmful vapours.

Dusts

Dusts consist of inorganic and organic particles, which can be classified as inhalable, thoracic or respirable, depending on particle size. Most organic dusts have a biological origin. Inorganic dusts will be generated in mechanical processes like grinding, sawing, cutting, crushing, screening or sieving. Dusts may be dispersed when dusty material is handled or whirled up by air movements from traffic. Handling dry materials or powder by weighing, filling, charging, transporting and packing will generate dust, as will activities like insulation and cleaning work.

Fumes

Fumes are solid particles vaporized at high temperature and condensed to small particles. The vaporization is often accompanied by a chemical reaction such as oxidation. The single particles that make up a fume are extremely fine, usually less than 0.1 μm, and often aggregate in larger units. Examples are fumes from welding, plasma cutting and similar operations.

Mists

Mists are suspended liquid droplets generated by condensation from the gaseous state to the liquid state or by breaking up a liquid into a dispersed state by splashing, foaming or atomizing. Examples are oil mists from cutting and grinding operations, acid mists from electroplating, acid or alkali mists from pickling operations or paint spray mists from spraying operations.

 

Back

Thursday, 10 March 2011 17:16

Evaluation of the Work Environment

Hazard Surveillance and Survey Methods

Occupational surveillance involves active programmes to anticipate, observe, measure, evaluate and control exposures to potential health hazards in the workplace. Surveillance often involves a team of people that includes an occupational hygienist, occupational physician, occupational health nurse, safety officer, toxicologist and engineer. Depending upon the occupational environment and problem, three surveillance methods can be employed: medical, environmental and biological. Medical surveillance is used to detect the presence or absence of adverse health effects for an individual from occupational exposure to contaminants, by performing medical examinations and appropriate biological tests. Environmental surveillance is used to document potential exposure to contaminants for a group of employees, by measuring the concentration of contaminants in the air, in bulk samples of materials, and on surfaces. Biological surveillance is used to document the absorption of contaminants into the body and correlate with environmental contaminant levels, by measuring the concentration of hazardous substances or their metabolites in the blood, urine or exhaled breath of workers.

Medical Surveillance

Medical surveillance is performed because diseases can be caused or exacerbated by exposure to hazardous substances. It requires an active programme with professionals who are knowledgeable about occupational diseases, diagnoses and treatment. Medical surveillance programmes provide steps to protect, educate, monitor and, in some cases, compensate the employee. It can include pre-employment screening programmes, periodic medical examinations, specialized tests to detect early changes and impairment caused by hazardous substances, medical treatment and extensive record keeping. Pre-employment screening involves the evaluation of occupational and medical history questionnaires and results of physical examinations. Questionnaires provide information concerning past illnesses and chronic diseases (especially asthma, skin, lung and heart diseases) and past occupational exposures. There are ethical and legal implications of pre-employment screening programmes if they are used to determine employment eligibility. However, they are fundamentally important when used to (1) provide a record of previous employment and associated exposures, (2) establish a baseline of health for an employee and (3) test for hypersusceptibility. Medical examinations can include audiometric tests for hearing loss, vision tests, tests of organ function, evaluation of fitness for wearing respiratory protection equipment, and baseline urine and blood tests. Periodic medical examinations are essential for evaluating and detecting trends in the onset of adverse health effects and may include biological monitoring for specific contaminants and the use of other biomarkers.

Environmental and Biological Surveillance

Environmental and biological surveillance starts with an occupational hygiene survey of the work environment to identify potential hazards and contaminant sources, and determine the need for monitoring. For chemical agents, monitoring could involve air, bulk, surface and biological sampling. For physical agents, monitoring could include noise, temperature and radiation measurements. If monitoring is indicated, the occupational hygienist must develop a sampling strategy that includes which employees, processes, equipment or areas to sample, the number of samples, how long to sample, how often to sample, and the sampling method. Industrial hygiene surveys vary in complexity and focus depending upon the purpose of the investigation, type and size of establishment, and nature of the problem.

There are no rigid formulas for performing surveys; however, thorough preparation prior to the on-site inspection significantly increases effectiveness and efficiency. Investigations that are motivated by employee complaints and illnesses have an additional focus of identifying the cause of the health problems. Indoor air quality surveys focus on indoor as well as outdoor sources of contamination. Regardless of the occupational hazard, the overall approach to surveying and sampling workplaces is similar; therefore, this chapter will use chemical agents as a model for the methodology.

Routes of Exposure

The mere presence of occupational stresses in the workplace does not automatically imply that there is a significant potential for exposure; the agent must reach the worker. For chemicals, the liquid or vapour form of the agent must make contact with and/or be absorbed into the body to induce an adverse health effect. If the agent is isolated in an enclosure or captured by a local exhaust ventilation system, the exposure potential will be low, regardless of the chemical’s inherent toxicity.

The route of exposure can impact the type of monitoring performed as well as the hazard potential. For chemical and biological agents, workers are exposed through inhalation, skin contact, ingestion and injection; the most common routes of absorption in the occupational environment are through the respiratory tract and the skin. To assess inhalation, the occupational hygienist observes the potential for chemicals to become airborne as gases, vapours, dusts, fumes or mists.

Skin absorption of chemicals is important primarily when there is direct contact with the skin through splashing, spraying, wetting or immersion with fat-soluble hydrocarbons and other organic solvents. Immersion includes body contact with contaminated clothing, hand contact with contaminated gloves, and hand and arm contact with bulk liquids. For some substances, such as amines and phenols, skin absorption can be as rapid as absorption through the lungs for substances that are inhaled. For some contaminants such as pesticides and benzidine dyes, skin absorption is the primary route of absorption, and inhalation is a secondary route. Such chemicals can readily enter the body through the skin, increase body burden and cause systemic damage. When allergic reactions or repeated washing dries and cracks the skin, there is a dramatic increase in the number and type of chemicals that can be absorbed into the body. Ingestion, an uncommon route of absorption for gases and vapours, can be important for particulates, such as lead. Ingestion can occur from eating contaminated food, eating or smoking with contaminated hands, and coughing and then swallowing previously inhaled particulates.

Injection of materials directly into the bloodstream can occur from hypodermic needles inadvertently puncturing the skin of health care workers in hospitals, and from high-velocity projectiles released from high-pressure sources and directly contacting the skin. Airless paint sprayers and hydraulic systems have pressures high enough to puncture the skin and introduce substances directly into the body.

The Walk-Through Inspection

The purpose of the initial survey, called the walk-through inspection, is to systematically gather information to judge whether a potentially hazardous situation exists and whether monitoring is indicated. An occupational hygienist begins the walk-through survey with an opening meeting that can include representatives of management, employees, supervisors, occupational health nurses and union representatives. The occupational hygienist can powerfully impact the success of the survey and any subsequent monitoring initiatives by creating a team of people who communicate openly and honestly with one another and understand the goals and scope of the inspection. Workers must be involved and informed from the beginning to ensure that cooperation, not fear, dominates the investigation.

During the meeting, requests are made for process flow diagrams, plant layout drawings, past environmental inspection reports, production schedules, equipment maintenance schedules, documentation of personal protection programmes, and statistics concerning the number of employees, shifts and health complaints. All hazardous materials used and produced by an operation are identified and quantified. A chemical inventory of products, by-products, intermediates and impurities is assembled and all associated Material Safety Data Sheets are obtained. Equipment maintenance schedules, age and condition are documented because the use of older equipment may result in higher exposures due to the lack of controls.

After the meeting, the occupational hygienist performs a visual walk-through survey of the workplace, scrutinizing the operations and work practices, with the goal of identifying potential occupational stresses, ranking the potential for exposure, identifying the route of exposure and estimating the duration and frequency of exposure. Examples of occupational stresses are given in figure 1. The occupational hygienist uses the walk-through inspection to observe the workplace and have questions answered. Examples of observations and questions are given in figure 2.

Figure 1.  Occupational stresses. 

IHY040T1

Figure 2.  Observations and questions to ask on a walk-through survey.

IHY040T2

In addition to the questions shown in figure 5, questions should be asked that uncover what is not immediately obvious. Questions could address:

  1. non-routine tasks and schedules for maintenance and cleaning activities
  2. recent process changes and chemical substitutions
  3. recent physical changes in the work environment
  4. changes in job functions
  5. recent renovations and repairs.

 

Non-routine tasks can result in significant peak exposures to chemicals that are difficult to predict and measure during a typical workday. Process changes and chemical substitutions may alter the release of substances into the air and affect subsequent exposure. Changes in the physical layout of a work area can alter the effectiveness of an existing ventilation system. Changes in job functions can result in tasks performed by inexperienced workers and increased exposures. Renovations and repairs may introduce new materials and chemicals into the work environment which off-gas volatile organic chemicals or are irritants.

Indoor Air Quality Surveys

Indoor air quality surveys are distinct from traditional occupational hygiene surveys because they are typically encountered in non-industrial workplaces and may involve exposures to mixtures of trace quantities of chemicals, none of which alone appears capable of causing illness (Ness 1991). The goal of indoor air quality surveys is similar to occupational hygiene surveys in terms of identifying sources of contamination and determining the need for monitoring. However, indoor air quality surveys are always motivated by employee health complaints. In many cases, the employees have a variety of symptoms including headaches, throat irritation, lethargy, coughing, itching, nausea and non-specific hypersensitivity reactions that disappear when they go home. When health complaints do not disappear after the employees leave work, non-occupational exposures should be considered as well. Non-occupational exposures include hobbies, other jobs, urban air pollution, passive smoking and indoor exposures in the home. Indoor air quality surveys frequently use questionnaires to document employee symptoms and complaints and link them to job location or job function within the building. The areas with the highest incidence of symptoms are then targeted for further inspection.

Sources of indoor air contaminants that have been documented in indoor air quality surveys include:

  • inadequate ventilation (52%)
  • contamination from inside of the building (17%)
  • contamination from outside of the building (11%)
  • microbial contamination (5%)
  • contamination from the building materials (3%)
  • unknown causes (12%).

 

For indoor air quality investigations, the walk-through inspection is essentially a building and environmental inspection to determine potential sources of contamination both inside and outside of the building. Inside building sources include:

  1. building construction materials such as insulation, particleboard, adhesives and paints
  2. human occupants that can release chemicals from metabolic activities
  3. human activities such as smoking
  4. equipment such as copy machines
  5. ventilation systems that can be contaminated with micro-organisms.

 

Observations and questions that can be asked during the survey are listed in figure 3.

Figure 3. Observations and questions for an indoor air quality walk-through survey.

IHY040T3

Sampling and Measurement Strategies

Occupational exposure limits

After the walk-through inspection is completed, the occupational hygienist must determine whether sampling is necessary; sampling should be performed only if the purpose is clear. The occupational hygienist must ask, “What will be made of the sampling results and what questions will the results answer?” It is relatively easy to sample and obtain numbers; it is far more difficult to interpret them.

Air and biological sampling data are usually compared to recommended or mandated occupational exposure limits (OELs). Occupational exposure limits have been developed in many countries for inhalation and biological exposure to chemical and physical agents. To date, out of a universe of over 60,000 commercially used chemicals, approximately 600 have been evaluated by a variety of organizations and countries. The philosophical bases for the limits are determined by the organizations that have developed them. The most widely used limits, called threshold limit values (TLVs), are those issued in the United States by the American Conference of Governmental Industrial Hygienists (ACGIH). Most of the OELs used by the Occupational Safety and Health Administration (OSHA) in the United States are based upon the TLVs. However, the National Institute for Occupational Safety and Health (NIOSH) of the US Department of Health and Human Services has suggested their own limits, called recommended exposure limits (RELs).

For airborne exposures, there are three types of TLVs: an eight-hour time-weighted-average exposure, TLV-TWA, to protect against chronic health effects; a fifteen-minute average short-term exposure limit, TLV-STEL, to protect against acute health effects; and an instantaneous ceiling value, TLV-C, to protect against asphyxiants or chemicals that are immediately irritating. Guidelines for biological exposure levels are called biological exposure indices (BEIs). These guidelines represent the concentration of chemicals in the body that would correspond to inhalation exposure of a healthy worker at a specific concentration in air. Outside of the United States as many as 50 countries or groups have established OELs, many of which are identical to the TLVs. In Britain, the limits are called the Health and Safety Executive Occupational Exposure Standards (OES), and in Germany OELs are called Maximum Workplace Concentrations (MAKs).

OELs have been set for airborne exposures to gases, vapours and particulates; they do not exist for airborne exposures to biological agents. Therefore, most investigations of bioaerosol exposure compare indoor with outdoor concentrations. If the indoor/outdoor profile and concentration of organisms is different, an exposure problem may exist. There are no OELs for skin and surface sampling, and each case must be evaluated separately. In the case of surface sampling, concentrations are usually compared with acceptable background concentrations that were measured in other studies or were determined in the current study. For skin sampling, acceptable concentrations are calculated based upon toxicity, rate of absorption, amount absorbed and total dose. In addition, biological monitoring of a worker may be used to investigate skin absorption.

Sampling strategy

An environmental and biological sampling strategy is an approach to obtaining exposure measurements that fulfils a purpose. A carefully designed and effective strategy is scientifically defensible, optimizes the number of samples obtained, is cost-effective and prioritizes needs. The goal of the sampling strategy guides decisions concerning what to sample (selection of chemical agents), where to sample (personal, area or source sample), whom to sample (which worker or group of workers), sample duration (real-time or integrated), how often to sample (how many days), how many samples, and how to sample (analytical method). Traditionally, sampling performed for regulatory purposes involves brief campaigns (one or two days) that concentrate on worst-case exposures. While this strategy requires a minimum expenditure of resources and time, it often captures the least amount of information and has little applicability to evaluating long-term occupational exposures. To evaluate chronic exposures so that they are useful for occupational physicians and epidemiological studies, sampling strategies must involve repeated sampling over time for large numbers of workers.

Purpose

The goal of environmental and biological sampling strategies is either to evaluate individual employee exposures or to evaluate contaminant sources. Employee monitoring may be performed to:

  • evaluate individual exposures to chronic or acute toxicants
  • respond to employee complaints about health and odours
  • create a baseline of exposures for a long-term monitoring programme
  • determine whether exposures comply with governmental regulations
  • evaluate the effectiveness of engineering or process controls
  • evaluate acute exposures for emergency response
  • evaluate exposures at hazardous waste sites
  • evaluate the impact of work practices on exposure
  • evaluate exposures for individual job tasks
  • investigate chronic illnesses such as lead and mercury poisoning
  • investigate the relationship between occupational exposure and disease
  • carry out an epidemiological study.

 

Source and ambient air monitoring may be performed to:

  • establish a need for engineering controls such as local exhaust ventilation systems and enclosures
  • evaluate the impact of equipment or process modifications
  • evaluate the effectiveness of engineering or process controls
  • evaluate emissions from equipment or processes
  • evaluate compliance after remediation activities such as asbestos and lead removal
  • respond to indoor air, community illness and odour complaints
  • evaluate emissions from hazardous waste sites
  • investigate an emergency response
  • carry out an epidemiological study.

 

When monitoring employees, air sampling provides surrogate measures of dose resulting from inhalation exposure. Biological monitoring can provide the actual dose of a chemical resulting from all absorption routes including inhalation, ingestion, injection and skin. Thus, biological monitoring can more accurately reflect an individual’s total body burden and dose than air monitoring. When the relationship between airborne exposure and internal dose is known, biological monitoring can be used to evaluate past and present chronic exposures.

Goals of biological monitoring are listed in figure 4.

Figure 4. Goals of biological monitoring.

IHY040T4

Biological monitoring has its limitations and should be performed only if it accomplishes goals that cannot be accomplished with air monitoring alone (Fiserova-Bergova 1987). It is invasive, requiring samples to be taken directly from workers. Blood samples generally provide the most useful biological medium to monitor; however, blood is taken only if non-invasive tests such as urine or exhaled breath are not applicable. For most industrial chemicals, data concerning the fate of chemicals absorbed by the body are incomplete or non-existent; therefore, only a limited number of analytical measurement methods are available, and many are not sensitive or specific.

Biological monitoring results may be highly variable between individuals exposed to the same airborne concentrations of chemicals; age, health, weight, nutritional status, drugs, smoking, alcohol consumption, medication and pregnancy can impact uptake, absorption, distribution, metabolism and elimination of chemicals.

 

What to sample

Most occupational environments have exposures to multiple contaminants. Chemical agents are evaluated both individually and as multiple simultaneous assaults on workers. Chemical agents can act independently within the body or interact in a way that increases the toxic effect. The question of what to measure and how to interpret the results depends upon the biological mechanism of action of the agents when they are within the body. Agents can be evaluated separately if they act independently on altogether different organ systems, such as an eye irritant and a neurotoxin. If they act on the same organ system, such as two respiratory irritants, their combined effect is important. If the toxic effect of the mixture is the sum of the separate effects of the individual components, it is termed additive. If the toxic effect of the mixture is greater than the sum of the effects of the separate agents, their combined effect is termed synergistic. Exposure to cigarette smoking and inhalation of asbestos fibres gives rise to a much greater risk of lung cancer than a simple additive effect.

Sampling all the chemical agents in a workplace would be both expensive and not necessarily defensible. The occupational hygienist must prioritize the laundry list of potential agents by hazard or risk to determine which agents receive the focus.

Factors involved in ranking chemicals include:

  • whether the agents interact independently, additively or synergistically
  • inherent toxicity of the chemical agent
  • quantities used and generated
  • number of people potentially exposed
  • anticipated duration and concentration of the exposure
  • confidence in the engineering controls
  • anticipated changes in the processes or controls
  • occupational exposure limits and guidelines.
Where to sample

To provide the best estimate of employee exposure, air samples are taken in the breathing zone of the worker (within a 30 cm radius of the head), and are called personal samples. To obtain breathing zone samples, the sampling device is placed directly on the worker for the duration of the sampling. If air samples are taken near the worker, outside of the breathing zone, they are called area samples. Area samples tend to underestimate personal exposures and do not provide good estimates of inhalation exposure. However, area samples are useful for evaluating contaminant sources and measuring ambient levels of contaminants. Area samples can be taken while walking through the workplace with a portable instrument, or with fixed sampling stations. Area sampling is routinely used at asbestos abatement sites for clearance sampling and for indoor air investigations.

Whom to sample

Ideally, to evaluate occupational exposure, each worker would be individually sampled for multiple days over the course of weeks or months. However, unless the workplace is small (<10 employees), it is usually not feasible to sample all the workers. To minimize the sampling burden in terms of equipment and cost, and increase the effectiveness of the sampling programme, a subset of employees from the workplace is sampled, and their monitoring results are used to represent exposures for the larger work force.

To select employees who are representative of the larger work force, one approach is to classify employees into groups with similar expected exposures, called homogeneous exposure groups (HEGs) (Corn 1985). After the HEGs are formed, a subset of workers is randomly selected from each group for sampling. Methods for determining the appropriate sample sizes assume a lognormal distribution of exposures, an estimated mean exposure, and a geometric standard deviation of 2.2 to 2.5. Prior sampling data might allow a smaller geometric standard deviation to be used. To classify employees into distinct HEGs, most occupational hygienists observe workers at their jobs and qualitatively predict exposures.

There are many approaches to forming HEGs; generally, workers may be classified by job task similarity or work area similarity. When both job and work area similarity are used, the method of classification is called zoning (see figure 5). Once airborne, chemical and biological agents can have complex and unpredictable spatial and temporal concentration patterns throughout the work environment. Therefore, proximity of the source relative to the employee may not be the best indicator of exposure similarity. Exposure measurements made on workers initially expected to have similar exposures may show that there is more variation between workers than predicted. In these cases, the exposure groups should be reconstructed into smaller sets of workers, and sampling should continue to verify that workers within each group actually have similar exposures (Rappaport 1995).

Figure 5.  Factors involved in creating HEGs using zoning.

IHY040T5

Exposures can be estimated for all the employees, regardless of job title or risk, or it can be estimated only for employees who are assumed to have the highest exposures; this is called worst-case sampling. The selection of worst-case sampling employees may be based upon production, proximity to the source, past sampling data, inventory and chemical toxicity. The worst-case method is used for regulatory purposes and does not provide a measure of long-term mean exposure and day-to-day variability. Task-related sampling involves selecting workers with jobs that have similar tasks that occur on a less than daily basis.

There are many factors that enter into exposure and can affect the success of HEG classification, including the following:

  1. Employees rarely perform the same work even when they have the same job description, and rarely have the same exposures.
  2. Employee work practices can significantly alter exposure.
  3. Workers who are mobile throughout the work area may be unpredictably exposed to several contaminant sources throughout the day.
  4. Air movement in a workplace can unpredictably increase the exposures of workers who are located a considerable distance from a source.
  5. Exposures may be determined not by the job tasks but by the work environment.

 

Sample duration

The concentrations of chemical agents in air samples are either measured directly in the field, obtaining immediate results (real-time or grab), or are collected over time in the field on sampling media or in sampling bags and are measured in a laboratory (integrated) (Lynch 1995). The advantage of real-time sampling is that results are obtained quickly onsite, and can capture measurements of short-term acute exposures. However, real-time methods are limited because they are not available for all contaminants of concern and they may not be analytically sensitive or accurate enough to quantify the targeted contaminants. Real-time sampling may not be applicable when the occupational hygienist is interested in chronic exposures and requires time-weighted-average measurements to compare with OELs.

Real-time sampling is used for emergency evaluations, obtaining crude estimates of concentration, leak detection, ambient air and source monitoring, evaluating engineering controls, monitoring short-term exposures that are less than 15 minutes, monitoring episodic exposures, monitoring highly toxic chemicals (carbon monoxide), explosive mixtures and process monitoring. Real-time sampling methods can capture changing concentrations over time and provide immediate qualitative and quantitative information. Integrated air sampling is usually performed for personal monitoring, area sampling and for comparing concentrations to time-weighted-average OELs. The advantages of integrated sampling are that methods are available for a wide variety of contaminants; it can be used to identify unknowns; accuracy and specificity is high and limits of detection are usually very low. Integrated samples that are analysed in a laboratory must contain enough contaminant to meet minimum detectable analytical requirements; therefore, samples are collected over a predetermined time period.

In addition to analytical requirements of a sampling method, sample duration should be matched to the sampling purpose. For source sampling, duration is based upon the process or cycle time, or when there are anticipated peaks of concentrations. For peak sampling, samples should be collected at regular intervals throughout the day to minimize bias and identify unpredictable peaks. The sampling period should be short enough to identify peaks while also providing a reflection of the actual exposure period.

For personal sampling, duration is matched to the occupational exposure limit, task duration or anticipated biological effect. Real-time sampling methods are used for assessing acute exposures to irritants, asphyxiants, sensitizers and allergenic agents. Chlorine, carbon monoxide and hydrogen sulphide are examples of chemicals that can exert their effects quickly and at relatively low concentrations.

Chronic disease agents such as lead and mercury are usually sampled for a full shift (seven hours or more per sample), using integrated sampling methods. To evaluate full shift exposures, the occupational hygienist uses either a single sample or a series of consecutive samples that cover the entire shift. The sampling duration for exposures that occur for less than a full shift are usually associated with particular tasks or processes. Construction workers, indoor maintenance personnel and maintenance road crews are examples of jobs with exposures that are tied to tasks.

How many samples and how often to sample?

Concentrations of contaminants can vary minute to minute, day to day and season to season, and variability can occur between individuals and within an individual. Exposure variability affects both the number of samples and the accuracy of the results. Variations in exposure can arise from different work practices, changes in pollutant emissions, the volume of chemicals used, production quotas, ventilation, temperature changes, worker mobility and task assignments. Most sampling campaigns are performed for a couple of days in a year; therefore, the measurements obtained are not representative of exposure. The period over which samples are collected is very short compared with the unsampled period; the occupational hygienist must extrapolate from the sampled to the unsampled period. For long-term exposure monitoring, each worker selected from a HEG should be sampled multiple times over the course of weeks or months, and exposures should be characterized for all shifts. While the day shift may be the busiest, the night shift may have the least supervision and there may be lapses in work practices.

Measurement Techniques

Active and passive sampling

Contaminants are collected on sampling media either by actively pulling an air sample through the media, or by passively allowing the air to reach the media. Active sampling uses a battery-powered pump, and passive sampling uses diffusion or gravity to bring the contaminants to the sampling media. Gases, vapours, particulates and bioaerosols are all collected by active sampling methods; gases and vapours can also be collected by passive diffusion sampling.

For gases, vapours and most particulates, once the sample is collected the mass of the contaminant is measured, and concentration is calculated by dividing the mass by the volume of sampled air. For gases and vapours, concentration is expressed as parts per million (ppm) or mg/m3, and for particulates concentration is expressed as mg/m3 (Dinardi 1995).

In integrated sampling, air sampling pumps are critical components of the sampling system because concentration estimates require knowledge of the volume of sampled air. Pumps are selected based upon desired flowrate, ease of servicing and calibration, size, cost and suitability for hazardous environments. The primary selection criterion is flowrate: low-flow pumps (0.5 to 500 ml/min) are used for sampling gases and vapours; high-flow pumps (500 to 4,500 ml/min) are used for sampling particulates, bioaerosols and gases and vapours. To insure accurate sample volumes, pumps must be accurately calibrated. Calibration is performed using primary standards such as manual or electronic soap-bubble meters, which directly measure volume, or secondary methods such as wet test meters, dry gas meters and precision rotameters that are calibrated against primary methods.

Gases and vapours: sampling media

Gases and vapours are collected using porous solid sorbent tubes, impingers, passive monitors and bags. Sorbent tubes are hollow glass tubes that have been filled with a granular solid that enables adsorption of chemicals unchanged on its surface. Solid sorbents are specific for groups of compounds; commonly used sorbents include charcoal, silica gel and Tenax. Charcoal sorbent, an amorphous form of carbon, is electrically nonpolar, and preferentially adsorbs organic gases and vapours. Silica gel, an amorphous form of silica, is used to collect polar organic compounds, amines and some inorganic compounds. Because of its affinity for polar compounds, it will adsorb water vapour; therefore, at elevated humidity, water can displace the less polar chemicals of interest from the silica gel. Tenax, a porous polymer, is used for sampling very low concentrations of nonpolar volatile organic compounds.

The ability to accurately capture the contaminants in air and avoid contaminant loss depends upon the sampling rate, sampling volume, and the volatility and concentration of the airborne contaminant. Collection efficiency of solid sorbents can be adversely affected by increased temperature, humidity, flowrate, concentration, sorbent particle size and number of competing chemicals. As collection efficiency decreases chemicals will be lost during sampling and concentrations will be underestimated. To detect chemical loss, or breakthrough, solid sorbent tubes have two sections of granular material separated by a foam plug. The front section is used for sample collection and the back section is used to determine breakthrough. Breakthrough has occurred when at least 20 to 25% of the contaminant is present in the back section of the tube. Analysis of contaminants from solid sorbents requires extraction of the contaminant from the medium using a solvent. For each batch of sorbent tubes and chemicals collected, the laboratory must determine the desorption efficiency, the efficiency of removal of chemicals from the sorbent by the solvent. For charcoal and silica gel, the most commonly used solvent is carbon disulphide. For Tenax, the chemicals are extracted using thermal desorption directly into a gas chromatograph.

Impingers are usually glass bottles with an inlet tube that allows air to be drawn into the bottle through a solution that collects the gases and vapours by absorption either unchanged in solution or by a chemical reaction. Impingers are used less and less in workplace monitoring, especially for personal sampling, because they can break, and the liquid media can spill onto the employee. There are a variety of types of impingers, including gas wash bottles, spiral absorbers, glass bead columns, midget impingers and fritted bubblers. All impingers can be used to collect area samples; the most commonly used impinger, the midget impinger, can be used for personal sampling as well.

Passive, or diffusion monitors are small, have no moving parts and are available for both organic and inorganic contaminants. Most organic monitors use activated charcoal as the collection medium. In theory, any compound that can be sampled by a charcoal sorbent tube and pump can be sampled using a passive monitor. Each monitor has a uniquely designed geometry to give an effective sampling rate. Sampling starts when the monitor cover is removed and ends when the cover is replaced. Most diffusion monitors are accurate for eight-hour time-weighted-average exposures and are not appropriate for short-term exposures.

Sampling bags can be used to collect integrated samples of gases and vapours. They have permeability and adsorptive properties that enable storage for a day with minimal loss. Bags are made of Teflon (polytetrafluoroethylene) and Tedlar (polyvinylfluoride).

Sampling media: particulate materials

Occupational sampling for particulate materials, or aerosols, is currently in a state of flux; traditional sampling methods will eventually be replaced by particle size selective (PSS) sampling methods. Traditional sampling methods will be discussed first, followed by PSS methods.

The most commonly used media for collecting aerosols are fibre or membrane filters; aerosol removal from the air stream occurs by collision and attachment of the particles to the surface of the filters. The choice of filter medium depends upon the physical and chemical properties of the aerosols to be sampled, the type of sampler and the type of analysis. When selecting filters, they must be evaluated for collection efficiency, pressure drop, hygroscopicity, background contamination, strength and pore size, which can range from 0.01 to 10 μm. Membrane filters are manufactured in a variety of pore sizes and are usually made from cellulose ester, polyvinylchloride or polytetrafluoroethylene. Particle collection occurs at the surface of the filter; therefore, membrane filters are usually used in applications where microscopy will be performed. Mixed cellulose ester filters can be easily dissolved with acid and are usually used for collection of metals for analysis by atomic absorption. Nucleopore filters (polycarbonate) are very strong and thermally stable, and are used for sampling and analysing asbestos fibres using transmission electron microscopy. Fibre filters are usually made of fibreglass and are used to sample aerosols such as pesticides and lead.

For occupational exposures to aerosols, a known volume of air can be sampled through the filters, the total increase in mass (gravimetric analysis) can be measured (mg/m3 air), the total number of particles can be counted (fibres/cc) or the aerosols can be identified (chemical analysis). For mass calculations, the total dust that enters the sampler or only the respirable fraction can be measured. For total dust, the increase in mass represents exposure from deposition in all parts of the respiratory tract. Total dust samplers are subject to error due to high winds passing across the sampler and improper orientation of the sampler. High winds, and filters facing upright, can result in collection of extra particles and overestimation of exposure.

For respirable dust sampling, the increase in mass represents exposure from deposition in the gas exchange (alveolar) region of the respiratory tract. To collect only the respirable fraction, a preclassifier called a cyclone is used to alter the distribution of airborne dust presented to the filter. Aerosols are drawn into the cyclone, accelerated and whirled, causing the heavier particles to be thrown out to the edge of the air stream and dropped to a removal section at the bottom of the cyclone. The respirable particles that are less than 10 μm remain in the air stream and are drawn up and collected on the filter for subsequent gravimetric analysis.

Sampling errors encountered when performing total and respirable dust sampling result in measurements that do not accurately reflect exposure or relate to adverse health effects. Therefore, PSS has been proposed to redefine the relationship between particle size, adverse health impact and sampling method. In PSS sampling, the measurement of particles is related to the sizes that are associated with specific health effects. The International Organization for Standardization (ISO) and the ACGIH have proposed three particulate mass fractions: inhalable particulate mass (IPM), thoracic particulate mass (TPM) and respirable particulate mass (RPM). IPM refers to particles that can be expected to enter through the nose and mouth, and would replace the traditional total mass fraction. TPM refers to particles that can penetrate the upper respiratory system past the larynx. RPM refers to particles that are capable of depositing in the gas-exchange region of the lung, and would replace the current respirable mass fraction. The practical adoption of PSS sampling requires the development of new aerosol sampling methods and PSS-specific occupational exposure limits.

Sampling media: biological materials

There are few standardized methods for sampling biological material or bioaerosols. Although sampling methods are similar to those used for other airborne particulates, viability of most bioaerosols must be preserved to ensure laboratory culturability. Therefore, they are more difficult to collect, store and analyse. The strategy for sampling bioaerosols involves collection directly on semisolid nutrient agar or plating after collection in fluids, incubation for several days and identification and quantification of the cells that have grown. The mounds of cells that have multiplied on the agar can be counted as colony-forming units (CFU) for viable bacteria or fungi, and plaque-forming units (PFU) for active viruses. With the exception of spores, filters are not recommended for bioaerosol collection because dehydration causes cell damage.

Viable aerosolized micro-organisms are collected using all-glass impingers (AGI-30), slit samplers and inertial impactors. Impingers collect bioaerosols in liquid and the slit sampler collects bioaerosols on glass slides at high volumes and flowrates. The impactor is used with one to six stages, each containing a Petri dish, to allow for separation of particles by size.

Interpretation of sampling results must be done on a case-by-case basis because there are no occupational exposure limits. Evaluation criteria must be determined prior to sampling; for indoor air investigations, in particular, samples taken outside of the building are used as a background reference. A rule of thumb is that concentrations should be ten times background to suspect contamination. When using culture plating techniques, concentrations are probably underestimated because of losses of viability during sampling and incubation.

Skin and surface sampling

There are no standard methods for evaluating skin exposure to chemicals and predicting dose. Surface sampling is performed primarily to evaluate work practices and identify potential sources of skin absorption and ingestion. Two types of surface sampling methods are used to assess dermal and ingestion potential: direct methods, which involve sampling the skin of a worker, and indirect methods, which involve wipe sampling surfaces.

Direct skin sampling involves placing gauze pads on the skin to absorb chemicals, rinsing the skin with solvents to remove contaminants and using fluorescence to identify skin contamination. Gauze pads are placed on different parts of the body and are either left exposed or are placed under personal protective equipment. At the end of the workday the pads are removed and are analysed in the laboratory; the distribution of concentrations from different parts of the body are used to identify skin exposure areas. This method is inexpensive and easy to perform; however, the results are limited because gauze pads are not good physical models of the absorption and retention properties of skin, and measured concentrations are not necessarily representative of the entire body.

Skin rinses involve wiping the skin with solvents or placing hands in plastic bags filled with solvents to measure the concentration of chemicals on the surface. This method can underestimate dose because only the unabsorbed fraction of chemicals is collected.

Fluorescence monitoring is used to identify skin exposure for chemicals that naturally fluoresce, such as polynuclear aromatics, and to identify exposures for chemicals in which fluorescent compounds have been intentionally added. The skin is scanned with an ultraviolet light to visualize contamination. This visualization provides workers with evidence of the effect of work practices on exposure; research is underway to quantify the fluorescence intensity and relate it to dose.

Indirect wipe sampling methods involve the use of gauze, glass fibre filters or cellulose paper filters, to wipe the insides of gloves or respirators, or the tops of surfaces. Solvents may be added to increase collection efficiency. The gauze or filters are then analysed in the laboratory. To standardize the results and enable comparison between samples, a square template is used to sample a 100 cm2 area.

Biological media

Blood, urine and exhaled air samples are the most suitable specimens for routine biological monitoring, while hair, milk, saliva and nails are less frequently used. Biological monitoring is performed by collecting bulk blood and urine samples in the workplace and analysing them in the laboratory. Exhaled air samples are collected in Tedlar bags, specially designed glass pipettes or sorbent tubes, and are analysed in the field using direct-reading instruments, or in the laboratory. Blood, urine and exhaled air samples are primarily used to measure the unchanged parent compound (same chemical that is sampled in workplace air), its metabolite or a biochemical change (intermediate) that has been induced in the body. For example, the parent compound lead is measured in blood to evaluate lead exposure, the metabolite mandelic acid is measured in urine for both styrene and ethyl benzene, and carboxyhaemoglobin is the intermediate measured in blood for both carbon monoxide and methylene chloride exposure. For exposure monitoring, the concentration of an ideal determinant will be highly correlated with intensity of exposure. For medical monitoring, the concentration of an ideal determinant will be highly correlated with target organ concentration.

The timing of specimen collection can impact the usefulness of the measurements; samples should be collected at times which most accurately reflect exposure. Timing is related to the excretion biological half-life of a chemical, which reflects how quickly a chemical is eliminated from the body; this can vary from hours to years. Target organ concentrations of chemicals with short biological half-lives closely follow the environmental concentration; target organ concentrations of chemicals with long biological half-lives fluctuate very little in response to environmental exposures. For chemicals with short biological half-lives, less than three hours, a sample is taken immediately at the end of the workday, before concentrations rapidly decline, to reflect exposure on that day. Samples may be taken at any time for chemicals with long half-lives, such as polychlorinated biphenyls and lead.

Real-time monitors

Direct-reading instruments provide real-time quantification of contaminants; the sample is analysed within the equipment and does not require off-site laboratory analysis (Maslansky and Maslansky 1993). Compounds can be measured without first collecting them on separate media, then shipping, storing and analysing them. Concentration is read directly from a meter, display, strip chart recorder and data logger, or from a colour change. Direct-reading instruments are primarily used for gases and vapours; a few instruments are available for monitoring particulates. Instruments vary in cost, complexity, reliability, size, sensitivity and specificity. They include simple devices, such as colorimetric tubes, that use a colour change to indicate concentration; dedicated instruments that are specific for a chemical, such as carbon monoxide indicators, combustible gas indicators (explosimeters) and mercury vapour meters; and survey instruments, such as infrared spectrometers, that screen large groups of chemicals. Direct-reading instruments use a variety of physical and chemical methods to analyse gases and vapours, including conductivity, ionization, potentiometry, photometry, radioactive tracers and combustion.

Commonly used portable direct-reading instruments include battery-powered gas chromatographs, organic vapour analysers and infrared spectrometers. Gas chromatographs and organic vapour monitors are primarily used for environmental monitoring at hazardous waste sites and for community ambient air monitoring. Gas chromatographs with appropriate detectors are specific and sensitive, and can quantify chemicals at very low concentrations. Organic vapour analysers are usually used to measure classes of compounds. Portable infrared spectrometers are primarily used for occupational monitoring and leak detection because they are sensitive and specific for a wide range of compounds.

Small direct-reading personal monitors are available for a few common gases (chlorine, hydrogen cyanide, hydrogen sulphide, hydrazine, oxygen, phosgene, sulphur dioxide, nitrogen dioxide and carbon monoxide). They accumulate concentration measurements over the course of the day and can provide a direct readout of time-weighted-average concentration as well as provide a detailed contaminant profile for the day.

Colorimetric tubes (detector tubes) are simple to use, cheap and available for a wide variety of chemicals. They can be used to quickly identify classes of air contaminants and provide ballpark estimates of concentrations that can be used when determining pump flow rates and volumes. Colorimetric tubes are glass tubes filled with solid granular material which has been impregnated with a chemical agent that can react with a contaminant and create a colour change. After the two sealed ends of a tube are broken open, one end of the tube is placed in a hand pump. The recommended volume of contaminated air is sampled through the tube by using a specified number of pump strokes for a particular chemical. A colour change or stain is produced on the tube, usually within two minutes, and the length of the stain is proportional to concentration. Some colorimetric tubes have been adapted for long duration sampling, and are used with battery-powered pumps that can run for at least eight hours. The colour change produced represents a time-weighted-average concentration. Colorimetric tubes are good for both qualitative and quantitative analysis; however, their specificity and accuracy is limited. The accuracy of colorimetric tubes is not as high as that of laboratory methods or many other real-time instruments. There are hundreds of tubes, many of which have cross-sensitivities and can detect more than one chemical. This can result in interferences that modify the measured concentrations.

Direct-reading aerosol monitors cannot distinguish between contaminants, are usually used for counting or sizing particles, and are primarily used for screening, not to determine TWA or acute exposures. Real-time instruments use optical or electrical properties to determine total and respirable mass, particle count and particle size. Light-scattering aerosol monitors, or aerosol photometers, detect the light scattered by particles as they pass through a volume in the equipment. As the number of particles increases, the amount of scattered light increases and is proportional to mass. Light-scattering aerosol monitors cannot be used to distinguish between particle types; however, if they are used in a workplace where there are a limited number of dusts present, the mass can be attributed to a particular material. Fibrous aerosol monitors are used to measure the airborne concentration of particles such as asbestos. Fibres are aligned in an oscillating electric field and are illuminated with a helium neon laser; the resulting pulses of light are detected by a photomultiplier tube. Light-attenuating photometers measure the extinction of light by particles; the ratio of incident light to measured light is proportional to concentration.

Analytical Techniques

There are many available methods for analysing laboratory samples for contaminants. Some of the more commonly used techniques for quantifying gases and vapours in air include gas chromatography, mass spectrometry, atomic absorption, infrared and UV spectroscopy and polarography.

Gas chromatography is a technique used to separate and concentrate chemicals in mixtures for subsequent quantitative analysis. There are three main components to the system: the sample injection system, a column and a detector. A liquid or gaseous sample is injected using a syringe, into an air stream that carries the sample through a column where the components are separated. The column is packed with materials that interact differently with different chemicals, and slows down the movement of the chemicals. The differential interaction causes each chemical to travel through the column at a different rate. After separation, the chemicals go directly into a detector, such as a flame ionization detector (FID), photo-ionization detector (PID) or electron capture detector (ECD); a signal proportional to concentration is registered on a chart recorder. The FID is used for almost all organics including: aromatics, straight chain hydrocarbons, ketones and some chlorinated hydrocarbons. Concentration is measured by the increase in the number of ions produced as a volatile hydrocarbon is burned by a hydrogen flame. The PID is used for organics and some inorganics; it is especially useful for aromatic compounds such as benzene, and it can detect aliphatic, aromatic and halogenated hydrocarbons. Concentration is measured by the increase in the number of ions produced when the sample is bombarded by ultraviolet radiation. The ECD is primarily used for halogen-containing chemicals; it gives a minimal response to hydrocarbons, alcohols and ketones. Concentration is measured by the current flow between two electrodes caused by ionization of the gas by radioactivity.

The mass spectrophotometer is used to analyse complex mixtures of chemicals present in trace amounts. It is often coupled with a gas chromatograph for the separation and quantification of different contaminants.

Atomic absorption spectroscopy is primarily used for the quantification of metals such as mercury. Atomic absorption is the absorption of light of a particular wavelength by a free, ground-state atom; the quantity of light absorbed is related to concentration. The technique is highly specific, sensitive and fast, and is directly applicable to approximately 68 elements. Detection limits are in the sub-ppb to low-ppm range.

Infrared analysis is a powerful, sensitive, specific and versatile technique. It uses the absorption of infrared energy to measure many inorganic and organic chemicals; the amount of light absorbed is proportional to concentration. The absorption spectrum of a compound provides information enabling its identification and quantification.

UV absorption spectroscopy is used for analysis of aromatic hydrocarbons when interferences are known to be low. The amount of absorption of UV light is directly proportional to concentration.

Polarographic methods are based upon the electrolysis of a sample solution using an easily polarized electrode and a nonpolarizable electrode. They are used for qualitative and quantitative analysis of aldehydes, chlorinated hydrocarbons and metals.

 

Back

After a hazard has been recognized and evaluated, the most appropriate interventions (methods of control) for a particular hazard must be determined. Control methods usually fall into three categories:

  1. engineering controls
  2. administrative controls
  3. personal protective equipment.

 

As with any change in work processes, training must be provided to ensure the success of the changes.

Engineering controls are changes to the process or equipment that reduce or eliminate exposures to an agent. For example, substituting a less toxic chemical in a process or installing exhaust ventilation to remove vapours generated during a process step, are examples of engineering controls. In the case of noise control, installing sound-absorbing materials, building enclosures and installing mufflers on air exhaust outlets are examples of engineering controls. Another type of engineering control might be changing the process itself. An example of this type of control would be removal of one or more degreasing steps in a process that originally required three degreasing steps. By removing the need for the task that produced the exposure, the overall exposure for the worker has been controlled. The advantage of engineering controls is the relatively small involvement of the worker, who can go about the job in a more controlled environment when, for instance, contaminants are automatically removed from the air. Contrast this to the situation where the selected method of control is a respirator to be worn by the worker while performing the task in an “uncontrolled” workplace. In addition to the employer actively installing engineering controls on existing equipment, new equipment can be purchased that contains the controls or other more effective controls. A combination approach has often been effective (i.e., installing some engineering controls now and requiring personal protective equipment until new equipment arrives with more effective controls that will eliminate the need for personal protective equipment). Some common examples of engineering controls are:

  • ventilation (both general and local exhaust ventilation)
  • isolation (place a barrier between the worker and the agent)
  • substitution (substitute less toxic, less flammable material, etc.)
  • change the process (eliminate hazardous steps).

 

The occupational hygienist must be sensitive to the worker’s job tasks and must solicit worker participation when designing or selecting engineering controls. Placing barriers in the workplace, for example, could significantly impair a worker’s ability to perform the job and may encourage “work arounds”. Engineering controls are the most effective methods of reducing exposures. They are also, often, the most expensive. Since engineering controls are effective and expensive it is important to maximize the involvement of the workers in the selection and design of the controls. This should result in a greater likelihood that the controls will reduce exposures.

Administrative controls involve changes in how a worker accomplishes the necessary job tasks—for example, how long they work in an area where exposures occur, or changes in work practices such as improvements in body positioning to reduce exposures. Administrative controls can add to the effectiveness of an intervention but have several drawbacks:

  1. Rotation of workers may reduce overall average exposure for the workday but it provides periods of high short-term exposure for a larger number of workers. As more becomes known about toxicants and their modes of action, short-term peak exposures may represent a greater risk than would be calculated based on their contribution to average exposure.
  2. Changing work practices of workers can present a significant enforcement and monitoring challenge. How work practices are enforced and monitored determines whether or not they will be effective. This constant management attention is a significant cost of administrative controls.

 

Personal protective equipment consists of devices provided to the worker and required to be worn while performing certain (or all) job tasks. Examples include respirators, chemical goggles, protective gloves and faceshields. Personal protective equipment is commonly used in cases where engineering controls have not been effective in controlling the exposure to acceptable levels or where engineering controls have not been found to be feasible (for cost or operational reasons). Personal protective equipment can provide significant protection to workers if worn and used correctly. In the case of respiratory protection, protection factors (ratio of concentration outside the respirator to that inside) can be 1,000 or more for positive-pressure supplied air respirators or ten for half-face air-purifying respirators. Gloves (if selected appropriately) can protect hands for hours from solvents. Goggles can provide effective protection from chemical splashes.

Intervention: Factors to Consider

Often a combination of controls is used to reduce the exposures to acceptable levels. Whatever methods are selected, the intervention must reduce the exposure and resulting hazard to an acceptable level. There are, however, many other factors that need to be considered when selecting an intervention. For example:

  • effectiveness of the controls
  • ease of use by the employee
  • cost of the controls
  • adequacy of the warning properties of the material
  • acceptable level of exposure
  • frequency of exposure
  • route(s) of exposure
  • regulatory requirements for specific controls.

 

Effectiveness of controls

Effectiveness of controls is obviously a prime consideration when taking action to reduce exposures. When comparing one type of intervention to another, the level of protection required must be appropriate for the challenge; too much control is a waste of resources. Those resources could be used to reduce other exposures or exposures of other employees. On the other hand, too little control leaves the worker exposed to unhealthy conditions. A useful first step is to rank the interventions according to their effectiveness, then use this ranking to evaluate the significance of the other factors.

Ease of use

For any control to be effective the worker must be able to perform his or her job tasks with the control in place. For example, if the control method selected is substitution, then the worker must know the hazards of the new chemical, be trained in safe handling procedures, understand proper disposal procedures, and so on. If the control is isolation—placing an enclosure around the substance or the worker—the enclosure must allow the worker to do his or her job. If the control measures interfere with the tasks of the job, the worker will be reluctant to use them and may find ways to accomplish the tasks that could result in increased, not decreased, exposures.

Cost

Every organization has limits on resources. The challenge is to maximize the use of those resources. When hazardous exposures are identified and an intervention strategy is being developed, cost must be a factor. The “best buy” many times will not be the lowest- or highest-cost solutions. Cost becomes a factor only after several viable methods of control have been identified. Cost of the controls can then be used to select the controls that will work best in that particular situation. If cost is the determining factor at the outset, poor or ineffective controls may be selected, or controls that interfere with the process in which the employee is working. It would be unwise to select an inexpensive set of controls that interfere with and slow down a manufacturing process. The process then would have a lower throughput and higher cost. In very short time the “real” costs of these “low cost” controls would become enormous. Industrial engineers understand the layout and overall process; production engineers understand the manufacturing steps and processes; the financial analysts understand the resource allocation problems. Occupational hygienists can provide a unique insight into these discussions due to their understanding of the specific employee’s job tasks, the employee’s interaction with the manufacturing equipment as well as how the controls will work in a particular setting. This team approach increases the likelihood of selecting the most appropriate (from a variety of perspectives) control.

Adequacy of warning properties

When protecting a worker against an occupational health hazard, the warning properties of the material, such as odour or irritation, must be considered. For example, if a semiconductor worker is working in an area where arsine gas is used, the extreme toxicity of the gas poses a significant potential hazard. The situation is compounded by arsine’s very poor warning properties—the workers cannot detect the arsine gas by sight or smell until it is well above acceptable levels. In this case, controls that are marginally effective at keeping exposures below acceptable levels should not be considered because excursions above acceptable levels cannot be detected by the workers. In this case, engineering controls should be installed to isolate the worker from the material. In addition, a continuous arsine gas monitor should be installed to warn workers of the failure of the engineering controls. In situations involving high toxicity and poor warning properties, preventive occupational hygiene is practised. The occupational hygienist must be flexible and thoughtful when approaching an exposure problem.

Acceptable level of exposure

If controls are being considered to protect a worker from a substance such as acetone, where the acceptable level of exposure may be in the range of 800 ppm, controlling to a level of 400 ppm or less may be achieved relatively easily. Contrast the example of acetone control to control of 2-ethoxyethanol, where the acceptable level of exposure may be in the range of 0.5 ppm. To obtain the same per cent reduction (0.5 ppm to 0.25 ppm) would probably require different controls. In fact, at these low levels of exposure, isolation of the material may become the primary means of control. At high levels of exposure, ventilation may provide the necessary reduction. Therefore, the acceptable level determined (by the government, company, etc.) for a substance can limit the selection of controls.

Frequency of exposure

When assessing toxicity the classic model uses the following relationship:

TIME x CONCENTRATION = DOSE 

Dose, in this case, is the amount of material being made available for absorption. The previous discussion focused on minimizing (lowering) the concentration portion of this relationship. One might also reduce the time spent being exposed (the underlying reason for administrative controls). This would similarly reduce the dose. The issue here is not the employee spending time in a room, but how often an operation (task) is performed. The distinction is important. In the first example, the exposure is controlled by removing the workers when they are exposed to a selected amount of toxicant; the intervention effort is not directed at controlling the amount of toxicant (in many situations there may be a combination approach). In the second case, the frequency of the operation is being used to provide the appropriate controls, not to determine a work schedule. For example, if an operation such as degreasing is performed routinely by an employee, the controls may include ventilation, substitution of a less toxic solvent or even automation of the process. If the operation is performed rarely (e.g., once per quarter) personal protective equipment may be an option (depending on many of the factors described in this section). As these two examples illustrate, the frequency with which an operation is performed can directly affect the selection of controls. Whatever the exposure situation, the frequency with which a worker performs the tasks must be considered and factored into the control selection.

Route of exposure obviously is going to affect the method of control. If a respiratory irritant is present, ventilation, respirators, and so on, would be considered. The challenge for the occupational hygienist is identifying all routes of exposure. For example, glycol ethers are used as a carrier solvent in printing operations. Breathing-zone air concentrations can be measured and controls implemented. Glycol ethers, however, are rapidly absorbed through intact skin. The skin represents a significant route of exposure and must be considered. In fact, if the wrong gloves are chosen, the skin exposure may continue long after the air exposures have decreased (due to the employee continuing to use gloves that have experienced breakthrough). The hygienist must evaluate the substance—its physical properties, chemical and toxicological properties, and so on—to determine what routes of exposure are possible and plausible (based on the tasks performed by the employee).

In any discussion of controls, one of the factors that must be considered is the regulatory requirements for controls. There may well be codes of practice, regulations, and so on, that require a specific set of controls. The occupational hygienist has flexibility above and beyond the regulatory requirements, but the minimum mandated controls must be installed. Another aspect of the regulatory requirements is that the mandated controls may not work as well or may conflict with the best judgement of the occupational hygienist. The hygienist must be creative in these situations and find solutions that satisfy the regulatory as well as best practice goals of the organization.

Training and Labelling

Regardless of what form of intervention is eventually selected, training and other forms of notification must be provided to ensure that the workers understand the interventions, why they were selected, what reductions in exposure are expected, and the role of the workers in achieving those reductions. Without the participation and understanding of the workforce, the interventions will likely fail or at least operate at reduced efficiency. Training builds hazard awareness in the workforce. This new awareness can be invaluable to the occupational hygienist in identifying and reducing previously unrecognized exposures or new exposures.

Training, labelling and related activities may be part of a regulatory compliance scheme. It would be prudent to check the local regulations to ensure that whatever type of training or labelling is undertaken satisfies the regulatory as well as operational requirements.

Conclusion

In this short discussion on interventions, some general considerations have been presented to stimulate thought. In practice, these rules become very complex and often have significant ramifications for employee and company health. The occupational hygienist’s professional judgement is essential in selecting the best controls. Best is a term with many different meanings. The occupational hygienist must become adept at working in teams and soliciting input from the workers, management and technical staff.

 

Back

Workplace exposure assessment is concerned with identifying and evaluating agents with which a worker may come in contact, and exposure indices can be constructed to reflect the amount of an agent present in the general environment or in inhaled air, as well as to reflect the amount of agent that is actually inhaled, swallowed or otherwise absorbed (the intake). Other indices include the amount of agent that is resorbed (the uptake) and the exposure at the target organ. Dose is a pharmacological or toxicological term used to indicate the amount of a substance administered to a subject. Dose rate is the amount administered per unit of time. The dose of a workplace exposure is difficult to determine in a practical situation, since physical and biological processes, like inhalation, uptake and distribution of an agent in the human body cause exposure and dose to have complex, non-linear relationships. The uncertainty about the actual level of exposure to agents also makes it difficult to quantify relationships between exposure and health effects.

For many occupational exposures there exists a time window during which the exposure or dose is most relevant to the development of a particular health-related problem or symptom. Hence, the biologically relevant exposure, or dose, would be that exposure which occurs during the relevant time window. Some exposures to occupational carcinogens are believed to have such a relevant time window of exposure. Cancer is a disease with a long latency period, and hence it could be that the exposure which is related to the ultimate development of the disease took place many years before the cancer actually manifested itself. This phenomenon is counter-intuitive, since one would have expected that cumulative exposure over a working lifetime would have been the relevant parameter. The exposure at the time of manifestation of disease may not be of particular importance.

The pattern of exposure—continuous exposure, intermittent exposure and exposure with or without sharp peaks—may be relevant as well. Taking exposure patterns into account is important for both epidemiological studies and for environmental measurements which may be used to monitor compliance with health standards or for environmental control as part of control and prevention programmes. For example, if a health effect is caused by peak exposures, such peak levels must be monitorable in order to be controlled. Monitoring which provides data only about long-term average exposures is not useful since the peak excursion values may well be masked by averaging, and certainly cannot be controlled as they occur.

The biologically relevant exposure or dose for a certain endpoint is often not known because the patterns of intake, uptake, distribution and elimination, or the mechanisms of biotransformation, are not understood in sufficient detail. Both the rate at which an agent enters and leaves the body (the kinetics) and the biochemical processes for handling the substance (biotransformation) will help determine the relationships between exposure, dose and effect.

Environmental monitoring is the measurement and assessment of agents at the workplace to evaluate ambient exposure and related health risks. Biological monitoring is the measurement and assessment of workplace agents or their metabolites in tissue, secreta or excreta to evaluate exposure and assess health risks. Sometimes biomarkers, such as DNA-adducts, are used as measures of exposure. Biomarkers may also be indicative of the mechanisms of the disease process itself, but this is a complex subject, which is covered more fully in the chapter Biological Monitoring and later in the discussion here.

A simplification of the basic model in exposure-response modelling is as follows:

exposure uptake distribution,

elimination, transformationtarget dosephysiopathologyeffect

Depending on the agent, exposure-uptake and exposure-intake relationships can be complex. For many gases, simple approximations can be made, based on the concentration of the agent in the air during the course of a working day and on the amount of air that is inhaled. For dust sampling, deposition patterns are also related to particle size. Size considerations may also lead to a more complex relationship. The chapter Respiratory System provides more detail on the aspect of respiratory toxicity.

Exposure and dose assessment are elements of quantitative risk assessment. Health risk assessment methods often form the basis upon which exposure limits are established for emission levels of toxic agents in the air for environmental as well as for occupational standards. Health risk analysis provides an estimate of the probability (risk) of occurrence of specific health effects or an estimate of the number of cases with these health effects. By means of health risk analysis an acceptable concentration of a toxicant in air, water or food can be provided, given an a priori chosen acceptable magnitude of risk. Quantitative risk analysis has found an application in cancer epidemiology, which explains the strong emphasis on retrospective exposure assessment. But applications of more elaborate exposure assessment strategies can be found in both retrospective as well as prospective exposure assessment, and exposure assessment principles have found applications in studies focused on other endpoints as well, such as benign respiratory disease (Wegman et al. 1992; Post et al. 1994). Two directions in research predominate at this moment. One uses dose estimates obtained from exposure monitoring information, and the other relies on biomarkers as measures of exposure.

Exposure Monitoring and Prediction of Dose

Unfortunately, for many exposures few quantitative data are available for predicting the risk for developing a certain endpoint. As early as 1924, Haber postulated that the severity of the health effect (H) is proportional to the product of exposure concentration (X) and time of exposure (T):

H=X x T

Haber’s law, as it is called, formed the basis for development of the concept that time-weighted average (TWA) exposure measurements—that is, measurements taken and averaged over a certain period of time—would be a useful measure for the exposure. This assumption about the adequacy of the time-weighted average has been questioned for many years. In 1952, Adams and co-workers stated that “there is no scientific basis for the use of the time-weighted average to integrate varying exposures …” (in Atherly 1985). The problem is that many relations are more complex than the relationship that Haber’s law represents. There are many examples of agents where the effect is more strongly determined by concentration than by length of time. For example, interesting evidence from laboratory studies has shown that in rats exposed to carbon tetrachloride, the pattern of exposure (continuous versus intermittent and with or without peaks) as well as the dose can modify the observed risk of the rats developing liver enzyme level changes (Bogers et al. 1987). Another example is bio-aerosols, such as α-amylase enzyme, a dough improver, which can cause allergic diseases in people who work in the bakery industry (Houba et al. 1996). It is unknown whether the risk of developing such a disease is mainly determined by peak exposures, average exposure, or cumulative level of exposure. (Wong 1987; Checkoway and Rice 1992). Information on temporal patterns is not available for most agents, especially not for agents that have chronic effects.

The first attempts to model exposure patterns and estimate dose were published in the 1960s and 1970s by Roach (1966; 1977). He showed that the concentration of an agent reaches an equilibrium value at the receptor after an exposure of infinite duration because elimination counterbalances the uptake of the agent. In an eight-hour exposure, a value of 90% of this equilibrium level can be reached if the half-life of the agent at the target organ is smaller than approximately two-and-a-half hours. This illustrates that for agents with a short half-life, the dose at the target organ is determined by an exposure shorter than an eight-hour period. Dose at the target organ is a function of the product of exposure time and concentration for agents with a long half-life. A similar but more elaborate approach has been applied by Rappaport (1985). He showed that intra-day variability in exposure has a limited influence when dealing with agents with long half-lives. He introduced the term dampening at the receptor.

The information presented above has mainly been used to draw conclusions on appropriate averaging times for exposure measurements for compliance purposes. Since Roach’s papers it is common knowledge that for irritants, grab samples with short averaging times have to be taken, while for agents with long half-lives, such as asbestos, long-term average of cumulative exposure has to be approximated. One should however realize that the dichotomization into grab sample strategies and eight-hour time average exposure strategies as adopted in many countries for compliance purposes is an extremely crude translation of the biological principles discussed above.

An example of improving an exposure assessment strategy based on pharmocokinetic principles in epidemiology can be found in a paper of Wegman et al. (1992). They applied an interesting exposure assessment strategy by using continuous monitoring devices to measure personal dust exposure peak levels and relating these to acute reversible respiratory symptoms occurring every 15 minutes.A conceptual problem in this kind of study, extensively discussed in their paper, is the definition of a health-relevant peak exposure. The definition of a peak will, again, depend on biological considerations. Rappaport (1991) gives two requirements for peak exposures to be of aetiological relevance in the disease process: (1) the agent is eliminated rapidly from the body and (2) there is a non-linear rate of biological damage during a peak exposure. Non-linear rates of biological damage may be related to changes in uptake, which in turn are related to exposure levels, host susceptibility, synergy with other exposures, involvement of other disease mechanisms at higher exposures or threshold levels for disease processes.

These examples also show that pharmacokinetic approaches can lead elsewhere than to dose estimates. The results of pharmacokinetic modelling can also be used to explore the biological relevance of existing indices of exposure and to design new health-relevant exposure assessment strategies.

Pharmacokinetic modelling of the exposure may also generate estimates of the actual dose at the target organ. For instance in the case of ozone, an acute irritant gas, models have been developed which predict the tissue concentration in the airways as a function of the average ozone concentration in the airspace of the lung at a certain distance from the trachea, the radius of the airways, the average air velocity, the effective dispersion, and the ozone flux from air to lung surface (Menzel 1987; Miller and Overton 1989). Such models can be used to predict ozone dose in a particular region of the airways, dependent on environmental ozone concentrations and breathing patterns.

In most cases estimates of target dose are based on information on the exposure pattern over time, job history and pharmacokinetic information on uptake, distribution, elimination and transformation of the agent. The whole process can be described by a set of equations which can be mathematically solved. Often information on pharmacokinetic parameters is not available for humans, and parameter estimates based on animal experiments have to be used. There are several examples by now of the use of pharmacokinetic modelling of exposure in order to generate dose estimates. The first references to modelling of exposure data into dose estimates in the literature go back to the paper of Jahr (1974).

Although dose estimates have generally not been validated and have found limited application in epidemiological studies, the new generation of exposure or dose indices is expected to result in optimal exposure-response analyses in epidemiological studies (Smith 1985, 1987). A problem not yet tackled in pharmacokinetic modelling is that large interspecies differences exist in kinetics of toxic agents, and therefore effects of intra-individual variation in pharmacokinetic parameters are of interest (Droz 1992).

Biomonitoring and Biomarkers of Exposure

Biological monitoring offers an estimate of dose and therefore is often considered superior to environmental monitoring. However, the intra-individual variability of biomonitoring indices can be considerable. In order to derive an acceptable estimate of a worker’s dose, repeated measurements have to be taken, and sometimes the measurement effort can become larger than for environmental monitoring.

This is illustrated by an interesting study on workers producing boats made of plastic reinforced with glass fibre (Rappaport et al. 1995). The variability of styrene exposure was assessed by measuring styrene in air repeatedly. Styrene in exhaled air of exposed workers was monitored, as well as sister chromatid exchanges (SCEs). They showed that an epidemiological study using styrene in the air as a measure of exposure would be more efficient, in terms of numbers of measurements required, than a study using the other indices of exposure. For styrene in air three repeats were required to estimate the long-term average exposure with a given precision. For styrene in exhaled air, four repeats per worker were necessary, while for the SCEs 20 repeats were necessary. The explanation for this observation is the signal-to-noise ratio, determined by the day-to-day and between-worker variability in exposure, which was more favourable for styrene in air than for the two biomarkers of exposure. Thus, although the biological relevance of a certain exposure surrogate might be optimal, the performance in an exposure-response analysis can still be poor because of a limited signal-to-noise ratio, leading to misclassification error.

Droz (1991) applied pharmacokinetic modelling to study advantages of exposure assessment strategies based on air sampling compared to biomonitoring strategies dependent on the half-life of the agent. He showed that biological monitoring is also greatly affected by biological variability, which is not related to variability of the toxicological test. He suggested that no statistical advantage exists in using biological indicators when the half-life of the agent considered is smaller than about ten hours.

Although one might tend to decide to measure the environmental exposure instead of a biological indicator of an effect because of variability in the variable measured, additional arguments can be found for choosing a biomarker, even when this would lead to a greater measurement effort, such as when a considerable dermal exposure is present. For agents like pesticides and some organic solvents, dermal exposure can be of greater relevance that the exposure through the air. A biomarker of exposure would include this route of exposure, while measuring of dermal exposure is complex and results are not easily interpretable (Boleij et al. 1995). Early studies among agricultural workers using “pads” to assess dermal exposure showed remarkable distributions of pesticides over the body surface, depending on the tasks of the worker. However, because little information is available on skin uptake, exposure profiles cannot yet be used to estimate an internal dose.

Biomarkers can also have considerable advantages in cancer epidemiology. When a biomarker is an early marker of the effect, its use could result in reduction of the follow-up period. Although validation studies are required, biomarkers of exposure or individual susceptibility could result in more powerful epidemiological studies and more precise risk estimates.

Time Window Analysis

Parallel to the development of pharmacokinetic modelling, epidemiologists have explored new approaches in the data analysis phase such as “time frame analysis” to relate relevant exposure periods to endpoints, and to implement effects of temporal patterns in the exposure or peak exposures in occupational cancer epidemiology (Checkoway and Rice 1992). Conceptually this technique is related to pharmacokinetic modelling since the relationship between exposure and outcome is optimized by putting weights on different exposure periods, exposure patterns and exposure levels. In pharmacokinetic modelling these weights are believed to have a physiological meaning and are estimated beforehand. In time frame analysis the weights are estimated from the data on the basis of statistical criteria. Examples of this approach are given by Hodgson and Jones (1990), who analysed the relationship between radon gas exposure and lung cancer in a cohort of UK tin miners, and by Seixas, Robins and Becker (1993), who analysed the relationship between dust exposure and respiratory health in a cohort of US coal miners. A very interesting study underlining the relevance of time window analysis is the one by Peto et al. (1982).

They showed that mesothelioma death rates appeared to be proportional to some function of time since first exposure and cumulative exposure in a cohort of insulation workers. Time since first exposure was of particular relevance because this variable was an approximation of the time required for a fibre to migrate from its place of deposition in the lungs to the pleura. This example shows how kinetics of deposition and migration determine the risk function to a large extent. A potential problem with time frame analysis is that it requires detailed information on exposure periods and exposure levels, which hampers its application in many studies of chronic disease outcomes.

Concluding Remarks

In conclusion, the underlying principles of pharmacokinetic modelling and time frame or time window analysis are widely recognized. Knowledge in this area has mainly been used to develop exposure assessment strategies. More elaborate use of these approaches, however, requires a considerable research effort and has to be developed. The number of applications is therefore still limited. Relatively simple applications, such as the development of more optimal exposure assessment strategies dependent on the endpoint, have found wider use. An important issue in the development of biomarkers of exposure or effect is validation of these indices. It is often assumed that a measurable biomarker can predict health risk better than traditional methods. However, unfortunately, very few validation studies substantiate this assumption.

 

Back

Thursday, 10 March 2011 17:54

Occupational Exposure Limits

The History of Occupational Exposure Limits

Over the past 40 years, many organizations in numerous countries have proposed occupational exposure limits (OELs) for airborne contaminants. The limits or guidelines that have gradually become the most widely accepted both in the United States and in most other countries are those issued annually by the American Conference of Governmental Industrial Hygienists (ACGIH), which are termed threshold limit values (TLVs) (LaNier 1984; Cook 1986; ACGIH 1994).

The usefulness of establishing OELs for potentially harmful agents in the working environment has been demonstrated repeatedly since their inception (Stokinger 1970; Cook 1986; Doull 1994). The contribution of OELs to the prevention or minimization of disease is now widely accepted, but for many years such limits did not exist, and even when they did, they were often not observed (Cook 1945; Smyth 1956; Stokinger 1981; LaNier 1984; Cook 1986).

It was well understood as long ago as the fifteenth century, that airborne dusts and chemicals could bring about illness and injury, but the concentrations and lengths of exposure at which this might be expected to occur were unclear (Ramazinni 1700).

As reported by Baetjer (1980), “early in this century when Dr. Alice Hamilton began her distinguished career in occupational disease, no air samples and no standards were available to her, nor indeed were they necessary. Simple observation of the working conditions and the illness and deaths of the workers readily proved that harmful exposures existed. Soon however, the need for determining standards for safe exposure became obvious.”

The earliest efforts to set an OEL were directed to carbon monoxide, the toxic gas to which more persons are occupationally exposed than to any other (for a chronology of the development of OELs, see figure 1. The work of Max Gruber at the Hygienic Institute at Munich was published in 1883. The paper described exposing two hens and twelve rabbits to known concentrations of carbon monoxide for up to 47 hours over three days; he stated that “the boundary of injurious action of carbon monoxide lies at a concentration in all probability of 500 parts per million, but certainly (not less than) 200 parts per million”. In arriving at this conclusion, Gruber had also inhaled carbon monoxide himself. He reported no symptoms or uncomfortable sensations after three hours on each of two consecutive days at concentrations of 210 parts per million and 240 parts per million (Cook 1986).

Figure 1. Chronology of occupational exposure levels (OELS).

IHY060T1

The earliest and most extensive series of animal experiments on exposure limits were those conducted by K.B. Lehmann and others under his direction. In a series of publications spanning 50 years they reported on studies on ammonia and hydrogen chloride gas, chlorinated hydrocarbons and a large number of other chemical substances (Lehmann 1886; Lehmann and Schmidt-Kehl 1936).

Kobert (1912) published one of the earlier tables of acute exposure limits. Concentrations for 20 substances were listed under the headings: (1) rapidly fatal to man and animals, (2) dangerous in 0.5 to one hour, (3) 0.5 to one hour without serious disturbances and (4) only minimal symptoms observed. In his paper “Interpretations of permissible limits”, Schrenk (1947) notes that the “values for hydrochloric acid, hydrogen cyanide, ammonia, chlorine and bromine as given under the heading ‘only minimal symptoms after several hours’ in the foregoing Kobert paper agree with values as usually accepted in present-day tables of MACs for reported exposures”. However, values for some of the more toxic organic solvents, such as benzene, carbon tetrachloride and carbon disulphide, far exceeded those currently in use (Cook 1986).

One of the first tables of exposure limits to originate in the United States was that published by the US Bureau of Mines (Fieldner, Katz and Kenney 1921). Although its title does not so indicate, the 33 substances listed are those encountered in workplaces. Cook (1986) also noted that most of the exposure limits through the 1930s, except for dusts, were based on rather short animal experiments. A notable exception was the study of chronic benzene exposure by Leonard Greenburg of the US Public Health Service, conducted under the direction of a committee of the National Safety Council (NSC 1926). An acceptable exposure for human beings based on long-term animal experiments was derived from this work.

According to Cook (1986), for dust exposures, permissible limits established before 1920 were based on exposures of workers in the South African gold mines, where the dust from drilling operations was high in crystalline free silica. In 1916, an exposure limit of 8.5 million particles per cubic foot of air (mppcf) for the dust with an 80 to 90% quartz content was set (Phthisis Prevention Committee 1916). Later, the level was lowered to 5 mppcf. Cook also reported that, in the United States, standards for dust, also based on exposure of workers, were recommended by Higgins and co-workers following a study at the south-western Missouri zinc and lead mines in 1917. The initial level established for high quartz dusts was ten mppcf, appreciably higher than was established by later dust studies conducted by the US Public Health Service. In 1930, the USSR Ministry of Labour issued a decree that included maximum allowable concentrations for 12 industrial toxic substances.

The most comprehensive list of occupational exposure limits up to 1926 was for 27 substances (Sayers 1927). In 1935 Sayers and Dalle Valle published physiological responses to five concentrations of 37 substances, the fifth being the maximum allowable concentration for prolonged exposure. Lehmann and Flury (1938) and Bowditch et al. (1940) published papers that presented tables with a single value for repeated exposures to each substance.

Many of the exposure limits developed by Lehmann were included in a monograph initially published in 1927 by Henderson and Haggard (1943), and a little later in Flury and Zernik’s Schadliche Gase (1931). According to Cook (1986), this book was considered the authoritative reference on effects of injurious gases, vapours and dusts in the workplace until Volume II of Patty’s Industrial Hygiene and Toxicology (1949) was published.

The first lists of standards for chemical exposures in industry, called maximum allowable concentrations (MACs), were prepared in 1939 and 1940 (Baetjer 1980). They represented a consensus of opinion of the American Standard Association and a number of industrial hygienists who had formed the ACGIH in 1938. These “suggested standards” were published in 1943 by James Sterner. A committee of the ACGIH met in early 1940 to begin the task of identifying safe levels of exposure to workplace chemicals, by assembling all the data which would relate the degree of exposure to a toxicant to the likelihood of producing an adverse effect (Stokinger 1981; LaNier 1984). The first set of values were released in 1941 by this committee, which was composed of Warren Cook, Manfred Boditch (reportedly the first hygienist employed by industry in the United States), William Fredrick, Philip Drinker, Lawrence Fairhall and Alan Dooley (Stokinger 1981).

In 1941, a committee (designated as Z-37) of the American Standards Association, which later became the American National Standards Institute, developed its first standard of 100 ppm for carbon monoxide. By 1974 the committee had issued separate bulletins for 33 exposure standards for toxic dusts and gases.

At the annual meeting of the ACGIH in 1942, the newly appointed Subcommittee on Threshold Limits presented in its report a table of 63 toxic substances with the “maximum allowable concentrations of atmospheric contaminants” from lists furnished by the various state industrial hygiene units. The report contains the statement, “The table is not to be construed as recommended safe concentrations. The material is presented without comment” (Cook 1986).

In 1945 a list of 132 industrial atmospheric contaminants with maximum allowable concentrations was published by Cook, including the then current values for six states, as well as values presented as a guide for occupational disease control by federal agencies and maximum allowable concentrations that appeared best supported by the references on original investigations (Cook 1986).

At the 1946 annual meeting of ACGIH, the Subcommittee on Threshold Limits presented their second report with the values of 131 gases, vapours, dusts, fumes and mists, and 13 mineral dusts. The values were compiled from the list reported by the subcommittee in 1942, from the list published by Warren Cook in Industrial Medicine (1945) and from published values of the Z-37 Committee of the American Standards Association. The committee emphasized that the “list of M.A.C. values is presented … with the definite understanding that it be subject to annual revision.”

Intended use of OELs

The ACGIH TLVs and most other OELs used in the United States and some other countries are limits which refer to airborne concentrations of substances and represent conditions under which “it is believed that nearly all workers may be repeatedly exposed day after day without adverse health effects” (ACGIH 1994). (See table 1).  In some countries the OEL is set at a concentration which will protect virtually everyone. It is important to recognize that unlike some exposure limits for ambient air pollutants, contaminated water, or food additives set by other professional groups or regulatory agencies, exposure to the TLV will not necessarily prevent discomfort or injury for everyone who is exposed (Adkins et al. 1990). The ACGIH recognized long ago that because of the wide range in individual susceptibility, a small percentage of workers may experience discomfort from some substances at concentrations at or below the threshold limit and that a smaller percentage may be affected more seriously by aggravation of a pre-existing condition or by development of an occupational illness (Cooper 1973; ACGIH 1994). This is clearly stated in the introduction to the ACGIH’s annual booklet Threshold Limit Values for Chemical Substances and Physical Agents and Biological Exposure Indices (ACGIH 1994).

Table 1. Occupational exposure limits (OELs) in various countries (as of 1986)

Country/Province

Type of standard

Argentina

OELs are essentially the same as those of the 1978 ACGIH TLVs. The principal difference from the ACGIH list is that, for the 144 substances (of the total of 630) for which no STELs are listed by ACGIH, the values used for the Argentina TWAs are entered also under this heading.

Australia

The National Health and Medical Research Council (NHMRC) adopted a revised edition of the Occupational Health Guide Threshold Limit Values (1990-91) in 1992. The OELs have no legal status in Australia, except where specifically incorporated into law by reference. The ACGIHTLVs are published in Australia as an appendix to the occupational health guides, revised with the ACGIH revisions in odd-numbered years.

Austria

The values recommended by the Expert Committee of the Worker Protection Commission for Appraisal of MAC (maximal acceptable concentration) Values in cooperation with the General Accident Prevention Institute of the Chemical Workers Trade Union, is considered obligatory by the Federal Ministry for Social Administration. They are applied by the Labour Inspectorate under the Labour Protection Law.

Belgium

The Administration of Hygiene and Occupational Medicine of the Ministry of Employment and of Labour uses the TLVs of the ACGIH as a guideline.

Brazil

The TLVs of the ACGIH have been used as the basis for the occupational health legislation of Brazil since 1978. As the Brazilian work week is usually 48 hours, the values of the ACGIH were adjusted in conformity with a formula developed for this purpose. The ACGIH list was adopted only for those air contaminants which at the time had nationwide application. The Ministry of Labour has brought the limits up to date with establishment of values for additional contaminants in accordance with recommendations from the Fundacentro Foundation of Occupational Safety and Medicine.

Canada (and Provinces)

Each province has its own regulations:

Alberta

OELs are under the Occupational Health and Safety Act, Chemical Hazard Regulation, which requires the employer to ensure that workers are not exposed above the limits.

British Columbia

The Industrial Health and Safety Regulations set legal requirements for most of British Columbia industry, which refer to the current schedule of TLVs for atmospheric contaminants published by the ACGIH.

Manitoba

The Department of Environment and Workplace Safety and Health is responsible for legislation and its administration concerning the OELs. The guidelines currently used to interpret risk to health are the ACGIH TLVs with the exception that carcinogens are given a zero exposure level “so far as is reasonably practicable”.

New Brunswick

The applicable standards are those published in the latest ACGIH issue and, in case of an infraction, it is the issue in publication at the time of infraction that dictates compliance.

Northwest Territories

The Northwest Territories Safety Division of the Justice and Service Department regulates workplace safety for non-federal employees under the latest edition of the ACGIH TLVs.

Nova Scotia

The list of OELs is the same as that of the ACGIH as published in 1976 and its subsequent amendments and revisions.

Ontario

Regulations for a number of hazardous substances are enforced under the Occupational Health and Safety Act, published each in a separate booklet that includes the permissible exposure level and codes for respiratory equipment, techniques for measuring airborne concentrations and medical surveillance approaches.

Quebec

Permissible exposure levels are similar to the ACGIH TLVs and compliance with the permissible exposure levels for workplace air contaminants is required.

Chile

The maximum concentration of eleven substances having the capacity of causing acute, severe or fatal effects cannot be exceeded for even a moment. The values in the Chile standard are those of the ACGIH TLVs to which a factor of 0.8 is applied in view of the 48-hour week.

Denmark

OELs include values for 542 chemical substances and 20 particulates. It is legally required that these not be exceeded as time-weighted averages. Data from the ACGIH are used in the preparation of the Danish standards. About 25 per cent of the values are different from those of ACGIH with nearly all of these being somewhat more stringent.

Ecuador

Ecuador does not have a list of permissible exposure levels incorporated in its legislation. The TLVs of the ACGIH are used as a guide for good industrial hygiene practice.

Finland

OELs are defined as concentrations that are deemed to be hazardous to at least some workers on long-term exposure. Whereas the ACGIH has as their philosophy that nearly all workers may be exposed to substances below the TLV without adverse effect, the viewpoint in Finland is that where exposures are above the limiting value, deleterious effects on health may occur.

Germany

The MAC value is “the maximum permissible concentration of a chemical compound present in the air within a working area (as gas, vapour, particulate matter) which, according to current knowledge, generally does not impair the health of the employee nor cause undue annoyance. Under these conditions, exposure can be repeated and of long duration over a daily period of eight hours, constituting an average work week of 40 hours (42 hours per week as averaged over four successive weeks for firms having four work shifts).- Scientifically based criteria for health protection, rather than their technical or economical feasibility, are employed.”

Ireland

The latest TLVs of the ACGIH are normally used. However, the ACGIH list is not incorporated in the national laws or regulations.

Netherlands

MAC values are taken largely from the list of the ACGIH, as well as from the Federal Republic of Germany and NIOSH. The MAC is defined as “that concentration in the workplace air which, according to present knowledge, after repeated long-term exposure even up to a whole working life, in general does not harm the health of workers or their offspring.”

Philippines

The 1970 TLVs of the ACGIH are used, except 50 ppm for vinyl chloride and 0.15 mg/m(3) for lead, inorganic compounds, fume and dust.

Russian Federation

The former USSR established many of its limits with the goal of eliminating any possibility for even reversible effects. Such subclinical and fully reversible responses to workplace exposures have, thus far, been considered too restrictive to be useful in the United States and in most other countries. In fact, due to the economic and engineering difficulties in achieving such low levels of air contaminants in the workplace, there is little indication that these limits have actually been achieved in countries which have adopted them. Instead, the limits appear to serve more as idealized goals rather than limits which manufacturers are legally bound or morally committed to achieve.

United States

At least six groups recommend exposure limits for the workplace: the TLVs of the ACGIH, the Recommended Exposure Limits (RELs) suggested by the National Institute for Occupational Safety and Health (NIOSH), the Workplace Environment Exposure Limits (WEEL) developed by the American Industrial Hygiene Association (AIHA), standards for workplace air contaminants suggested by the Z-37 Committee of the American National Standards Institute (EAL), the proposed workplace guides of the American Public Health Association (APHA 1991), and recommendations by local, state or regional governments. In addition, permissible exposure limits (PELs), which are regulations that must be met in the workplace because they are law, have been promulgated by the Department of Labor and are enforced by the Occupational Safety and Health Administration (OSHA).

Source: Cook 1986.

This limitation, although perhaps less than ideal, has been considered a practical one since airborne concentrations so low as to protect hypersusceptibles have traditionally been judged infeasible due to either engineering or economic limitations. Until about 1990, this shortcoming in the TLVs was not considered a serious one. In light of the dramatic improvements since the mid-1980s in our analytical capabilities, personal monitoring/sampling devices, biological monitoring techniques and the use of robots as a plausible engineering control, we are now technologically able to consider more stringent occupational exposure limits.

The background information and rationale for each TLV are published periodically in the Documentation of the Threshold Limit Values (ACGIH 1995). Some type of documentation is occasionally available for OELs set in other countries. The rationale or documentation for a particular OEL should always be consulted before interpreting or adjusting an exposure limit, as well as the specific data that were considered in establishing it (ACGIH 1994).

TLVs are based on the best available information from industrial experience and human and animal experimental studies—when possible, from a combination of these sources (Smith and Olishifski 1988; ACGIH 1994). The rationale for choosing limiting values differs from substance to substance. For example, protection against impairment of health may be a guiding factor for some, whereas reasonable freedom from irritation, narcosis, nuisance or other forms of stress may form the basis for others. The age and completeness of the information available for establishing occupational exposure limits also varies from substance to substance; consequently, the precision of each TLV is different. The most recent TLV and its documentation (or its equivalent) should always be consulted in order to evaluate the quality of the data upon which that value was set.

Even though all of the publications which contain OELs emphasize that they were intended for use only in establishing safe levels of exposure for persons in the workplace, they have been used at times in other situations. It is for this reason that all exposure limits should be interpreted and applied only by someone knowledgeable of industrial hygiene and toxicology. The TLV Committee (ACGIH 1994) did not intend that they be used, or modified for use:

  • as a relative index of hazard or toxicity
  • in the evaluation of community air pollution
  • for estimating the hazards of continuous, uninterrupted exposures or other extended work periods
  • as proof or disproof of an existing disease or physical condition
  • for adoption by countries whose working conditions differ from those of the United States.

 

The TLV Committee and other groups which set OELs warn that these values should not be “directly used” or extrapolated to predict safe levels of exposure for other exposure settings. However, if one understands the scientific rationale for the guideline and the appropriate approaches for extrapolating data, they can be used to predict acceptable levels of exposure for many different kinds of exposure scenarios and work schedules (ACGIH 1994; Hickey and Reist 1979).

Philosophy and approaches in setting exposure limits

TLVs were originally prepared to serve only for the use of industrial hygienists, who could exercise their own judgement in applying these values. They were not to be used for legal purposes (Baetjer 1980). However, in 1968 the United States Walsh-Healey Public Contract Act incorporated the 1968 TLV list, which covered about 400 chemicals. In the United States, when the Occupational Safety and Health Act (OSHA) was passed it required all standards to be national consensus standards or established federal standards.

Exposure limits for workplace air contaminants are based on the premise that, although all chemical substances are toxic at some concentration when experienced for a period of time, a concentration (e.g., dose) does exist for all substances at which no injurious effect should result no matter how often the exposure is repeated. A similar premise applies to substances whose effects are limited to irritation, narcosis, nuisance or other forms of stress (Stokinger 1981; ACGIH 1994).

This philosophy thus differs from that applied to physical agents such as ionizing radiation, and for some chemical carcinogens, since it is possible that there may be no threshold or no dose at which zero risk would be expected (Stokinger 1981). The issue of threshold effects is controversial, with reputable scientists arguing both for and against threshold theories (Seiler 1977; Watanabe et al. 1980, Stott et al. 1981; Butterworth and Slaga 1987; Bailer et al. 1988; Wilkinson 1988; Bus and Gibson 1994). With this in mind, some occupational exposure limits proposed by regulatory agencies in the early 1980s were set at levels which, although not completely without risk, posed risks that were no greater than classic occupational hazards such as electrocution, falls, and so on. Even in those settings which do not use industrial chemicals, the overall workplace risks of fatal injury are about one in one thousand. This is the rationale that has been used to justify selecting this theoretical cancer risk criterion for setting TLVs for chemical carcinogens (Rodricks, Brett and Wrenn 1987; Travis et al. 1987).

Occupational exposure limits established both in the United States and elsewhere are derived from a wide variety of sources. The 1968 TLVs (those adopted by OSHA in 1970 as federal regulations) were based largely on human experience. This may come as a surprise to many hygienists who have recently entered the profession, since it indicates that, in most cases, the setting of an exposure limit has come after a substance has been found to have toxic, irritational or otherwise undesirable effects on humans. As might be anticipated, many of the more recent exposure limits for systemic toxins, especially those internal limits set by manufacturers, have been based primarily on toxicology tests conducted on animals, in contrast to waiting for observations of adverse effects in exposed workers (Paustenbach and Langner 1986). However, even as far back as 1945, animal tests were acknowledged by the TLV Committee to be very valuable and they do, in fact, constitute the second most common source of information upon which these guidelines have been based (Stokinger 1970).

Several approaches for deriving OELs from animal data have been proposed and put into use over the past 40 years. The approach used by the TLV Committee and others is not markedly different from that which has been used by the US Food and Drug Administration (FDA) in establishing acceptable daily intakes (ADI) for food additives. An understanding of the FDA approach to setting exposure limits for food additives and contaminants can provide good insight to industrial hygienists who are involved in interpreting OELs (Dourson and Stara 1983).

Discussions of methodological approaches which can be used to establish workplace exposure limits based exclusively on animal data have also been presented (Weil 1972; WHO 1977; Zielhuis and van der Kreek 1979a, 1979b; Calabrese 1983; Dourson and Stara 1983; Leung and Paustenbach 1988a; Finley et al. 1992; Paustenbach 1995). Although these approaches have some degree of uncertainty, they seem to be much better than a qualitative extrapolation of animal test results to humans.

Approximately 50% of the 1968 TLVs were derived from human data, and approximately 30% were derived from animal data. By 1992, almost 50% were derived primarily from animal data. The criteria used to develop the TLVs may be classified into four groups: morphological, functional, biochemical and miscellaneous (nuisance, cosmetic). Of those TLVs based on human data, most are derived from effects observed in workers who were exposed to the substance for many years. Consequently, most of the existing TLVs have been based on the results of workplace monitoring, compiled with qualitative and quantitative observations of the human response (Stokinger 1970; Park and Snee 1983). In recent times, TLVs for new chemicals have been based primarily on the results of animal studies rather than human experience (Leung and Paustenbach 1988b; Leung et al. 1988).

It is noteworthy that in 1968 only about 50% of the TLVs were intended primarily to prevent systemic toxic effects. Roughly 40% were based on irritation and about two per cent were intended to prevent cancer. By 1993, about 50% were meant to prevent systemic effects, 35% to prevent irritation, and five per cent to prevent cancer. Figure 2 provides a summary of the data often used in developing OELs. 

Figure 2. Data often used in developing an occupational exposure.

IHY060T3

Limits for irritants

Prior to 1975, OELs designed to prevent irritation were largely based on human experiments. Since then, several experimental animal models have been developed (Kane and Alarie 1977; Alarie 1981; Abraham et al. 1990; Nielsen 1991). Another model based on chemical properties has been used to set preliminary OELs for organic acids and bases (Leung and Paustenbach 1988).

Limits for carcinogens

In 1972, the ACGIH Committee began to distinguish between human and animal carcinogens in its TLV list. According to Stokinger (1977), one reason for this distinction was to assist the stakeholders in discussions (union representatives, workers and the public) in focusing on those chemicals with more probable workplace exposures.

Do the TLVs Protect Enough Workers?

Beginning in 1988, concerns were raised by numerous persons regarding the adequacy or health protectiveness of TLVs. The key question raised was, what percentage of the working population is truly protected from adverse health effects when exposed to the TLV?

Castleman and Ziem (1988) and Ziem and Castleman (1989) argued both that the scientific basis of the standards was inadequate and that they were formulated by hygienists with vested interests in the industries being regulated.

These papers engendered an enormous amount of discussion, both supportive of and opposed to the work of the ACGIH (Finklea 1988; Paustenbach 1990a, 1990b, 1990c; Tarlau 1990).

A follow-up study by Roach and Rappaport (1990) attempted to quantify the safety margin and scientific validity of the TLVs. They concluded that there were serious inconsistencies between the scientific data available and the interpretation given in the 1976 Documentation by the TLV Committee. They also note that the TLVs were probably reflective of what the Committee perceived to be realistic and attainable at the time. Both the Roach and Rappaport and the Castleman and Ziem analyses have been responded to by the ACGIH, who have insisted on the inaccuracy of the criticisms.

Although the merit of the Roach and Rappaport analysis, or for that matter, that of Ziem and Castleman, will be debated for a number of years, it is clear that the process by which TLVs and other OELs will be set will probably never be as it was between 1945 and 1990. It is likely that in coming years, the rationale, as well as the degree of risk inherent in a TLV, will be more explicitly described in the documentation for each TLV. Also, it is certain that the definition of “virtually safe” or “insignificant risk” with respect to workplace exposure will change as the values of society change (Paustenbach 1995, 1997).

The degree of reduction in TLVs or other OELs that will undoubtedly occur in the coming years will vary depending on the type of adverse health effect to be prevented (central nervous system depression, acute toxicity, odour, irritation, developmental effects, or others). It is unclear to what degree the TLV committee will rely on various predictive toxicity models, or what risk criteria they will adopt, as we enter the next century.

Standards and Nontraditional Work Schedules

The degree to which shift work affects a worker’s capabilities, longevity, mortality, and overall well-being is still not well understood. So-called nontraditional work shifts and work schedules have been implemented in a number of industries in an attempt to eliminate, or at least reduce, some of the problems caused by normal shift work, which consists of three eight-hour work shifts per day. One kind of work schedule which is classified as nontraditional is the type involving work periods longer than eight hours and varying (compressing) the number of days worked per week (e.g., a 12-hours-per-day, three-day workweek). Another type of nontraditional work schedule involves a series of brief exposures to a chemical or physical agent during a given work schedule (e.g., a schedule where a person is exposed to a chemical for 30 minutes, five times per day with one hour between exposures). The last category of nontraditional schedule is that involving the “critical case” wherein persons are continuously exposed to an air contaminant (e.g., spacecraft, submarine).

Compressed workweeks are a type of nontraditional work schedule that has been used primarily in non-manufacturing settings. It refers to full-time employment (virtually 40 hours per week) which is accomplished in less than five days per week. Many compressed schedules are currently in use, but the most common are: (a) four-day workweeks with ten-hour days; (b) three-day workweeks with 12-hour days; (c) 4-1/2–day workweeks with four nine-hour days and one four-hour day (usually Friday); and (d) the five/four, nine plan of alternating five-day and four-day workweeks of nine-hour days (Nollen and Martin 1978; Nollen 1981).

Of all workers, those on nontraditional schedules represent only about 5% of the working population. Of this number, only about 50,000 to 200,000 Americans who work nontraditional schedules are employed in industries where there is routine exposure to significant levels of airborne chemicals. In Canada, the percentage of chemical workers on nontraditional schedules is thought to be greater (Paustenbach 1994).

One Approach to Setting International OELs

As noted by Lundberg (1994), a challenge facing all national committees is to identify a common scientific approach to setting OELs. Joint international ventures are advantageous to the parties involved since writing criteria documents is both a time- and cost-consuming process (Paustenbach 1995).

This was the idea when the Nordic Council of Ministers in 1977 decided to establish the Nordic Expert Group (NEG). The task of the NEG was to develop scientifically-based criteria documents to be used as a common scientific basis of OELs by the regulatory authorities in the five Nordic countries (Denmark, Finland, Iceland, Norway and Sweden). The criteria documents from the NEG lead to the definition of a critical effect and dose-response/dose-effect relationships. The critical effect is the adverse effect that occurs at the lowest exposure. There is no discussion of safety factors and a numerical OEL is not proposed. Since 1987, criteria documents are published by the NEG concurrently in English on a yearly basis.

Lundberg (1994) has suggested a standardized approach that each county would use. He suggested building a document with the following characteristics:

  • A standardized criteria document should reflect the up-to-date knowledge as presented in the scientific literature.
  • The literature used should preferably be peer-reviewed scientific papers but at least be available publicly. Personal communications should be avoided. An openness toward the general public, particularly workers, decreases the kind of suspiciousness that recently has been addressed toward documentation from the ACGIH.
  • The scientific committee should consist of independent scientists from academia and government. If the committee should include scientific representatives from the labour market, both employers and employees should be represented.
  • All relevant epidemiological and experimental studies should be thoroughly scrutinized by the scientific committee, especially “key studies” that present data on the critical effect. All observed effects should be described.
  • Environmental and biological monitoring possibilities should be pointed out. It is also necessary to thoroughly scrutinize these data, including toxicokinetic data.
  • Data permitting, the establishment of dose-response and dose-effect relationships should be stated. A no observable effect level (NOEL) or lowest observable effect level (LOEL) for each observed effect should be stated in the conclusion. If necessary, reasons should be given as to why a certain effect is the critical one. The toxicological significance of an effect is thereby considered.
  • Specifically, mutagenic, carcinogenic and teratogenic properties should be pointed out as well as allergic and immunological effects.
  • A reference list for all studies described should be given. If it is stated in the document that only relevant studies have been used, there is no need to give a list of references not used or why. On the other hand, it could be of interest to list those databases that have been used in the literature search.

 

There are in practice only minor differences in the way OELs are set in the various countries that develop them. It should, therefore, be relatively easy to agree upon the format of a standardized criteria document containing the key information. From this point, the decision as to the size of the margin of safety that is incorporated in the limit would then be a matter of national policy.

 

Back

" DISCLAIMER: The ILO does not take responsibility for content presented on this web portal that is presented in any language other than English, which is the language used for the initial production and peer-review of original content. Certain statistics have not been updated since the production of the 4th edition of the Encyclopaedia (1998)."

Contents

Occupational Hygiene References

Abraham, MH, GS Whiting, Y Alarie et al. 1990. Hydrogen bonding 12. A new QSAR for upper respiratory tract irritation by airborne chemicals in mice. Quant Struc Activity Relat 9:6-10.

Adkins, LE et al. 1990. Letter to the Editor. Appl Occup Environ Hyg 5(11):748-750.

Alarie, Y. 1981. Dose response analysis in animal studies: Prediction of human responses. Environ Health Persp 42:9-13.

American Conference of Governmental Industrial Hygienists (ACGIH). 1994. 1993-1994 Threshold Limit Values for Chemical Substances and Physical Agents and Biological Exposure Indices. Cincinnati: ACGIH.

—. 1995. Documentation of Threshold Limit Values. Cincinnati: ACGIH.

Baetjer, AM. 1980. The early days of industrial hygiene: Their contribution to current problems. Am Ind Hyg Assoc J 41:773-777.

Bailer, JC, EAC Crouch, R Shaikh, and D Spiegelman. 1988. One-hit models of carcinogenesis: Conservative or not? Risk Anal 8:485-490.

Bogers, M, LM Appelman, VJ Feron, et al. 1987. Effects of the exposure profile on the inhalation toxicity of carbon tetrachloride in male rats. J Appl Toxicol 7:185-191.

Boleij, JSM, E Buringh, D Heederik, and H Kromhour. 1995. Occupational Hygiene for Chemical and Biological Agents. Amsterdam: Elsevier.

Bouyer, J and D Hémon. 1993. Studying the performance of a job exposure matrix. Int J Epidemiol 22(6) Suppl. 2:S65-S71.

Bowditch, M, DK Drinker, P Drinker, HH Haggard, and A Hamilton. 1940. Code for safe concentrations of certain common toxic substances used in industry. J Ind Hyg Toxicol 22:251.

Burdorf, A. 1995. Certification of Occupational Hygienists—A Survey of Existing Schemes Throughout the World. Stockholm: International Occupational Hygiene Association (IOHA).

Bus, JS and JE Gibson. 1994. Body defense mechanisms to toxicant exposure. In Patty’s Industrial Hygiene and Toxicology, edited by RL Harris, L Cralley and LV Cralley. New York: Wiley.

Butterworth, BE and T Slaga. 1987. Nongenotoxic Mechanisms in Carcinogenesis: Banbury Report 25. Cold Spring Harbor, New York: Cold Spring Harbor Laboratory.

Calabrese, EJ. 1983. Principles of Animal Extrapolation. New York: Wiley.

Casarett, LJ. 1980. In Casarett and Doull’s Toxicology: The Basic Science of Poisons, edited by J Doull, CD Klaassen, and MO Amdur. New York: Macmillan.

Castleman, BI and GE Ziem. 1988. Corporate Influence on Threshold Limit Values. Am J Ind Med 13(5).

Checkoway, H and CH Rice. 1992. Time-weighted averages, peaks, and other indices of exposure in occupational epidemiolgy. Am J Ind Med 21:25-33.

Comité Européen de Normalisation (CEN). 1994. Workplace Atmoshperes—Guidance for the Assessment of Exposure to Chemical Agents for Comparison With Limit Values and Measurement Strategy. EN 689, prepared by CEN Technical Committee 137. Brussels: CEN.

Cook, WA. 1945. Maximum allowable concentrations of industrial contaminants. Ind Med 14(11):936-946.

—. 1986. Occupational Exposure Limits—Worldwide. Akron, Ohio: American Industrial Hygiene Association (AIHA).

Cooper, WC. 1973. Indicators of susceptibility to industrial chemicals. J Occup Med 15(4):355-359.

Corn, M. 1985. Strategies for air sampling. Scand J Work Environ Health 11:173-180.

Dinardi, SR. 1995. Calculation Methods for Industrial Hygiene. New York: Van Nostrand Reinhold.

Doull, J. 1994. The ACGIH Approach and Practice. Appl Occup Environ Hyg 9(1):23-24.

Dourson, MJ and JF Stara. 1983. Regulatory history and experimental support of uncertainty (safety) factors. Regul Toxicol Pharmacol 3:224-238.

Droz, PO. 1991. Quantification of concomitant biological and air monitoring results. Appl Ind Hyg 6:465-474.

—. 1992. Quantification of biological variability. Ann Occup Health 36:295-306.

Fieldner, AC, SH Katz, and SP Kenney. 1921. Gas Masks for Gases Met in Fighting Fires. Bulletin No. 248. Pittsburgh: USA Bureau of Mines.

Finklea, JA. 1988. Threshold limit values: A timely look. Am J Ind Med 14:211-212.

Finley, B, D Proctor, and DJ Paustenbach. 1992. An alternative to the USEPA’s proposed inhalation reference concentration for hexavalent and trivalent chromium. Regul Toxicol Pharmacol 16:161-176.

Fiserova-Bergerova, V. 1987. Development of using BEIs and their implementation. Appl Ind Hyg 2(2):87-92.

Flury, F and F Zernik. 1931. Schadliche Gase, Dampfe, Nebel, Rauch-und Staubarten. Berlin: Springer.

Goldberg, M, H Kromhout, P Guénel, AC Fletcher, M Gérin, DC Glass, D Heederik, T Kauppinen, and A Ponti. 1993. Job exposures matrices in industry. Int J Epidemiol 22(6) Suppl. 2:S10-S15.

Gressel, MG and JA Gideon. 1991. An overview of process hazard evaluation techniques. Am Ind Hyg Assoc J 52(4):158-163.

Henderson, Y and HH Haggard. 1943. Noxious Gases and the Principles of Respiration Influencing their Action. New York: Reinhold.

Hickey, JLS and PC Reist. 1979. Adjusting occupational exposure limits for moonlighting, overtime, and environmental exposures. Am Ind Hyg Assoc J 40:727-734.

Hodgson, JT and RD Jones. 1990. Mortality of a cohort of tin miners 1941-1986. Br J Ind Med 47:665-676.

Holzner, CL, RB Hirsh, and JB Perper. 1993. Managing workplace exposure information. Am Ind Hyg Assoc J 54(1):15-21.

Houba, R, D Heederik, G Doekes, and PEM van Run. 1996. Exposure sensitization relationship for alpha-amylase allergens in the baking industry. Am J Resp Crit Care Med 154(1):130-136.

International Congress on Occupational Health (ICOH). 1985. Invited lectures of the XXI International Congress on Occupational Health, Dublin. Scand J Work Environ Health 11(3):199-206.

Jacobs, RJ. 1992. Strategies to recognize biological agents in the work environment and possibilities for setting standards for biological agents. IOHA first International Science Conference, Brussels, Belgium 7-9 Dec 1992.

Jahr, J. 1974. Dose-response basis for setting a quartz threshold limit value. Arch Environ Health 9:338-340.

Kane, LE and Y Alarie. 1977. Sensory irritation to formaldehyde and acrolein during single and repeated exposures in mills. Am Ind Hyg Assoc J 38:509-522.

Kobert, R. 1912. The smallest amounts of noxious industrial gases which are toxic and the amounts which may perhaps be endured. Comp Pract Toxicol 5:45.

Kromhout, H, E Symanski, and SM Rappaport. 1993. Comprehensive evaluation of within-and between-worker components of occupational exposure to chemical agents. Ann Occup Hyg 37:253-270.

LaNier, ME. 1984. Threshold Limit Values: Discussion and 35 Year Index with Recommendations (TLVs: 1946-81). Cincinnati: ACGIH.

Lehmann, KB. 1886. Experimentelle Studien über den Einfluss Technisch und Hygienisch Wichtiger Gase und Dampfe auf Organismus: Ammoniak und Salzsauregas. Arch Hyg 5:1-12.

Lehmann, KB and F Flury. 1938. Toxikologie und Hygiene der Technischen Losungsmittel. Berlin: Springer.

Lehmann, KB and L Schmidt-Kehl. 1936. Die 13 Wichtigsten Chlorkohlenwasserstoffe der Fettreihe vom Standpunkt der Gewerbehygiene. Arch Hyg Bakteriol 116:131-268.

Leidel, NA, KA Busch, and JR Lynch. 1977. NIOSH Occupational Exposure Sampling Strategy Manuel. Washington, DC: NIOSH.

Leung, HW and DJ Paustenbach. 1988a. Setting occupational exposure limits for irritant organic acids and bases based on their equilibrium dissociation constants. Appl Ind Hyg 3:115-118.

—. 1988b. Application of pharmokinetics to derive biological exposure indexes from threshold limit values. Amer Ind Hyg Assoc J 49:445-450.

Leung, HW, FJ Murray and DJ Paustenbach. 1988. A proposed occupational exposure limit for 2, 3, 7, 8 - TCDD. Amer Ind Hyg Assoc J 49:466-474.

Lundberg, P. 1994. National and international approaches to occupational standard setting within Europe. Appl Occup Environ Hyg 9:25-27.

Lynch, JR. 1995. Measurement of worker exposure. In Patty’s Industrial Hygiene and Toxicology, edited by RL Harris, L Cralley, and LV Cralley. New York: Wiley.

Maslansky, CJ and SP Maslansky. 1993. Air Monitoring Instrumentation. New York: Van Nostrand Reinhold.

Menzel, DB. 1987. Physiological pharmacokinetic modelling. Environ Sci Technol 21:944-950.

Miller, FJ and JH Overton. 1989. Critical issues in intra-and interspecies dosimetry of ozone. In Atmospheric Ozone Research and Its Policy Implications, edited by T Schneider, SD Lee, GJR Wolters, and LD Grant. Amsterdam: Elsevier.

National Academy of Sciences (NAS) and National Research Council (NRC). 1983. Risk Assessment in the Federal Government: Managing the Process. Washington, DC: NAS.

National Safety Council (NSC). 1926. Final Report of the Committee of the Chemical and Rubber Sector on Benzene. Washington, DC: National Bureau of Casualty and Surety Underwriters.

Ness, SA. 1991. Air Monitoring for Toxic Exposures. New York: Van Nostrand Reinhold.

Nielsen, GD. 1991. Mechanisms of activation of the sensory irritant receptor. CRC Rev Toxicol 21:183-208.

Nollen, SD. 1981. The compressed workweek: Is it worth the effort? Ing Eng :58-63.

Nollen, SD and VH Martin. 1978. Alternative Work Schedules. Part 3: The Compressed Workweek. New York: AMACOM.

Olishifski, JB. 1988. Administrative and clinical aspects in the chapter Industrial Hygiene. In Occupational Medicine: Principles and Practical Applications, edited by C Zenz. Chicago: Year Book Medical.

Panett, B, D Coggon, and ED Acheson. 1985. Job exposure matrix for use in population based studies in England and Wales. Br J Ind Med 42:777-783.

Park, C and R Snee. 1983. Quantitative risk assessment: State of the art for carcinogenesis. Fund Appl Toxicol 3:320-333.

Patty, FA. 1949. Industrial Hygiene and Toxicology. Vol. II. New York: Wiley.

Paustenbach, DJ. 1990a. Health risk assesment and the practice of industrial hygiene. Am Ind Hyg Assoc J 51:339-351.

—. 1990b. Occupational exposure limits: Their critical role in preventative medicine and risk management. Am Ind Hyg Assoc J 51:A332-A336.

—. 1990c. What Does the Risk Assessment Process Tell us about the TLVs? Presented at the 1990 Joint Conference on Industrial Hygiene. Vancouver, BC, 24 October.

—. 1994. Occupational exposure limits, pharmacokinetics, and unusual workshifts. In Patty’s Industrial Hygiene and Toxicology. Vol. IIIa (4th edn.). New York:Wiley.

—. 1995. The practice of health risk assessment in the United States (1975-1995): How the US and other countries can benefit from that experience. Hum Ecol Risk Assess 1:29-79.

—. 1997. OSHA’s program for updating the permissible exposure limits (PELs): Can risk assessment help “move the ball forward”? Risk in Perspectives 5(1):1-6. Harvard University School of Public Health.

Paustenbach, DJ and RR Langner. 1986. Setting corporate exposure limits: State of the art. Am Ind Hyg Assoc J 47:809-818.

Peto, J, H Seidman, and IJ Selikoff. 1982. Mesothelioma mortality in asbestos workers: implications for models of carcinogenesis and risk assessment. Br J Cancer 45:124-134.

Phthisis Prevention Committee. 1916. Report of Miners. Johannesburg: Phthisis Prevention Committee.

Post, WK, D Heederik, H Kromhout, and D Kromhout. 1994. Occupational exposures estimated by a population specific job-exposure matrix and 25-year incidence rate of chronic non-specific lung disease (CNSLD): The Zutphen Study. Eur Resp J 7:1048-1055.

Ramazinni, B. 1700. De Morbis Atrificum Diatriba [Diseases of Workers]. Chicago: The Univ. of Chicago Press.

Rappaport, SM. 1985. Smoothing of exposure variability at the receptor: Implications for health standards. Ann Occup Hyg 29:201-214.

—. 1991. Assessment of long-term exposures to toxic substances in air. Ann Occup Hyg 35:61-121.

—. 1995. Interpreting levels of exposures to chemical agents. In Patty’s Industrial Hygiene and Toxicology, edited by RL Harris, L Cralley, and LV Cralley. New York: Wiley.

Rappaport, SM, E Symanski, JW Yager, and LL Kupper. 1995. The relationship between environmental monitoring and biological markers in exposure assessment. Environ Health Persp 103 Suppl. 3:49-53.

Renes, LE. 1978. The industrial hygiene survey and personel. In Patty’s Industrial Hygiene and Toxicology, edited by GD Clayton and FE Clayton. New York: Wiley.

Roach, SA. 1966. A more rational basis for air sampling programmes. Am Ind Hyg Assoc J 27:1-12.

—. 1977. A most rational basis for air sampling programmes. Am Ind Hyg Assoc J 20:67-84.

Roach, SA and SM Rappaport. 1990. But they are not thresholds: A critical analysis of the documentation of threshold limit values. Am J Ind Med 17:727-753.

Rodricks, JV, A Brett, and G Wrenn. 1987. Significant risk decisions in federal regulatory agencies. Regul Toxicol Pharmacol 7:307-320.

Rosen, G. 1993. PIMEX-combined use of air sampling instruments and video filming: Experience and results during six years of use. Appl Occup Environ Hyg 8(4).

Rylander, R. 1994. Causative agents for organic dust related disease: Proceedings of an international workshop, Sweden. Am J Ind Med 25:1-11.

Sayers, RR. 1927. Toxicology of gases and vapors. In International Critical Tables of Numerical Data, Physics, Chemistry and Toxicology. New York: McGraw-Hill.

Schrenk, HH. 1947. Interpretation of permissible limits. Am Ind Hyg Assoc Q 8:55-60.

Seiler, JP. 1977. Apparent and real thresholds: A study of two mutagens. In Progress in Genetic Toxicology, edited by D Scott, BA Bridges, and FH Sobels. New York: Elsevier Biomedical.

Seixas, NS, TG Robins, and M Becker. 1993. A novel approach to the characterization of cumulative exposure for the study of chronic occupational disease. Am J Epidemiol 137:463-471.

Smith, RG and JB Olishifski. 1988. Industrial toxicology. In Fundamentals of Industrial Hygiene, edited by JB Olishifski. Chicago: National Safety Council.

Smith, TJ. 1985. Development and application of a model for estimating alveolar and interstitial dust levels. Ann Occup Hyg 29:495-516.

—. 1987. Exposure assessment for occupational epidemiology. Am J Ind Med 12:249-268.

Smyth, HF. 1956. Improved communication: Hygienic standard for daily inhalation. Am Ind Hyg Assoc Q 17:129-185.

Stokinger, HE. 1970. Criteria and procedures for assessing the toxic responses to industrial chemicals. In Permissible Levels of Toxic Substances in the Working Environment. Geneva: ILO.

—. 1977. The case for carcinogen TLV’s continues strong. Occup Health Safety 46 (March-April):54-58.

—. 1981. Threshold limit values: Part I. Dang Prop Ind Mater Rep (May-June):8-13.

Stott, WT, RH Reitz, AM Schumann, and PG Watanabe. 1981. Genetic and nongenetic events in neoplasia. Food Cosmet Toxicol 19:567-576.

Suter, AH. 1993. Noise and conservation of hearing. In Hearing Conservation Manual. Milwaukee, Wisc: Council for Accreditation in Occupational Hearing Conservation.

Tait, K. 1992. The Workplace Exposure Assessment Expert System (WORK SPERT). Am Ind Hyg Assoc J 53(2):84-98.

Tarlau, ES. 1990. Industrial hygiene with no limits. A guest editorial. Am Ind Hyg Assoc J 51:A9-A10.

Travis, CC, SA Richter, EA Crouch, R Wilson, and E Wilson. 1987. Cancer risk management: A review of 132 federal regulatory decisions. Environ Sci Technol 21(5):415-420.

Watanabe, PG, RH Reitz, AM Schumann, MJ McKenna, and PJ Gehring. 1980. Implications of the mechanisms of tumorigenicity for risk assessment. In The Scientific Basis of Toxicity Assessment, edited by M Witschi. Amsterdam: Elsevier.

Wegman, DH, EA Eisen, SR Woskie, and X Hu. 1992. Measuring exposure for the epidemiologic study of acute effects. Am J Ind Med 21:77-89.

Weil, CS. 1972. Statistics versus safety factors and scientific judgment in the evaluation of safety for man. Toxicol Appl Pharmacol 21:454-463.

Wilkinson, CF. 1988. Being more realistic about chemical carcinogenesis. Environ Sci Technol 9:843-848.

Wong, O. 1987. An industry wide mortality study of chemical workers occupationally exposed to benzene. II Dose-response analyses. Br J Ind Med 44:382-395.

World Commission on Environment and Development (WCED). 1987. Our Common Future. Brundtland Report. Oxford: OUP.

World Health Organization (WHO). 1977. Methods used in Establishing Permissible Levels in Occupational Exposure to Harmful Agents. Technical Report No. 601. Geneva: International Labour Organization (ILO).

—. 1992a. Our Planet, Our Health. Report of the WHO Commission on Health and Environment. Geneva: WHO.

—. 1992b. Occupational Hygiene in Europe: Development of the Profession. European Occupational Health Series No. 3. Copenhagen: WHO Regional Office for Europe.

Zielhuis, RL and van der FW Kreek. 1979a. Calculations of a safety factor in setting health based permissible levels for occupational exposure. A proposal. I. Int Arch Occup Environ Health 42:191-201.

Ziem, GE and BI Castleman. 1989. Threshold limit values: Historical perspective and current practice. J Occup Med 13:910-918.