Committee report - November 16-17, 2016

Chemicals Management Plan Science Committee

Background

“Health Canada (HC) and Environment and Climate Change Canada (ECCC) are developing a roadmap for integrating new approach methodologies (NAMs), including in silico approaches, in chemico and in vitro assays, with traditional risk assessment... The roadmap is anticipated to cover multiple aspects of chemical risk assessment, including priority-setting, hazard characterization, exposure characterization and risk characterization. The development of the roadmap will occur in stages and will contribute to both short-term [in other words, third phase of the Chemicals Management Plan (CMP)] and long-term (post 2020) initiatives”. (Excerpt from November 16-17, 2016 Meeting Objectives Paper.)

The CMP Science Committee (the Committee) was requested to provide its input to this developing approach. The Committee was requested to address the following Charge Questions:

  1. Does the Committee have input for HC and ECCC as they move forward with developing a roadmap for new approach methodologies and risk assessment modernization?
  2. Does the Committee have input for HC and ECCC for using systematic approaches within the process for the Identification of Risk Assessment Priorities (IRAP)? Are there specific computational approaches that the Committee is aware of that could be used to lessen the resources required to complete IRAP?
  3. Does the Committee have suggestions for NAM-based tools that can be considered for further exploration for estimating human and ecological exposure for the purposes of prioritization?
  4. a) Does the Committee have suggestions for NAM based tools that can be considered for hazard identification for prioritization? How might these NAM be incorporated in the decision points of the IRAP process?

    b) Given that biological pathways are often conserved across species, does the Committee have input on how human health based NAM can inform ecological IRAP approaches and vice versa?

    c) Considering the hazard NAM identified by HC and ECCC and those identified by the (Question 4a), what near-term opportunities and challenges are associated with the implementation of NAM for the identification of new priorities for risk assessment?

  5. a) Does the Committee have suggestions for NAM based tools that can be considered for developing risk metrics for prioritization (IRAP)? How might these NAMs be incorporated in the decision points of the IRAP process?

    b) Considering the risk metric NAM identified by the departments and those identified by the Committee (Question 5a), what near-term opportunities and challenges are associated with the implementation of NAM for the identification of new priorities for risk assessment?

Introduction

This meeting focused on reviewing the application of new and emerging tools and methodologies for IRAP. The objectives of HC/ECCC for this meeting were two-fold: (1) to seek general input on developing a roadmap for integrating NAMs as part of the risk assessment paradigm, and, (2) to seek specific input on how to enhance current priority-setting approaches through the incorporation of NAMs. The Committee was requested to address a number of Charge Questions in the context of the existing substances risk assessment program of the Canadian Environmental Protection Act, 1999 (CEPA 1999). These questions are listed above and the Committee’s responses follow.

From an overview perspective, the Committee concluded that although these new assessment approaches are still evolving, they are mature enough, in many cases, to begin to have an application in priority setting, as supplemental lines of evidence in risk assessment and as high-throughput risk approximation/classification (not assessment) tools (see Figure 1). The Committee endorses their use for these purposes.

1) Does the Committee have input for HC and ECCC as they move forward with developing a roadmap for new approach methodologies and risk assessment modernization?

Traditional risk assessment methods (exposure estimation/hazard characterization) have relied on the identification of direct or calculated measures of exposure (individual-levels in human samples; population-levels in the environment), and the description and assessment of apical effects on an organism or a population. However, in addition to the recent development of NAMs for estimating potential exposure (described further below), there is a growing movement towards, and acceptance of, in silico and molecular data to help inform decision making, similar to the recent increase in the utilization of quantitative structure-activity relationship (QSAR) data.

As one example, the Ecological Risk Classification (ERC) model is a tool designed by ECCC for prioritization and ecological risk assessment under the third phase of the CMP (ECCC, 2016). The ERC model relies on fish LC50, receptor binding, and QSAR data which together represent a rather heterogeneous assemblage. Such practices are now being rationalized and decisions are being made on the basis of these data. The Committee felt that it was reasonable to use and integrate new methodologies into the risk assessment program, and that further improvements will occur in the future as the development and validation of NAMs increase. Practical and financial constraints were not considered by the Committee because the current program foresees using existing NAM data that are freely available. These considerations may be taken into account when considering a future (post-2020) strategy, when routine prospective generation of NAM data for the purposes of screening and assessment may become a reality.

Figure 1. Schematic representation of the possible roles of NAMs in CMP.

Text Equivalent - Figure 1

The terms “Social and Regulatory Input” are above a funnel.

At the top of the funnel, the following CMP Feeder Mechanisms are listed: Hazard, Exposure, Epidemiological Data, Physiologically Based Pharmacokinetic Modelling, New Approach Methodologies and Computational Modelling.

Moving further down the funnel, these mechanisms are used for prioritization (assessments and reassessments). After prioritization, there is an increasing use of New Approach Methodologies as a component of the risk assessment. At the bottom of the funnel, a risk decision or risk management action is taken based on the results of the risk assessment.

Once the risk decision or risk management action is taken, a “review and recycle” mechanism is in place to inform the CMP Feeder Mechanisms noted above (one directional arrow from the bottom of the funnel to the top).

The Committee offers input for HC and ECCC in the form of key considerations for moving forward with developing a roadmap for NAMs and risk assessment modernization in the context of recent developments in this arena. These key considerations are as follows:

1.1) Interrogation of the effects at the molecular and cellular levels and increased biological coverage

The Committee recognizes that the large number of substances to be evaluated in the CMP will require a shift towards the use of NAMs that may include a variety of in vitro and in silico approaches to interrogate the effects at the molecular and cellular levels. However, a full characterization of the potential effects of a substance will require the continued development and application of technologies with increased biological coverage (for example, transcriptomic technologies and other high-content approaches). Currently, one of the major tasks is to annotate the “biological space” occupied by the many NAMs in use as well as under development and validation (Zuang et al., 2016). The NAM portfolio should be “evergreen” (keeping pace with changing circumstances) and iterative to ensure important coverage of biological space.

1.2) Putting results in a dose/exposure context

The Committee emphasizes that the use of molecular data and in vitro systems for risk-based prioritization requires framing the results in a dose and exposure context. The potency values derived from these approaches should either be compared directly with blood concentrations (using biomonitoring equivalents approaches) or be converted into administered dose values (using reverse dosimetry methods) and compared with exposure estimates. The Committee recognizes that HC and ECCC have been actively considering high-throughput toxicokinetic (HTTK) assays and in vitro-to-in vivo extrapolation (IVIVE) models to put in vitro data into a dose context. However, the Committee understands that CMP has been primarily using traditional exposure data, models, and methods to provide the exposure context. In addition, the Objectives Paper outlining the roadmap for integrating NAMs with traditional risk assessment was heavily weighted towards hazard characterization. Given these considerations, the Committee suggests that HC and ECCC should consider the need to develop a NAM strategy for exposure assessment.

1.3) Customer-driven NAM development

The Committee was of the opinion that NAMs should be “pulled” by the end-users (regulators/clients) instead of being driven by the creators in the field. In other words, the end-users should describe what they want and need. What specific descriptors are needed to be able to make informed decisions? What “lamp posts” (described further under Charge Question 4a–c) do the end-users need to consider when making an informed judgment regarding screening prioritization? Based on these descriptors, NAM “creators” would then have the mandate to develop/modify NAMs to meet these needs. The Committee also encourages end-users to recognize that NAMs may afford opportunities to identify and answer questions previously obscured by a lack of methods and information for more holistic risk-based evaluations.

One longer-term consideration in chemical prioritization, and for the development of NAMs, may be expressed as the relative public health burden of an exposure in terms of the concept of the burden-of-disease [for example, the disability- adjusted life years (DALYs) or quality-adjusted life years (QALYs)] associated with exposure to a substance or group of substances.

1.4) Characterization of uncertainty and variability

The uncertainties associated with NAMs are different than those associated with traditional animal models. The Committee suggests that HC and ECCC rigorously characterize the uncertainty and variability in NAM data and incorporate these in the decision-making process. It was also suggested that new tools/methods are needed to support high-throughput NAMs that assess exposure; to increase adjustments to allow for individual differences (genetic and other) in human populations; and, to address species differences for ecological predictions.

1.5) Building scientific confidence and transparency

In the short term, the use of existing NAM data for chemical prioritization is particularly supportable; it builds experience and competence, and avoids the risk of false-negative decisions because a de-prioritization decision can be readily reversed. Over the mid-term, the Committee suggests that HC and ECCC explore developing the capabilities to collect NAM data that are specific to needs in the CMP, and evaluate the applicability of NAMs for higher-tier decisions as one input among multiple lines of evidence (as per Figure 1). The Committee further suggests that the use of NAMs in the decision process be transparent.

1.6) Collaboration with other international organizations

Collaborative agreements with international organizations may lessen the resources required for IRAP (see Charge Question 2), and speed up its integration into current programs. For example, the development of formal relationships/agreements with the European Chemicals Agency (ECHA) would facilitate the acquisition of their chemical use information and other data. Collaboration with the United States Environmental Protection Agency (U.S. EPA) will provide high-throughput in vitro screening and toxicokinetic data on substances of interest to the CMP. The Committee recognized that a number of such agreements are already in place; it may be a priority to extend the scope of such agreements in the future.

2) Does the Committee have input for HC and ECCC using systemic approaches within the process for the identification of IRAP? Are there specific computational approaches that the Committee is aware of that could be used to lessen the resources required to complete IRAP?

In addition to the earlier suggestion to focus on collaborative approaches, the Committee also suggested that consideration be given to a number of systematic approaches within the prioritization process. Among these, there were several suggestions related to automated data collection, which include bibliometric analyses to identify specific information. Examples of automated data collection include: text mining; using automated gathering, cataloguing, sorting and prioritizing emerging evidence approaches, possibly involving direct discussions with the owners of Web-based search tools and engines; using systematic automatic updates of chemical inventories; and “crowd-sourcing”, or seeking submissions of papers from researchers and the public, together with a rationale for their importance. NAMs are available to facilitate a number of these approaches. For example, the extension of the Dragon Toolkit™ software for text mining is an approach that is being pursued by the U.S. EPA and the National Institute of Environmental Health Sciences – National Toxicology Program (NIEHS-NTP). Dragon descriptors can be used to evaluate molecular structure-activity or structure-property relationships, as well as for similarity analysis and high-throughput screening of molecular databases.

HC and ECCC are currently making use of (or considering the use of) several specific NAMs for use in the screening prioritization of substances. The U.S. EPA’s Chemistry Dashboard (available in English only) and ToxCast Dashboard (available in English only) provide access to chemical structures, curated physicochemical property data, high-throughput in vitro screening data, use information, and exposure data. The Toxicological Prioritization Index (ToxPi™) approach (Reif et al., 2010) enables the assembly of multiple sources of exposure and/or toxicological evidence and promotes their visualization in a manner that facilitates communication. The use of other NAM tools and outputs, such as the threshold of toxicological concern (TTC), the ecological TTC (ecoTTC), and the ERC approach are supported by the Committee. We note that these are either in use now or are being tested in case studies within the CMP and are discussed in more detail elsewhere in this report.

3) Does the Committee have suggestions for the NAM-based tools that can be considered for further exploration for estimating human and ecological exposure for the purposes of prioritization?

As previously stated, it is the Committee’s understanding that the CMP has been primarily using traditional exposure data, models, and methods to provide the exposure context. The Committee suggests that traditional exposure data and models may be augmented using high-throughput empirical and mechanistic exposure models. An example of a high-throughput empirical exposure model is the U.S. EPA’s ExpoCast Systematic Empirical Evaluation of Models (SEEM) framework that currently uses production volume and binary descriptors of four use categories to provide exposure estimates for nearly 8,000 chemicals (Wambaugh et al., 2014). The empirical exposure estimates are calibrated using human biomonitoring data and have quantitative estimates of uncertainty. An example of a high-throughput mechanistic exposure model is Stochastic Human Exposure and Dose Simulation – High Throughput (SHEDS-HT) which uses an exposure pathway approach to provide exposure estimates for approximately 2,500 chemicals (Isaacs et al., 2014). It should be noted that both the SEEM and SHEDS-HT models are appropriate tools for prioritization, rather than serving as a substitute for individual and holistic exposure estimates in a risk assessment context. Finally (and separate from traditional CMP information gathering surveys for import, production, and use quantities for substances undergoing assessments), it might prove feasible to engage industry to provide production, import or use-based estimates from which it may be possible to model substance emissions and/or exposures in the context of developing new methodologies. To some extent, the above suggestions may be secondary to our suggestion under Charge Question #1 for HC and ECCC to create an overall NAM exposure strategy.

While they are not NAM-based tools, the Canadian Health Measures Survey (CHMS), the Canadian Human Activity Pattern Survey (CHAPS), newborn screening programs, and cord blood banks are possible sources of Canadian human biomonitoring/exposure data for some substances. These sources may be used to calibrate high-throughput empirical exposure models that are Canadian specific. Data from poison-control centres may also be available, although this information tends to be product-based, rather than substance-based.

4. a) Does the Committee have suggestions for the NAM-based tools that can be considered for hazard identification for prioritization? How might these NAM be incorporated in the decision points of the IRAP process?

4. b) Given that biological pathways are often conserved across species, does the Committee have input on how human health based NAM can inform ecological IRAP approaches and vice versa?

4. c) Considering the hazard NAM identified by HC and ECCC and those identified by the Committee (Question 4.a)), what near-term opportunities and challenges are associated with the implementation of NAM for the identification of new priorities for risk assessment?

It is envisaged that any NAM-based Hazard-Screen (HS) tools will be incorporated into the current tiered approach to chemical assessment in Canada by ECCC and HC, particularly in the short-term and for prioritization purposes. The main purpose of these tools will be to “flag” substances that require further in-depth assessment. Ideally, HS NAM data would not only flag substances of concern, but also inform the follow-up assessment to some extent. In this context, NAM data that indicate a possible association with a particular toxicological endpoint of regulatory concern, severity, and reversibility are of interest but not essential.

CMP assessments are currently dependent mostly on the use of existing data from third-party sources and data that can be analysed using computational methods in-house. Thus, the resulting HSs will be limited in scope regarding the toxicological effects they can cover. However, it would be useful to attempt to specify and design an "ideal" HS from the start because it will help to put the available HSs in the short-term (dependent on available data) into perspective and to identify knowledge gaps and priorities for the future development of new NAMs. Considering short-and long-term priorities at the same time may be beneficial, in other words, what we can do now and, ideally, what we should do in the future.

Any NAM-based HS should be able to flag substances of concern to either human health or target species/genera in the environment. Although toxicokinetic-related aspects are not necessarily required in a qualitative HS, any risk-based prioritization activity, such as the bioactivity-to-exposure ratio (BER), will require an understanding of toxicokinetic-related aspects. At the highest level, the HS should be described in terms of the toxicological space it represents or covers. Alternatively, because it is anticipated that HSs will increasingly (over time) be based on upstream mechanistic profiling, rather than predictive of typical toxicological endpoints, one could describe the HS in terms of the “perturbable biological space” it represents (for example, in terms of biological pathways). However, the majority of environmental and industrial chemicals of interest in the CMP are highly promiscuous (in other words, they do not have a unique or selective pharmacologic specificity) and interact with many biological targets/pathways in a narrow dose range (Thomas et al., 2013b). As a result, the Committee suggests that the proposal to use the most sensitive assay considered to be reliable as the basis of the BER is reasonable as a default unless the substance appears to be relatively selective in its biological activity. For a selective substance, the BER calculation should be based on the relevant pathway(s).

At the next level, the HS can be described in terms of the NAMs that it incorporates and how they link to the toxicological/biological effects/pathways that they represent/recapitulate. To facilitate this, NAMs should be identified and annotated appropriately to serve this purpose; the technological basis of the NAM is of less concern. Thus, one could consider a NAM as a “lamp post” illuminating a defined sub-domain of toxicological space (or perturbable biological space); as more NAMs are assembled, more of the actual domain of interest can be covered within the HS, thus “lighting up” more space and increasing understanding. In this sense, it is important that the collection of NAMs comprehensively covers as much of the domain as possible in order to reduce false negatives, and that the boundary of the sub-domain that is covered is properly described. There is a considerable amount of research currently on NAMs and their components (for example, induced pluripotent stem cells, 3-dimensional (3-D) tissue models, transcriptomics, toxicokinetic models, toxicity pathway characterisation) and thus it is likely that, in theory, an appreciable amount of biological/toxicological space can be covered. However, many of these NAMs are in a development phase and/or have been deployed to a very limited extent.

Although, at this point in time, the CMP relies on incorporating NAMs for which data are already available into its HS for substances of interest, there are many opportunities to cooperate with international partners to identify further sources of relevant NAMs and NAM data, and to explore the possibility of generating new NAM data. The initiative of the U.S. EPA to bring together international regulatory actors to "accelerate risk assessment" was mentioned as one potentially useful platform for such cooperation, as were some groups and projects of the Organisation for Economic Co-operation and Development (OECD).

5. a) Does the Committee have suggestions for NAM based tools that can be considered for developing risk metrics for prioritization? How might these NAM be incorporated in the decision points of the IRAP process?

Committee members supported the use of the BER as a cross-cutting approach that should be useful in both ecological and human health applications. Even today, in the absence of any other hazard flags, a large BER (for example, multiple orders of magnitude) will provide some support for de-prioritizing a particular substance. From an ecological perspective, the environmental analogue of the BER is critical body or tissue burden. HC is proposing to use the BER as a logical metric to prioritize/triage substances when Bioactivity and Exposure data are available. In the BER approach, the numerator generally relies on potency values derived from new approach methods (for example, high-throughput in vitro assays). In the majority of cases, the most sensitive in vitro assay is used to estimate the numerator of the BER. However, where the in vitro assays or NAMs provide sufficient evidence for the likely adverse outcome pathway (AOP) or mode of action (MOA), the Committee suggests that the numerator of the BER be based on potency values from the relevant assay or new approach method rather than on the most sensitive value.

In order to develop risk metrics for prioritization, the potency values must be converted into administered-dose values, using reverse dosimetry methods. Researchers have started generating data for plasma protein binding and hepatic intrinsic clearance using HTTK assays to parameterize IVIVE models and provide a dose context (Wetmore et al., 2014). HC has generated some additional data on CMP chemicals using these same approaches. The Committee supports the continued development and application of these approaches and other pharmacokinetic approaches to enable risk-based prioritization.

Dose-adjusted bioactivity measurements are compared with traditional or high-throughput exposure estimates to provide a BER. When available, Canadian-specific exposure data (or traditional exposure models based on Canadian-specific parameters) should be used; in cases where these are not available, the Committee suggests considering high-throughput exposure modeling approaches, such as the U.S. EPA’s ExpoCast (Wambaugh et al., 2013). The Committee also suggests consideration of coupling non-targeted analytical screening approaches with Canadian biomonitoring samples to provide experimental exposure data across a range of substances. One member of the Committee commented that while there is enthusiasm for non-targeted analytical screening of Canadian biomonitoring samples, the available resources thus far have been limited.

Machine learning approaches are also of interest. For a substance with no data, it may be possible to identify substances (for example, five) that have relevant/surrogate bioactivity (B) data and a similar number of substances that have surrogate exposure (E) data. The surrogate chemicals for B and E properties can be different. This approach could be tested using a panel of well-characterized chemicals (for example, ToxCast) or by purposefully removing B or E data for a given substance.

The Committee previously noted that ERC is a tool designed by ECCC for ecological risk assessment under the third phase of the CMP. There was some interest expressed in using an approach similar to the ERC to inform human health risk-based prioritization. In addition, the utility of the ERC could be validated against a panel of test substances to determine the toxicological space/domain that the tool covers. Other new approaches of value are the TTC, ecoTTC, and short-term in vivo studies with transcriptomic characterization to estimate points-of-departure (Thomas et al., 2013a; Thomas et al., 2013b; Thomas et al., 2011). The application of transcriptomic technologies to risk assessment has been evaluated by HC (Bourdon-Lacombe et al., 2015).

5. b) Considering the risk metric NAM identified by HC and ECCC and those identified by the Committee (Question 5a), what near-term opportunities and challenges are associated with the implementation of NAM for the identification of new priorities for risk assessment?

There is a near-term opportunity for the CMP to partner with other international organizations on both the methods and common substances of concern in the application of NAMs, such as the BER for risk-based prioritization (see the Committee’s earlier suggestion).

The near-term challenges of the BER approach are the lack of direct linkage to apical endpoints, development of scientific confidence in the BER as a prioritization metric, and the need to incorporate uncertainty in both the numerator and denominator of the BER. Addressing the first and second challenges will require the continued pursuit of on-going and potentially new case studies that compare the BER with Margin of Exposure (MOE) values from traditional risk assessments using in vivo data. In addition, continued stakeholder dialogue will be required to ensure that researchers, industry scientists, assessors and the public understand the strengths and limitations of the BER approach. One Committee suggestion was to link the BER with a visual tool (such as ToxPi) to provide a graphic representation of the traditional in vivo study and NAM data that are available for a particular substance.

The BER approach may include confidence bands where the ratio of bioactivity to exposure is high, which will assist in communicating the uncertainties associated with the output; unbiased (in other words, natural) variability should not be an issue. False negatives (because of limitations in the screening NAM) are potentially serious and require constant expansion of the areas of biological activity covered, coupled with an “evergreen” philosophy of running all substances that were assessed previously through the expanded system. For example, the current in vitro screens do not cover some mechanisms of acute toxicity very well, although this has not been, and is not expected to be, a common issue in CMP assessments.

The incorporation of uncertainty in the BER will be necessary for application to regulatory decisions and to develop scientific confidence among stakeholders. The incorporation of uncertainty in the bioactivity potency estimates, HTTK modeling, and high-throughput exposure estimates have been investigated by the U.S. EPA. For example, the uncertainty in potency estimates can be assessed using bootstrapping approaches (for example, ToxBoot (available in English only); approaches to estimate uncertainty in HTTK assays and IVIVE modeling are under development. The uncertainty in high-throughput exposure estimates has already been incorporated in the ExpoCast models (Wambaugh et al., 2015). The Committee suggests that uncertainty be incorporated into both the numerator and denominator of the BER for regulatory application.

Respectfully submitted on behalf of the Committee,
Barbara F. Hales and Geoff Granville, Co-chairs
23 December 2016

POSTSCRIPT

  1. The Committee wishes to recognize the three ad-hoc members, Niladri Basu (McGill University) Russell Thomas (U.S. EPA), and Maurice Whelan (European Commission, Directorate General Joint Research Centre) for their significant contributions to this report.
  2. Committee discussions in breakout groups mainly addressed the overall concept and design of a screening approach based on NAMs that would meet the needs of HC/ECCC. Suggestions and information relating to specific NAMs that HC/ECCC may wish to consider will be provided by individual Committee members after the meeting.

References

Bourdon-Lacombe JA, Moffat ID, Deveau M, Husain M, Auerbach S, Krewski D, Thomas RS, Bushel PR, Williams A, Yauk CL. 2015 Technical guide for applications of gene expression profiling in human health risk assessment of environmental chemicals. Regul Toxicol Pharmacol. 272:292-309.

ECCC, 2016. Science Approach Document: Ecological risk classification of organic substances.

Isaacs KK, Glen WG, Egeghy P, Goldsmith MR, Smith L, Vallero D, Brooks R, Grulke CM, Özkaynak H. 2014. SHEDS-HT: an integrated probabilistic exposure model for prioritizing exposures to chemicals with near-field and dietary sources. Environ Sci Technol. 48:12750-9.

Reif DM, Martin M, Tan SW, Houck KA, Judson RS, Richard AM, Knudsen TB, Dix DJ, Kavlock RJ. 2010. Endocrine profiling and prioritization of environmental chemicals using ToxCast data. Environ Health Perspect. 118:1714–1720.

Thomas RS, Clewell HJ III, Allen BC, Wesselkamper SC, Wang NC, Lambert JC, Hess-Wilson JK, Zhao QJ, Andersen ME. 2011. Application of transcriptional benchmark dose values in quantitative cancer and noncancer risk assessment. Toxicol Sci. 120:194-205.

Thomas RS, Philbert MA, Auerbach SS, Wetmore BA, Devito MJ, Cote I, Rowlands JC, Whelan MP, Hays SM, Andersen ME, Meek ME, Reiter LW, Lambert JC, Clewell HJ 3rd, Stephens ML, Zhao QJ, Wesselkamper SC, Flowers L, Carney EW, Pastoor TP, Petersen DD, Yauk CL, Nong A. 2013a.Incorporating new technologies into toxicity testing and risk assessment: moving from 21st century vision to a data-driven framework. Toxicol Sci. 136:4-18.

Thomas RS, Wesselkamper SC, Wang NC, Zhao QJ, Petersen DD, Lambert JC, Cote I, Yang L, Healy E, Black MB, Clewell HJ III, Allen BC, Andersen ME. 2013b. Temporal concordance between apical and transcriptional points of departure for chemical risk assessment. Toxicol Sci. 134:180-194.

Wambaugh JF, Setzer RW, Reif DM, Gangwal S, Mitchell-Blackwood J, Arnot JA, Joliet O, Frame A, Rabinowitz J, Knudsen TB, Judson RS, Egeghy P, Vallero D, Cohen Hubal EA. 2013. High-throughput models for exposure-based chemical prioritization in the ExpoCast project. Environ Sci Technol. 47:8479-8488.

Wambaugh JF, Wang A, Dionisio KL, Frame A, Egeghy P, Judson R, Setzer RW. 2014. High throughput heuristics for prioritizing human exposure to environmental chemicals. Environ Sci Technol. 48:12760-12767.

Wambaugh JF, Wetmore BA, Pearce R, Strope C, Goldsmith R, Sluka JP, Sedykh A, Tropsha A, Bosgra S, Shah I, Judson R, Thomas RS, Setzer RW. 2015. Toxicokinetic triage for environmental chemicals. Toxicol Sci. 147:55-67.

Wetmore BA, Allen B, Clewell HJ III, Parker T, Wambaugh JF, Almond LM, Sochaski MA, Thomas RS. 2014. Incorporating population variability and susceptible subpopulations into dosimetry for high-throughput toxicity testing. Toxicol Sci. 142:210-224.

Zuang V, Asturiol Bofill D, Barroso J et al. 2016. EURL ECVAM status report on the development, validation and regulatory acceptance of alternative methods and approaches.

Page details

Date modified: