Emerging technologies for food and drug safety

: Emerging technologies are playing a major role in the generation of new approaches to assess the Brasilia, Brazil on September 18-20, 2017 to discuss the role of new approaches in regulatory science with a specific emphasis on applications in food and medical product safety. The global regulatory landscape concerning the application of new technologies was assessed in several countries worldwide. Challenges and issues were discussed in the context of developing an international consensus for objective criteria in the development, application and review of emerging technologies. The need for advanced approaches to allow for faster, less expensive and more predictive methodologies was elaborated. In addition, the strengths and weaknesses of each new approach was discussed. And finally, the need for standards and reproducible approaches was reviewed to enhance the application of the emerging technologies to improve food and drug safety. The overarching goal of GSRS17 was to provide a venue where regulators and researchers meet to develop collaborations addressing the most pressing scientific challenges and facilitate the adoption of novel technical innovations to advance the field of regulatory science. described Standards in Precision Medicine – Perspective. Dr. Shi established and the Pharmacogenomics. Dr. Shi focused on the topics of pharmacogenomics, bioinformatics, and cheminformatics and aims to realize precision medicine by developing biomarkers for early cancer diagnosis, prognosis, and personalized therapy. One important aspect of precision medicine intends to deliver the right medicine to the right patient at the right dose at the right time, thus maximizing drug efficacy and minimizing adverse effects. The implementation of precision medicine thus depends particularly on the availability of predictive biomarkers. However, there currently is a lack of reliable predictive biomarkers for guiding treatment selection based on a patient’s omic profiles alone, and a range of factors are involved. Many questions remain to attain precision medicine. key developers, and consultants across many different sectors, initially with the intention of creating the overall strategy for an in silico protocol development. Working subgroups will develop individual in silico toxicology protocols for major toxicological endpoints, including genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity, etc. Anil Core addressed the issue of Reproducibility Considerations for Nanotechnology Products for Regulatory Review. Dr. Patri described several emerging technologies, standards and issues in reproducibility in nanotechnology and how the global increase of nano-applications in drugs, devices and consumer products is paving the way for new advances in science, technology and medicine. Dr. Patri provided an overview that included points to consider for nanotechnology, naming some of its applications in medical products, providing a landscape of products submitted to FDA, and discussing considerations on quality and reproducibility and highlighting some of the needed standards. Nanotechnology applications of meetings and addressed the need for new standards. important information on the topic of regulatory science in and its contribution to GSRS/GCRSR. Dr. outlined ongoing research about key techniques for toxicity testing. He also mentioned human-on-a-chip; novel biomarkers; in vitro 3D models; in vitro cardiac toxicity evaluation model; stem cell differentiation cell model; innovative biological products evaluation; traditional Chinese medicine evaluation techniques; evaluation of pediatric drugs and of special dosage forms; new phototoxicity models; study of carcinogenicity replacement model; in vitro reproductive and development toxicity model; multiple endpoint genotoxicity detection; pharmacokinetic and toxicokinetic assays; early toxicity screening tests; molecular imaging techniques; pathological diagnostic techniques; drug dependency evaluation and ophthalmological evaluation. provided an update from the technical working group. explained that there was a request that his group develop the GCRSR Bioinformatics area as a working group in early 2014 to establish common bioinformatics methodologies among the participating countries/agencies to deal with complex data sets derived from emerging technologies for regulatory decision-making. The scope is the investigation, evaluation and development of bioinformatics approaches for analyzing and managing complex data sets to efficiently and effectively support the regulatory science applicable to food and medical products safety. The member countries/agencies are Australia (FSANZ and Commonwealth Scientific and Industrial Research Organization), Canada (CFIA and PHAC), EU (EFSA and JRC) and US (FDA). The GCRSR Bioinformatics Working Group currently has two main activities: 1) engaging international collaboration to establish international partnerships, particularly scientific collaboration and information exchange through the development of strategies for cross-training and exchanging regulatory scientists; and to engage research communities through the workshop and conference to understand standards for reporting and analysis of data; and 2) fostering regulatory science research to assess research gaps in bioinformatics towards regulatory decision-making, particularly criteria and measures for data integrity, traceability, reliability and reproducibility; to emphasize fit-for-purpose metrics to apply bioinformatics to process complex data sets; and to implement horizon-scanning approaches for emerging bioinformatics methodologies and to assess their potential impact on regulatory decision-making. Dr. Tong concluded by acknowledging the Bioinformatics Working Group Members, particularly the Canadian Food Inspection Agency (CFIA): Martine Primal Silva, Burton Blais, Dominic Lambert, Margaret Neuspiel, Gary (Public Health Agency of and the Bioinformatics Working Group Technical Team.


Global Regulatory Landscape
To compare the status of regulatory science as applied to food and drug safety around the world, the Global Regulatory Landscape was assessed. To focus the presentations from Brazil, the European Union, Nigeria and Japan, several common questions were addressed by the presenters. These included: 1) What are the common regulatory science issues and practices across governmental agencies? 2) Which infrastructure/mechanism should be developed to address these common issues and practices? 3) How do you envision the cross-training opportunities to facilitate knowledge to be exchanged across agencies?
Dr. Meiruze Sousa Freitas, Deputy Director, Authorization and Health Registry, Brazilian Health Surveillance Agency (ANVISA), described that ANVISA has a hierarchy based on legislation and a competency to edit legislation on health surveillance-related subjects. ANVISA is very connected to the global community and the Good Regulatory Practices Process. International approaches and information are collected through contact with authorities and discussions based on the technical procedures with international cooperation groups.
In ANVISA, regulation is coordinated by two offices: Drug General Management Office (GGMED) and Food General Management Office (GGALI), which are subordinate to the Authorization and Health Registration Directory (DIARE). The GGMED is composed of the Office of Safety and Efficacy Evaluation (GESEF), the Department of Therapeutic Equivalence (CETER), the Drug Registration Evaluation Office (GRMED), the Office of Specific and Herbal Drugs Registration Evaluation (GMESP), and the Office of Post-Approval Changes Evaluation (GEPRE). The staff consists of health regulation experts, regulation technicians and administrative personnel including a wide range of professionals, namely, biologists, bio-medics, chemists, doctors, engineers, lawyers, pharmacists, physicians and statisticians. GGMED performs drug registration through evaluation of drugs applications, post-approval change applications and drug registration renewal. Other actions connected are "in situ" inspections of the applications mentioned, implementation of audits on the application review process, inspections on pharmaceutical equivalence and bioequivalence/bioavailability centers, evaluation of safety and efficacy (clinical trials), and drafting/reviewing legislation. Recent improvements to ANVISA's review process include systematic standardized approaches to the benefit/risk assessment of medicines and moving towards stepwise implementation of Good Review Practices (GRevP).
ANVISA's other major component is the Food General Management Office (GGALI). It encompasses the Risk Assessment and Effectiveness Office (GEARE), the Food Registration Office (GEREG) and the Post-Registration Food Office (GEPRA). The professional team reviewing at GGALI consists of health regulation experts and administrative personnel, including several professionals, such as food engineers, biologists, pharmacists, nutritionists and veterinarians. As an overview of the national food safety control system, the primary production M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
is the responsibility of the agricultural agencies, while responsibility for the subsequent stages of production is shared between the health and agriculture agencies. Actions of these agencies are largely co-dependent. In Brazil, food health control is a mixed model. In pre-marketing, assessment involves health promotion, register and safety assessment, and food standards.
In summary, ANVISA uses best practices and international guidelines to create Brazil´s standards. Information is frequently obtained by contact with other regulatory and international agencies and communication using existing technical knowledge bases available on various web sites and confidential information exchange with other regulatory authorities.
Hans-Georg Eichler, MD, MSc, Senior Medical Officer, European Medicines Agency (EMA), United Kingdom (UK), EU, focused on regulatory science and the evolving role of pharmacovigilance. Pharmacovigilance stems from the 1970s and means "close watch". It is the practice of monitoring the effects of medical drugs following licensure and market entry, especially to identify and evaluate previously unreported adverse reactions. It is the science and activities related to detection, assessment, understanding and prevention of adverse effects or any other drug-related problem.
Dr. Eichler described some of the difficulties encountered when practicing precision medicine. Often the stratification criterion is not completely understood, the numbers of patients are smaller but there are advanced therapies (gene, cell, tissue-based), and some of them are truly personalized. Personalized treatment combinations are even more complex in that the scientists must deal with small patient numbers and it is sometimes difficult to define clinical indications. Dr. Eichler also highlighted the difference(s) between efficacy and effectiveness in clinical practice. Efficacy is the extent to which an intervention does more good than harm under ideal circumstances (monitoring the patient taking the drug). Effectiveness is the extent to which an intervention does more good than harm when provided under the usual circumstances of health care practice. Accounting for heterogeneity will lead to more precise precision medicine and a different approach to drug development. Dr. Eichler also mentioned that it is very likely that regulators and the health delivery system will need to shift some health care practices in the future in case we account for heterogeneity.
Dr. Eichler believes that the future evolution of pharmacovigilance will be facilitated by payfor-performance and pre-licensing of real world data. This will be the pinnacle of individualized pharmacovigilance. There will be a shift from population to patient level, from prediction to observation-based decision-making. Treatment responsive and non-responsive will be identified, and the use of patient-level outcomes would enable outcome-based contracting and, perhaps further improve patient safety and public health by focusing on patient experiences. Dr. Eichler concluded by emphasizing that the development of pharmacovigilance will likely bring shifts of focus from safety-only to benefits and harms, from pharmacovigilance to knowledge generation, and from population focus to patient-level focus. He also indicated a continuum knowledge M A N U S C R I P T A C C E P T E D ACCEPTED MANUSCRIPT generation spanning the pre-and post-licensing phases will emerge. Then, evidence will be based on a diverse family of data sources and complementary methodologies.
Herbert Deluyker, PhD, Science Advisor, European food Safety Authority (EFSA) described the establishment of EFSA in 2002 through the EU Food Law (Reg. EC 178/2002). It introduced the functional separation of risk assessment and risk management. EFSA's mission is to be the EU's independent risk assessment body for food and feed safety. The remit of EFSA concerns the entire food chain, covering aspects of human, animal and plant health; and sometimes environmental protection.
Scientific assessments can be viewed from the perspective of the conduct of a scientific experiment. This includes a hypothesis that is examined by scientific experts using existing evidence and employing agreed-upon assessment methods, the results of which are made public.
Dr. Deluyker described the EFSA risk assessment paradigm as "Risk = Seriousness x Vulnerability x Scale". In the paradigm, "seriousness" refers to what effect and how severe it is; "vulnerability" concerns the probability of being affected, and "scale" regards to who is affected, broken down by age, gender, etc. In food safety risk assessment, the aim is not to conclude that a chemical compound, microorganism or a mixture thereof, is perfectly safe under any circumstances. Rather, the objective is to estimate the upper limit below which exposure to a hazard does not constitute a risk. The nature of the hypothesis may evolve from such a risk-only to a risk-risk or a risk-benefit assessment.
The scientific bodies ultimately responsible for EFSA's scientific advice would be its (currently) 10 Scientific Panels, plus an overarching Scientific Committee. Over the years, 'cross-cutting' guidance has been developed by EFSA's Scientific Committee. These serve as harmonized scientific methodologies across multiple fields within EFSA's remit. For example, assessment of scientific evidence is carried out at several levels of aggregation: the level of an individual study, the combination of several studies on the same endpoint, combining different strands of evidence, integrating evidence from other compounds, and the overall risk assessment which brings together all the steps of the process. In addition, topic-specific guidance has been developed. The scientific assessment framework will continue to evolve. In particular, a hypothesis-driven approach is to be based on a prior elucidation of the mode of action through various screening systems. EFSA does not generally conduct primary research, but rather assesses existing scientific evidence. For scientific assessment advice to be delivered prior to market authorization, the data requirements to be fulfilled are typically stated in sector-specific legislations. To meet these data requirements, evidence is to be generated through studies funded by the applicant. New types of evidence may be collected and become available only after market authorization is granted, e.g. human and environmental exposure and health events.
For food chain contaminants, the burden for funding applied research assessing their impact on human safety and environmental protection reverts to public institutions. EFSA's budget does not allow the funding of research and development to fill information gaps, be it on specific compounds or on research underpinning the development and implementation of new assessment methods. Hence, data gaps are identified and highlighted to organizations that are responsible for funding such research.
The acceptability of a scientific assessment depends on several legitimizing characteristics being met: quality, consistency, independence and impartiality, as well as transparency and openness.
Other key considerations are relevance, evolving expectations and innovations, fitness-forpurpose and efficiency, along with sustainability of the system.
In conclusion, Dr. Deluyker described policy development as a cycle, not a linear process. As an example, he mentioned the development and dissemination of antibiotic resistance. By and large, the scientific assessment process in place at EFSA can be understood to mimic the conduct of a scientific experiment. Since its creation only 15 years ago, EFSA has very much delivered on its mission. Whatever the achievements, the EU cannot rest on its laurels.
Orish Ebere Orisakwe, Ph.D., Toxicology Unit, Department of Experimental Pharmacology, University of Port Harcourt, Nigeria described regulatory science and emerging technologies in Sub-Sahara Africa (SSA). He illustrated the location of SSA and its economy, a situation that Nigeria shares. He stated that many foods and drugs are basically unsafe in SSA.
SSA is comprised of Angola, Benin, Botswana, Burkina Faso, Burundi, Cabo Verde, Cameroon, and Central African Republic, and represents a fast-growing economy with a middle-class greater than 350 million people and increased imports of goods. However, poor/inadequate infrastructure hinders SSA's economic growth and poor management of existing assets generates waste and lost opportunities, also caused by the huge imports of goods from developing nations.
Food laws in Africa do exist, but enforcement is the challenge. To understand contamination profiles of some food, and heavy metal pollution, Dr. Orisakwe´s team tests food for contamination using products purchased at markets. In regards to heavy metals, levels of some canned and uncanned beverages are significant: , Maduabuchi, et al., 2007 and contain high levels of contamination. To better define exposure routes and destination of electronic waste-related mixtures of toxicants in the environment, the team noted that electronic dump contaminates irrigation water and that animals frequently feed near those dumps. This is a possible route for contamination risks to the human population Orisakwe and Frazzoli, 2010;Orisakwe et al., 2012). Again, Nigerian food laws exist, but many are poorly enforced: Moving to the pharmaceutical product area, Dr. Orisakwe stated that some of the sources of unsafe drugs in SSA include substandard or falsified ingredients [diethylene glycol use instead of propylene glycol in paracetamol syrup], inappropriate, unsafe and toxic preparations, expired drugs that are re-labelled (issued without complete manufacturing information), and drugs that are not registered with the National Agency for Food and Drug Administration and Control (NAFDAC). Current estimates suggest that as much as ten percent of prescription drugs sold around the world are substandard or falsified, and in some parts of Africa and Asia, the figures may exceed 50% (Newton et al., 2001;Cockburn, 2002). Dr. Orisakwe also described the hazard of heavy metals in Nigerian herbal medicines : -Herbal products are unregulated in Nigeria and in low-income countries. The safety of herbal medicines is poorly understood.
-The public health hazards from ingestion of herbal medicines should be identified and disclosed by scientific risk assessment studies.
The lack of attention to food and drug regulatory systems in low-and middle-income countries is a significant gap that must be bridged given the globalization reality. Some of his recommendations are: -Sound regulatory policies predicated on science will be a useful tool in boosting predictive and preventive science.
-Preventive medicine is a hallmark of public health.
Toxicology (essential in predictive and preventive science and by implication regulatory science) remains an under-represented discipline in SSA. Bioinformatics remain largely unknown. Application of various dry lab bioinformatic tools like the NCTR's predictive liver toxicity knowledge base (LTKB) would be useful in SSA regulations.
In response to a question as to whether Nigeria´s scientific technology can lead to sound decision making, Dr. Orisakwe indicated that the biggest challenge is the lack of research equipment. And in some agencies where equipment is available, often necessary expertise and training is lacking. He suggested that bridges between academia and the regulatory agencies may help address the problem.
Dr. Naoyishi Yasuda presented information focused on the role of the National Institute of Health Sciences (NIHS), Ministry of Health, Labor, and Welfare (MHLW), Japan. He introduced regulatory science and its infrastructure and implementation in Japan, where the scientists believe that regulatory science combines knowledge of academic science from various fields to make the best judgment for society. Therefore, the scientist perceives it as an intellectual science, which provides the criteria for making regulatory decisions and forms the backbone of implementing regulations. Two elements are associated, namely, "Evaluation Science", which predicts outcomes from science and technology, and "Proper Regulatory Science", which realizes the coordination between the human being and society." In Japan, regulatory science is driven by four institutions, each with a specific role in promoting regulatory science: 1) The Ministry of Health, Labor and Welfare, which decides and organizes basic policy; 2) The Pharmaceuticals and Medical Devices Agency (PMDA), in charge of performing drug reviews and safety; 3) The National Institute of Health Sciences (NIHS), responsible for developing official evaluation methods and tests, and; 4) the Japan Agency for Medical Research and Development, AMED (established in 2015), which provides grants to facilitate research and development.
The Clinical Innovation Network (CIN) is used in Japan to collect real world evidence to develop pre-and post-marketing safety measures. By utilizing registered patient's information, rather than traditional registries, it is hoped that CIN registries can be used for new drug development and post-marketing surveillance. To combine such initiatives, a regulatory science center is scheduled to be established within PMDA in 2018. This new center will provide the infrastructure for a consistent regulatory response utilizing a regulatory science perspective. Dr. Yasuda stated that the introduction of regulatory science into regulatory systems and for external engagement will be further strengthened in the coming years. He hopes that Japan can meet the needs of this trend, and engage in information sharing with other countries.

Drug and Food Safety
Approaches and methods to assess drug and food safety were addressed by several presenters. A series of questions were used to drive the presentations forward including: 1) what new approaches may contribute to faster, less expensive and higher quality predictions, 2) what M A N U S C R I P T A C C E P T E D ACCEPTED MANUSCRIPT 9 information concerning these new approaches are essential to their appropriate application and 3) what are some of the strengths and limitations of these newer technologies?
Frank Sistare, Ph.D., Scientific Associate Vice President, Safety Assessment and Laboratory Animal Resources, Merck and Co., Inc., USA addressed these questions by focusing on the need for establishing alignment among regulatory agencies and industry toward reducing drug development costs, timelines and attrition associated with drug-induced liver injury and carcinogenicity testing. He pointed out that the 2-year rodent carcinogenicity study involving the use of both rats and mice is the longest, largest and costliest animal toxicology study conducted in pharmaceutical development. And that drug-induced liver injury is one of the most poorly predicted human risks based on results of conventional animal toxicology studies. Thus such poor liver safety prediction has historically resulted in late stage drug attrition or market withdrawals with devastating effects on human safety, drug development costs and delays.
Dr. Sistare first reviewed how pharmaceutical industry members and drug regulatory agencies had begun in 2010 by leveraging decades of internal conventional study data to support a revised testing strategy that would allow earlier recognition of drug candidates devoid of human carcinogenic risk, eliminating the need and value of conducting a high percentage of 2-yr rat carcinogenicity studies. Supported by these data, international negotiations were initiated and a prospective confirmatory testing period was launched to evaluate the reliability of a Carcinogenicity Assessment Document (CAD) based approach for assimilating data to support a prediction by sponsors and a structured dialog with regulatory scientists of the outcome of new rat carcinogenicity studies conducted between 2013-2018. If confirmed, the approach is expected to eliminate 40% of 2-yr rat carcinogenicity studies. Dr. Sistare explained that this new framework is anticipated to also encourage and reward practical application of innovative carcinogenicity assessment approaches going forward. He stated that the opportunity now presents itself to align on how best to leverage carcinogenomics, epigenetics, systems biology, stem/progenitor cell models, novel mechanistic endpoints, genetically engineered models, and humanized animal models to identify, explain, and/or predict human relevant and irrelevant tumorigenic mechanisms. These tools promise to pragmatically inform, optimize, and accelerate this evolution, and result in further significant reductions of 2-yr rat carcinogenicity studies well beyond the 40% anticipated from just conventional study data. Human irrelevant tumorigenic mechanisms involving PPARa, PXR, and CAR for rat liver (and other tumor sites) carcinogenicity can be identified using carcinogenomics, for example. Genomic signatures for a P53/DNA repair response, aryl hydrocarbon receptor activation, and estrogen receptor activation, on the other hand, may prove to identify more human relevant mechanisms. But for both human irrelevant and human relevant carcinogenomic-based mechanistic signatures, thresholds of signature responses, gene signature details, and further performance evaluations are needed. Dr. Sistare indicated there is opportunity for "leveraging genomics in a pragmatic manner to inform dose-dependency of both human relevant and human irrelevant mechanisms of rat M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
carcinogenicity, and support the Carcinogenicity Assessment Documents-based waiver approach being considered by the International Council on Harmonization, S1 Rodent Carcinogenicity Studies for Human Pharmaceuticals, Expert working Group." Qualification and alignment will be needed to move this additional area forward.
Recently, improved in vivo and 3-D in vitro translational liver models and novel endpoints have been introduced and adopted within Merck & Co. Inc., and they are performing very well toward improving early de-risking for human liver safety. A mechanism based strategy has been adopted using both in vivo and in vitro endpoints for improving prediction of unsafe doses for drugs associated with high drug-induced liver injury (DILI) potential resulting from: 1) reactive metabolite formation; 2) alteration of bile acid homeostasis; 3) mitochondrial toxicity; and 4) innate and/or acquired immune system activation. The company has built off published benchmarking results and has further benchmarked internally several new in vitro liver models based on a prioritized set of core liver attributes deemed critical for supporting these DILI mechanisms, e.g., drug metabolism activity, drug and endogenous biochemical transport, albumin synthesis, urea synthesis, and both basal and induced gene expression. Because of these evaluations the rat and human HepatoPac systems have been adopted for routine use in conjunction with novel in vivo approaches for de-risking for DILI potential based on very favorable in vitro/in vivo correlation data encompassing hundreds of test compounds to evaluate drug metabolism, response to reactive metabolite formation, bile acid perturbation, and mitochondrial toxicity. Widespread alignment and adoption toward consistent implementation of any such improved in vivo/in vitro mechanism based liver toxicity testing strategy will require similar data sharing, coordinated collaboration, and leadership as described above for driving alignment to the changes that are on the horizon in carcinogenicity testing.
Gary Van Domselaar, PhD, Chief, Bioinformatics, National Microbiology Laboratory, Public Health Agency of Canada described the application of new data streams generated from nextgeneration sequencing (NGS) and their utility for food microbiology, pathogen identification, and illness outbreak detection. Defining best practices for data integrity, reproducibility, and traceability will ensure reliable, auditable, and transparent processes underlying food microbiology risk management decisions. These points summarize some of the objectives and tasks that guide the Bioinformatics Working Group of the GCRSR. In his presentation, Dr. Van Domselaar outlined the general principles developed by the Bioinformatics Working Group to guide regulatory authorities in the transition toward the use of NGS technologies for food safety applications, and to facilitate interactions and the transfer of information in the interest of public health (Lambert et al, 2017). Dr. Van Domselaar pointed out that food authorities have significant responsibility to ensure the safety of food for the consumer. Food authorities should strive to incorporate the best scientific methods available. These include activities such as source attribution, risk assessment (identification in gene tests), and a focus on developing rapid diagnosis testing. NGS methods are replacing traditional molecular tests used to inform on these activities, but this process presents several significant challenges that must be addressed before these new methods can be approved for application in a food regulatory environment. It is important to invite all the stakeholders to facilitate an improved probability of standardization to organize metadata. In support, a consortium was formed with a focus on three ontologies, such as GenEpio (https;//genepio.org) and FoodOns (https://foodontology.github.io/foodon/). At the same time, Dr. Van Domselaar described the consortium's work with the FDA and other international organizations. He mentioned the challenges to build a viable and sustainable international structure, including the difficulty in adopting universal standard(s) as many organizations rely on their own vocabulary. Regarding good lab practices, standards already exist, e.g. ISO 17025, which encompass general requirements for the competence of carrying out tests and/or calibrations and existing methods should be taken into consideration. This includes standards around information technologies as labs must demonstrate uninterrupted competence/heavy IT reliance: data collection, analysis, transfer, storage, security, and integrity. Any software must be validated and fit for purpose. Finally, there is a need to return to relevant stakeholders to ensure applicability and efficiency. New standard server and desktop arrangements cannot keep pace with the growth of genomic data, and increasingly, high performance computing infrastructures are being deployed within food safety institutions to provide the requisite computing power and scalability. In terms of reporting and interpreting results with regards to source attribution, when referring to new testing there is not much historical data available. Guidelines are being developed for interpretation. To validate accuracy one must be able to assess the proportion of false negative and false positive results generated by NGS methods, and determine the acceptable limits of these errors for the intended end-purposes in which those methods are being developed. Data quality assessment parameters and traceability information are also being discussed. Through coordination by a large international group and two years of focused efforts, these good practices were developed.
Davide Arcella, MS, Scientific Assistance Directorate, Evidence Management, European Food Safety Authority (EFSA), EU, emphasized EFSA's mission in providing 'trusted science for safe food' for the protection of consumers through independent scientific advice on risks in the food chain. In addition to providing updated and independent data to the public in general, the regulatory EU agency works in a collaborative way with its member states, institutional partners and other interested parties with the objective of providing consistent trusted advice on the quality of EU food safety system. The range of domains encompassed by EFSA are broad, including multiple scientific fields from plant health and protection, genetically modified organisms, biological hazards, chemical contaminants to food additives and contaminants, nutrition, among others. The use of common standards with other scientific and regulatory bodies is a preliminary condition to improve data interoperability and facilitate data sharing and exchange. Therefore, Mr. Arcella emphasized the importance to open innovation and cooperation with other bodies for better outputs.

M A N U S C R I P T A C C E P T E D ACCEPTED MANUSCRIPT
While recalling EFSA's role of central repository for pan-European data from diverse national control and monitoring programs, he explained the main data providers of the system were EU Member States, the European Commission, industry, consumer associations, universities and academia, etc. FoodEx2 is a food classification system that was created to support EFSA's strategy and values aimed at enhancing the quality of its outputs and transparency by giving direct access to data and promoting the like. A key strategy involving FoodEx2 is to keep daily data collection in Europe and globally to support data enhancement and re-use and foster innovation. EFSA invested in FoodEx2 and standardization to enable sophisticated analyses and sharing of information. The need for a 'common language', a universal identification for food items was reinforced. The common language is essential when working together and to enable positive outcomes in food safety management and, consequently, in food security.
While recalling EFSA's role as a central repository for pan-European data from diverse national control and monitoring programs, he explained who the main data providers of this system were: EU Member States, industry, consumer associations, and university, academia, etc. FoodEx2 is a food classification system that was created to support EFSA's strategy and values, aimed at enhancing the quality of its outputs and transparency by giving direct access to data and promoting the development of collaborative platforms in Europe and internationally. Another important aspect and one of the key strategies involving FoodEx2 mentioned by Mr. Arcella is to keep daily data collection in Europe and internationally as this allows for data enhancement and re-use, and fosters innovation. EFSA invested in this organization and standardization to enable sophisticated analyses and sharing of information. The need for a 'common language', a universal identification for food items was reassured. This common language is vital when working together and to guarantee positive outcomes in food safety management and, consequently, in food security.
The food description system is composed of categories and subcategories in several domains, including Consumption, Production and Retail, Management of food safety and food security. Classification is done by grouping food objects based on similar properties.
FoodEx2 is embedded in the standard data models used by EU Member States to transmit data to EFSA in relation to several food safety domains, e.g. for analytical results, food consumption, etc. Currently, EFSA is working on expanding the use of FoodEx in the EU.
The Food and Agriculture Organization (FAO) and the World Health Organization (WHO) are already using FoodEx2 and approved it as it facilitates the collection of food consumption data and food composition data worldwide, while keeping the list of food terms updated. To regulate these products within best international practices, ANVISA is preparing a regulatory proposal to create a food supplement category with the following objectives: contribute to the access of safe and quality dietary supplements; reduce the asymmetry of information related to this market; facilitate regulatory control and risk management; eliminate unnecessary barriers to marketing and innovation; and simplify regulatory stock. The initial list of authorized constituents contains 242 substances and the initial list of permitted claims total 57 for 36 constituents.
The proposal includes certain criteria to update the lists of supplements, aimed at providing healthy dietary supplementary food. For example, some conditions are applicable to all constituents, i.e., safe assessment; provide nutrients, enzymes, probiotics or bioactive substances; and specifications in recognized references. The difference between general and specific claims for probiotics was also explained, as they require different levels of evidence. For general claims, descriptions should include potential effect, use of probiotics and the contribution to a healthy gastrointestinal system. In specific claims, the effect must be proven. During the development of this regulation, some complementary actions were outlined, such as specific regulation of GMP for dietary supplements; the monitoring of adverse effects from the consumption of supplements; preparation of Frequently Asked Questions document; a review of the guide on Food and Ingredients Safety Assessment; guidelines on substantiation of claims in foods and supplements, and overdose and stability of dietary supplements; and for training for Brazilian Health Surveillance Agency (SNVS) personnel.
Ms. de Souza Lima described the update of ANVISA's Good Practices Guidelines and how the Agency is addressing marketing issues. She highlighted the importance of technology and innovation in these processes, and emphasized a shift toward an online system for product registration.
Chia-Ding Liao, Ph.D., Food and Drug Administration, Ministry of Health and Welfare (MHW), Taiwan, addressed the use of advanced technology to detect adulteration of edible oils used in foods. Food safety is critical for public health, as foodborne diseases affect people's health and well-being. Food adulteration is a problem as informal food production and distribution systems are deeply entrenched at the community level in the world. Increasing incidents have raised food safety concerns among consumers and policymakers globally.
Adulteration costs the global industry several billion dollars every year, and negatively impacts public confidence in food producers and regulators. Adulteration can also result in significant public health consequences. Given these and other factors, food fraud must be tackled globally if countries are to effectively address the potential financial and health-related consequences.
There are multiple ways of adulterating food products, including unapproved enhancements, dilution, substitution, mislabeling, non-disclosure and concealment. NCFPD and USF cite oils among the top 10 most adulterated foods/ingredients. Dr. Liao's presentation included data on different analytical techniques used to confirm food authenticity to detect fraud, including sensory, physicochemical, DNA-based, chromatographic and spectroscopic methods.
Many tests were performed for compounds that could define animal fat and vegetable and the contents of major phytosterols, including campesterol, b-sitosterol, and stigmasterol, are not significantly affected by the oil refining process (Kochhar, 1983). Phytosterols could potentially serve as target compounds to identify animal fats that have been adulterated with refined cooked oils. The study aimed to (1) characterize phytosterol contents in homemade lards, commercial lards and lards that were inspected in the 2014 adulteration incident in Taiwan, and (2) evaluate the effectiveness of using phytosterol as a target compound to identify adulterated animal fats.

Emerging Methodologies
The need for advanced approaches to allow for faster, less expensive and more predictive methodologies was presented and the strengths and weaknesses of each new approach discussed. Questions centered around the gaps that the novel approaches would attempt to address, the advantages of the new approach(es) over existing technologies and the limitations of the new approaches once applied to assess food, drug or chemical safety.
challenges were presented as the chemicals to be addressed included those with a range of data availability, from data rich to data poor, thereby introducing the opportunity to explore and integrate emerging data and novel approaches. Dr. Barton-Maclaren indicated Canada was moving towards the integration of novel tools and methodologies to support screening, prioritization and risk assessment activities consistent with the global efforts focused on the risk assessment paradigm shift. From 2007 to 2012, foundational reports have recommended a shift in toxicity testing and risk assessment, including reports by the National Academy of Sciences To illustrate the ongoing applied research conducted with the goal of practical implementation in Canada's CMP, Dr. Barton-Maclaren also presented a collaborative case study between HC and the EPA on a group of Substituted Phenols to be addressed under the third phase of the CMP.
The study interest in this group of substances for the case study is that a human health related concern with phenols is that they have the potential to be estrogenic. The objectives were 2-fold: (1) to investigate the utility of new approach methodologies to support priority setting and risk assessment, and (2) to investigate the utility of combining new approach methodologies in an IATA-based hazard characterization to address data-poor substances. Dr. Barton-Maclaren outlined several key elements and explored the case study in a two pronged approach that relate to the types of approaches used for risk assessment under the CMP; additional details on The Risk Assessment ToolBox can be found at https://www.canada.ca/en/healthcanada/services/chemical-substances/fact-sheets/chemicals-management-plan-risk-assessmenttoolbox.html. In Part 1, systematic approaches were used to identify valid source analogues through the exploration of the utility of Quantitative Structure Activity Relationship (QSAR) predictions and High Throughput Screening (HTS) data to support IATA-based hazard characterization for de novo or more complex risk assessment of substances or groups of substances. The objective was to apply these data and approaches to substantiate analogue selection for in vivo data read-across for estrogenicity and further, to support preliminary weight-M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
of-evidence of estrogenicity activity for CMP Substituted Phenols. In Part 2, the goal of the analysis was to extend the application of the HTS data to priority setting and assessment through the comparison of the bioactivity to exposure ratio (BER), based on the concentration at 50% of maximum activity from the ToxCast HTS data, with the traditional margin of exposure (MOE). This approach examines the utility of HTS data to predict the potential level of concern for human health effects for priority setting and screening level risk assessment. Together the outcomes of this case study contribute to building confidence for the use of alternative methodologies in risk assessment using estrogenicity as an example. Dr. Barton-Maclaren also highlighted the progress that Health Canada has made toward the integration of quantitative toxicogenomics for priority setting and assessment; this is an example of applied tools development. Currently, toxicogenomics complements conventional toxicological approaches. Some applications include: to inform weight-of-evidence approaches, particularly in supporting linkages between exposure, to assess mechanism/Mode of Action (MoA) and adverse effects (particularly for chemicals that are data-poor), to provide support in establishing chemical groups based on similar gene expression profiles (i.e., for read-across) and in tiered approaches to assessment. Although research at Health Canada also focuses on predictive and mechanistic toxicogenomics, advances in quantitative toxicogenomics to determine the doses at which the system or specific toxicity pathways begin to be perturbed were specifically highlighted as an emerging risk assessment application. Quantitative toxicogenomics is a dose-response analysis that is analogous to conventional approaches. Using high-throughput analytical tools (US EPA's BMDExpress 2.0 software), benchmark dose modeling is conducted on every single gene that exhibits a response. The genes are then assigned to their pathways or networks for Point of Departure (POD) derivation. Identification of the POD determined by the lowest pathway BMDprovides a reasonable estimate of apical effects. This has been shown through a series of case studies demonstrating that the median and modes of the pathway BMDs in an experiment are very close to apical BMDs. The study highlighted in this presentation is one example where published microarray data for 6 chemicals, that also had apical dose-response data, were used to compare points of departure using 11 approaches to derive a genomics BMD(L) (Farmahin et al. 2017). Dr. Barton-Maclaren indicated researchers are coming to a point where a simplified pipeline is a reality to support integrating toxicogenomics into risk assessment activities. The development of a pipeline for data analysis allows the establishment of predictive TGx signatures linked to MoA or phenotype from the large gene lists which will make it possible to identify the transcriptomic, POD and finally the prediction of an acceptable level(s) of exposure.
Dr. Barton-Maclaren recognized, like for any change or transition, the challenges to be worked through to increase information sharing and gain global acceptance for fit for purpose applications. Dr. Barton-Maclaren described several challenges: Challenge 1: The acceptance of novel approaches for regulatory application (validation). Considerations in the Canadian context is that there are currently no legislative or policy M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
restrictions that explicitly challenge accepting alternative methods in lieu of traditional studies. Validation is an implicit barrier to acceptance and "fit for purpose" approaches are based on legislative mandates. Currently, NAM data are used in a weight-of-evidence (WoE) approach to increase acceptability such as through IATA.
Challenge 2: Lack of harmonized guidelines, strategies or frameworks considerations: There is a need for standardized methods and interpretation to increase transferability and transparency (e.g., such as that illustrated in the technical guide for applications of gene expression profiling in human health risk assessment of environmental chemicals by Bourdon-Lacombe et al. (2015)). This is further supported by the need for consistent and improved reporting through tools and templates; and as well as analytical frameworks to guide application.
Challenge 3: Lack of high quality data and data accessibility. Considerations: In this case, there are limited chemical specific data and new approach(es) are needed to handle the large testing universe (reduced cost, higher throughput); multi-stakeholder engagement is required to enhance data sharing through agreements and further support the development of infrastructure and data science expertise and capabilities.
Challenge 4: Uncertainty. Considerations: It is necessary to capture and communicate new and different uncertainties, to continue to increase awareness and transparency, and maintain multistakeholder engagement.
Challenge 5: Lack of expertise. Considerations: There is a need to strengthen the skill sets of risk assessors in specialized areas such as in data science and alternative methods, and enhance the provision of better data integration tools. It was recognized that success also depends on continued engagement across stakeholder groups.
Moving forward toward greater integration of emerging data and novel methodologies for chemicals risk assessment in Canada there will be continuous efforts on capacity building. This will be accomplished through increased data accessibility and sharing, the maintenance and establishment of key partnerships, technical workshops and training sessions with international experts, and ongoing focus on data analysis tools development to address regulatory questions. It is also important to demonstrate proof of concept through various case studies and work collaboratively on the interpretation and application of new data for use in regulatory applications. This is currently being done at an international level under the OECD and as the focus of the Accelerating the Pace for Chemical Risk Assessment initiative co-lead by the US EPA, the European Chemicals Agency (ECHA) and Health Canada (Kavlock et al. 2018). Communication and consultation has been an important pillar in the past and will continue to be critical moving forward. To ensure a strong science foundation and obtain expert input related to key scientific considerations, consultations are conducted with the CMP Science Committee and the Health Canada Science Advisory Board. Further, there is leadership and participation on M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
international committees and projects to promote consistency and alignment of approaches as relevant and ongoing external expert review of reports and publications. It was acknowledged that there is strength in cooperation and a need for continued multi-stakeholder engagement to encourage openness to data sharing and collaboration with other governments and sectors to collectively increase expertise. This is critical as we continue to funnel the large cloud of information into well-defined, pragmatic and transparent approaches to guide the interpretation and translation of emerging data into relevant information for regulatory applications. The Chemical and Food Ingredient Safety Program works actively with international regulatory and research agencies around the world to address key current gaps in chemical safety assessment and regulation. Dr. Lee highlighted ongoing collaborations with the US EPA, Health Canada, ECHA, and EFSA, including an EPA-led international case study to examine the utility of in vitro bioactivity data as a conservative estimate of point-of-departure (POD) for chemical risk assessments, as well as collaborative projects with the EPA focused on the development of in vitro assays for organ-specific and developmental toxicity.
Dr. Loo expanded on the projects which leverage A*STAR's High-throughput In vitro Phenotypic Profiling (HIPPTox) platform. HIPPTox uses high-throughput cellular imaging and machine learning to automatically identify the most predictive phenotypic markers for bioactivity/toxicity. This allows for the estimation of POD based on in vitro bioactivity, as well as predictive assays for cell-type-specific toxicity to be built. Compared to existing approaches, HIPPTox has greater efficiency and broader biological coverage, making it feasible to assess the safety of large numbers of chemicals, including data-poor and new compounds.
Dr. Loo explained that in the international case study, A*STAR is working with partners to compare in vitro POD values derived from HIPPTox and ToxCast, to POD values from traditional animal toxicity studies. Several highly relevant chemical classes are being studied, including parabens (preservatives), phthalates (plasticizers), and benzophenones (sunscreens). A

M A N U S C R I P T A C C E P T E D ACCEPTED MANUSCRIPT
highly predictive and low-cost in vitro assay for nephrotoxicity has also been developed based on HIPPTox, and A*STAR is working with the EPA to incorporate data generated from this nephrotoxicity assay into the ToxCast database.
Dr. Loo also highlighted several new initiatives under the Program, including the Toxicity Mode-of-Action Discovery (ToxMAD) platform which draws on several unique technologies at A*STAR to elucidate the molecular initiating events and key cellular events in toxicity pathways. Dr. Lee then provided examples of how A*STAR has successfully partnered with industry to accelerate innovation, such as the development of a 3D biophysical model to predict the allergenic potential of proteins from their sequence, efficient methods to image and characterize the effects of particulate compounds on human lung cells, and a novel machine learning approach to predict hormone receptor binding affinity.
To conclude, Dr. Lee shared Singapore's vision of establishing a Singapore Centre for Alternatives to Animal Methods. He reiterated that Singapore hopes to play a prominent role in Asia and globally in advancing safety science, and welcomed partners to join A*STAR in the endeavor.
Prof. Qasim Chaudhry, Ph.D., University of Chester, United Kingdom (UK) is also a member of the Scientific Committee on Consumer Safety (SCCS). The SCCS is an independent scientific committee of the European Commission charged with assessing safety of non-food consumer products (cosmetics and personal-care products, textiles, toys). Detailed risk assessment is based on data on physicochemical, toxicological, and exposure aspects in the form of dossiers submitted by a company or an industry consortium. Since 11 March 2013, a ban has been placed in Europe on animal testing of any cosmetic ingredient or a finished product, as well as marketing of any cosmetic ingredient/product that has been tested on animals under the EU Cosmetic Regulation (EC) No 1223/2009. This means that safety data on new cosmetic ingredients needs to be derived from alternative (non-animal) methods. In this context, Prof. Chaudhry focused on the use of in silico (computational) methods for chemical hazard assessment.
The in silico models and read-across tools stand out amongst the available alternative methods as a particularly useful and inexpensive non-testing means for rapid screening of chemicals for toxicity. These include predictive computational models based on Structure Activity Relationship (SAR), QSAR, read-across from data on already-tested chemical analogues, and integrated "expert" systems that can derive toxicity estimates from multiple in silico models and approaches. This field has seen a lot of developments in the past few decades in terms of access to large chemical and toxicological databases, increasing computational power, versatile statistical algorithms for structure-activity modelling and powerful datamining tools. This, coupled with moves towards application of stringent standards for quality and reliability, has made in silico models and tools increasingly relevant for use in regulatory risk assessment of a M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
20 wide range of chemical substances. The reliability of the results derived from in silico models and tools can be further enhanced when used as part of WoE drawn from more than one model/expert systems, and combined with read-across from other structurally-similar analogues. This requires integration of the results from different in silico models/expert systems and readacross either mathematically (e.g. mean or median values); after weighting each model based on quality and reliability; expert judgment; through development of hybrid models that utilize multiple individual models; or through learning models that can generate, test and improve results in iterative stages. The current gaps identified as responsible for the slower uptake of in silico approaches for regulatory risk assessment include the need for an authoritative seal of validation for high quality systems; a systematic way for assembling the WoE; training of risk evaluators and regulators; and the essential use of expert judgment because in silico models and tools must not be left to be used as a 'black box' routine.
William Slikker, Jr., Ph.D., Director, National Center for Toxicological Research (NCTR), US Food and Drug Administration (US FDA) addressed the use of stem cells to assess chemical and drug safety: three-dimensional (3D) culture, microphysiological systems and modeling. Dr. Slikker explained that scientists are pursuing the goal to simulate humans, at least in terms of chemical effects, safety evaluation, and the practice of regulatory science. A system of cells or tissues may be examined under strict criteria to reflect the human condition. These "human on a chip" and "human organ constructs" microphysiological systems (known as MPS) are an emerging technology that have the potential to correlate in vivo with in vitro and simulate human organ systems.
While the use of human cells may be a great advantage because there is no need to extrapolate across species, there is the requirement that different cell types interact in a three-dimensional relationship to provide prediction of the intact human condition. The immediate environment of the cells in culture, including cell types (single or multiple origins), the need for specialized media to keep them in a differentiated state, and the 3D configuration can all affect outcome measures. The inclusion of multiple cell types and cell layers has proven an important variable in cellular function. The model systems may also need to reflect an appropriate disease state or developmental stage to provide a valid prediction.
Additional features that must be considered include dose-response characterization, the influence of varying the developmental stage of selected cells and their sex. To reproduce the desired outcome, standard procedures need to be developed and widely accepted. A series of mutually agreed upon positive and negative test agents will be useful to ascertain functional status and reproducibility. Multicenter trials will be necessary to confirm reproducibility and the ability to recapitulate the underlying biology and toxicology provides many opportunities and challenges for their application to alternative, more public health relevant and efficient chemical toxicity testing methods. Biologically inspired MPS models built from human induced Pluripotent Stem M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
(iPS)-delivered cells and synthetic matrices that recapitulate organ-specific physiologies and native tissue architectures offer exciting new research opportunities to advance chemical and drug assessments (Slikker, 2014).
Dr. Slikker also described the DARPA-FDA-NIH Microphysiological Systems Program which was initiated in 2011 to support the development of human microsystems, or organ "chips," to screen for safe and effective drugs swiftly and efficiently (before human testing). Collaboration occurred through coordination of independent programs. DARPA contributed with engineering platforms and biological proof-of-concept (DARPA-BAA-11-73: Microphysiological Systems); the NIH supported the underlying biology/pathology and mechanistic understanding (RFA-RM-12-001 and RFA RM-11-022) and FDA provided advice on regulatory requirements, validation and qualification. Dr. Slikker also addressed the use of stem cell directed toward the central nervous system. The neural stem cell is a subclass of precursors that is self-renewing, capable of making additional copies of itself by division, either symmetric (both daughters are stem) or asymmetric (one daughter is stem cell). Another characteristic of the neural stem cell is that it is multipotent, capable of making daughters other than itself (committed progenitors, neuros, astrocytes, oligodendrocytes, non-neural tissues), and it can generate all or part of neural tissue (normal development and functional reconstitution). Conclusions of the study showed that stem cells can be used to understand the effects of anesthetics on developing systems; build knowledge of the stage of development of the stem cell critical to the interpretation of toxicity data; and under well-controlled conditions, stem cell data may be predictive of in vivo derived data (Slikker et al, 2015).
Maurice Whelan, Ph.D., Professor, Head, Chemical Safety and Alternative Methods, European Commission, Joint Research Centre, Italy, EU addressed the issue of incorporating novel methods into integrated approaches to testing and assessment of chemicals. Dr. Whelan stressed that in a quest to protect human health and the environment, both scientists and regulators have significant challenges in designing and implementing solutions based on novel methods that effectively address existing and emerging needs. Dr. Whelan noted that in one sense the regulatory science community has never been in such a fortunate position considering the M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
22 plethora of new technologies and techniques available. However, how to actually exploit these tools to support regulatory decision-making is still very much an open question. He proposed that three specific aspects need more attention, namely: ensuring that novel methods are applicable to the safety questions that industry and regulators face (i.e. fit-for-purpose); identifying, characterizing and discussing sources of uncertainty associated with both conventional and novel methods; and ways to establish scientific credibility of novel approaches so that decision-makers are more likely to use the information they provide.
Dr. Whelan highlighted the importance of having a practical framework that is widely understood and accepted that facilitates the design, presentation and review of new approaches to safety assessment based on novel methods. He described the ongoing work at the OECD on IATA. An IATA integrates and weight all relevant existing evidence and guides the targeted generation of new data where required to inform regulatory decisions. The IATA framework encompasses 'Defined Approaches' which consist of a fixed data interpretation procedure applied to data generated with a defined set of information sources (e.g. in vitro methods or computational models). The result can either be used on its own, or together with other information sources within an IATA. Six principles or properties underpin a Defined Approach, namely: defined toxicological endpoint; a defined regulatory purpose; a description of underlying rationale and mechanistic basis; a description of individual information sources used; a description of how information is processed and combined; and a consideration of uncertainties. Dr. Whelan outlined a suitable reporting template has been proposed that should be used to make it easier for potential end-users of the methods or the data they produce to fully understand the salient features of the approach in order that they can make informed decisions. The reporting template has been used to great effect by OECD member countries to report 12 different Defined Approaches based on in vitro and in silico methods for skin sensitization assessment of chemicals and this has helped a great deal in their evaluation and uptake.
Dr. Whelan concluded by presenting a framework to establish the scientific credibility of predictive toxicology approaches to facilitate and promote their use to support regulatory safety decisions. Here, credibility is understood as reflecting the willingness of end-users of novel methods to trust the data they deliver to inform their decisions. An essential tool within the framework is a 'credibility matrix' which provides a means of determining the extent to which a predictive approach is supported by relevant toxicological knowledge and by experimental data to which assumptions and performance can be compared. He proposed a set of seven credibility factors that both developers and decision-makers can use together as a basis to collectively and systematically make an objective assessment of an approach and to judge what needs to be done to show that it is fit-for-purpose. In relation to this, the credibility framework is also intended to inform validation strategies that aim to strengthen the evidence base underpinning the scientific credibility of predictive approaches when required.

Standards and Reproducibility
The focus of this session was on Standards and Reproducibility and described the uncertainty and current concerns with lack of reproducibility of scientific data, and how that issue needed exploration, practice, development and community discussion within the context of in silico methods. Questions that were considered included methods to address the degree of uncertainty and methods that are under development to reduce uncertainty.
Weida Tong, PhD, Director, Division of Bioinformatics and Biostatistics, NCTR, US FDA described Reproducible Toxicogenomics for Regulatory Decision-Making. Dr. Tong elucidated important aspects of how toxicogenomics, an important field of science in the application of genomic technologies, can be useful in the regulatory decision-making processes. Several characteristics must be understood, such as toxicity is defined by the magnitude of the dose, in addition to the individual characteristics of the chemical or therapeutic. For example, the decision-making process can be established through dose-response, point-of-departure and guiltby-association methods, but how to define "response", "point" and "guilt" in the context of genomics? How to apply the concept of toxicogenomics to the safety assessment process? How to measure different points? Which genes and which point of departure can be used? Another aspect is the mechanism: Molecular Initiating Event (MIE), MoA, Adverse Outcome Pathway (AOPs), pathways, functions, and network. What do we use consistently for communication? Moreover, how about the gene activities: individual genes, DEGs, signature, gene-set…are they reproducible? Dr. Tong presented results that showed better reliability in pathway-based analysis compared to tracking the changes in individual genes. Adapting from Tolstoy, he quoted the insight that "Happy families are all alike, every unhappy family is unhappy in its own way" and that this insight also applied to gene responses. He also asked about the study relation between in vitro and in vivo TGx -is in vitro sufficient? (As considerable modelling issues are involved in extrapolation.) Finally, enabling technologies: microarray, RNA-seq, qPCR -How to address 'evolving technology'? Is it possible to obtain a list of reproducible DEGs? First, it is dependent on both the cut-off methods and threshold used to select the list of DEGs, without considering the magnitude and direction of the expression. However, cut-off approaches alone are not suitable to assess reproducibility.
Dr. Tong detailed how the FDA-led community-wide consortium effort to assess technical performance and application of genomics technologies (microarrays, GWAS and next-generation sequencing) in clinic and safety evaluation concluded that microarrays are not fully reliable. The analysis about Microarray Quality Control (MAQC) has gone through four phases and during each phase; several issues addressing uncertainty were examined.

M A N U S C R I P T A C C E P T E D ACCEPTED MANUSCRIPT
Dr. Leming Shi, Professor and Director, Center for Pharmacogenomics at Fudan University in Shanghai described Standards in Precision Medicine -China's Perspective. Dr. Shi established and directs the Center for Pharmacogenomics. Dr. Shi focused on the topics of pharmacogenomics, bioinformatics, and cheminformatics and aims to realize precision medicine by developing biomarkers for early cancer diagnosis, prognosis, and personalized therapy.
One important aspect of precision medicine intends to deliver the right medicine to the right patient at the right dose at the right time, thus maximizing drug efficacy and minimizing adverse effects. The implementation of precision medicine thus depends particularly on the availability of predictive biomarkers. However, there currently is a lack of reliable predictive biomarkers for guiding treatment selection based on a patient's omic profiles alone, and a range of factors are involved. Many questions remain to attain precision medicine.
China's Precision Medicine Initiative was founded in 2016, with the core of project funding by the government to identify new and existing good technologies and use samples and data to identify precision biomarkers. In total, 98 projects have been funded to date by the top institutions, including Fudan University. Ongoing research includes different types of cancer, metabolic diseases such as type 2 diabetes, rare diseases, etc. Factors such as biospecimen, omics technologies, big data analytics, and phenotypes must be highlighted when describing the key elements of precision medicine that is primarily aimed at precision diagnosis, prognosis, therapy, monitoring, and prevention. Dr. Shi emphasized the importance of standardization, while highlighting quality control to enable precision medicine.
Dr. Barry Hardy, Managing Director at Douglas Connect addressed the topic of Towards Reproducible in silico practice via OpenTox. Dr. Hardy described OpenTox as a leading global community platform with potential to benefit emerging regulatory frameworks. He focused on demonstrating how new technologies are essential for regulatory science, more specifically by highlighting reproducible in silico practice via OpenTox. He also discussed themes such as reproducibility, predictive modelling and other related topics. The process is not only about building predictive models, but observations on how predictive uses are constantly changing. In addition to building an application with a set of principles, another concern was the implementation of best practices based on quality, reliability, robustness, interoperability, reproducibility, harmonization, completeness, openness and confidence.
He provided examples with regards to building validated reproducible QSARs, integrated testing strategies (ITS) using Bayesian Networks to WoE and provide confidence in predictions, and providing best practices in data management and processing, including providing trust to workflows using block chain technology. Another current initiative is the in silico toxicology (IST) protocol consortium, organized by Glenn Myatt, a founder of Leadscope. This international consortium includes regulators, government agencies, industry, academics, model M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
developers, and consultants across many different sectors, initially with the intention of creating the overall strategy for an in silico protocol development. Working subgroups will develop individual in silico toxicology protocols for major toxicological endpoints, including genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity, etc.
Anil Patri, PhD, Director, NCTR-Office of Regulatory Affairs (ORA) Nanotechnology Core Facility, NCTR, US FDA addressed the issue of Reproducibility Considerations for Nanotechnology Products for Regulatory Review. Dr. Patri described several emerging technologies, standards and issues in reproducibility in nanotechnology and how the global increase of nano-applications in drugs, devices and consumer products is paving the way for new advances in science, technology and medicine. Dr. Patri provided an overview that included points to consider for nanotechnology, naming some of its applications in medical products, providing a landscape of products submitted to FDA, and discussing considerations on quality and reproducibility and highlighting some of the needed standards. Nanotechnology applications have been a recurring theme of many conferences, scientific meetings and studies on novel technologies, but few have addressed the need for new standards.
Different standards that exist in nanotechnology were analyzed through several agencies and international stakeholders. Examining these existing standards in a pre-meeting of the Global Summit on Regulatory Science (GSRS15) in 2015, participants from various regulatory and standards agencies prepared a priority list of standards needed and should be developed. In a follow-up GSRS16 summit in 2016, joint efforts from European, North and South American, and Asian regulators have identified the most urgent needs in nanotechnology standards relevant to medical products, food, personal care and consumer products. Because standards development is a time-consuming process, a collaboration between Standards Developing Organizations (SDOs) to develop highly relevant standards that industry and regulatory agencies could use, based on the products being developed, would be most useful to utilize limited resources diligently. Through this process, Dr. Patri indicated that significant progress can be made so that the regulatory submission process becomes much smoother, with less iterations for review, bringing much needed products to market sooner.
Anna Zhao-Wong, MD, PhD, Deputy Director, MedDRA Maintenance and Support Services Organization (MSSO) addressed the issue of Medical Dictionary for Regulatory Activities (MedDRA). MedDRA is a rich and highly specific standard medical terminology developed by the International Council for Harmonization (ICH) to facilitate sharing of regulatory information internationally for medical products used by humans.
Such regulatory information includes data used in the pharmacovigilance process, e.g., data of adverse event reporting, monitoring, and evaluation. MedDRA is used for registration, documentation and safety monitoring of medical products before and after a product has been authorized for sale. MedDRA's scope includes pharmaceuticals, vaccines and drug-device combination products. MedDRA continues to evolve and develop to meet the changing regulatory needs and the emergence of new technologies. Dr. Zhao-Wong is responsible for leading several MedDRA development projects, such as the expansion of medication error, pharmacogenetics/pharmacogenomics, and medical device adverse event terms in MedDRA, and the mapping of CTCAE and the International Classification of Diseases and Related Health Problems ICD-9-CM terms to MedDRA. Although the access to MedDRA and user support is through subscription, there are free options for universities, regulators, non-profits, academics, and healthcare providers. Approximately 5,000 organizations use MedDRA in 106 countries. Support and free training is offered to all subscribers. To enhance the discussions and focus attention on some of the most important issues, a series of questions were posed which included: 1) Which emerging technologies have and will be established as novel tools in regulatory decision-making for drug and food safety ? 2) Lessonslearned -What is our experience with them and how much do we understand about their limitations ? 3) Clarification -Are the emerging technologies transformative or how are you currently accessing knowledge regarding emerging technologies of regulatory relevance -i.e. do they replace previous technologies -or instead supplement existing technologies and are the novel technologies more "efficient" in terms of cost, speed, accuracy or some other criteria? 4) Opportunities -How can the experiences with these technologies be shared and exchanged between regulatory agencies around the globe? and 5) Challenges -What are the primary challenges and how can we overcome them ? 6) Future perspective -How best to develop emerging technologies to meet needs in your part of the global regulatory landscape? Are you currently employing any techniques such as horizon scanning/fore sighting or other approaches to systematically survey the field?

Reflections and Discussion
Dr. Primal Silva, Acting Vice President, Canadian Food Inspection Agency (CFIA), Canada and Dr. Anil Patri, Director, NCTR ORA Nanotechnology Core Facility, NCTR, US FDA opened the discussion session, introduced the speakers and noted the challenges they faced due to the many very exciting topics on emerging technologies discussed over the past few days. Panelists considered it a privilege to share their perspectives and the developments in their respective M A N U S C R I P T

A C C E P T E D ACCEPTED MANUSCRIPT
fields. Dr. Patri noted that there are many technologies that have emerged on the regulatory front. He mentioned nanotechnology prospects, and the emerging 3D-printing of which US FDA has already approved one drug that is a 3D-printed tablet.
In the foods area, there are many new technologies and concerns including the issues of food adulteration and counterfeit drugs. Drs. Silva and Patri also explained that the main objective of the panel was to help solve some concerns that were raised during the previous presentations, and facilitate the scientific debate for the development of regulatory science. The first issue raised was on the roles of the foods and pharmaceutical industries in the current regulatory setting, and possible contributions to science. The participants argued that the health sector stands out in the scope of technological innovation and the pharmaceutical industry, economically, is its main segment. This is due in large part to the incorporation of new technologies to produce drugs that meet the world's diverse needs.
The issue of in silico studies and the future of this approach was raised during the discussion. Participants indicated that the in silico studies have been highly valued, as in most cases, they contribute to optimize the processes, reducing time and costs. Another positive factor is the decrease or elimination of the use of animals in tests for evaluation of drugs, cosmetics and food, which becomes possible, with the development of new protocols. In silico models will be widely used in predicting the risk and danger of a chemical according to its molecular structure. In this context, the theoretical scientists mentioned the QSAR models that can be used to predict the physical-chemical, biological and environmental target properties of compounds from knowledge of their molecular structure. In terms of innovation and emerging technologies, it is useful to understand the perspective from the regulators. As scientists move forward in capturing innovation, the global community should become involved, as a global commitment is necessary to implement the best of the new technologies.
Dr. Slikker, Director, NCTR, US FDA addressed the past, present and future of regulatory science and praised the great discussions to improve global collaboration. The GCRSR is a group of agencies that work in conjunction with many partners around the world, offering great insight into the contribution of science and research science. Dr. Slikker outlined that the definition of regulatory science, concept development, meeting summaries and progress and establishment of plans concerning the Coalition. Dr. Slikker explained that regulatory science is the science of developing new tools, standards and approaches to assess the safety, efficacy, quality and performance of regulated products. Dr. Slikker quoted Margaret A. Hamburg´s article ("Advancing Regulatory Science", Hamburg, 2011): "Today, we are neither effectively translating scientific discoveries into therapies nor fully applying knowledge to ensure the safety of food and medical products. We must bring 21 st century approaches to 21 st century products and problems. We need better predictive models to identify concerns earlier in the product M A N U S C R I P T A C C E P T E D ACCEPTED MANUSCRIPT 28 development process to reduce time and costs. But this will require collaborations and partnerships with academia, industry, and other government agencies." The annual Global Summit on Regulatory Science (GSRS) provides an opportunity for international scientists to objectively assess the regulatory implication of emerging technologies such as nanotechnology, omics, next-generation sequencing, bio-imaging, and the like; and a platform where regulators and bench scientists from various countries can exchange views on how to develop, apply, and implement innovative methodologies into regulatory assessments in their respective countries, as well as harmonizing strategies via global collaboration.
The third GSRS (GSRS13) established the GCRSR. The first meeting took place on September 10, 2013. The US FDA Commissioner sent invitation letters to regulatory colleagues in seventeen countries, and nineteen members from nine countries joined.
Concerning the GCRSR, Dr. Slikker identified representatives from Singapore ( Dr. Sciacchitano summarized that only through effective global engagement with international partners can FDA weave a safety net that benefits public health in the US and around the world. Working in close collaboration with FDA Centers and Offices, OIP is located in strategic locations around the globe, including China, India, Belgium, the United Kingdom, Mexico, Costa Rica and Chile. OIP works with governments, industry and academia in its foreign posts as well as with multilateral organizations, such as the World Bank and World Health Organization, to help ensure that food and medical products exported to the US meet the required standards. Presently, there are 300,000 facilities in more than 150 countries that export FDAregulated products to the US. Such a range necessitates FDA to work well beyond its borders to ensure that products entering US markets are safe and effective for patients and consumers.

M A N U S C R I P T
A C C E P T E D Anil Patri, PhD, Director, NCTR ORA Nanotechnology Core Facility, NCTR, US FDA provided an update of the GCRSR nanotechnology working group. Dr. Patri indicated that nanotechnology is a focal point of research, development and innovation activities in several countries. The major motivation for the development of nanometric objects lies in the possibility of producing new molecules that have different and unusual physical and chemical properties. Nanotechnology must take advantage of these new properties that arise to develop products with different types of technological applications.
Dr. Patri concluded with the rapid global advances in nanotechnology research and the proliferation of new medical and food products containing nanomaterials, it is challenging to maintain the necessary pace of science to enact appropriate regulations. Emerging products contain nanomaterials of different attributes and properties, for example, type, composition and functionality, and therefore, require consideration of new or modified methods for nanomaterial measurements. Although regulatory agencies already participate in the review and approval of such new products, significant challenges remain in using advances in the science of nanomaterial evaluations to develop practical standards to ensure confidence in results.

M A N U S C R I P T A C C E P T E D ACCEPTED MANUSCRIPT
Weida Tong, Ph.D., Director, Division of Bioinformatics and Biostatistics, NCTR, US FDA provided an update from the GCRSR bioinformatics technical working group. Dr. Tong explained that there was a request that his group develop the GCRSR Bioinformatics area as a working group in early 2014 to establish common bioinformatics methodologies among the participating countries/agencies to deal with complex data sets derived from emerging technologies for regulatory decision-making. The scope is the investigation, evaluation and development of bioinformatics approaches for analyzing and managing complex data sets to efficiently and effectively support the regulatory science applicable to food and medical products safety. The member countries/agencies are Australia (FSANZ and Commonwealth Scientific and Industrial Research Organization), Canada (CFIA and PHAC), EU (EFSA and JRC) and US (FDA). The GCRSR Bioinformatics Working Group currently has two main activities: 1) engaging international collaboration to establish international partnerships, particularly scientific collaboration and information exchange through the development of strategies for cross-training and exchanging regulatory scientists; and to engage research communities through the workshop and conference to understand standards for reporting and analysis of data; and 2) fostering regulatory science research to assess research gaps in bioinformatics towards regulatory decision-making, particularly criteria and measures for data integrity, traceability, reliability and reproducibility; to emphasize fit-for-purpose metrics to apply bioinformatics to process complex data sets; and to implement horizon-scanning approaches for emerging bioinformatics methodologies and to assess their potential impact on regulatory decision-making. Primal Silva, Ph. D., Acting Vice President, Canadian Food Inspection Agency (CFIA) provided information on the GCRSR cross-training working group (CTWG) for regulatory science. Dr. Silva described that the GCRSR cross-training workgroup represents the newest of the global coalition working groups. He explained that the previous panel discussions underlined the need for training especially for innovative research approaches, and the necessity to analyze and draw information to advance regulatory outcome. Regulatory agencies are in continuous need for training and competency building and it is important to build capacity with a view toward integrated systems. Dr. Silva explained that the GCRSR's mission is to facilitate, communicate and share current advances in science and technology with the potential to affect the global regulatory process. The formation of GCRSR's cross-training workgroup reflects this desired goal.
Dr. Silva outlined the progress to date and mentioned that the inaugural meeting was held on June 9, 2017. Current participants include representatives from Canada (CFIA), US (FDA), EU M A N U S C R I P T A C C E P T E D ACCEPTED MANUSCRIPT 31 (European Commission) and Argentina (ANMAT). The group´s objectives were reaffirmed based on discussions in Brasilia at the GSRS17 and members are actively encouraging other countries to join the working group. Dr. Silva emphasized the group's outreach to elicit others' needs, inquire about existing organizational-supported training, and based on such, compile a catalog of existing training (most organizations have in-house training). The workgroup is planning to perform a similar analysis of current training opportunities. Potential training projects include regulatory science, competency and lab capacity. Dr. Silva stated that the group's productivity depends on how members view their individual goals in relation to the goals of the group. Future steps will be the discussion of the scope (narrow vs. broad topics); identification of new participants for the working group; and identification and contribution to training opportunities.
Dr. Tong, Ph.D., Director, Division of Bioinformatics and Biostatistics, NCTR, US FDA provided summary remarks. He thanked the local host Dr. Danitza Passamai Rojas Buvinich and her team from ANVISA for an exceptional conference organization; Dr. Tong emphasized that this Summit provided participants with an important opportunity to become more acquainted with the state of regulatory science in various regions of the world, such as the Brazilian National Health Surveillance Agency, European Food Safety Authority, Regulatory Science in Japan, Regulatory Science in Sub-Saharan Africa and Singapore. Important studies ensue in various fields, such as Omics technologies (genomics, proteomics and metabolomics), nextgeneration sequencing (NGS, GWAS, RNA-seq), microphysiological systems (organ-on-chip), nanotechnology, high-throughput and high-content screening methodologies, silico methodologies and bioinformatics.
To better facilitate translation across agencies around the globe, a number of issues could be considered, including facilitation of data sharing among regulatory bodies; communication of lessons-learned across agencies in implementing and applying emerging technologies, promotion of common standards and best practices, definition for fit-for-purpose uses in different regulatory bodies; and cross-training enhancement. Dr. Tong emphasized the need to facilitate performance metrics (quality, transparency, accessibility, reproducibility, traceability, security, accountability and integrity) as the GSRS activities progress.

Disclaimer
The views expressed in this article are the personal views of the authors and may not be understood or quoted as being made on behalf of or reflecting the position of the agencies or organizations with which the authors are affiliated.