The green chemistry movement is scrutinized for marks of tangible success in this short perspective. Beginning with the easily identified harm of the Union Carbide Bhopal, India disaster and the concerns of Love Canal site in Niagara Falls, NY the public can begin to more easily discern the environmental problems of the worldwide chemical industry. The waste associated with chemical production became a major issue since the established mode of disposal was to landfills. Cleanup of the poorly performing landfills became a major focus due to the importance of human health protection. In the early 1990s, the chemical industry moved towards less contaminating technology in response to increasing stringent environmental regulation and this new direction became formulated as sustainable industrial chemistry and green chemistry. Green chemistry practice calls for enhancing the safety of industrial chemistry, becoming cleaner and more energy efficient, and having an understanding of where the chemical passes in the environment through life cycle assessment. The directions for green chemistry were codified into 12 principles which have been extended through engineering considerations. The environmental factor (e-factor) has been formulated as a metric based in the comparison between competing technologies. The waste per kilogram of product or the e-factor permits direct comparison between competing synthetic technologies. The drug-maker Pfizer has shown the utility of green chemistry approaches to the synthetic manufacture of sildenafil citrate (Viagra), the anticonvulsant pregabalin (Lyrica), the antidepressant sertraline and the non-steroidal anti-inflammatory celecoxib. The greener processes for the last three items accounted for the elimination of more than 0.5 million metric tons of chemical waste. A recent effort of insilico research has developed a tool that permits the assessment of toxicity for new chemicals without the established animal technological testing with considerable cost savings. Despite these examples of success, the green chemistry theme is still considered by some skeptics as a trendy buzz phrase.

(Nature 2011, 469, 18–20)

Victory for sustainable synthetic methodology

The synthesis of sitagliptin, an oral antihyperglycemic for the treatment of type 2 diabetes with a global market of $1B + US per annum has been the goal of an extensive research effort. Three separate processes of increasing environmental interest have been used to synthesize stiagliptin. The first generation process was comprised of 8 steps with an overall yield of 52% but was generally recognized for its lack of efficient manufacture. The second generation process was composed of three steps with an overall yield of 65%. The total waste produced was reduced from 350 kg to 50 kg and complete elimination of the aqueous wastestream for each kg of final product. The second generation process received the 2006 US Presidential Green Chemistry Challenge Award for Greener Synthetic Pathways and the 2005 ICHEME AstraZeneca Award for Excellence in Green Chemistry and Engineering. Despite the obvious success of the second generation process, there were significant areas for improvement due to inherent drawbacks of a hydrogenation step. Through corporate cooperation several private entities entered into a cooperative research effort which resulted in a third generation process which involved the engineering of a highly evolved biocatalyst having 27 mutations in the active site and elsewhere within the biocatalyst. The third generation process (see below) showed an enhanced yield of 13% over the second generation process. Productivity was increased by 53% and the total waste was reduced by 19% when contrasted with the second generation process. The third generation process was awarded the 2010 US Presidential Green Chemistry Challenge Award for Greener Reaction Conditions.

(Angew Chem Int Ed 2011, 50, 1974–1976)

Safer chemical design

This review assembles information from toxicology, medicinal chemistry, drug design, and chemodynamics to form a framework enabling the molecular design of safer chemicals. The properties of new molecules can be assessed by a host of new techniques such as toxicogenomics. Using the new framework, the synthetic chemist/engineer can systematically evaluate the properties of virtual molecules to discern candidates that are attractive for commercialization. At the early stages of discovery use of established synthetic considerations can be assisted by designs attuned to chemical safety and toxicology. Quantitative structure–activity relationships have been used to develop the relationship between chemical structure and biological activity. This approach was developed from drug discovery efforts to evaluate the changes in structure with a measure of potency directed to some therapeutic goal. This analysis can be used to evaluate the toxicity of new chemical structures. Such an approach has been used to identify effects using whole-organism models and more recently insilico. Knowledge-based systems permit the use of more knowledge from diverse sources to provide an expert evaluation of new chemicals. These new considerations can enable the chemist to begin the molecular design of new chemicals with a more comprehensive evaluation of each new feature deemed necessary to new chemical selection.

Chem Rev 2010, 110, 5845–5882

Biorefinery future

The World Economic Forum published a 40 page white paper entitled: The Future of Industrial Biorefineries which undertakes a brief discussion of the technology encompassing biofuel production and extension to bulk and fine chemicals. Feedstocks and conversion technologies are evaluated for their role leading to long term trends affecting the adoption of biofuels and other industrial chemicals. The market for biobased products is expected to expand aggressively across the globe. The value chain connected with biorefineries could be worth $300 billion US by 2030.

http://www3.weforum.org/docs/WEF_FutureIndustrialBiorefineries_Report_2010.pdf.

Solar energy conversion

Consideration of the annual solar incident energy on a given land area have been used to estimate the efficiency of the several secondary energy technologies capable of utilizing the incident energy. For a representative 1 m2 land area in the United States, it is estimated that the average annual solar incident energy delivers 6307 MJ/m2 year (1752 kW h/m2 year). How much of this incident energy can be captured by energy conversion technologies? Demonstrated efficiencies of energy recovery show that heat recovery could be as much as 70%, electricity as much as 10–42%, hydrogen 5–15%, energy crops 0.3–.2%. Biomass conversion technologies have been estimated to produce 1.5–3.0 times less sun-to-fuel yield than augmented processes where heat, electricity, and hydrogen are used to supplement the biomass conversion process. This analysis offers new perspectives to aid the optimization of new conversion technologies that harness the solar incident energy more effectively.

(AICHE J 2010, 56, 2762–2768; Annu Rev Chem Biomol Eng 2010, 1, 343–364; Environ Sci Technol 2010, 44, 5298–5305)

Pesticide effects with Parkinson disease

A case-controlled study involving participants in the Farming and Movement Evaluation of the Agricultural Health Service assessed lifetime use of pesticides selected by mechanism of interaction with humans. Parkinson’s disease was positively associated with two groups of pesticides by mechanisms that impair mitochondrial function or increase oxidative stress in humans. Rotenone use was found to be highly associated with Parkinson’s disease mechanism of mitochondrial complex inhibition. Paraquat was connected through subcellular changes associated with Parkinson’s disease. The test group for this study was drawn for a population of pesticide applicators. The researchers admit to study limitations in terms of the unknown effect of other pesticides used by the population and the corresponding effects. Clearly, the study points to the possibility of household exposure through applications and untested consequences.

Environ Health Perspect 2011, 119, 866–872

Selective synthetic design

Functional groups in organic molecules are the points at which chemists involved in fine and pharmaceutical intermediate synthesis can develop their strategy of converting a starting substrate into the desired synthetic target. The reactivity of the functional groups of a reaction substrate towards a given synthetic step can lead to success or failure to synthetic endeavors. Where multiple products are formed, there are the issues of separation and waste. Using chemoselective reactivities to design a synthetic step the chemist uses the established knowledge of differential reaction rates at one of two different functionalities. Discrimination by functional group reactivity may be related to two different reaction pathways. When the stereochemical outcome of a reaction can be controlled, the term stereoselective is used to describe the process. If there is a preference in chemical formation or breaking, regioselectivity describes the process. Chemoselectivity describes the preferential reaction at one or more functional groups. This review elucidates the new chemistries leading to chemoselective transformations at saturated carbon-heteroatom bonds, and unsaturated carbon atoms, formation of carbon–carbon bonds and metal catalyzed C–H activation. The selection of reaction conditions leading to the desired selectivity can significantly affect the outcome of a synthetic reaction in a way supportive of green chemistry and sustainability.

Angew Chem Int Ed 2010, 49, 262–310

Biocatalytic reaction credentials

The expanding use of biocatalytic reactions emphasizes the necessity for the specification and adoption of guidelines necessary for the reproduction and understanding of reported results. A committee of the Federation of Biotechnology Section on Applied Biocatalysts offers a start to the proper information required for the reporting of biocatalytic reactions. Checklists for complete description of biocatalytic reactions, and reporting results from biocatalytic experiments are presented. A previously undertaken initiative called Standards for Reporting Enzymology Data (STRENDA) provided a basis to these new recommended guidelines. Issues such as selectivity, specificity, productivity, and biocatalyst stability are underscored for their importance as well as methodology enabling reactions in multiphase systems.

Trends Biotechnol 2010, 28, 171–180

Biocatalyst performance improvement

Biocatalyst(enzyme) performance is becoming a major focus for research. The drive for these investigations is propelled by the remarkable nature of biocatalysts as conducting synthetic transformations in aqueous physiological conditions efficiently and in highly specific ways. This research targeted toluene 4-monoogygenase (T4MO) for improvements in the synthesis of hydroxytyrosol, a phytochemical derived from olives and olive oil. Directed evolution and rational design were selected to generate active variants of the monooxygenase with higher activity to conduct the transformation of 2-phenylethanol to hydroxytyrosol. An enzyme variant having high activity, in contrast to the wide type enzyme, was selected from a subset of 16 from a field of ~13,000 possible combinations in a minimum timeframe and corresponding effort. In this research directed evolution, rational design, and statistical methods were utilized to improve the T4MO enzyme to produce the antioxidant, hydroxytyrosol following green chemistry principles.

Appl Environ Microbiol 2010, 76, 6397–6403

Computational biocatalyst design

The use of computer-aided design as applied to enzymes can generate active catalysts. Computer designed enzymes have shown modest levels of catalytic activity but the majority of designs have no detectable activity. The de novo design of enzyme catalysts is based on models of the active site. The prowess of the computer based design pivots on the correct depiction of the active site captured by a given model. The complexity of enzyme structure contributes to these difficulties. Despite these difficulties, the tremendous potential of computer-based enzyme design is expected to broaden application to a wide range of important applications. Information from all the supporting disciplines is expected to lead to an enhancement of activities found with de novo enzymes.

(Protein Sci 2010, 19, 1817–1819)

Cheap coal’s end

Soon, coal will not play a significant role in the global energy equation according to recent coal reserve analysis. The Energy Information Administration pins current global coal reserves at 930 billion short tons of recoverable coal reserves equal to 4,116 billion barrels of oil equivalent. A prevailing idea that coal is expected to be a cheap energy source for decades to come appears to be unsubstantiated by new investigations. According to one study, the global energy derived from coal could peak in 2011. Global demand for coal is strongly affected by Chinese needs. The short term effects may be seen in rising coal prices leading to potential economic shocks. It is estimated that China has 62 years production of coal based on the 2009 rate of consumption of 3 billion metric tons. Greater than 90% of China’s coal is mined from as much 1,000 meter deep operations. The use of the term “proven recoverable reserves” can lead to extensive inaccuracies depending on the desire of industry and politicians to convey a false sense of security. New mining techniques can provide access to currently unreachable coal deposits. The solution to these dire consequences is to limit consumption since global energy supplies are expected to fall short of demands by 2020.

(Nature 2010, 468, 367–369; Energy 2010, 35, 3109–3122; http://en.wikipedia.org/wiki/Coal)

Carbon dioxide to fuel

The availability of highly concentrated gas streams containing carbon dioxide from steam power stations fueled with nonrenewable energy resources could be an asset useful to the formation of carbon fuels and chemicals. The developing carbon capture sequestration effort could be converted to a carbon capture conversion strategy that would use the high concentration gas streams as a starting material in the formation of hydrocarbon materials from chemistries related to the “dry reforming” process illustrated as follows:

$$ {\text{CH}}_{4} + {\text{CO}}_{2} \to 2{\text{CO}} + 2{\text{H}}_{2} \Updelta {\text{H}}^{\text{O}} = + 247.3\,{\text{kJ}}\,{\text{mol}}^{ - 1} $$

About 20% more energy is required for dry reforming of CH4 when compared with steam reforming. Low molecular weight hydrocarbons and alcohols are potential product streams. The hydrogen required for the transformations is slated to come from the use of renewable energy to electrolyze water in times when the renewable energy generation provides surplus to an electrical grid. Fischer–Tropsch chemistry can also be targeted as technology to intercept the carbon dioxide asset from the sequestrations efforts.

Phil Trans Royal Soc A 2010, 368, 3343–3364

Implementation of new materials assisting the control of CO2 emissions associated with large point sources such as power plants is beginning to take form. Promising new materials extend across the diverse range of physical absorbents, membranes, and gas hydrate formation. The development strategy is built on the emerging carbon capture and storage schemes to be implemented. Carbon emissions to the atmosphere from modern power plants could be reduced by 80–90% through conversion with new materials and technology. These advancements are dependent on improved materials that provide separation of CO2 from other gaseous emissions. The field of metal–organic frameworks is expected to provide materials enhancing gas separation.

Angew Chem Int Ed 2010, 49, 6058–6082

Water and organic synthesis can they mix?

The hydrophobic nature of many organic substrates leads to the general antipathy of synthetic chemists to the use of water as a reaction solvent due to the poor substrate solubility and efficient mixing of multiphase reaction mixtures. While it is true that these characteristics have been used quite successfully as in the case of interfacial polymerization, water has been avoided as a solvent by most organic chemists. New concepts such as “in-water” and “on-water” present strategies to be exploited. The “on-water” concept successfully employs the insolubility of reactants in water. An early notable example of the hydrophobic effect in organic reactions was found with the water enhancement of Diels–Alder reactions. “On-Water” reactions were found to occur in conditions where insoluble reactants when stirred vigorously led to high product yields. In each of these categories, reaction effects were traced to hydrogen-bonding effects. The catalogue of reactions susceptible to water effects extends from pericyclic reactions to catalyzed (metal-catalyzed and organo-catalyzed). These remarkable findings enhance the role of water as a clean, renewable, and green solvent.

Chem Rev 2010, 110, 6302–6337

Energy efficiency evaluations

Reduction of U.S. energy bills, preventing pollution and addressing climate change can be accomplished simply through the evaluation of energy efficiency. By optimizing service selection through minimized energy input offers a lowest cost means of energy conservation. The US federal government is one of the largest energy consumers in the United States and has taken leading role by promoting energy efficiency. This 234 page reference, Energy Efficiency Reference for Environmental Reviewers is intended for reviewers of documents related to responsibilities for environmental review under the Clean Air Act. Private and public groups may find this reference useful to their interests in environmental legislation, regulations, projects, documentation, or permit applications. This document provides background information related to the use of energy evaluations but not regulation, policy or guidelines.

http://www.epa.gov/compliance/resources/policies/nepa/energy-efficiency-reference-for-environmental-reviewers-pg.pdf.

Indirect land use and energy crops

The land intensive nature of biofuel crop cultivation leads to green house gas (GHG) emissions balance impacts. As greater landscapes are devoted to biofuel crop production, direct and indirect land use changes become important to GHG emissions. Concerns of the competition of biofuel feedstocks with food, feed and fiber production as well as nature protection and conservation are central to the development of current policies. This review looks at the current state of art used to quantify GHG emissions related to direct and indirect land use change (iLUC). Different approaches to iLUC calculations suggest that GHG emissions can be quantified. With better information, regulations or policies are expected to be easily accomplished and improved with better global monitoring. The reduction of land competition and the risks associated with both direct and indirect LUC can be assisted by improvements to increased biomass production and conversion efficiency.

Biofuels Bioprod Bioref 2010, 4, 692–704

Wind can outproduce offshore oil and gas as energy source for US

A report, released by the non-governmental organization Oceana, is entitled Untapped Wealth: Offshore Wind Can Deliver Cleaner, More Affordable Energy and More Jobs Than Offshore Oil evaluated the Atlantic coast of the US and found that offshore wind could outperform energy generation from economically recovered offshore oil and gas resources by 30%. Costs of oil and gas were set for the study at $110 per barrel and $11.74 per thousand cubic feet respectively. The report asserts that some 127 gigawatts of wind power could be installed on the US Atlantic coast with about a third of the spatial footprint required for comparable oil and gas production. The wind resource is estimated to provide more energy for less money when contrasted with offshore oil and gas production. These developments would require a significant workforce which is highly desirable.

http://na.oceana.org/sites/default/files/Offshore_Wind_Report.pdf.

Food security and transgenic plants

The Pontifical Academy of Sciences of the Vatican held a study week on the topic of Transgenic Plants for Food Security in the Context of Development 15–19 May 2009. The meeting agenda shows 37 presentations by 38 participants. The meeting was divided into eight subsections on topics such as Contribution form Transgenic Plants, State of Application of the Technology etc. The proceedings were published in New Biotechnology and is freely accessible at the journal url or the entire issue and meeting program is available from a Vatican url cited below. Among the papers in the proceedings collection is one on the competition with biofuels for food.

(New Biotechnol 2010, 27, 445–718)

http://www.vatican.va/roman_curia/pontifical_academies/acdscien/2010/newbiotechnologynov2010.pdf.

http://www.vatican.va/roman_curia/pontifical_academies/acdscien/2009/booklet_transgenic_34.pdf.

The deepwater horizon event

Global society’s unquenchable thirst for petroleum based fuel participated as a contributing agent the oil disaster in the Gulf of Mexico. The 33,000 ton, $560 million Deepwater Horizon Platform centered over the British Petroleum (BP) Macondo well was drilled to a depth of 5,000 ft in the water column and another 13,000 ft below the seafloor. The oil loss of the BP spill has been pegged at 4.9 million barrels which is an energy resource loss of almost $0.5 billion at current petroleum prices. The environmental features of this disaster are well portrayed through a compelling display of pictorial damage effects. A separate insert to this issue of National Geographic entitled “Layers of Life” which offers an exceptional graphic depiction of coastal ecosystems, and marine ecosystems along with a geography of offshore oil in the Gulf of Mexico. The ecosystems proximate to the disaster are rich habitats encompassing one of the most ecologically and economically productive areas of the globe. In each case the detail given to the ecosystem depiction offers a quick understanding of the environmental insult delivered by the spill to the surrounding area.

Natl Geogr 2010, 218(4), 28–77

The National Commission on the BP Deepwater Horizon Oil Spill and Offshore Drilling has developed a 398 page report entitled: Deep Water: The Gulf Oil Disaster and the Future of Offshore Drilling, Report to the President which provides an extensive analysis of the platform operations and how the disaster came into being. The web site accompanying this report offers extensive analytical tools and analysis including animations of certain aspects of the drilling activities that are both instructive and entertaining. Anyone wishing to gain a better understanding of the disaster will be generously rewarded by a scrutiny of these analytical aids.

http://www.oilspillcommission.gov/final-report .

The Chief counsel’s team of the Oil Spill Commission has published a separate 371 page report entitled: Macondo :The Gulf Oil Disaster which provides introductory and background material, a time line of events, and technical causes of the blowout.

http://www.oilspillcommission.gov/chief-counsels-report.

The Norwegian firm Det Norske Veritas conducted an analysis of the blowout preventer on behalf of the US Department of the Interior and the Coast Guard. Despite the claim that the blowout preventer was designed and tested according to industry standards, it was found that there could be systematic design issues with the blowout preventer that could lead to ineffective operation even when properly deployed. The report called for further investigation of the use of the blowout preventer within the industry and recommended that the industry conduct a study to ascertain the cause of failure with the current design.

The Forensic Examination of Deepwater Horizon Blowout Preventer, a report for the Interior Department and the Coast Guard, is currently unavailable but is expected to be posted at the US Department of the Interior, Bureau of Ocean Energy Management, Regulation and Enforcement url.

BP has offered its interpretation of the deepwater blowout and rig explosion that released almost 5 million barrels of oil and caused the death of 11 people. The 193 page Deepwater Horizon Accident Investigation report presents BP’s internal investigation relating to the causes of the blowout. Dispersants were effective according the BP. In situ burning was found to be effective but mechanical recovery methodology such as booms and skimmers require further development to meet the current needs for oil spill response technology. A separate report of 46 page entitled: Deepwater Horizon Containment and Response: Harnessing Capabilities and Lessons Learned points to the fact that the company’s plan which conformed to regulatory requirements was not written to address every development and contingency. Clearly, we know no human effort to be omniscient but it is somewhat lame to assert that the response plan was constructed from any significant effort to explore developments and contingencies derived from a greater degree of due diligence relative to the Deepwater Horizon platform operation.

Deepwater Horizon Containment and Response: Harnessing Capabilities and Lessons Learned.

http://www.boemre.gov/ooc/PDFs/NarrativeFinal.pdf.

Deepwater Horizon Accident Investigation report.

http://www.bp.com/liveassets/bp_internet/globalbp/globalbp_uk_english/gom_response/STAGING/local_assets/downloads_pdfs/Deepwater_Horizon_Accident_Investigation_Report.pdf.