Industry-Nominated Technology Breakthroughs of NSF Industry/University Cooperative Research Centers 2012
related photo here
National Science Foundation Directorate for Engineering Division of Industrial Innovation and Partnerships Industry/University Cooperative Research Centers Program 4201 Wilson Boulevard, Suite 585 Arlington, Virginia 22230 http://www.nsf.gov/ http://www.nsf.gov/eng/iip/iucrc/ Craig S. Scott, Editor University of Washington School of Medicine Marita Stevens Graube, Designer Pixel Theory Inc. Front cover credits: Living Earth HD real-time weather image over smaller stock photo of fiber optic rendition. Component of a DC-DC converter: Center for Electromagnetic Compatibility (CEMC). A naturally-generated mixed species forest: Center for Advanced Forestry Systems (CAFS). Bill Arbegast working with students: Center for Friction STIR Processing (CFSP). Preface The National Science Foundation (NSF) Industry & University Cooperative Research Center (I/UCRC) Program is housed in the NSF’s Directorate for Engineering’s Division of Industrial Innovation and Partnerships (IIP). A primary IIP Division strategic goal is to foster the development of sustained partnerships based upon common research interests, trusted networks, and dynamic technical interrelationships. For almost four decades the I/UCRC Program has catalyzed such partnerships among industry sectors, academe and government. Currently these include: over 750 industry partners plus more than 100 member universities, state governments, national laboratories and federal agencies. Each Industry & University Cooperative Research Center (I/UCRC) conducts pre-competitive research that is of interest to its industry sponsors and to the university(s) with which they are involved. A unique feature of I/UCRCs is that they do not conduct industry’s research. Rather, in these centers, industry joins and supports centers that have research areas and missions in which they are interested. Faculty present proposals annually. After listening to proposals, industry representatives offer their industrial perspectives. In this way the center sponsors attempt to improve the quality of the proposed work. Industry then advises the center leadership regarding which projects they would most like to see funded. Final funding decisions are made by the university-based center director(s). The majority of center research is then conducted by graduate students under the supervision and guidance of university faculty. The NSF provides I/UCRCs with a modest amount of funding as base support. The I/UCRC partnership model, which is at the heart of the program, requires that the majority of center funds be contributed by corporate sponsors. In 2010, the NSF invested approximately 8 million dollars in the program while industry invested over $85 million dollars in I/UCRCs - primarily in the form of center memberships. The majority of these funds were used for direct research support - a multiplier effect of over 8 to 1. Many I/UCRCs have become recognized for innovative, cutting edge research. They have amassed an impressive track record of exploring great ideas through leveraged collaboration and team science. This 4th edition compendium is intended to acquaint readers with how center knowledge and technology are being translated into commercial and industrial advances and applications. It catalogues technological breakthroughs and advances that industry representatives believe are attributable to specific Industry & University Cooperative Research Centers (I/UCRCs). In this edition, for the first time, statements of economic impact have been added. These are designed to help convey how center research is contributing to center sponsor’s bottom lines, to the vitality of industry sectors, to the nation's research infrastructure, and to the national economy. To obtain entries in the compendium each center director was asked to identify industrial advisory board (IAB) member/scientists that they view as particularly knowledgeable about the nature of their center’s research program and its impacts on science and technology. These IAB members/industry scientists were then approached by the editor to determine whether any of the center’s research endeavors met the following definition of a technological advance or breakthrough: TECHNOLOGY BREAKTHROUGH DEFINITION: A technological breakthrough or advance may include: significant process improvements, new processes or techniques and new or improved products or services that resulted either directly from, or was indirectly stimulated by the center's research program. If an IAB member viewed any of their center’s work as meeting this definition, then they were asked whether they wanted nominate the work for possible inclusion in the compendium. All entries in this compendium were nominated by industry scientists. The editor then worked with them and with the involved university scientists to write the individual entries contained herein. The entries exemplify the countless collaborative efforts of university and industry scientists and the economic impacts of the resulting research. Berkeley Sensor and Actuator Center (BSAC)
University of California, Berkeley, John Huggins, Executive Director, 510.643.5663, jhuggins@eecs.berkeley.edu
University of California, Berkeley, Richard Muller, Director, 510.642.0614, muller@eecs.berkeley.edu
University of California, Davis, David Horsley, 530.752.1178, dahorsley@ucdavis.edu
Center website: http://www-bsac.eecs.berkeley.edu/
image of a horizontal line
MEMS-Based Timing Components
Today (and for most of the past century), electronic systems depend upon or have depended upon quartz crystals for generation of basic timing signals. That is about to change. Microelectromechanical systems (MEMS) include an important class of devices that “resonate” at high frequencies and that can be used to create precise electronic timing and frequency-selective systems. These promise to change the way electronic systems derive their timing. Image of MEMS resonator-based timing components, which will most likely be embedded in computing and mobile devices Several BSAC-inspired startup companies including SiTime (co-founder BSAC and co-Director Bernhard Boser), Harmonic Devices (acquired by Qualcomm), and Silicon Clocks founded by former BSAC co-Director Roger Howe and BSAC post-doctoral researcher Emmanuel Quevy (acquired by Silicon Labs), and current and former BSAC industrial member companies including Japanese NDK, and University of Michigan startup Discera Corporation (founder BSAC and co-Director Clark Nguyen) have been introducing quartz replacement technology based on these MEMS resonators. Economic Impact : The current US timing devices market of nearly $2B/year represents only the initial target market for MEMS timing components (source: BCC Research Inc). In 2006, this market was 99% served by quartz crystal devices. Private estimates by both crystal and MEMS technology companies suggest a MEMS resonator penetration of approximately 5% in 2012 and 10% to 50% of a $5B/year worldwide market by 2020. Before end of this decade, it is nearly certain that high frequency filtering required for all mobile devices including cellular telephones and mobile computers as well as communications systems for wireless sensor networks, will depend upon integrated components with thousands of interconnected MEMS resonators. These will perform most of the radio frequency filtering functions currently done with external discrete surface acoustic wave (SAW) and discrete Film Bulk Acoustic Resonators (FBAR) devices. At that time, more than 2 billion portable computers and mobile (cellular) telephones will make use of this technology pioneered in large part at BSAC. For more information, contact Bernhard Boser, 510. 643.8350, boser@eecs.berkeley.edu. Silicon Carbide Materials and Processes for Rotary Internal Combustion Engine on a Chip
horizontal line
Researchers at the NSF Berkeley Sensor and Actuator Center (BSAC) at the University of California-Berkeley designed and micro-fabricated engine components with features on the scale of tens of microns and an overall scale of millimeters with etch depths as large as 900 m. Evolving from machined stainless steel millimeter scale engine research of Professor Carlos Fernandez-Pello and BSAC researchers, the DARPA funded, Al Pisano-led BSAC MEMS REPS (MEMS Rotary Engine Power System) Program required research and development of new materials and processes. These MEMS engines - much like conventional-sized gasoline-powered generators - have been shown to convert the stored chemical energy of liquid hydrocarbon fuels into usable electric power in the 10-100 mW range. The system was able to deliver specific power (W/kg) superior to conventional systems and to leverage the inherent advantages of liquid hydrocarbons: storage, safety and specific energy (W-hr/kg). Several BSAC NSF-Center member companies, such as Chevron Corporation, Textron Systems and Harris Corporation, have participated in the DARPA-funded research and testing of this device. While federal research on such internal combustion, hydrocarbon fueled systems has diminished, the by-products of the engine research have been substantial. Research efforts to develop the required auxiliary systems, similar to those found on a modern automotive hybrid engine (ignition, fuel delivery, integrated generator), have led to entirely new applications for the materials and processes originally developed for the micro-fabricated engine. In particular, silicon carbide-based sensors and actuators) have led to a very important class of MEMS devices for harsh environment sensing. This enables more efficient the generation of clean power, geothermal wells, and gas and steam turbines. Image caption: MEMS rotary engines convert the stored chemical energy of liquid hydrocarbon fules into usable electric power Economic Impact: The microwankle engine itself may not have the economic impact that was once envisioned, though it spawned the development of materials and processes with larger and environmentally more important impact than envisioned under the original program. Downstream indirect impacts will be felt in increased energy efficiency in large power generation systems. Embedded sensors operating at up to 600 degrees C will create performance improvements and condition-based maintenance for myriad of steam and gas turbines used in geothermal, nuclear and gas-fired plants, and even internal combustion and aircraft turbine engines. For example, a 2% improvement in efficiency of a large single gas or steam electric power generation turbine with output of 200MW can represent additional 35 giga-watt hours/year of energy, or $3.5M/year at retail pricing of $.10/KWH. image of a geothermal power system, oil and gas exploration, gas turvines, automotive engines The Energy Information Administration (EIA) estimates that between 2009-2015, 21 GW of new gas fired power plant electricity will be become available. Each 1% improvement in efficiency of the turbines generating only this power, represents $160M/year savings at retail pricing of $0.10/KW-H. These numbers can be multiplied by similar efficiencies in geothermal generation. A conservative estimate of the impact of the emerging harsh environment sensors enabled by these new processes and devices, spawned in part by the microwankle research and used to make efficient these energy sources, is expected to exceed $300M/year. For more information, contact Albert Pisano, 510.643.7013, appisano@me.berkeley.edu.
image of a horizontal line
Radio-Equipped Wireless Sensors called “Smart Dust”
Kris Pister of the Berkeley Sensor and Actuator Center popularized the term “Smart Dust” to help visualize his goal of an autonomous network of highly miniaturized “motes” containing microradios and microsensors that can be deployed at random, that wake up; identify who and where their neighbor motes are; and form a dynamic ad hoc self-organized mesh data network over which sensor data such as location, motion, light, pressure, temperature, etc, is communicated wirelessly, reliably and without human intervention.images of computer chips This Smart Dust story is really a story of collaborative “stone soup” in which Pister contributed the stone from a $25,000 industrial award from I/UCRC member company Hughes and a $10,000 California (state) MICRO industrial matching grant that eventually led to a $1.7M DARPA “Smart Dust” program. This work resulted in a groundswell of industrial and new venture capital investments in wireless sensor networks (WSN). UC Berkeley computer science collaborators developed an open source small footprint (4KROM, 256 bytes RAM) network operating system called “Tiny OS” for the little “micro motes” that were built from off-the-shelf components, and later miniaturized. The micromotes were dropped from UC Berkeley unmanned aerial vehicles; installed at 1/100th the installation cost of wired sensors in a structure of a sister I/UCRC: “Center for the Built Environment (CBE)” on page 41. This inspired academic and industrial collaborations that haven’t subsided today. This technology was awarded the Alexander Schwarzkopf Prize for Technological Innovation by the I/UCRC Association in 2006. Economic Impact: Economic Impact: Market forecasts of more than $8B/year made some 8 years ago by market analyst InStat*, of overall wireless sensors and network components enabled in large part by the “Smart Dust revolution”, were about 8 years too early; but these technology-enabled promises to revolutionize homeland security, environmental control, power management, and infrastructure monitoring, are now materializing into the multi-billion dollar market envisioned. *Source: InStat/MDR 11/2003 (Wireless); Wireless Data Research Group 2003; InStat/MDR 7/2004 (Handsets). For more information, contact Kris Pister 510.643.6690, pister@eecs.berkeley.edu.
image of a horizontal line
Surface Micromachining of Micro-Electro Mechanical Systems (MEMS)
image of car components MEMS “surface micromachining of polysilicon” was perfected in the 1980’s at the Berkeley Sensor and Actuator Center. Like prior “Bulk” micromachining, “Surface” micromachining (SM) creates moving electro-mechanical structures (like clock pendulums or guitar strings on a microscale). But unlike Bulk micromachining, SM uses polysilicon as the mechanical structure and is therefore compatible with standard CMOS microelectronic processing, This breakthrough put MEMS on a similar cost and complexity path to mainstream CMOS memory and microprocessors that benefit from “Moore’s Law” in which component complexity and value double every 24 months. Economic Impact: Benefits from resulting MEMS components are remarkable. Accelerometers which are used in automobile crash mitigation (airbag deployments) have been estimated to save more than 7,000 lives per year in the U.S. alone. Implantable inertial sensors are now used for early detection of heart pre-failure conditions, in time for interventions. Inertial MEMS will be in 300 million cellular phones sold in 2011. Fast-forward two and a half decades from the BSAC pioneering work in surface micromachining of polysilicon and you find that these “microelectronic inertial sensor” applications have grown into an international multibillion dollar product category that creates even higher value in the products and systems in which they are employed. For more information, contact Richard Muller, 510.642.0614, muller@eecs.berkeley.edu.
image of a horizontal line
Magnetic Immunosensor
drawing of magnetic bead A portable device suitable for handheld field deployment by moderately trained personnel has been developed and demonstrated by researchers at the Berkeley Sensor and Actuator Center. This technology allows verified diagnostic assays for infectious diseases, including Dengue, Malaria, and HIV. The device has allowed dramatic simplification of testing protocol compared to ELISA (the current optical immunoassay standard), with the advantage of allowing sophisticated assays in a point of care or at home setting, where the facilities and advantages of a research laboratory are not available. A high level of system integration is necessary for replicating the functionality of a diagnostic immunoassay protocol in an inexpensive, palm-held device. Segregation of labels which match the suspected disease from non-specific bound labels (those which do not match) and label detection present major obstacles to implementing an integrated immunoassay device. Magnetic bead labels are particularly attractive in this context since they can be electromagnetically detected and manipulated in opaque solutions such as blood, where the optical ELISA method may fail. Finally, the integration of differential magnetic sensors with local magnetic field generation for internally implemented magnetic washing, represents system miniaturization and potential cost reduction (because of mass producible CMOS) that is unprecedented for complex field-or-home deployable assay. Economic Impact: The immuno-assay market represents $15B a year in sales in the US, and over two thirds of that market consists of laboratory tests. Recently, emphasis on healthcare cost reduction in combination with the increasingly burdensome liabilities of running a clinical laboratory have incentivized POC (Point of Care) testing. Unfortunately, adoption of POC testing has been very slow due to the poor performance of current products on the market and to CLIA regulations that impose stringent quality control requirements on providers. A BSAC-originated startup company, Silicon BioDevices, Inc., has begun commercializing this immunosensor assay technology through their unique product line that is easy to use, accurate and fully integrated. This approach has begun to catalyze the transition of immuno-assays from the laboratory to the POC. For more information, contact Octavian Florescu, 510. 292.6260, florescu.octavian@gmail.com or Bernhard Boser, 510.643.8350, boser@eecs.berkeley.edu. Biomolecular Interaction Technologies Center (BITC)
University of New Hampshire, Thomas M. Laue, Director, 603.862.2459, tom.laue@unh.edu
Center website: http://www.bitc.unh.edu/
image of a horizontal line
Making Biopharmaceuticals Safer
image of a person injecting medicine into their stomach Protein drugs, also called biopharmaceuticals, are at the forefront of modern medicine. Of particular use are antibodies, proteins that are part of the body’s immune system. Therapeutic monoclonal antibodies are used in the treatment of cancers, multiple sclerosis, asthma and other life-threatening diseases. Delivery of these drugs is currently achieved by intravenous injection. Pharmaceutical companies wish to develop high-concentration versions of the drugs that can be administered by patients subcutaneously at home, similar to how insulin is administered now, thus substantially reducing costs and making treatment easier. Development of high concentration monoclonal antibody formulations is complicated by their tendency to stick to one another, making insoluble aggregates and plugging up needles. However, the aggregates are more than just a nuisance. Aggregation can result in reduced effectiveness, drug resistance and life-threatening anaphylactic shock. Even current lower-dose versions of the drugs may suffer from aggregation. It is estimated that up to 30% of patients develop drug resistance to some of the monoclonal antibody cancer treatments due to aggregate formation. The analytical ultracentrifuge (AUC) is widely used in academic and industrial laboratories to characterize molecular interactions, including aggregate formation. BITC funding helped develop a fluorescence detection optical system (FDS) with unparalleled sensitivity and selectivity, which is now produced commercially by Aviv Biomedical, Inc (the AU-FDS). Using the AU-FDS, it is now possible for the first time ever to detect antibodies and their aggregates in serum. BITC member companies are using this instrument to determine whether their therapeutic antibodies form aggregates serum after injection: the drug development and formulation divisions in Genentech, Roche, Amgen, Abbott, Boehringer-Ingleheim, Pfizer and Johnson and Johnson either have purchased an AU-FDS to conduct these studies, or are contracting with laboratories that have an AU-FDS to make the measurements. By developing formulations that prevent aggregation in serum, drugs will be safer and more effective. Economic Impact: Costs associated with chronic monitoring of drug effectiveness annually exceed billions of dollars. By detecting aggregate formation early in the drug development process, companies can modify or reformulate their drugs before beginning drug trials. This saves hundreds of millions of dollars annually. Lower drug development costs reduce medical costs. For example, one BITC member pharmaceutical company switched to another molecule when its original candidate molecule was discovered to aggregate in human serum. Since no clinical trials of the drug had begun, the switch to the new, non-aggregating molecule cost almost nothing and saved the company upwards of $100,000,000 in fruitless clinical trials. There are considerable cost savings to society by making non-aggregating drugs. First, by not aggregating, these molecules are far more likely to be tolerated by patients over a longer period of time. This means that patients require less medical supervision and do not need to be switched so frequently to new drugs. It is estimated that 30 to 60% of patients must switch drugs, often several time during treatment, to circumvent aggregate-related immune responses. Reduced aggregation results in lower incidence of anaphylactic shock, a life-threatening condition. The rate of severe anaphylaxis is 1 to 3 per 10,000 patients; costing an average treatment of $10,000 per incident. For more information, contact Thomas M. Laue, 603.862.2459, tom.laue@unh.edu.
image of a horizontal line
Making Biopharmaceuticals Less Expensive
Shown below are two therapeutic monoclonal antibody preparations. The sample in the top photo forms aggregates and forms a white precipitate as salt is increased to physiological concentrations. The lower photo shows the results for a different antibody which does not aggregate as salt is increased. The antibody in the top photo had to be abandoned as a possible drug due to its aggregation. Unfortunately, over $100M dollars already had been spent on its development. image of test tube vials The difference between the two proteins is their net charge. More charged protein remains in solution, while the low-charged protein aggregates and precipitates. Just as NaCl dissolves in water to form Na+ and Cl- ions, proteins also may be charged. BITC-sponsored research has shown that only proteins that have a high net charge remain soluble. This research has demonstrated that the current methods for calculating the net charge may be exceedingly inaccurate, misleading researchers about their solubility. That was the case for the proteins shown here. BITC-sponsored research has developed methods for accurately determining protein charge. Workshops have been held to teach researchers how to make the measurements. Member companies now are routinely measuring the charge on candidate therapeutic monoclonal antibodies before embarking on multi-million drug development projects. All of the BITC member companies (Genentech, Roche, Johnson and Johnson, Pfizer, Amgen, Abbott, Boehringer-Ingelheim) have instituted routine screening of candidate drug proteins early in the drug development process to determine which candidates carry sufficient charge to remain soluble at high concentrations. Only those antibodies with a high net charge are good candidates for formulation. Economic Impact: In order for proteins to remain soluble they must carry a net charge that blocks aggregation. Prior to the work by BITC, pharmaceutical companies did not have a way to measure protein charge. Instead, they calculated charge estimates based on indirect measurements. The companies were repeatedly and unpleasantly surprised to find that their therapeutic proteins were not soluble at high concentrations. Because the proteins had to be diluted, patients had to undergo long and expensive infusions in the clinic. By developing ways to determine protein charge BITC has reduced the cost of drug development substantially in two ways. First, companies now know that their charge estimates were too often incorrect, which is why the drugs were aggregating. By instituting charge determinations as part of the drug development process, they now will be able to prevent drug aggregation prior to expensive clinical testing. Second, the drugs will remain aggregate free up to very high concentrations, making it feasible for patients to administer them at home, thus saving clinic costs. One other important impact of charge determination is longer ranged and more abstract, but will have significant impact on medicine. The biochemistry of cells and tissues takes place at very high protein concentrations. However, our current understanding of biochemistry is based on measurements made in very dilute solutions. The same erroneous assumptions about protein charge made by the pharmaceutical companies have been at the foundation of our understanding of high concentration systems. It is clear that charge measurements must be made a routine part of all of biochemistry and cell biology if we are to have an accurate view of how cells function. This new knowledge will be at the heart of biological medicine, and will open up new horizons for how medicines are developed and used. For more information, contact Thomas M. Laue, 603.862.2459, tom.laue@unh.edu. Center for Advanced Forestry Systems (CAFS)
North Carolina State University (Center Headquarters), Barry Goldfarb, Director, 919.515.4471, barry_goldfarb@ncsu.edu
North Carolina State University, Jose Stape, 919.513.4041, jlstape@ncsu.edu
Purdue University, Charles Michler, 765.496.6016, michler@purdue.edu
University of Georgia, Michael Kane, 706.542.3009, mkane@warnell.uga.edu
University of Florida, Erik Jokela, 352.846.0890, ejokela@ufl.edu
University of Maine, Robert Wagner, 207.581.2903, bob_wagner@umenfa.maine.edu
Oregon State University, Glenn Howe, 541.737.9001, glenn.howe@oregonstate.edu
University of Washington, David Briggs, 206.543.1581, dbriggs@u.washington.edu
Virginia Tech, Thomas Fox, 540.231.8862, trfox@vt.edu
Center website: http://www.cnr.ncsu.edu/fer/cafs
image of a horizontal line
Refinement of Growth and Yield Models for Naturally-Regenerated, Mixed-Species Stands
image of a naturally generated forest Forest growth models are widely used by forest managers and researchers to forecast future growth, update forest inventory information, and assess alternative forest management strategies. Currently used computer simulation models across the Northeastern US typically show significant biases. Because of these biases improved growth and yield tools have recently become a top research priority by Maine’s industrial forest landowners. This CAFS project has developed a computer simulation tool that better reflects present-day forest conditions and can more accurately represent alternative forest management regimes across the Northeast. Compared to plantations, growth and yield models for naturally-regenerated, mixed-species stands have received relatively little attention. This project also provides a tool that can be used by forest managers in a variety of settings. First, model uses data generated from diverse sets of stand conditions and forest management regimes (>3 million observations are typical). Second, it is specific enough to predict individual tree growth in the complex mixed-species stands that comprise much of the Northeast. As a result, outputs are flexible enough to account for varying forest types and stand histories. Third, this project uses model-fitting techniques that are capable of flexibly accounting for dynamic growth patterns. At the same time, alternatives measures of site productivity are being tested to find out which measures best correlate to and model forest growth. Finally, similar models are being developed to accurately represent common forest management practices such as thinning; an attribute that existing models lack. Economic Impact: This work will have significant economic impacts not only on the forestry industry generally, but on regional, state and the national economy as well. A properly developed growth characterization and forecasting tool such as this CAFS’s work produces will produce more refined regional growth and yield estimates for determining future forest attributes. For forest managers, the net present value of carrying out various types of silviculture, such as pre-commercial and commercial thinning, can be quantitatively assessed prior to treating a stand. For the center’s pulp and paper industry sponsors, wood supply analyses are being conducted on ownership and regional scales. For wildlife managers and those interested in biodiversity issues, suitable habitat is being more accurately assessed. Landowners can accrue significant savings and higher yields by calculating annual forest property carbon sequestration, which is becoming a valuable commodity itself. Finally, policy makers can assess the consequences of various forest policy measures. For more information, contact Aaron Weiskittel, 207.581.2857, aaron.weiskittel@maine.edu.
image of a horizontal line
Predicting the Quality Value of Fast Grown Wood from Advanced Forestry System
Over the last two decades foresters have dramatically increased growth rates and yields of the South’s forests through intensive plantation management (advanced forestry systems). What is the quality of this fast grown wood compared to historic wood products? Until now that question has either been avoided or addressed with very limited data and a good bit of speculation. Forest landowners rely on long term forecasting models of forest growth and yield as they plan these intensive management regimes. Traditional forecasting models predict the timber volumes but have not taken into account the quality of the wood produced, such as lumber stiffness and strength. Yet wood quality is directly affected both positively and negatively by intensive plantation management. Wood growers currently do not have good understandings of the quality of the wood they are growing. There has been no region wide consensus among wood growers on how to manage for wood quality, nor is there a broad understanding among wood products manufacturers about the value of trees grown in different ways.image of a managed planation forest CAFS researchers at the University of Georgia have been dissecting trees and analyzing the wood grown throughout the South under the range of forest management intensities and have put their findings into prediction models that can future wood density, stiffness and strength. Most importantly, they have integrated these wood quality predictions into the growth and yield models foresters use for evaluating volume gains from advanced genotypes and intensive silvicultural treatments such as weed control, thinning, and fertilization. This allows wood growers, for the first time, to incorporate the quality of wood in their planning along with the quantity of wood. Results are helping sponsors to better understand wood quality impact on forest product mix. Large timberland owner such as Plum Creek need to understand value of wood that is grown as it relates to their customers' needs. This work is a breakthrough because it enables direct predictions of value along with volume, linking the tree grower with the wood buyer and the lumber manufacturer. Economic Impact: This outcome of this project will improve how timber is grown and marketed, because it will allow wood quality and value to be factored into the long term forest management decisions. This in turn will help manufacturers better market their products to end-users. It will result in more competitive pricing for timber based on wood quality. It should also dispel myths about plantation grown wood that have negative market impacts and could improve sales of the products. By factoring wood quality into silvicultural decisions, growers may increase the proportion of high quality sawtimber by 25% or more. This could mean gains in value at harvest of $1000 or more per acre; that could easily reach $1 billion in new value annually across the South. For more information, contact Richard F. “Dick” Daniels, 706.542.7298, ddaniels@uga.edu.
image of a horizontal line
Exponential Nutrient Loading
A new approach referred to as “exponential nutrient loading” has been developed by researchers at the Center for Advanced Forestry Systems (CAFS) to pre-condition black walnut grafts in the greenhouse for field planting. The technique increases the morphological and nutritional quality of grafted plants, as well as store nutrients in root plugs for later utilization to benefit early plantation establishment success. This protocol, allows for a higher growth rate of the grafts in their first year after planting in the field. Black walnut grafts that have been grown exponentially will be used in intensively cultivated plantings. In intensive cultivation, this is important because the response to fertigation and weed control is higher and rotation age will be decreased, which brings substantial financial benefits. In extensive cultivation, the rapid growth and competitiveness exhibited by exponentially nutrient loaded grafts will accelerate plantation growth to reach free to grow status sooner, which increases the chances of crops to escape damage from animal browsing and weed competition. Intensively cultivated clonal black walnut plantings under are currently being offered as a financial opportunity for long-term investors. Economic Impact: This method makes possible a two-year reduction in the long term production cycle. Based on today’s prices for black walnut timber, this new method of tree production is increasing the profitability of one CAFS corporate partner by an estimated 8 million USD over expected the production rotation period. Our corporate partner employs over 50 staff annually. This increased profitability allows the company to maintain its current level of employment. This method has also been adopted by public nurseries and increases their competitiveness by allowing them to sell a higher quality product and thus stay more competitive in the current recessionary market. For more information, contact Charles Michler, 765.496.6016, michler@purdue.edu.
image of a horizontal line
Precocious Flowering in Populus
image of a forest at sunset Trees have not been domesticated to the same extent as agronomic row-crops because of their extended juvenile periods. Moreover, significant amounts of sugars fixed as a result of photosynthesis are diverted away from vegetative growth (e.g., stems, branches, roots, and leaves) to form reproductive structures (e.g., cones, flowers, seeds, etc.) after trees have undergone the transition to maturity. Federal regulators have made it clear that a transgene confinement system is likely to be needed before genetically engineered trees can be deployed commercially. CAFS researchers are attempting to genetically engineer flowering control as a way to satisfy this requirement. In order to test the efficacy of the genetic constructs inserted in the poplar genome for their ability to affect floral development, researchers must wait for plants to acquire the competence to produce flowers. The long delay before the onset of flowering in poplars (they have a juvenile period of five to seven years) and their resistance to various conventional flower-induction treatments have been serious impediments to engineering sterility. CAFS researchers obtained a genotype of Populus alba from the University of Tuscia (Viterbo, Italy) that flowered nine months from when the seed was sown. Vegetative propagules from this line remained true to type (i.e., they flowered in nine months). However, this genotype had to be regenerated in vitro and grown under aseptic conditions before it could be imported into the U.S. The regeneration process caused this genotype to lose its ability to flower early. CAFS researchers experimented with a variety of inductive treatments and discovered one that restored the early-flowering phenotype. Center researchers have also obtained a genotype of Juglans regia that is capable of producing flowerings on nine-month-old plantlets and have identified conditions required to induce flower formation on Prunus serotina grown in vitro. Thus, CAFS scientists now have a variety of effective model systems for testing flower-control constructs, without having to conduct lengthy, expensive field trials. Economic Impact: Through better understandings of the process by which trees control the onset of flowering, it may be possible switch flowering on or off at will, through genetic engineering. Shortening breeding cycles will allow for more rapid selection of trees that produce more biomass and are resistant to various biotic and abiotic stresses, thus minimizing economic losses. Preventing flowering will allow more photosynthate to be used for vegetative growth. Because there are so many tree species grown for a diverse range of products, it is difficult to quantify the benefits associated with this technological advance. Poplars are grown for fiber to manufacture paper, but increasingly for window and door casings, moldings, pallets, core stock for plywood, and, increasingly, as a feedstock for biofuel production and it is estimated that the yearly value of the U.S. poplar industry alone is about $300 million USD. For more information, contact Rick Meilan, 765.496.2287, rmeilan@purdue.edu. Center for Advanced Knowledge Enablement (CAKE)
Florida International University, Naphtali Rishe, Director, 305.348.2025, rishe@fiu.edu
Florida Atlantic University, Borko Furht, 561.297.3180, borko@cse.fau.edu
Dubna International University (International Site)
Center website: http://cake.fiu.edu
image of a horizontal line
TerraFly Maps Enable Monitoring of Airborne Cameras
Although video surveillance recording is the state-of-the-practice, the video collected is normally used only after the fact - it cannot easily be accessed in real time, does not have accurate geolocation capabilities, and cannot be easily integrated with other forms of critical information. This state-of-the-practice lack of situational awareness will be overcome by the CARMEL-TerraFly system. image of a surveillance photo from airborne cameras looking over land and water The project integrates cutting-edge Context Aware Rich Media Extensible Middleware technology (known as CARMEL) from IBM Research - Haifa (http://www.haifa.ibm.com) with the TerraFly Geospatial System at the Center for Advanced Knowledge Enablement (CAKE). This integrated system offers innovative situational awareness technology, while helping expand the Center’s international influence and connections. By combining IBM Haifa’s Geographic Information Systems (GIS) and streaming technology research, CARMEL is a geographically anchored, video-on-demand streaming infrastructure that provides: 1) scalable, end-to-end low-delay and resilient streaming technologies; 2) on-demand bandwidth adaptation (transcoding); 3) highly accurate geographical searches, 4) real-time, geo-located notification, and; 5) high performance, service oriented architecture-enabled technologies. TerraFly is a technology and tools for the visualization and querying of geospatial data. It provides users with the experience of virtual "flight" over maps comprised of aerial and satellite imagery overlaid with geo-referenced data. The data drilling and querying component of the system allows the users to easily explore geospatial data, create geospatial queries, and get instant answers supported by high-performance multidimensional search mechanisms. TerraFly's server farm ingests, geo-locates, cleanses, mosaics, and cross-references 40TB of basemap data and user-specific data streams. The interface allows rapid deployment of interactive Web applications. It is accessible from anywhere via any standard Web browser, with no client software to install. This novel technology would transform public safety assurance and the ability to quickly respond to situations. The CARMEL-TerraFly project marries these two technologies, providing geographically anchored streaming services that can be combined with and accessed via the intuitive TerraFly user interface. Users will be able to select a geographic area of interest, retrieve multimedia data from sensors in the area and view streaming video of moving objects in real time (e.g., vehicles, people, animals, etc.). Users will also be able to set temporal and geographic constraints to view the path traversed by a specific moving object or group of objects. There are numerous potential applications for this advanced technology, particularly for command and control operations such as homeland security, law enforcement and disaster response. For example, using the CARMEL-TerraFly system, law enforcement could be alerted to a situation such as a hit-and-run accident. Officers would be able to quickly pin-point the geographic location, view streaming media of the current location to quickly assess the situation, and, through the use of additional sensors, track the offender’s vehicle. Economic Impact: The potential economic impact of CARMEL-TerraFly is substantial because it can be a cost effective public safety tool that reduces law enforcement costs, increases effectiveness of situational evaluation and response and contribute to economic improvement of areas. Litigation costs could also be decreased as more timely and accurate evidence becomes available for use in and out of the courtroom. In addition, the system could improve the effectiveness of situational evaluations and subsequent responses by providing tools for better resource allocation, thus improving the safety of responders and the public, and ultimately saving lives and property. Finally, use of this system could ultimately reduce crime, which, in turn, would lower the cost of doing business and contribute to local and national economic improvement. For more information, contact Naphtali Rishe, 305.348.2025, rishe@fiu.edu.
image of a horizontal line
Business Continuity Information Network: Faster Community Driven Disaster Recovery
In coastal areas throughout the US information sharing is critical for community resilience and protection of economic interests. Studies indicate that following hurricanes approximately 40% of companies fail within 36 months when they were closed for 3 or more days. Years of meteorological data have demonstrated that South Florida is particularly prone to extensive damage from hurricanes. There are a myriad of toolkits, checklists, and other business continuity tools available that address how to prepare businesses for disaster. None of these stand-alone tools provide a means for business users to connect with local governments to monitor ongoing situations before, during and after natural disasters. image of a satellite image of a hurricane over Florida The Business Community Information Network (BCIN), at the Center for Advanced Knowledge Enablement (CAKE), provides a platform for public and private sector communities to work in a coordinated fashion, providing the right information, to the right person, at the right time, in the right format. Florida International University, its public and private sector partners, including Office Depot, Wal-Mart, IBM, the Greater Miami Chamber of Commerce, and county and city government agencies, have developed BCIN; a unique information sharing web-based software that provides a means for at-risk local businesses to receive and share timely and vital preparedness, response, and recovery information. This information helps protect critical infrastructure and provide high demand recovery resources. CAKE researchers have captured processes, workflow, and continuity "best practices" in an intuitive user interface that displays, queries and reports on over 26 different situational categories such as ports, roads, utilities, fuel, and other critical infrastructure and recovery resources. The BCIN is available year-round as a service. This business-to-business community network provides participating companies with a new powerful tool to track their key employees and supply chain status, and locate needed recovery goods and services. The system helps government agencies assess damage and prioritize recovery needs. Economic Impact: Based on training exercises, surveys and other feedback our participants feel they will significantly benefit by utilizing the system and its capabilities. Information sharing is critical for community resilience and overall economic well being in coastal areas throughout the US. Since May 2009, the system has been operational in four South Florida counties: Miami-Dade, Broward, Palm Beach, and Monroe. The system was tested in response to storms Fay (see photo), Gustav, and Ike and used in numerous state and local hurricane and terrorist disaster training situations. Hundreds of individuals from government agencies, NGOs, and businesses have been acquainted with and trained on the system. Based on data from the Insurance Information Institute, if 5% of the companies in South Florida could gain the capability to speed up their hurricane recovery by one week, then $220M of non-property economic losses could be avoided. For more information, contact Shu-Ching Chen, 305.348.3480, chens@cs.fiu.edu.
image of a horizontal line
Distributed Cloud Computing: 3-D Visualization Services for Climate Data on Demand
This study is a collaboration between CAKE and the Center for Hybrid Multicore Productivity Research (CHMPR) at UMBC. image of a heat map of the world Measuring the surface temperature of the entire Earth on a daily basis is a difficult challenge because 75% of the planet is covered with oceans and ice. Continuously determining, for several days to weeks, the vertical thermal field around a hurricane surrounded by dynamically rotating clouds is needed for more accurate landfall predictions. Thus, for applications ranging from climate change to hurricanes, satellites measure the Earth’s emitted infrared radiation twice daily with sufficiently high spatial and spectral resolution to provide an estimate of vertical profiles of regional or global surface brightness temperature (BT). However, in order to assess global warming, these temperatures need to be measured to within an accuracy of 0.10 ˚C per year since models indicate CO2 warming of ~20-30 over 100 years. Moreover, to resolve the structure around hurricanes, infrared data at resolutions of 1-5 km are needed. Not until 2002, when the Aqua satellite was launched, has there been a single satellite with instruments that can meet both the accuracy and the spatial resolution required. In this multi-center collaborative project, researchers from the Center for Hybrid Multicore Productivity Research (CHMPR) at UMBC and the Center for Advanced Knowledge Enablement (CAKE) at Florida International University (FIU) and Florida Atlantic University (FAU) have developed a capability to deliver a decade of 3-D gridded arrays of animated visualizations of spectral IR satellite radiance data from instruments on AQUA. These animations render in 3-D the vertical structure of a decade of global and regional temperature trends occurring at the surface and lower troposphere. In addition, the gridding algorithm developed by CHMPR has been applied to providing CAKE with 3-D temperature profiles that specify the thermal structure around hurricanes in order to improve their landfall prediction. image of a computer generated model of atmospheric temperatures CHMPR and CAKE have implemented a distributed cloud computing web-based service, called SOAR, that incorporates this visualization capability as a public service available on an advanced IBM-based server cluster. This system provides researchers and students with the ability to select regional and temporal periods and automatically transform IR orbital satellite data into spherical grid arrays of 3-D temperature profiles for viewing the continuous changing thermal structure of the atmosphere. The FIU site at CAKE augmented the satellite data visualization by providing spatiotemporal visualization and animation of the data (http://cake.fiu.edu/SOAR). The FAU site at CAKE has developed tools for 3-D visualization of the vertical temperature profiles. When coupled with gridding CHMPR software, render for the past decade the first integrated scientifically validated multi-year infrared brightness temperature record. Economic Impact: Fundamental Decadal Data Records are highly desired products recommended by the National Academy of Science/National Research Council. The SOAR distributed cloud computing web-based service enhances NASA’s ACCESS program by providing fundamental brightness temperature records. This can go a long way towards improving scientific and public understanding of the nature of global and regional climate change. As a result, everyone can be better positioned to design any necessary policies and actions for mitigating negative impacts on the economy. For more information, contact Valerie Thomas, 410.455.2862, valeriet@umbc.edu or Naphtali Rishe, 305.672.6471, rishe@fiu.edu or Borko Furht, 561.297.3180, borko@cse.fau.edu. Center for Advanced Polymer & Composite Engineering (CAPCE)
Ohio State University, L. James Lee, Director, 614.292.2408, lee.31@osu.edu
Florida State University, Ben Wang, 850.410.6339, indwang1@eng.fsu.edu
University of Wisconsin-Madison, Tom Turng, turng@engr.wisc.edu
Center website: http://www.capce.ohio-state.edu/
image of a horizontal line
Software for Enhancing In-Mold Coating Processes of Plastic and Composite Products
CAPCE research led by Jose Castro has produced new software to enhance in-mold coating processes. The technique of in-mold coating has the potential to revolutionize the coating and paint processing industries because it allows the coatings to be injected under high pressure right inside the mold used, for instance, to create an automotive body panel, rather than having to run the part through long coating production lines that are expensive, energy consuming, and release solvents into the environment. Software developed by the center provides the ability to predict the flow of in-mold coating processes, saving time and money compared to the previous approach. A center member company is paying patent application costs for this technology. Applications extend beyond the automotive industry to include many kinds of plastic and composite products. image of a car dashboard For more information, contact Jose M. Castro, 614.688.8233, castro.38@osu.edu.
image of a horizontal line
Nanocomposite Foam Breakthrough
The worldwide value of plastic foams was $2 billion in 2000. However, current applications are limited by the fact that foams have poor toughness, strength, and surface quality and low thermal stability, lack fire retardance, and release environmentally-harmful gases. Researchers at Ohio State's I/UCR Center for Advanced Polymer and Composite Engineering (CAPCE) have developed a novel method with the potential to improve foam properties by a factor of 3 or 4. Such improvements are expected to dramatically increase the worldwide demand for plastic foams and increase the U.S. market share in the building and transportation industry, in packaging and as absorbent materials for the health care industry. The method has attracted a great deal of interest from industry and the media. The method involves mixing specially-treated clay nanoparticles with the materials to be foamed, then blowing the foams with carbon dioxide using supercritical fluids technology. The new process for making the foam will have many environmental benefits, including reduced energy use when the material is applied as an insulator in building construction and the elimination of ozone-depleting materials in the foam-making process. In addition, the resulting plastic foam is also fire retardant. Tests with Owens Corning and other companies have demonstrated the feasibility of cost-effective mass production. Scale-up activities for commercialization are being carried out through a $1.9 million NIST-ATP project with Owens Corning and a $2 million equipment award for Low Cost Nanocomposite Foams from State of Ohio Wright Center Capital Project Funds. Economic Impact: The worldwide value of plastic foams was $2 billion in 2000. This work is expected increase the demand worldwide for higher quality and safer plastic foams and increase the US market share in construction, transportation, and healthcare. For more information, contact L. James Lee, 614.292.2408, Lee.31@osu.edu or Roland Loh, Roland.loh@owenscorning.com. Center for Advanced Processing and Packaging Studies (CAPPS)
The Ohio State University, Steven Schwartz, Director, 614.292.2934, schwartz.177@osu.edu
North Carolina State University, Kandiyan Sandeep, 919.515.2444, kp_sandeep@ncsu.edu
Center website: http://www.fst.ohio-state.edu/CAPPS/index.html
image of a horizontal line
High Pressure pH Probe
image of a measuring device, looks like the tip of a small lightbulb attached to a metal handle It has been thought for many years that the pH of various food systems may be reduced under pressure, yet to date there have been no means of studying this phenomenon. There has been a wealth of literature reporting the effect of pressure on inactivation of microorganisms in various buffer systems, yet it was known that the degree to which the pH lowering effect of high pressure is a factor in inactivation kinetics measured in different buffer systems. This High Pressure pH Measuring Device, or probe, developed at the Center for Advanced Processing and Packaging Studies (CAPPS), can determine the pH of fluids at extremely high pressures; on the order of 600 MPa (87,000 psi). No such device had been previously available to study acid / base equilibrium phenomena under extreme pressure conditions. High Pressure Processing is a technology that uses extreme pressures, instead of heat, to pasteurize foods. The probe is beginning to be employed commercially in the food industry for a number of high quality product applications such as processed meats, shellfish and the preservation of products containing heat-labile fruit and vegetables. The development of a high-pressure pH probe should finally enable a better understanding of the pressure / pH shift / microbiological effects. Economic Impact: With this new probe as a research tool, it is becoming more possible to develop and select food acidulant systems that reach low pH levels under pressure (thus improving HPP antimicrobial effectiveness) yet allow for organoleptically acceptable products at 1 atm when they are consumed. This is a development that could save many millions of dollars annually for the food processing and distribution industries. For more information, contact Sudhir Sastry, 614.292.3508, sastry.2@osu.edu.
image of a horizontal line
Continuous Microwave Sterilization of Fluid Foodstuffs
image of a man checking industrial equipment used for pasteurization Research at the Center for Advanced Processing and Packaging Studies (CAPPS) utilized technology that allows fluids to be continuously and very rapidly heated, in a tube, by a focused microwave source. Aseptic processing of fluid foods has been practiced by industry for a fairly long time, but the quality of foods produced conventionally, by indirect heat transfer through the walls of a tube, has been limited by the rate at which the food can be heated to pasteurization/ sterilization temperatures. To eliminate microorganisms, the food must be exposed to a certain target temperature for a defined period of time; slow heating will degrade the quality of food during heat-up. This is a particular problem with highly viscous fluids that tend to have poor heat transfer rates from a heated wall. By conducting heat with microwaves, heating rates can be substantially increased with dramatic improvement in quality, without the need for scraped surface heat exchangers and large surface area heat exchangers. Continuous microwave processing may be further extensible to food systems with particulates. Economic Impact: This innovation makes possible a number of viscous food products to be prepared with a significant improvement in quality. It should result in substantial economies in the food processing industry. For more information, contact Josip Simunovic, 919.513.3190, josip_simunovic@ncsu.edu or Ken Swartzel, 919.513.2063, ken_swartzel@ncsu.edu. Center for Advanced Studies in Novel Surfactants (CASNS)
Columbia University, Ponisseril Somasundaran, Director, 212.854.2926, ps24@columbia.edu
Center website: http://www.columbia.edu/cu/iucrc/
image of a horizontal line
Greener, More Sustainable Solutions for the Mining Industry
Mineral separations have been becoming increasingly challenging due to the emergence of problematic ores in several existing operations. It is well known within the mining industry that in selective flotation separation of valuable minerals from complex ores, certain silicates and slime-forming minerals have significant detrimental effects. Until recently such effects were attributed to chemical factors such as hetero-coagulation between the silicates and valuable minerals, which is generally referred to as slime coating. Previous approaches to this slimes problem have been unsatisfactory because they consume too much energy and water and are not sustainable. Researchers at CASNS, in cooperation with Cytec Industries and centers’ sponsor Vale-Inco, have developed new techniques to study rheological properties of such ore pulps. This research program, which was designed to develop a scientific understanding of contributions from both physical and chemical factors of slimes to selective mineral separation and suspension rheology, has discovered the large role played by morphology of certain silicate minerals, when present in even small amounts in a complex mixture of minerals. image of mining trucks in a rock mine When the CASNS research program was initiated, there were no established methods to monitor the rheological properties of ore pulps, which typically have a wide size and specific gravity distribution, due to difficulties caused by the rapid settling of coarse and heavy particles. These techniques are based on sedimentation and determination of various rheological parameters such as viscosity, yield stress and torque values at high shear rate. Findings should have significant scientific and technological impacts, leading to the derivation of pathways to enhance selective separation of valuable minerals from complex ores containing slimes and develop a robust solution to the long-standing slimes problem. Economic Impact: This advance will make smelting more energy efficient, and utilize waste products. Most importantly, This work will result in the design of greener, sustainable solutions for the mining industry: new processes that consume less water and energy, and use green reagents, thus significantly reducing the overall mining environmental footprint and making the industry more economically productive. It should extend the lifetime of existing mining operations, saving jobs and resources. For more information, contact P. Somasundaran, 212.854.2926, ps24@columbia.edu
image of a horizontal line
Novel Technologies for Superior and More Sustainable Consumer Products
image of laundry detergent In the past few years, CASNS has studied complicated systems with interaction among multiple ingredients including surfactants, polymers, enzymes, solid particles, solvents and electrolytes in water and with substrates such as skin and fabric materials. This work provides a knowledge base and framework of learning for industrial applications. In personal care sector, the application of the learning in products such as soaps and shampoos led to milder, less irritative skin care products. In household cleaning and laundry care segments, the learning led to higher performing products with less environmental footprint and cost. The project examined the industrial systems as the benchmark through the lens of sustainability. Using greener materials based on their profile in the supply chain, the researchers examined novel materials and commercial ingredients and searched for synergy. Through investigation of the physical properties and micro-structure of the surfactant systems and their correlation with the performance, the center were able to provide solid learning toward a new generation of product formulations with higher performance and less cost and footprint. This research provided systematic learning that helped the development of a newer generation of laundry detergent formulation with better textile skin feel, greater resistance to pH variation, more robust profile of stain removal in laundry washing process, and greater cost effectiveness. Another example is the use of cationic polymers in skin wash products. The research project led to enhanced skin mildness and more effective use of surfactants in cleansing. Economic Impact: The learning developed at CASNS provides further opportunities for member companies in their R&D efforts to develop products with higher performance and less cost and environmental footprint. For more information, contact Ponisseril Somasundaran, 212.854.2926, ps24@columbia.edu.
image of a horizontal line
Enhanced Silicone Coatings: Improved Fabric Care
Interaction of silicones with surfactants are very important for better performance of the products in cosmetic and personal care industry, but very little is known about the mechanism by which these polymers interact with various substrates. Generally, improved silicone coating results in better softness, handle, feel and bounce. Results provide quantitative measures for qualitative properties such as softness, bounciness of fiber; important considerations for the fabric care industry. image of a computer generated model One area of application is that enhanced silicone coatings enable industry to use much wider varieties of cotton, while getting comparable end-product performance. Thus short fiber length cotton can be used with expected performance comparable to long fiber length cotton. This advance means that higher yielding cotton and cellulosic fibers can also be utilized for quality durable garments even if they are of lower quality. It has enabled Center collaborators such as Elkay Chemicals to better understand and design silicone chemistries. Impact of this approach is significant to environment and ecology as much wider areas of farmland can be made available for cotton farming. Economic Impact: This research has produced significant economic impacts in a broad area of surfactant science due to applications in the chemical, cosmetics, mineral, petroleum, and pharmaceutical industries. For more information, contact P. Somasundaran, 212.854.2926, ps24@columbia.edu.
image of a horizontal line
Advances in Basic Science of Skin Cleansing
Washing and cleansing can be a damaging process for skin, so the choice of surfactant becomes very critical in order to minimize damage. Insights on surfactant blends are critically important in deciding what kind of blends to incorporate into product formulations. Fundamental research at the Center for Advanced Studies in Novel Surfactants (CASNS) on surfactant binding to proteins has benefited development of skin cleansing and skin care products at a center sponsor’s main R&D center for skin research. One of the key factors is surfactant micelle charge density, which affects the irritation potential of surfactants. The goal is to minimize charge density up to a point, because with zero charge density, as in non-ionic surfactants, there is usually not enough lather and foam to satisfy customer demand. Formulation science focuses on how to blend components to achieve the desired balance. Researchers at CASNS have examined how surfactants bind to proteins. They have investigated how and when protein denaturation occurs, how the surfactant binds, how reversible the binding is, and how it may be affected by variables such as cleanser pH and temperature. Economic Impact: Insights gained from this research are being used by a center sponsor in skin cleansing and skin care products. For more information, contact P. Somasundaran, 212.854.2926, ps24@columbia.edu. image of 2 leaves, one treated with mild surfactant and left to dry for 48 hours. The treated leaf looks normal while the untreated leaf has dried up. Center for Advanced Vehicle Electronics (CAVE3)
Auburn University, Pradeep Lall, Director, 334.844.3424, lall@auburn.edu
Center website: http://cave.auburn.edu/
image of a horizontal line
New Experimental Techniques to Study Solder Materials and Processes
a microscopic image of a molten solder ball Work in CAVE3 has led to the development of several new innovative experimental techniques to study solder alloys. A unique scanning electron microscope has been developed that allows for real-time and in-situ studies of the melting, wetting, and spreading of solder alloys and pastes. The system allows for microscopic observation of the advancing molten solder with simultaneous analysis of alloy-substrate chemical reactions during wetting. It is highly unusual to undertake studies of liquids in expensive and high-performance vacuum systems due to potentially high vapor pressures and flux outgassing. Results from the use of this novel facility have especially benefited CAVE3 industrial sponsors who use solder materials and technology. In addition to the ability to study molten solders, CAVE is the first organization to develop a scanning electron microscope to measure strains in materials during repetitive temperature cycling processes such as those common in under-the-hood electronics. A third unique apparatus in CAVE is a custom-made surface analysis system that enables in-situ studies of surface segregation during melting and wetting processes. Economic Impact: The ability to study fundamental properties of electronic materials in-situ has reduced the development costs associated with new electronic platforms. In absence of these new experimental methods, significant system level testing would have to be undertaken for validation of the material performance in electronic manufacturing process. The reduced material development time is expected to result in faster time to market. For more information, contact Pradeep Lall, 334.844.3424, lall@auburn.edu.
image of a horizontal line
L ead-Free Solder Alloys for Harsh Environment Applications
Determine of the viable lead-free alternatives is necessary for long-life electronic systems such as defense and industrial electronic systems. Researchers at the Center for Advanced Vehicle and Extreme Environment Electronics (CAVE3) have developed a number of important methodologies related to electronics assembly with lead-free solder alloys. This research has international significance due to the ban on the element lead (Pb) enacted in the EU and Japan during 2006. The problem with lead-free electronics is not that there is a dearth of alloys, but that there have been a lot of alternative alloys proposed - most of which are not well understood in terms of their performance and reliability in extreme environment applications. Harsh environment systems require long extended period of operation under extreme environments. CAVE has become one of the first organizations to intensively study the materials science, mechanical behavior and solder joint reliability of leading candidates for lead-free solder alloys formulated from tin, silver and copper. a microscopic image of tin whiskers An innovative approach for lead-free solder prognostics has been established that allows users to estimate the remaining useful life of solder joints. Researchers at CAVE are leading efforts to characterize aging effects in lead free solder alloys that result in unexpected degradation of lead free solder joints in extreme environments. This work has helped CAVE3’s industrial partners stay ahead of their competition in their respective technological areas. The research has demonstrated not only what will work, but more importantly, what will not work. By not wasting time on dead-end research, CAVE3 has helped member companies narrow the options to cost-effective and reliable alternative solders that can be used in commercial, industrial, and military electronics for extended years of service. Economic Impact: Automotive systems often need 10-years, 100K miles. Defense systems require 15-20 year operation with extended periods of storage. Transition to lead-free electronics will result in long-term obsolescence of leaded electronic components. Achieving longer-life electronic systems relies on lead-free alternatives and mixed technology assemblies. Techniques developed at CAVE3 should reduce sustainment costs of the systems to the benefit of industry and the nation. Center for Autonomic Computing (CAC) A CISE-funded Center
University of Florida, Jose Fortes, 352.392.9265, fortes@ufl.edu
Rutgers University, Manish Parashar, 732.445.5388, parashar@caip.rutgers.edu
University of Arizona, Salim Hariri, 520.621.4378, hariri@ece.arizona.edu
Mississippi State University, Ioana Banicescu, 662.325.7508, ioana@cse.msstate.edu
Center website: http://www.nsfcac.org/
image of a horizontal line
Autonomic Critical Infrastructure Protection System (ACIP)
A recent Forrester survey reported that 75% of organizations experienced Distributed Denial of Service Attacks (DDoS) even though they implemented cybersecurity solutions. One third of the organizations that were attacked experienced service disruption as a result of the attack. The problem is that many of these are ineffective against novel and well-organized attacks. The nation’s critical energy infrastructures (power, water, gas and oil) are moving to modernize their industrial control systems to build what is referred to as “Smart Grids” that use advanced computing and communications technologies to bring knowledge so they can operate far more robustly and more efficiently. These developments have led to huge cybersecurity problems because of the widespread use of Supervisory Control and Data Acquisition (SCADA) systems that were never designed with security in mind. Consequently, SCADA systems become a prime target for cyberattacks due to the profound and catastrophic impacts they can have on our economy and on all aspects of our life. In fact, critical infrastructures have expanded to include not only the energy critical infrastructures, but also many process control systems, networks and infrastructures of which approximately 85% are privately owned. a drawing of the test bed configuration and communication between systems Motivated by this need, researchers at the Center for Autonomic Computing (CAC) and industrial center members (Raytheon and AVIRTEK) are collaborating to develop an innovative cybersecurity approach based on autonomic computing technology. It is analogous to the human nervous system where computing systems and applications can be self-configured, self-optimized, self-healed and self-protected with little involvement from the users and/or system administrators. CAC has successfully developed and implemented an Autonomic Critical Infrastructure Protection (ACIP) appliance and currently being tested and evaluated. This involves evaluating appliance’s self-protection capabilities using an industrial process control test bed that offers multiple capabilities for both hardware and software experimentation. This breakthrough technology is validating the thesis that autonomic paradigms have the potential to detect and mitigate cyber threats launched against industrial control systems. Responding faster than a human operator, SCADA and their associated control elements can effectively immunize against cyber malware and mitigate the effects of control element failures until human operators can take control. Economic Impact: A study conducted by the same group, Forrester Consulting, indicated that organizations that provide online services as their core business stand to lose millions of dollars per hour when their services are down. The ACIP technology when fully matured can be exploited by western world societies to immunize critical infrastructures against being targeted by malcontents and terrorists. Enhancing the ability of the nation to provide undisrupted service of electric power, clean potable water, transportation and other necessary societal support services, saves lives, preserves the domestic tranquility and can help protect industry’s and the nation’s economic vitality. For more information, contact Salim Hariri, 520.621.4378, hariri@ece.arizona.edu.
image of a horizontal line
Using CometCloud for Managing Scientific and Business Workflows on Multiple Clouds
Rutgers and Xerox collaborated to develop and deploy an innovative workflow management framework for federated cloud infrastructure using the CometCloud autonomic cloud engine. Public clouds have emerged as an important solution that enables the renting of resources on-demand, and supports a pay-as-you-go pricing policy. Furthermore, private clouds or data centers, which cater to a restricted set of users within an organizational domain, are exploring the possibility of scaling out to public clouds to respond to un-anticipated resource requirements. As a result, dynamically federated, hybrid cloud infrastructure, such as those enabled by CometCloud, which integrate private clouds, enterprise data centers and grids, and public clouds, are becoming increasingly important. An enterprise workflow typically consists of an ordered set of heterogeneous applications, each of which may have specific constraints on resource requirements, performance, completion time, cost, privacy, etc. an image of clouds, representing cloud computing The breakthrough enabled Rutgers and Xerox to demonstrate the following capabilities for the first time in a single framework: 1) Dynamic cloud federation - managing resources across multiple private and public clouds in order to dynamically scale the execution of application workflows up, down, and/or out, according to high-level policies; 2) Programming management - resource scheduling and provisioning within the federated cloud infrastructure based on application requirements as well as system resource capabilities and availability, and within cost, time, and performance constraints, and; 3) Workflow deployment - deployment of real-world enterprise application workflows from Xerox executing on a federated cloud infrastructure. Specifically, the hybrid infrastructure used in the demonstration dynamically integrated private clouds at Rutgers and ACS with the Amazon EC2 public cloud. Such an autonomic workflow framework can dynamically select an optimal mix of resource classes (clouds or grids provider, types of nodes, the number of nodes, etc.) based on application QoS and resources requirements, user policies, and constraints. The workflow framework can also monitor the execution of the applications services within the workflow, and can adapt both the resource provisioning as well as the services to ensure that the application requirements and user constraints continue to be satisfied. Adaptations may involve scaling resources up, down or out within the federated cloud infrastructure, and can allow the system to handle unanticipated situations such as workload bursts, system performance degradation, or resource failures. Economic Impact: This technology has the potential to reduce computational costs and improve efficiency of cloud computing service centers by enabling the construction of hybrid cloud infrastructures. These structures can support heterogeneous and dynamic workloads and on-demand cloud bridging. Federated cloud infrastructures also provide opportunities to improve application quality of service and lower cost by mapping applications of scientific or business workflows to appropriate resource providers. For more information, contact Manish Parashar, 732.445.5388, parashar@rutgers.edu.
image of a horizontal line
Demand-driven Service and Power Management in Data Center
Power consumption represents an increasingly significant percentage of the cost of operating large data centers. These data centers are used by banks, investment firms, IT service providers, and other large enterprises. One approach to reduce power consumption is to keep machines in standby or off modes except when the data center workload requires them to be fully on. This approach depends on being able to monitor performance, workload or resource demands and to anticipate the need for resources in order to meet service-level agreements of the users that generate the workload. image of computer servers in  a room The results of this project include: mechanisms to monitor, model and predict workload associated with individual services, model and predict global resource demand, and dynamically allocate and de-allocate virtual machines to physical machines; management methods based on control theory and/or market-based approaches; mechanisms to minimize the cost of providing individual services while globally minimizing power consumption and delivering contracted service levels; and development and evaluation of software that implements these methods. Ongoing experimental evaluations on an IBM BladeCenter have shown that the proposed approach can efficiently and stably reduce thermal hotspots, power consumption and performance degradation caused by virtual machine consolidation, while balancing conflicting objectives. Economic Impact: Annual energy and administration costs associated with today’s data centers amount to billions of dollars: power and cooling rates are increasing by an alarming 8 fold every year and are becoming the dominant part of IT budgets. The high energy consumption of modern data centers also translates into excessive heat dissipation, which, in turn, increases cooling costs and server failure rates. One of CAC’s main research thrusts aims to address this problem because doing so can lower the cost of ownership of data centers in all sectors of today’s economy. It can also increase the reliability of the infrastructure that provides critical services. For more information, contact Jose Fortes, 352.392.9265, fortes@ufl.edu. Center for Building Performance and Diagnostics (CBPD)
Carnegie Mellon University, Volker Hartkopf, Director, 412.268.2350, hartkopf@cmu.edu
Center website: http://www.arc.cmu.edu/cbpd/index.html
image of a horizontal line
The Robert L. Preger Intelligent Workplace (IW)
image of a building The Preger Intelligent Workplace (IW) is a living laboratory designed and engineered by the Center for Building Performance and Diagnostics (CBPD) in close cooperation with architects and engineers and the Advanced Buildings Systems Integration Consortium. The IW has functioned as a living (frequently adapted and updated to incorporate new materials, components, and systems) and lived-in laboratory that is occupied by CBPE faculty, staff and students. The IW pioneered the focus on hands-on integrated research involving robust innovative systems for multiple performance goals. It has partnered with over 50 industries to develop advanced integrated approaches to lighting, mechanical, structure, and interior systems. The IW has pioneered the concept of integrating horizontal load bearing structure, mechanical ducting, cabling for power, communication and controls, and floor-based infrastructures to support ongoing spatial dynamics. This facility and work has resulted in unprecedented levels of user accessibility, organizational flexibility, and technological adaptability, while eliminating the concept of obsolescence and material waste. The IW testbed has led to the development of Air Conditioning, Heating and Refrigeration Institute publications and has influenced innovative buildings internationally and across the US. image of a building interior The IW living laboratory also demonstrates the advantages of and opportunities for hybrid conditioning. Hybrid conditioning integrates daylighting with artificial lighting, natural ventilation with mechanical conditioning, passive and active heating and cooling strategies; all of which maximizes indoor environmental quality at the lowest energy cost with minimal material resource use. The energy value of hybrid conditioning represents over 50% of heating, cooling and lighting energy use in buildings. The capabilities of the lab has resulted in a series of ongoing research projects with industry from Gartner facades to Zumtobel lighting, to PPG glass and Alcoa aluminum, to Steelcase furniture, to Armstrong ceilings, Carrier mechanical systems and Johnson Controls. The most recent collaboration involves a ARRA Recovery Act project led by Siemens controls dedicated to profiling control systems to achieve 40% energy savings in existing buildings. Economic Impact: The economic impact of this innovation in building systems and systems integration for performance is multi-dimensional - impacting energy and operational costs, system reliability, product market share and quality of the indoor environments for building occupants. Owens Corning quantified the benefits of floor based infrastructures and flexible interiors introduced in its Toledo, Ohio headquarters as over 300,000 USD per year in. In separate CBPD efforts focused on building enclosure and mechanical system innovations, the Beijing energy efficient office building of the Ministry of Science and Technology recorded a 60 percent reduced peak cooling load due to the design and engineering involvement of the Center. The impacts of this “living laboratory” at is world-wide. The IW led to the Adaptable Workplace Laboratory at GSA Headquarters, the Laboratory for the Design of Cognition at Electricite de France, and the Building Energy Research Center at Tsinghua University. It created the impetuous for comparable labs at the University of British Columbia, Syracuse University and Purdue University. The R&D that occur in these labs fosters the development of advanced technologies and integrated systems and educates students and professionals to ensure the more rapid introduction of architectural building innovations in the marketplace. For more information, contact Volker Hartkopf, 412.268.2350, hartkopf@cmu.edu.
image of a horizontal line
National Environmental Assessment Toolkit (NEAT™)
image of NEAT. Looks like industrial equipment on wheels The National Environmental Assessment Toolkit (NEAT™) combines portable instrumentation with questionnaires and expert walkthrough to create robust baseline assessments of thermal, visual, acoustic, and air quality in the workplace. Development of NEAT continues with direct support from the General Services Administration and corporate and industry partners for the before and after field evaluation of cutting edge buildings and federal facilities nationwide. Most post occupancy evaluation (POE) is subjective only, with facility manager and user satisfaction questionnaires attempting to capture the perceived quality of the building. NEAT studies combine on-line and on-site user satisfaction questionnaires with objective measures of indoor environmental quality through substantial on-site measurements as well as capturing the technical attributes of building systems (TABS) to ensure that conclusions are linked to system design decisions. The CBPD team has developed robust data collection techniques, GIS based data records, and innovative data analysis tools from scatter plots to environmental “EKG for buildings” that can be linked to the quality of building systems and facilities use and management. The NEAT field studies are central to informing critical investments for improving indoor environmental quality, for building the business case for high performance buildings by linking facility management costs, health, and productivity to indoor environmental quality. Economic Impact: For the General Services Administration, the National Environmental Assessment Toolkit (NEAT) has been used to evaluate the environmental quality of federal workplaces, the technical attributes of the building systems, and employee satisfaction with the quality of the work environment. Informing renovation programs, specifications and change management, the NEAT studies also contribute to before and after records of improvements in environmental conditions and user satisfaction that are critical to justifications for investment in the quality of the built environment. Compelling quantitative and qualitative results have led utilities and corporate building owners to undertake NEAT studies of their existing facilities in preparation for renovation or new building construction. The value of investing in quality work environments can be measured in energy savings with 50% average reductions, decreased adverse human health outcomes from asthma to headaches to eyestrain seeing 20% average reductions, to productivity and performance improvements ranging from 2 to 20%. For more information, contact Azizan Aziz, 412.268.6882, azizan@cmu.edu.
image of a horizontal line
Building Investment Decision Support Tool (BIDS™)
Building operations (heating, cooling, lighting and ventilation) consume almost 40% of the U.S. primary energy, and 67% of electricity the nation’s electricity. The EPA estimates that sick building syndrome (SBS) costs the US economy in excess of 60 billion USD per year. On top of all this, building waste is the largest contribution to landfill. BIDS™ is a case-based cost-benefit analysis tool to support investments in advanced and innovative building systems that improve environmental quality, health and productivity in buildings. The tool provides life-cycle and return on investment based frameworks that take into account energy conservation, productivity, human health, and organizational effectiveness results of best practices. Through industry and federal support, the CBPD continues to identify laboratory and field case studies demonstrating the relationship of high performance components, flexible infrastructures and systems integration to the range of over 400 cost-benefit or productivity indices. The tool also relates quality indoor environments to major capital cost and benefit considerations, including productivity, health, and operations costs, with baseline data sets to support life cycle decision-making. Through extensive national and international lecturing and the robust web based tool, leading decision makers are more able to incorporate more efficient high performance HVAC systems and controls into their designs. The CBPD has established the technical and economic feasibility, as well as environmental and social desirability to create win-win solutions that promote investments in building quality that simultaneously improve energy efficiency. image of a computer screenshot of BIDS Economic Impact: The importance of capturing the true cost of ownership and striving for long life cycles has never been higher. The McKenzie 2009 report “Unlocking Energy Efficiency in the US Economy” identifies buildings as THE most cost effective investment for reducing our national carbon footprint, with the smallest investments yielding the highest reductions, in comparison to new energy sources, industry or transportation investments. The built environment is a key factor for human health, with an average of $5000 per worker per year spent by organizations for individual insurance to cover health care. The CBPD has identified close to a $1000 per year that is directly tied to the quality of the indoor environment, including respiratory, musculoskeletal, headaches and other chronic health concerns. Materials, component and systems choices, as well as building operations, can significantly reduce these health conditions by 10-80%, while also reducing absenteeism and improving worker performance. The Building Investment Decision Support (BIDS) clearly identifies the relationship between building performance, occupant satisfaction health and organizational productivity. The return on investment can be significant, at times reaching 100% in buildings that have lives of more than 30 years. Smart investments are critical to improved indoor environmental quality and to achieving the 50% energy savings that can be attained in the existing building market, as well as the 80% savings that is possible in new buildings. All three CBPD efforts - the Intelligent Workplace (IW), the National Environmental Assessment Toolkit (NEAT) and the Building Investment Decision Support Tool (BIDS) are dedicated to this low carbon future. For more information, contact Vivian Loftness, 412.268.2350 loftness@cmu.edu Center for Child Injury Prevention Studies (CChIPS)
University of Pennsylvania, Flaura Winston, Director, 215.590.3118, flaura@mail.med.upenn.edu
University of Pennsylvania, Kristy Arbogast, Co-Director, arbogast@email.chop.edu
Center website: www.chop.edu/cchips
image of a horizontal line
Child Injury Prevention: Enhancing Child Safety In Side-Impact Crashes
Side impact crashes account for 25% of motor vehicle crashes (MVCs), but represent more than 40% of MVC-related injury costs. Toward discovering better ways to protect children in side-impact crashes, a project at the Center for Child Injury Prevention Studies (CChIPS) by Kristy Arbogast sought to document the probable contact points in the vehicle interior in side-impact MVCs to children in child restraint systems (CRS). Two in-depth crash investigation databases, the Crash Injury Research and Engineering Network and the Partners for Child Passenger Safety Study, were queried for rear-seated, CRS-restrained children ages 0 to 8 years in side impact crashes who sustained clinically important injuries. A multidisciplinary team of physicians and engineers reviewed the cases to describe injury patterns, injury causation, and vehicle components contributing to the injuries; 41 occupants met the inclusion criteria (average age 2.6 years), with 24 seated near the side of the crash, 7 seated on the far side, and 10 seated in the center. The most common injuries were to the skull and brain, with a greater proportion of skull fractures occurring with increasing age. Lung contusions and spinal injuries were also reported. Near-side head and face contact points occurred along the rear vertical plane of the window and the horizontal plane of the windowsill. Head and face contact points for center- and far-side occupants were along the edges of the front seat back and front seat head restraint. image of a computer program designed to evaluate contact points in a vehicle's interior Economic Impact : Study results will inform innovations in vehicle occupant protection and child restraint designs, features and products. In particular, the findings will lead to new generation child restraints with side wings and energy management features on vehicle door interiors to reduce injuries from MVCs involving children placed in CRS. In addition, the findings could guide enhancements to safety standards that govern restraint performance and occupant protection, ensuring that the resources invested by manufacturers in meeting such standards will lead to safer vehicles and child restraint systems and protection for young occupants in the event of a crash. The results of this project should ultimately lower health care/insurance costs, as side-impact crashes represent a significant portion of MVC-related injury costs. For more information, contact Flaura Winston or Kristy Arbogast, 215.590.3118.
image of a horizontal line
Collaborative Review of Children's Injuries in Motor Vehicle Crashes
Road traffic injury remains the leading cause of children's death and acquired disability. Multi-disciplinary teams of university-based engineers, scientists, and physicians must collaborate with peers in government and industry to investigate crashes, determine mechanisms of injury, and develop safety technology to prevent similar injuries in the future. Multi-disciplinary expertise is needed to review the circumstances of each crash. This expertise is seldom available at any one institution. This results in the need for remote collaboration. It means that sensitive information from multiple sources needs to be shared via secure transmission lines with strictly controlled access. With funding from the National Science Foundation and guidance from I/UCRC Industrial Advisory Boards, a team of researchers from two IUCRCs, the Center for Child Injury Prevention Studies at The Children’s Hospital of Philadelphia and University of Pennsylvania and the Center for Autonomic Computing at the University of Florida, developed a networking system. It is technology for remote, collaborative review of mechanisms of injury to children in motor vehicle crashes. Referred to as Telecenter, this innovative application of information technology enables: 1) distributed, asynchronous collection of digital content needed for crash case reviews, with consistent organization of content across cases; 2) secure, Web-based, remote participation in review meetings with multi-media sharing of case content via visual images, real-time written and oral communication, and use of Web resources, and; 3) archiving for post-review access and follow-up involving statistics, search and networking. Telecenter system's design supports conferencing and remote image-sharing. Its capabilities extend beyond existing solutions via: workflow and content organization that is well-suited to traffic injury reviews; spatio-temporal, role-based access control; distributed content management; and seamless integration of services. Economic Impact: Telecenter was pilot tested to enhance the quality and value of National Highway Traffic Safety Administration (NHTSA) crash injury case reviews through the inclusion of remotely located experts without the burden of additional travel costs. Further leveraging the investment in Telecenter, an adaptation was developed within another public sector: health. Telecenter was reconfigured to meet the needs of state-mandated Child Death Review teams. Similar to crash investigation reviews, Child Death Review teams require multidisciplinary expertise in order to determine how and why children die and plans for action to prevent future child deaths, but this expertise might not be available locally. Initial real world results demonstrated that Telecenter for Child Death Review could help states enhance the quality of reviews without the financial burden of travel for experts while improving efficiency in the timely transfer of information to those who can implement actions to improve the health and safety of children. The collaborative nature of this project spurs innovation, as it promotes involving the appropriate assortment of people on specific projects. For more information, contact Flaura Winston or Kristy Arbogast, 215.590.3118. Center for the Design of Analog/Digital Integrated Circuits (CDADIC)
Washington State University, John Ringo, Director, 509.335.5595, ringo@wsu.edu
Washington State University, Joanne Buteau, University-Industry Corporate Relations, 509.335.5379, jbuteau@wsu.edu
University of Washington, Bruce Darling, 206.543.4703, darling@ee.washington.edu
Oregon State University, Pavan Hanumolu, 541.737.2178, hanumolu@eecs.oregonstate.edu
Center website: http://www.cdadic.com/
image of a horizontal line
Noise-Coupled Analog-to-Digital Data Converters
As wireless and wired communication and digital broadcasting proliferate, there is increasing demand for wideband analog-to-digital data converters (ADCs). The signal bandwidth requirement gets more stringent in direct conversion receivers. Along with the wide signal bandwidth, high dynamic range and linearity are also required in these applications. This performance should be achieved in a power-efficient way, since power dissipation determines the battery life for mobile devices. Delta-sigma ADCs can deliver high performance with low-power consumption over wide signal bandwidths, and it is hence the ADC architecture of choice in many wired and wireless receivers. Under a CDADIC-funded project, researchers developed a novel delta-sigma ADC based on noise coupling that provides excellent linearity and power efficiency for wideband communication devices and cell phones. image of a cell phone Economic Impact: Noise-coupling converters allow the translation of analog signals into digital form with less distortion and lower power requirements than earlier circuits, and will result in less expensive mixed-mode structures. Particularly strong economic impact may be expected in the cost-performance ratio of wideband battery-operated systems, including cellular telephones, digital radios, and other wireless devices. Since these products represent a significant percentage of the annual sales of electronic devices, the economic affect may be large. The integrated circuit industry will particularly benefit by this innovation. For more information, contact Gabor Temes, 541.737.2979, temes@eecs.oregonstate.edu.
image of a horizontal line
Low Voltage Analog Circuits in CMOS
The transistor and voltage scaling are important elements needed in the continuation of digital advancement. While such scaling brings significant advantages to digital circuit design, analog circuit designers face formidable challenges for low-voltage design. This research seeks new ways to overcome these scaling, low -voltage challenges by exploring techniques that will maximize signal swing and compensate for inherent accuracy limitations of analog signal processing. Another important consideration is time and how to make the best use of it. Due to fast transistors that come with scaling, time domain information is becoming more precise. Thus, this research considers ways to trade time information for what has traditionally been utilized as voltage. These efforts have led to novel circuit techniques, such as Correlated Level Shifting that allows signal swing beyond the supply voltage, and Integrating Quantizer that makes use of time domain information to provide an extra order of quantization noise shaping.image of a hearing aid Economic Impact: The current and future results from this CDADIC research effort will make higher accuracy and lower power analog/interface integrated circuits possible. With successful advancement in higher accuracy and lower power, meaningful progress will be made in medical applications (hearing aids, pace makers, and other medical devices), emerging consumer markets (high speed and long battery life wireless mobile devices), and other applications (smart sensors, aerospace, military). For more information, contact Un-Ku Moon, 541.737.2051, moon@eecs.oregonstate.edu
image of a horizontal line
Low-power Data Converters
Data converters provide the translation between the analog signals of the real world and the digital signals processed by computers. Hence, they play a key role in cellular telephones, digital television, CD and DVD players, and many other telecommunication and consumer electronics applications. More recently, data converters also are playing increasingly complex and demanding roles in diagnostic medical devices used in EKG, EEG, ultrasound monitors, hearing aids, brain stimulators, and other applications. In such applications, in addition to the usual requirements for speed and accuracy, low power dissipation plays a crucial role, since often power is supplied by batteries, or scavenged from the environment. This research has resulted in important technological advances in this field.image ofa computer chip Economic Impact: CDADIC researchers have been successful in finding novel architectures and algorithms that provide excellent trade-offs between accuracy, speed and power requirements of low-power data converters. Research results from this work are being incorporated into the design of multi-sensor networks used to monitor brain waves, heart beats and other biomedical signals. For more information, contact Un-Ku Moon, 541.737.2051, moon@ece.orst.edu or Gabor Temes, 541.737.2979, temes@ece.orst.edu.
image of a horizontal line
Modeling and Design of Integrated Circuit Protection Systems
CDADIC research has led to the development of new circuit designs and new compact modeling methods for protecting integrated circuits against the effects of electrostatic discharge (ESD) and electrical overstress (EOS). These problems have most commonly been dealt with using a trial-and-error method, but new compact models and simulation tools can now predict the current pathways on a chip that an ESD or EOS pulse will take, and then evaluate the robustness of the design to dissipate the energy of the errant pulse. This can eliminate much of the guesswork in ESD/EOS design, and can help bring products to market faster by evaluating ESD/EOS robustness prior to fabrication. ESD protection is also particularly difficult for sensitive RF circuits. New circuit protection systems for RF front-end circuits are also being designed, such as those used in cell phones and other wireless systems. ESD/EOS protection is predicted to become an even greater challenge for the newest generations of 22 nm and 16 nm CMOS processes. Thermal effects in these devices are also being studied to better predict the energy dissipation tolerance and their susceptibility to ESD/EOS transients. This will lead to improved simulation models and better ESD/EOS tolerant designs.image of a drawing of computer circuits Economic Impact: Approximately 0.5 percent of all integrated circuits fail, and of these failures, approximately two-thirds of those are caused by ESD/EOS. For the annual worldwide semiconductor industry market of around $300B, ESD/EOS failures thus constitute roughly a $1B annual problem. The most demanding environments for ESD/EOS stresses involve analog, mixed-signal, and RF integrated circuits, for which the simulation tools have the least well developed circuit models. ESD/EOS protection systems for these types of ICs necessarily require a greater level of customization, and accurate ESD/EOS circuit simulation can play a significant role in producing more reliable IC designs. For more information, contact Bruce Darling, 206.543.4703, bruced@u.washington.edu.
image of a horizontal line
PIN-Diode-Based Phase Shifter in Silicon Germanium
Phased array antennas (PAAs) are critical for next-generation satellite radios, broadband Internet, and GPS systems. A PAA consists of tens-to-thousands of individual, identical antenna elements. Each element consists of an antenna or radiator and associated electronics that amplify and phase shift the signal at each element. The primary factor limiting broader usage of PAAs has been their high cost, which is driven by the cost of the element electronics. This CDADIC research has successfully reduced the cost of an important electronic functional block used in each PAA element, the phase shifter. Working with one of CDADIC’s aerospace partners, center researchers have developed and modeled a PIN diode switch in silicon germanium (SiGe) Bi-polar/Complementary Metal Oxide Semiconductor (BiCMOS) technology. They have included this switch in an integrated, high-performance phase shifter. Over the years, this team has enhanced the linearity, isolation, and loss performance of the PIN diode and applied the improvements to implement multiple-channels and multi-beams, resulting in an integrated, light weight, and low power solution for PAAs. This has enabled the development of fully-integrated PAA electronics in a single SiGe BiCMOS integrated circuit. The result is lower cost PAAs, with higher performance. image of a satellite circling the earth Economic Impact: Based on these research results, The Boeing Company is now using the developed technology to implement various beamformers in defense and commercial applications based on the PIN diode phase sifter for high linearity. Linear Signal, Inc. is also using this research for various high performance beamformers based on the PIN diode phase shifters in silicon technology and providing these results to its customers in the commercial and defense domains. These low cost electronics will dramatically lower the cost and enhance the performance of mobile satellite television and internet for laptops, automobiles, boats, aircraft, and more. They will also introduce true reliable (SLA) services from satellite providers by providing self-correcting antennas that can repoint to backup satellites during transponder saturation or even thunderstorms. For more information, contact Deuk Heo, 509.335.1302, dheo@eecs.wsu.edu.
image of a horizontal line
Coupling Suppression in Integrated Circuits using Dummy Metal Fill
image of an airplane flying by a control tower Metal fill is required by semiconductor foundries to achieve planarization of the dielectric and metal layers. The added metal fill can have electrical performance impacts on an IC, especially in high speed or RF applications. These impacts are not well characterized and captured in IC design tools. This project addresses these issues, enabling designers to accommodate and even leverage these effects to their advantage. RF designs become extremely sensitive to all parasitic effects related to the metal layers at frequencies above 5 GHz. In the past, metal fill rules have been accommodated for some designs by adding chip area around the perimeter whose only purpose is to increase the percentage of area covered by metal to meet metal fill rules. This results in significant waste of silicon, impacting costs and miniaturization. This research makes it easier for designers to plan metal fill within the boundaries of a minimum sized chip, while understanding the impact of the metal fill and even the associated parasitic effects to advantage in their circuit design. RFICs and high-speed digital integrated circuits used in a wide range of end products will benefit through greater miniaturization, reduced design effort and improved performance. Phased array antennas for communications and radar applications should be improved as a result of this work. Economic Impact: Semiconductor technology companies will benefit from this work through reduced design effort, improved performance, and greater miniaturization of their RF and high-performance mixed-signal integrated circuit products. The research is expected to strengthen the leadership of US semiconductor technology companies in RF and high performance mixed-signal IC design and manufacturing, which directly impacts the large and growing economic sector of wireless communications. For more information, contact Andreas Weishaar, 541.737.3153, andreas@eecs.oregonstate.edu.
image of a horizontal line
Low-Cost MIMO Transceivers Using CMOS Technology
CDADIC researcher Dave Allstot is developing low-cost multiple-input multiple-output (MIMO) transmit/receive systems on monolithic microwave integrated circuit (MMIC) chips based on fine-line CMOS technology. Such systems traditionally have been implemented using gallium arsenide technology, which is more expensive and won't support putting the multiple transmitter, receiver, and control functions on the same integrated circuit. Phased array transceivers, used in aerospace and satellite communications, for example, use a radio channel for each element of the array. The cost limits how widely the technology gets used. Moreover, extensions of basic MIMO techniques are attractive for emerging cognitive radio systems. This research should help dramatically increase the use of MIMO transceivers in applications that are critical to the military for DOD's next-generation communications. Economic Impact: MIMO transmit/receive systems have now become a mainstream production technique in CMOS radio frequency, integrated circuit design and production. Many of the techniques associated with this research, are used by high technology companies throughout the world in CMOS radio chips and are having major economic impacts. For more information, contact Dave Allstot, 206.221.5764, allstot@ee.washington.edu. Center for Dielectric Studies (CDS)
Pennsylvania State University, Clive Randall, Director, 814.863.1328, car4@psu.edu
Pennsylvania State University, Michael Lanagan, 814.865.6992, mxl46@psu.edu
Pennsylvania State University, Susan Trolier-McKinstry, 814.863.8348, tmckinstry@psu.edu
Missouri University of Science & Technology, Fatih Dogan, 573.341.7130, doganf@mst.edu
Center website: http://www.mri.psu.edu/Centers/cds/
image of a horizontal line
Thermoelectric Materials from Ferroelectrics
Thermoelectric materials are a very interesting concept that has the potential to harvest waste heat and convert it to electrical energy. This is particularly attractive for exhausts in automobiles. Based on analysis of the automotive area alone, which represents only one possible application of this research, automobiles lose up to 70% with heat loss. It is feasible, with state-of-the-art heat exchanges coupled with high ZT, thermoelectric generation could give up to 8 to 10% improvement in fuel consumption.image of a graph showing temperature versus figure of merit Thermoelectric materials require specific properties that are enhanced with electrical conduction and high Seebeck coefficients and low thermal conductivity. Ferroelectric materials have low thermal conductivity. Using heavy defect doping, CDS researchers have converted traditional materials from insulating to thermoelectrics in oxide materials, such as (Sr, Br)Nb2O6-d. Some of our members are following this new research area. Economic Impact: At this time, the thermoelectrics worldwide business is approximately $60 million. However, there are roadmaps in Europe, the US and Asia to use thermoelectric generators to aid car fuel efficiency between 2014 and 2026. If successful, this industry will expand to a $20 billion industry. For more information, contact Clive Randall, 814.863.1328, car4@psu.edu.
image of a horizontal line
An Electrical Measuring Approach Towards High Reliability Electronics
Embedded electronics medical therapy requires the highest levels of reliability in electronic systems. The research on reliability at the CDS gives fundamental insights into the lifetime changes that occur in capacitor devices under use. The details of these mechanisms lie with complex interplay between the thermochemical and electrochemical processes that at the atomistic level limit long term stability. The techniques that the Center has developed, such as site-specific impedance spectroscopy and thermally stimulated depolarization current measurements, indicate the nature of processes and introduce methodologies that can quantify the defects that promote electrochemical processes in use.image of an implantable cardiac defibrillator Economic Impact: Reliability testing is a must in high-end electronics. Methods such as those being developed at CDS apply due diligence on all components in embedded electronic systems and protect manufacturers the customers in an essential industry that was about $10.7 billion worldwide in 2010. US companies, such as Boston Scientific, have adapted center based reliability methods to screen components that are inside the cardiac defibrillator. Through these techniques, suppliers and end-users such as Boston Scientific can assess opportunities for manufacturing that can provide the industry with the best performing capacitors. For more information, contact Clive Randall, 814.863.1328, car4@psu.edu.
image of a horizontal line
Increasing the Performance in Electrolytic Capacitors
To enhance capacitance and voltage performance in electrolytic Ta capacitors, companies such as Cabot Corporation, KEMET Electronics Corporation, and Greatbatch, Inc., are interested in the CDS studies on the fundamentals of anodization and the nature of the dielectric produced under different processing conditions. The point defect model has been extended to address the problems of Ta-anodization and provides unique insights into the relative roles of thermal and anodized dielectric. With these new models, it is believed that there are more methodologies at hand to extend the high capacitance from high surface area Ta powders. In addition, with higher voltage Ta electrolytics, understanding the details of the conduction mechanisms with trap densities and curvatures is helping our CDS companies design state-of-the-art capacitors for new application areas in power electronics. Economic Impact: Electrolytic capacitors impact all of the electronic industry, ranging from computers, communications, power supplies, automobile, aerospace, medical, and military. The Ta-electrolytic market is approximately $4 billion. For more information, contact Clive Randall, 814.863.1328, car4@psu.edu.
image of a horizontal line
New Products and Process Improvements for Passive Electronic Components
Passive electronic components have not undergone the same miniaturization as have other semiconductor components. This creates important constraints in terms of space consumption on circuit boards. Efforts have been made to make these components smaller, less expensive and generally more compatible with consumer electronics. Research at the Center for Dielectric Studies (CDS) has helped researchers at AVX Corporation better understand the materials and processes used to make electronic components. AVX is a passive electronic component manufacturing company that makes capacitors, resistors, and inductors, parts that control flow of current in circuits. Specifically, this center’s work has led to implementation of processes at AVX for preparation and heat treatment of capacitors, innovations that led to improvements in yield in product lines. Economic Impact: The world market for capacitor components is estimated to be $16 billion US dollars annually. Passive electronic components are constantly evolving to support system trends in functionality and miniaturization, such as in handheld electronics. image of a paint-like substance image of small electronic parts, looks like little fuses Thick film paste made from a new high permittivity, low loss dielectric for microwave passive component integration (left); Prototyped microwave filter components manufactured with a new high permittivity pyrochlore materials (right). For more information, contact Michael Lanagan, 814.865.6992, mxl46@psu.edu, Susan Trolier-McKinstry, 814.863.8348, stmckinstry@psu.edu, or Clive Randall, 814.863.1328, car4@psu.edu.
image of a horizontal line
Understanding Dielectric Materials
Research at the Center for Dielectric Studies (CDS) has furthered understanding of dielectric materials, including the requirements for raw materials and the properties that result from various compounds and processing approaches. One such company is Ferro Corporation, one of the largest manufacturers of barium titanate in the world. The Center’s research has shed new light on understanding the defect chemistry of barium titanate, a key ingredient of many of the dielectric powders. Related center research on mechanisms of failure in multilayer ceramic capacitors, particularly capacitors with Ni electrodes, helped to improve yields using state-of-the-art microscopy techniques as illustrated in the figure. image of nanometer-sized components, very abstract with overlapping stripes Economic Impact: The world market for capacitor components is estimated to be $16 billion US dollars annually. Passive electronic components are constantly evolving to support system trends in functionality and miniaturization, such as in handheld electronics, reliability is a very import part of their value. For more information, contact Clive Randall, 814.863.1328, car4@psu.edu or Elizabeth Dickey, 814.865.9067, ecd10@psu.edu. Center for e-Design
A CISE-funded Center
Virginia Tech, Richard Goff, Director, 540.231.9537, richgoff@vt.edu
University of Central Florida, Christopher Geiger, 407.823.0221, cdgeiger@mail.ucf.edu
University of Massachusetts, Sundar Krishnamurty, 413.545.0297, skrishna@ecs.umass.edu
University at Buffalo, Kemper Lewis, 716.645.2685, kelewis@buffalo.edu
Brigham Young University, Gregory Jensen, 801.422.6540, cjensen@byu.edu
Carnegie Mellon University, James Antaki, 412.802.6431, antaki@andrew.cmu.edu
Center website: http://e-design.iems.ucf.edu/area.html
image of a horizontal line
Complexity Management Methodology: A Tool for Developing Product Families
Industry is continually challenged by the need to balance customer demands for increased product variety and the associated increased costs of meeting the demands; all while staying competitive. While great from customers’ perspectives, increasing variety in products can quickly lead to complexity that impacts time, processes, and costs associated with the supply chain, inventory, manufacturing, service in the field, training, throughout the life of the product (from initial inception through product retirement, recycle, reuse, or disposal). One strategy for satisfying customer demand while managing costs is to design products using product families and platforms, wherein common components and subsystems are shared across a variety of products. The objective is to give customers variety and choices that are apparent to them, but to also incorporate commonality and reuse across products to save the company time and costs.image of a tire with a cut-out showing the tread underneath Researchers at the Center for e-Design have developed a collaborative Complexity Management Tool (CMT) that was lacking in previous complexity research. CMT is a comprehensive quantitative methodology that provides comprehensive evaluation and analysis of complexity for manufacturing integrative product lines and supports design decisions that links complexity with costs. It identifies and incorporates multiple measures and indicators of product complexity, including product design, development, manufacturing, assembly and supply chain. These metrics are then used to compute the costs that would be associated with increased product complexity. CMT has been tested on three product lines at The Goodyear Tire & Rubber Company (Goodyear) where tire production includes a variety of manual and automated assembly and integrative processes. Economic Impact: CMT Test results indicate that significant cost saving can be achieved with product family redesign. In the analysis of one specific product line, results demonstrated feasible changes that would help reduce production costs for two critical components by approximately 10% and 40% respectively. For high volume products, this can quickly result in hundreds of thousands of dollars of savings. Sponsors indicate that CMT has significant value. A patent in collaboration with Center for e-Design industry members is being sought. For more information, contact Janis Terpenny, 515.294.1287, terpenny@iastate.edu or Nihal Orfi, 540.808.9173, norfi@vt.edu. Center for Electromagnetic Compatibility (CEMC)
Missouri University of Science & Technology, Richard DuBroff, 573.341.4719, red@mst.edu
University of Houston, Ji Chen, 713.743.4423, jchen18@uh.edu
Clemson University, Todd Hubing, 864.656.7219, hubing@clemson.edu
University of Oklahoma, Floyd Grant, 405.325.2429, hgrant@ou.edu
Center website: http://www.emc-center.org/CEMC.aspx
image of a horizontal line
Reducing Emissions From DC-DC Converters Without Sacrificing Efficiency
High speed digital electronic devices such as computers and cell phones, for example, utilize signals that change rapidly, often on the order of nanoseconds or even picoseconds. These rapidly changing signals are generally transmitted from one integrated circuit to another or to one or more peripheral devices through a network of electrical conductors. The conductors in turn can exist in various forms including cables, wires, circuit board traces and circuit board planes. However, as the signal transitions become more rapid and occur more frequently the voltages and currents associated with the transitions can create a weak electromagnetic field in the proximity of the device. The electromagnetic field produced by the device then adds to the ambient electromagnetic fields produced by both natural events (e.g., lightning) and man-made events (e.g., radio and television transmissions). In this way each device contributes to the electromagnetic environment. While the electromagnetic environment is generally imperceptible to the senses, it can be detected with suitable electronic equipment. More importantly, excessive noise introduced into the electromagnetic environment by one device can cause interference to other devices located in proximity. The need is to determine how to design reliable electronic equipment that meets or exceeds regulatory constraints without significantly compromising other important design objectives.drawing of how the dc dc converter connects to the power grid and to mobile devices Researchers at the Center for Electromagnetic Compatibility (CEMC) have made a significant technological contribution with their work on reducing emissions from DC-DC converters without sacrificing efficiency. Conventional solutions to reduce noise often affect the efficiency of the circuit. This breakthrough work conducted at CEMC provides a more thorough understanding of the noise radiation mechanisms in the circuit, and proposes innovative solutions to eliminate noise while maintaining circuit performance. Economic Impact: The outcome of this study has been completed as a set of design guidelines. The design guideline is used during the product design stage to optimize the design of DC-DC converters for minimal radio emissions. The design guidelines allow designers to quickly optimize the EMC performance of the DC-DC converters in various products and thus can reduce the number of development cycles which can substantially reduce the cost of development. The outcome of this work has been implemented at Apple, Inc. in a number of its products. It is helping to deliver better products at reduced cost to the consumer. This will help industry realize reduced electromagnetic emissions from DC-DC converters that are often used multiple times in each product. Since DC-DC converters are used in almost all kinds of electronic devices, the breakthrough has a profound impact to the electronic industry. Implications of this work for end-users include reduced cost, fast-to-market product development, and a product that is "quiet" to environment while offering improved functionality and performance. For more information, contact Richard DuBroff, 573.341.4719, red@mst.edu. Center for Embedded Systems (CES)
A CISE-funded Center
Arizona State University, Sarma Vrudhula, Director, 480.965.4748, vrudhula@asu.edu
Southern Illinois - Carbondale, Spyros Tragoudas, 618.453.7027, spyros@engr.siu.edu
Center website: http://embedded.engineering.asu.edu/iucrc/index.php/Home
image of a horizontal line
Design Tool for Mobile Low-power Processors
Embedded smart devices such as cellular phones and tablets have emerged as the new technology drivers for the semiconductor industry. image of a mobile smart phone Mobile low-processor chips have evolved from single core processors into multi-core architectures that integrate 10-20 processor cores, 40-60 customized hardware units or accelerators and many memory blocks. In other words, the state-of-the-art mobile processors integrate upwards of hundred fairly complex intellectual property (IP) blocks into a single chip. A constant need is higher performance, stringent low power requirements, and short time to market. The on-chip interconnection architectures that connect these IP blocks together in cohesive systems have emerged as a key determinant of mobile processor performance and power consumption. These interconnected architectures are implemented as a Network-on-Chip (NoC); that consists of interconnected routers and IP blocks. CES researchers have developed a computer-aided design (CAD) tool chain for developing the NoC architecture for future Qualcomm mobile processor chips. The NoC tool chain automatically generates high performance, low power on-chip interconnection architectures that are able to successfully address multiple traffic classes, multiple use-cases, deadlock avoidance, multiple clock islands and bit-width optimization. In minutes, the tool chain is automates and performs design task that can take several weeks of manual effort. Consequently, the synthesized interconnection architecture and the overall mobile processor depict better performance, lower power consumption, and less time to design. The next generation smart phone products will have much higher performance requirements with the same or incrementally longer battery lifetimes. Consequently, future generations of mobile processor chips will integrate ever-increasing numbers of IP blocks on the same chip. The NoC design tool developed by the CES team is a key technology that will enable Qualcomm to maintain its dominant position in the mobile low power processor market. Economic Impact: A center sponsor, Qualcomm Inc is the market leader in mobile low-power processors aimed at smart phones and tablets. There are an estimated 5.2 billion cellular phone subscribers worldwide. It is expected that the number of low power mobile processors that are utilized by such devices will hit the 500 million mark by 2015. The increases in efficiency and development savings can be expected to have substantial economic impacts on the electronics industry and on the nation’s competitive position. For more information, contact Karam Chatha, 480.727.7850, KChatha@asu.edu.
image of a horizontal line
Safer Automotive and Aerospace In-Vehicle Control Systems
Current system design practices do not guarantee correct system functionality under the prevailing short development cycles. Since automotive and aerospace systems are safety-critical, design methodologies that can improve confidence in the overall system design are sought after by industry. drawing of the model-based development cycle One such design methodology is the Model Based Development (MBD) framework where the design of the system starts with a model of the system. Such models are usually developed in a modeling environment that supports a block diagram Graphical User Interface (GUI). GUIs enable modeling of both the physical components of a control system (modeling of the automobile engine), as well as the cyber components (modeling of software that controls engine performance). CES Researchers developed S-TALIRO, a software tool that systematically checks a given system’s model by searching for an input that demonstrates that a functional requirement is not satisfied. Such a functional requirement could be that the engine never stalls while the vehicle is cruising. The process of discovering systems’ operating conditions that produce the desired system’s behaviors that violate functional requirements is referred to as falsification. Even if a falsifying behavior cannot be found, the system behaviors that came the “closest” to violating the desired property are returned to the user. The advantage of this technology is that most formal verification methods seek a mathematical proof that a property is satisfied. Unfortunately, these mathematical methods do not scale to the complexity of a typical industrial control application, nor do they mathematically “understand” the model's semantics due to proprietary modeling formats. Because S-TALIRO is a simulation-based approach it is immune to these difficulties and, thus, is a more robust 'model-based verification' that for the first time is being made available to industry. The primary applications of this advance are in a utomotive and aerospace in-vehicle control systems such as powertrain control and avionics. Economic Impact: The complexity of modern automotive and aerospace systems is enormous. Sponsors consider this a breakthrough even though results are still at the research stage and have not yet been applied on actual development projects. Currently, both the automotive and the aerospace industries are moving into MBD practices. There is a clear need for tools that verify the correctness of a system with respect to functional requirements. It is often said that verification represents 50% of the development cost of in-vehicle control systems. It is within the realm of possibility that S-TALIRO could reduce development costs factor by 5 to 10%. For more information, contact Georgios Fainekos, 480.965.8267, fainekos@asu.edu. Center for Engineering Logistics and Distribution (CELDi)
University of Arkansas, Russell Meller, Director, 479.575.6196, rmeller@uark.edu
Auburn University, Jeffrey Smith, Deputy Director, 334.844.1412, jsmith@auburn.edu
University of Oklahoma, Mustafa Pulat, 405.325.4532, bpulat@ou.edu
Oklahoma State University, Ricki Ingalls, 405.744.6055, ricki.ingalls@okstate.edu
University of Missouri, James S. Noble, 573.882.9561, noblej@missouri.edu
Texas Tech University, Tim Matis, 806.742.3543, timothy.matis@ttu.edu
Arizona State University, Rene Villalobos, 480.965.0437, rene.villalobos@asu.edu
Virginia Tech, Kimberly Ellis, 540.231.9073, kpellis@vt.edu
Clemson University, William Ferrell, 864.656.2724, fwillia@clemson.edu
University of California Berkeley, Philip Kaminsky, 501.642.4927, kaminsky@ieor.berkeley.edu
Center website: http://celdi.org/
image of a horizontal line
Networking Merchandise Logistics
Research at the Center for Engineering Logistics and Distribution (CELDi) has enabled Wal-Mart to identify opportunities to streamline some of the company's processes. Center researchers collected data and conducted an in-depth analysis in areas the company could not otherwise accomplish. The work provided an opportunity to rethink how it uses some of the job activities and personnel hours in its stores and to enhance store productivity. images of many packages moving down a factory belt Last year, CELDi helped the company do a logistics analysis that caused the network designers to rethink how logistics networks (all the systems relate to moving merchandise from vendor/supplier to the store) will be organized in the future. Currently, center researchers are working on a project that will change how the company maintains inventory accuracy. In a store that has such a large flow of freight it is critical to maintain accuracy of inventory records to avoid over- or under-inventorying items in the store. Center efforts have led also to the publication of research papers on these subjects. Last year, the network analysis was published in the literature, and the company anticipates that this year's inventory analysis will lead to another paper. Economic Impact: Research results suggest that applying scheduled and opportunity count frequently is quite beneficial for slower moving high cost items. The estimated savings for a store for low demanded items and for high demanded items are $2,972,500 and $234,200 respectively. These results are based on simplified representation of the retail environment so that they may not reflect actual savings. Even with that caveat, based on the risk analysis and the process simulation modeling, there is strong evidence to suggest that significant savings can be obtained by adopting the recommended approach. For more information, contact Manuel Rossetti, 479.575.6756, rossetti@uark.edu, or Russell Meller, 479.575.6196, rmeller@uark.edu.
image of a horizontal line
UPS Integrad TM Training System for Generation Y Delivery Drivers
In 2006, the Department of Labor awarded funding to United Parcel Service (UPS) to address training issues and to develop a state-of-the art training system that would serve the needs of diverse drivers, including Generation Y drivers. Safety for UPS Delivery Service Providers (DSPs) was given a greater focus in the development of this training system than was used in previous training efforts. UPS wanted to focus on some of the most common and costly injuries. The UPS Integrad Training System uses computer-based training that utilizes agent-based and interactive learning presented through animation coded in CGI as well as kinetic learning modules. UPS collaborated with researchers from the Center for Engineering Logistics & Distribution (CELDi) the system to enhance and support training, and ultimately to support workers' well-being and quality of life. image of a brown UPS truck with a clear plastic side to see the packages inside Newly hired UPS Delivery Service Providers enters a building for their first day of training in what is called the UPS Integrad system. They typically expect to spend most of their day listening to lectures on how to deliver packages efficiently and safely. For most of them it is hard to learn that way; they would rather do more hands-on activities. Most prefer to use computers, but only if the learning can be done in a way that is exciting--like the video games that many are used to playing. The training site looks just like any UPS site where packages are loaded and package cars prepare to leave to begin their deliveries. Most are pleasantly surprised to see package cars made of plexiglass so they can observe their “peers” practicing how to lift and position packages safely. They walk into a room full of computer workstations, surrounded by eye-catching posters with motivational safety messages, and information. This training feels more like entering the job, though much like an interactive video games. Most trainees do “about-faces” in attitude. They are actually excited about employment training. Economic Impact : Evaluations of the training system by UPS’s Stephen Jones have documented a 79.3% greater reduction in injuries for UPS Delivery Service Providers who were trained with the Integrad compared to those receiving traditional training. There was a concurrent 61% greater reduction in accidents for the Integrad group. This system results in dramatic and earlier driver service provider competence improvements in safety and job performance. The innovative design of the system integrates inclusive learning; procedural and motor learning; active, peer-assisted and just-in-time training and assessment; training tool usability and iterative design. This system holds users' safety as the paramount design responsibility. For more information, contact Brian Kleiner at Virginia Tech, 540.231.4926, bkleiner@vt.edu. Center for Experimental Research in Computer Systems (CERCS) A CISE-funded Center
Georgia Institute of Technology, Karsten Schwan, Director, 404.894.2589, schwan@cc.gatech.edu
Georgia Institute of Technology, Calton Pu, 404.385.1106, calton@cc.gatech.edu
Georgia Institute of Technology, Doug Blough, 404.385.1271, doug.blough@ece.gatech.edu
Georgia Institute of Technology, Sudhakar Yalamanchili, 404.894.2940, sudha@ece.gatech.edu
Ohio State University, Jay Ramanathan, 614.565.4187, jaram@cse.ohio-state.edu
Ohio State University, Rajiv Ramnath, ramnath@cse.ohio-state.edu
Center website: http://www.cercs.gatech.edu/
image of a horizontal line
Scalable Management for Cloud Computing Environments: Monalytics
Cloud computing offers tremendous benefits to organizations by providing on-demand access to configurable computing resources, while lowering costs and enabling entities to increase computing assets with minimal effort. Data centers are the backbone of cloud infrastructures. They must be able to efficiently and rapidly adapt to support the increasing demands of cloud clients. As a result, automation in facility management is a key challenge, particularly when given the large-scale nature of data center and cloud systems. Cloud providers have difficulty dealing with variability, both in the demands of client applications and in the resources available for the computation. By monitoring all aspects of hardware and software performance, service providers can detect and address performance problems. Unfortunately, for very large scale cloud environments, this cannot be done, since the amounts of monitoring data generated would be staggering. Therefore, cloud vendors are hampered in extending and growing their facilities to meet future demand. image of cloud computing, showing devices interconnected to a drawing of a cloud Researchers at CERCS have developed Monalytics, a manageability software program that facilitates the development of next-generation scalable data center management products for very large cloud computing environments. It operates like the many ‘”big data” applications used by companies to mine data about customer preferences, but in contrast to those systems, its purpose is to flexibly perform management functions in a manner that scales to the size of the task. It enables cloud service providers to more effectively operate the ever-increasing numbers of hardware and software components in the data center, to make vast amounts of computing power available on demand and to enable cost savings and reduced energy demands. With this scalability, Monalytics will make it possible for future cloud data centers to provide levels of computing power equivalent to today’s largest supercomputers, thereby realizing the full potential of cloud computing. It does so by processing monitoring data online - to rapidly extract data of interest, in-place - where data is generated so that data center networks are not overwhelmed. Monalytics has attracted attention of multiple companies, such as HP and VMWare. These entities are particularly interested in the ability of the technology to enable significant performance improvements in large scale cloud environments while reducing system management costs. At HP, groups have used the approach to monitor utility data centers. At VMWare, online management methods like those enabled by Monalytics are routinely used to consolidate data center systems. Economic Impact: The manageability software market for clouds, in particular, is estimated by International Data Corporation to be a $2.5 billion market by 2015. Monalytics will directly contribute to this market and will accelerate the growth and scope of cloud computing by allowing cloud environments to grow ever larger. It will allow cloud service providers to better meet the challenges of system management for the millions of managed objects in future large cloud environments. Experimental evaluations have shown the Monalytics approach yields up to 92% reduction in time to insight and 86% lower cost compared with traditional approaches to performance management. This advance will result in lower-cost, more efficient data centers run by public and private cloud providers. It will also improve end-user’s online experiences, ensuring consistent and reliable performance in their cloud applications, leading to greater adoption of clouds and their use. This can lead to new business opportunities in clouds, such as improved efficiencies offered by online methods for managing data center power consumption, or better service offerings like “premium” services that provide improved service quality to end users. For more information, contact Karsten Schwan, 404.894.2589, schwan@cc.gatech.edu.
image of a horizontal line
Attaining Predictably Fast Responses: The “ Travelport Flight Shopping Engine”
image of a busy airport interior Researchers at the Center for Experimental Research in Computer Systems (CERCS) have been working with industry to improve the performance of and consider new services and functionality for Travelport, a shopping engine for making travel arrangements (flight shopping). Several prototypes have been developed to address topics that include early problem detection, traffic distribution, multi-core processing, and new services. One result of this work was a 35% improvement in the average response time of end user requests for travel options (e.g., possible flights), and a 10% reduction in failures to meet response time requirements for such requests. The economic impact of the breakthrough translated into 20% less hardware purchases by Travelport (the shopping engine provider) and a more competitive position for Travelport’s shopping engine in the market place. These technology breakthroughs, therefore, improved an existing product for airline shopping, i.e., Travelport’s flight shopping engine. These breakthroughs are relevant to all products that require extensive calculations such as flight shopping, ticketing, hotel reservations, and similar. More importantly, they are relevant to any application for which the amount of processing performed for each request can vary significantly depending on the nature of the requests. Recent work with Travelport is exploring new services and service opportunities, the idea being to find new ways to monetize the rich shopping and booking information available to Travelport. Prototype software built by CERCS students has demonstrated that such monetization based on online data analytics can even be done online, by observing and then mining customer traffic. Economic Impact: The flight shopping/booking industry is undergoing a rapid evolution, in part because of new players like Google that recently acquired one of the three international flight shopping and booking services. Well-positioned to capitalize on increasing travel activities worldwide, Travelport operates both domestically and internationally. By helping local companies develop new services, we help protect and strengthen its financial position, creating jobs in the Atlanta area. For more information, contact Karsten Schwan, 404.894.2589, schwan@cc.gatech.edu.
image of a horizontal line
Power-Efficient Data Centers
In the U.S., data center facilities consume approximately 2% of all electricity, with an estimated growth rate of 12% per year. The majority of this data center power is either consumed by the computers themselves or by the HVAC units required to keep the computers from overheating. image of a data center with servers The use of virtual machines has allowed data center managers to conserve power by consolidating data processing onto fewer servers in times low demand. However, that consolidation had not previously been performed while taking into account its impact upon the cooling demands of the data center. The Georgia Tech CERCS project “CoolIT”, developed in collaboration with the Mechanical Engineering department, allows the synergistic and cooperative management of IT and cooling system resources, adjusting air velocity and the location of active computing simultaneously to minimize power consumption. Initial results attained in CoolIT highlight the interesting trade-offs faced by a coordinated management solution. For instance, at lower cooling air velocities, for a homogeneous set of server systems, an awareness of hot spots in the data center permits the IT management system to operate at close to 100% maximum performance load, whereas without such awareness, there are situations in which only 40% load is achieved. These results illustrate the significant benefits of coordination between computer and facilities power management. Similar insights have been gained for the electrical distribution systems associated with machines, where unequal loads can cause inefficiencies in power delivery, for instance. Economic Impact: CoolIT researchers have interacted with the major IT vendors in the US. IBM’s data center technologies group, for instance, has interacted with our research team to create new technology for data center thermal monitoring (a CoolIT student worked with the IBM team). There have also been interactions with smaller companies, one being OSISoft, a leading vendor in large-scale monitoring software, which has used its exposure to CoolIT to market its software in the (for the company) new domain of data center monitoring and management. Most recently, topics on IT system management have affected VMWare’s product offerings, through joint work of CoolIT researchers with VMWare’s cloud computing and data center management teams. For more information, contact Karsten Schwan, 404.894.2589, schwan@cc.gatech.edu. Center for Friction Stir Processing (CFSP)
Brigham Young University, Tracy Nelson, 801.422.6233, nelsontw@byu.edu
Brigham Young University, Carl Sorensen, 801.361.6429, carl_sorensen@byu.edu
University of South Dakota, Michael West, 605.394.6924 ext. 1933, michael.west@sdsmt.edu
University of South Carolina, Tony Reynolds, 803.777.9548, reynolds@engr.sc.edu
Missouri University of Science and Technology, Rajiv S. Mishra, 573.341.6361, rsmishara@umr.edu
Wichita State University, Dwight Burford, 316.978.3204, dwight.burford@wichita.edu
Center website: http://cfsp.sdsmt.edu/
image of a horizontal line
Littoral Combat System with Improved Welding Technologies
image of a large ship The Center for Friction Stir Processing’s (CFSP’s) research on FSW, stir spot welding and friction stir structural designs and applications has resulted in significant improvements to weld strength and durability by, among other things, replacing fusion welds and rivets. Over 12 miles of Friction Stir Welding has been used to fabricate the Littoral Combat Ship the USS Freedom that was commissioned in September of 2006. The welded aluminum panels for the superstructure were fabricated by Friction Stir Link, Inc. and delivered to Marinette Marine for final assembly. The collaboration between the CFSP and our company has contributed to the success of the implementation of FSW on the LCS Freedom” stated John F. Hinrichs, Founder and Vice President of Technology, FSL, Inc. FSL has opened a new production facility in Slidell, LA to support continued production of the LCS. This technology was awarded the Alexander Schwarzkopf Prize for Technological Innovation by the I/UCRC Association in 2009. Economic Impact: The use of the solid-state friction stir welding process has resulted in improved strength and fatigue life, reduced distortion, an economical, robust and repeatable process. For more information, contact Tracy Nelson, 801.422.6233, nelsontw@byu.edu Center for Fuel Cells (CFC)
University of South Carolina, John Van Zee, Director, 803.777.2285, vanzee@cec.sc.edu
Center website: http://174.143.170.127/iucrc/publicFactSheetServlet?centerId=63
image of a horizontal line
MacMullin Number for PEMFC Gas Diffusion Media
For proton exchange membrane fuel cells (PEMFC), the porous media of interest has been commonly referred to as gas-diffusion media (GDM) and used as gas-diffusion layers (GDL) in the assembly of the unit cell, even though this media is critical for transport of liquid water as well as gases. A simple technique consisting of a four-electrode system, which uses a square-wave form of current, was developed for measuring the MacMullin number for GDM. The MacMullin number relates the free-stream properties with the actual liquid and gas transport in the GDM. This ratio was successfully measured for different carbon-cloth and carbon-paper GDM, for which, in the absence of information for PEMFC, the Bruggeman expression has been commonly used to correct the free stream properties for the actual path length. image of a hybrid vehicle This technique helps to understand critical properties of gas diffusion media that impact directly the liquid water and gas transport in fuel cell and electrolyzers. It includes assessing the length through which these phases travel. Previously, only measurements of the porosity were used to characterize the GDM and still product data sheets only use porosity. Mathematical models for fuel cell and electrolyzers are also improved by the use of the actual path length that leads to more accurate calculations on liquid and gas transport through the GDM. Economic Impact: This technique allows for the scientific community to understand that the Bruggeman equation is not valid for carbon paper GDM and that a different relationship exits as a result of the differences in the path length created by the orientation of the fibers in each type of GDM. It provides industry the knowledge to improve the design of GDM and reduce their cost. It is leading the industry to consider the path length for liquid and gas transport as part of their research. This benefits the development of the fuel cell and electrolyzer technologies by providing optimum designs that improve the efficiency of these devices. This is a significant step forward for moving these technologies from a niche market into a broader market. For more information, contact John Van Zee, 803.777.2285, vanzee@cec.sc.edu. Center for Glass Research (CGR)
Alfred University, Harrie Stevens, Director, 607.871.2432, stevenshj@alfred.edu
University of Missouri-Rolla, Jeffrey Smith, 573.341.4447, jsmith@umr.edu
Penn State University, Carlo Pantano, 814.863.2071, pantano@ems.psu.edu
Center website: http://cgr.alfred.edu/
image of a horizontal line
Glass for Toxic Waste Encapsulation
Specialized glasses and glass melting processes are at the heart of toxic waste vitrification, particularly of low-level and high-level radioactive waste, for long-term storage. Collaborative CGR research at Alfred University, the Virginia Military Institute and at the University of Washington’s “Center for Process Analytical Chemistry (CPAC)” resulted major insights into understanding the oxidation state of such glass melts, including the degree and mechanism of mutual interactions (oxidation-reduction reactions) among the many multivalent elements present. This work is at the heart of understanding and predicting chemical durability of the glass, which is essential for assuring long-term stability during underground storage. Results have been found extremely valuable by at least one of our member companies, the Westinghouse Savannah River Company, and several national laboratories involved with nuclear waste vitrification. three images of vitrified glass Economic Impact: This technological advance contributed essential technical knowledge that has helped the nation and the world deal with the nearly intractable dilemma of what to do with the vast amounts of nuclear waste produced by modern society. The economic impacts of this incredibly important innovative technology are undoubtedly huge but almost impossible to quantify. For more information, contact Tom Seward, 607.871.2432, seward@alfred.edu. Center for Health Organization & Transformation (CHOT)
Texas A&M University, Larry Gamm, 979.458.2244, gamm@srph.tamhsc.edu
Georgia Institute of Technology, Eva K. Lee, 404.894.4962, eva.lee@gatech.edu
Northeastern University, James Benneyan, 617.373.2975, benneyan@coe.neu.edu
Pennsylvania State University, Harriet Black Nembhard, 814.865.4210, hbn2@psu.edu
Center websites: http://www.srph.tamhsc.edu/research/CHOT/index.html and http://www2.isye.gatech.edu/nsf-chot/
image of a horizontal line
Advancing Clinic Workflow and Operations
An Advancing Clinic Workflow and Operations project has resulted in a patient flow optimization model that improves the operations of emergency departments, both in terms of efficiency and quality of care. The large-scale computerized system model developed by CHOT researchers at Georgia Institute of Technology models emergency department (ED) operations with greater realism and accuracy than was heretofore possible. image of a patient being taken to a hospital helicopter The model takes into account major elements in emergency departments, including patient flow, clinic workflow, staffing, equipment, and beds, etc., and seeks to optimize the system to arrive at the best results for patient outcomes. It allows for systems optimization and global intervention that affect both the quality of care and efficiency of delivery. The model is helping organizations deal with critical issues within emergency rooms. It addresses over-crowdedness where the presence of over 40% of patients with non-urgent medical conditions results in long wait time. Misalignment of services also results in unnecessary long lengths of stay, and, at times decreased quality of care and patient satisfaction. This work is impacting operations within the emergency department at Grady Memorial Hospital in Atlanta and should be applicable in any ED setting. Technically, the model uses an extensive and time-motion study of patient arrival patterns and service process distributions that are more comprehensive than previous studies. Results are important both for understanding the bottleneck as well as serving as input for the optimization system model. Economic Impact: The work has economic impacts across the nation. It optimizes resource allocation and improves scheduling and workflow efficiency. It also improves patient flow within EDs, reduces the number of non-urgent patients and the consequent re-directs, and reduces delay in care. This in turn improves quality of care and patient satisfaction. Redirecting 50% of the non-urgent cases results in potential savings of 10 to 14 million USD in health costs per year for Grady patients, and subsequent reductions more millions as length of stay and wait-time for other ER patients are improved. The work is also helping to create a new business model for alternative ER systems for non-urgent and walk-in patients that should reduce over-crowdedness in emergency rooms. This offers an improved revenue model for hospital patient care. For more information, contact Eva Lee, 404.894.4962, eva.lee@gatech.edu
image of a horizontal line
Optimizing Electronic Medical Record (EMR) Usage - Beyond Adoption
image of a stethoscope on a computer Fundamental difficulties with efficient usage of EMR-based clinical information systems include continual evolution and the inability to analyze large-scale distributed data sets to uncover important information for medical decisions. Medication orders generate about 50,000-55,000 alerts monthly. Such high frequency lead to alert fatigue--users almost total disregard to alerts in general as a result of frequent superfluous pop-ups. Prolonged alert fatigue can negatively impact patient care as true alerts may be ignored. Clinical data from electronic medical records (EMRs), including laboratory and imaging systems, provide a wealth of information for advancing diagnoses, optimizing health care delivery operations, and improving patient care. This project works with alerts that are generated from the EPIC EMR system. Prior to this work there were few specialized strategic procedures for reducing alerts, beyond the standard filters provided by the EMR companies. The Center for Health Organization Transformation CHOT) has developed information technology (IT) approaches to better characterize types of alerts using specialized filters for alert reduction. These automate the characterization and filtering processes so that decision support systems can be implemented on top of the existing EMR systems for more effective alert management. The EMR medical alert filtering concept and decision support tools should improve quality of care for patients and improve the work environment and morale of health care workers by reducing alert fatigue. Invention disclosure will be filed on the technology. The system can be implemented as a stand-alone information decision support system for use in health systems environments. It can also be licensed to health information technology companies and incorporated within commercial EMR systems. As design and technological development was guided by actual clinical data from EMR systems, the automatic filtering and information decision support system resulting from this work should be applicable to any clinical/hospital with EMR systems. Further, it can be adopted into health information technology companies and incorporated into existing commercialized EMR systems for national distribution. Economic Impact: This breakthrough can reduce the frequency of inconsequential alerts and provide an IT foundation for improving alert management strategies. It is helping set national standards for shaping the development of health information technology to enhance clinical information management. IT should have positive economic impacts by reducing costs associated with missing critical alerts and with responding to unnecessary alerts. For more information, contact Eva K. Lee, 404.894.4962, eva.lee@gatech.edu.
image of a horizontal line
Multi-Project Interdependency Mapping
Health care organizations are frequently faced with the problem of simultaneous projects, initiatives, implementations or transformations, without always a clear understanding of how these efforts interrelate or support each other. Multi-project interdependency mapping is a tool applied to multiple transformation in order to increase the absorptive capacity of health care organizations to effectively implement and sustain innovations. The project relies upon both narrative and numeric responses to standard interview items in detailed interviews with dozens of leaders in each health system. Subsequent mapping is based upon leader’s discussion of such interdependencies or linkages among projects, prioritization assigned by leaders to each transformational effort, and these leaders’ perceptions of the relative reliance of each effort upon each of four organizational technologies - administrative, information, clinical/work, and social technologies. The organizational technologies framework is derived from theories of control and coordination and socio-technical conceptualization of organizations. The organizational technologies framework was developed by CHOT researchers and has been used to compare and contrast a number of major transformation efforts and is currently being applied in two studies of organizational change.image of two doctors talking at a table Economic Impact: Top leadership in two large health systems identified this study as being of critical importance to their organizations’ learning and bottom lines. The health systems studied were engaged in numerous transformational programs such as the EMR implementation, Six Sigma, culture change, physician engagement, Baldrige review, and ongoing initiatives around quality, patient safety, and cost-effectiveness. For more information, contact Bita Kash, 979.845.0652, bakash@srph.tamhsc.edu. Center for High-Performance & Reconfigurable Computing (CHREC)
A CISE-funded Center
University of Florida, Alan George, 352.392.5225, george@hcs.ufl.edu
George Washington Univ., Tarek El-Ghazawi, 202.994.2607, tarek@gwu.edu
Brigham Young University, Brent Nelson, 801.422.6455, nelson@ee.byu.edu
Virginia Tech, Peter Athanas, 540.392.7250, athanas@vt.edu
Center website: http://www.chrec.org/
image of a horizontal line
Novo-G: Most Powerful Reconfigurable Supercomputer
It may be that the supercomputer, named Novo-G, developed at the Center for High-Performance Reconfigurable Computing (CHREC) is the world’s fastest reconfigurable supercomputer; likely faster than the Japanese supercomputer touted as the world’s most powerful. This innovative new form of supercomputing has a dramatically different internal structure than existing supercomputers. Most of the world’s other computers feature microprocessors with fixed-logic hardware structures. All software applications for these systems must conform to these fixed structures. This can lead to a significant loss of speed and increases in energy consumption. By contrast, reconfigurable computers can adapt to match the unique needs of each application, which can lead to much faster speeds and less wasted energy. Later in 2011, CHREC researchers will double the reconfigurable capacity of Novo-G, an upgrade only requiring a modest increase in size, power, and cooling, unlike upgrades with conventional supercomputers. Novo-G offers computing advantages in areas such as genome research, cancer diagnosis and plant science; any area that requires analysis of large data sets. This technology was awarded the Alexander Schwarzkopf Prize for Technological Innovation by the I/UCRC Association in 2012. image of Alan George and Herman Lam standing at their supercomputer image of a computer chip Economic Impact: Novo-G uses 192 reconfigurable processors rival the speed of the world’s largest supercomputers at a tiny fraction of their cost, size, power, and cooling. Conventional supercomputers, some the size of a large building, can consume up to millions of watts of electrical power, generating massive amounts of heat, whereas Novo-G is about the size of two home refrigerators and consumes less than 8,000 watts. Much of science depends on computers; most depends on computers doing many computations extremely fast. Reconfigurable computing in general and Novo-G in particular are bringing innovative new concepts and technologies that are expected to have a major positive economic impact. Computing and related information technologies are key drivers in economic development. If scientists can do complex study calculations in 30 seconds, instead of hours, days or weeks, then it changes the way science is done. The benefits of such advances to science and to the nation are extremely difficult to quantify - but they will undoubtedly be substantial. For more information, contact Alan George, 352.392.5225, george@hcs.ufl.edu or Herman Lam, 352.392.2689, hlam@ufl.edu. Center for Hybrid Multicore Productivity Research (CHMPR)
A CISE-funded Center
University of Maryland, Baltimore County, Milton Halem, Director, 410.455.3140, halem@umbc.edu
Georgia Institute of Technology, David Bader, 404.385.0004, bader@cc.gatech.edu
University of California-San Diego, Sheldon Brown, 858.534.2423, sgbrown@ucsd.edu
Center website: http://chmpr.ucsd.edu/
image of a horizontal line
Distributed Cloud Computing: 3-D Visualization Services for Climate Data on Demand
This study is a collaboration between the Center for Hybrid Multicore Productivity Research (CHMPR) at UMBC and the “Center for Advanced Knowledge Enablement (CAKE)” on page 15 at FIU and FAU. image of a heat map of the Earth Measuring the surface temperature of the entire Earth on a daily basis is a difficult challenge because 75% of the planet is covered with oceans and ice. Continuously determining, for several days to weeks, the vertical thermal field around a hurricane surrounded by dynamically rotating clouds is needed for more accurate landfall predictions. Thus, for applications ranging from climate change to hurricanes, satellites measure the Earth’s emitted infrared radiation twice daily with sufficiently high spatial and spectral resolution to provide an estimate of vertical profiles of regional or global surface brightness temperature (BT). However, in order to assess global warming, these temperatures need to be measured to within an accuracy of 0.10 ˚C per year since models indicate CO2 warming of ~20-30 over 100 years. Moreover, to resolve the structure around hurricanes, infrared data at resolutions of 1-5 km are needed. Not until 2002, when the Aqua satellite was launched, has there been a single satellite with instruments that can meet both the accuracy and the spatial resolution required. In this multi-center collaborative project, researchers from the Center for Hybrid Multicore Productivity Research (CHMPR) at UMBC and the Center for Advanced Knowledge Enablement (CAKE) at Florida International University (FIU) and Florida Atlantic University (FAU) have developed a capability to deliver a decade of 3-D gridded arrays of animated visualizations of spectral IR satellite radiance data from instruments on AQUA. These animations render in 3-D the vertical structure of a decade of global and regional temperature trends occurring at the surface and lower troposphere. In addition, the gridding algorithm developed by CHMPR has been applied to providing CAKE with 3-D temperature profiles that specify the thermal structure around hurricanes in order to improve their landfall prediction. computer image of atmospheric temperatures CHMPR and CAKE have implemented a distributed cloud computing web-based service, called SOAR, that incorporates this visualization capability as a public service available on an advanced IBM-based server cluster. This system provides researchers and students with the ability to select regional and temporal periods and automatically transform IR orbital satellite data into spherical grid arrays of 3-D temperature profiles for viewing the continuous changing thermal structure of the atmosphere. The FIU site at CAKE augmented the satellite data visualization by providing spatiotemporal visualization and animation of the data (http://cake.fiu.edu/SOAR). The FAU site at CAKE has developed tools for 3-D visualization of the vertical temperature profiles. When coupled with gridding CHMPR software, render for the past decade the first integrated scientifically validated multi-year infrared brightness temperature record. Economic Impact: Fundamental Decadal Data Records are highly desired products recommended by the National Academy of Science/National Research Council. The SOAR distributed cloud computing web-based service enhances NASA’s ACCESS program by providing fundamental brightness temperature records. This can go a long way towards improving scientific and public understanding of the nature of global and regional climate change. As a result, everyone can be better positioned to design any necessary policies and actions for mitigating negative impacts on the economy. For more information, contact Valerie Thomas, 410.455.2862, valeriet@umbc.edu or Naphtali Rishe, 305.672.6471, rishe@fiu.edu or Borko Furht, 561.297.3180, borko@cse.fau.edu.
image of a horizontal line
Specialized Graphic Processors
Until this year, supercomputers were based on tens of thousands of commodity processors like Intel and AMD multicore chips with 2 to 8 processors found in ordinary personal computers (PCs). These PCs contain specialized graphics cards that use hundreds of processors on their chips to render animations for games, simulations, and videos that are very fast and cheap. The graphics chips (GPUs) have evolved software and hardware that can not only do more than graphic renderings but can also perform complex floating point arithmetic. Lockheed Martin, a CHMPR member, supported a project at UMBC to study and test the performance of these GPUs when added to commodity based clusters. The company wanted to know whether such GPUs can accelerate the performance of the solution of a system of equations with more than a million unknowns. Such problems lead to enormous matrices of 1 million by 1 million terms or more than 30 Terrabytes (32X1012 or 32 million million) well beyond the capability of any computer to hold all these data internally in memory. Thus, this data intensive problem requires continuous moving of data from disks in and out of memory so that the processors can compute on them and then store them back on disks for future operations. It requires that all of the operations need to work in parallel. The method chosen for solving such equations is known as Gauss elimination and for implementation uses a transformation of the matrix into lower and upper triangular forms for direct and very fast solutions. These problems are commonly used in economics, chemistry, computer science, physics, and engineering. diagram of how the computer chips interconnect As pointed out, the unique application here performs LU decomposition on a disk based out-of-core 64-bit precision, full complex valued, non-symmetric matrix. Even with high speed interconnects, disks and CPUs the solution time for 1 million unknowns exceeds 25 days on a single multicore commodity chip. As a test case for Lockheed Martin, this project used two systems to perform timing tests. One system was based on their Cray computing node with an AMD chip and an Nvidia GPU. The other system used the CHMPR computing node with an Intel chip and also an Nvida GPU processor. A key result obtained with the additional graphic co-processor (the Nvidia GPU) added to the system was a reduction in wall clock time for solving a problem of 40,000 unknowns from 5 hours of solid computing to 40 minutes. Further studies show that a potential exists for reducing this time to less than 2 minutes when more recent available GPUs are used combined with solid state disks. Other of our government sponsors such as the NOAA/National Center for Environmental Prediction responsible for operational weather and climate forecasting and The NSA/Laboratory for Physical Sciences is supporting research into the resiliency of such hardware configurations when scaling to hundreds of millions of such processors. Economic Impact: Enormous cost benefits savings and new performance studies are possible for such critical problems when using the capabilities developed at CHMPR for capitalizing on the parallel nature of the architecture. This work made more feasible general accelerator technologies for solving large 64-bit complex valued matrices exceeding 1M unknowns. It is general enough that any type of accelerator with C or Fortran subroutine interfaces can be used to great effect by reducing design turn around times. For more information, contact Shujia Zhou, szhou@umbc.edu or Milton Halem, 410.455.3140, halem@umbc.edu. Center for Identification Technology Research (CITeR) A CISE-funded Center
Clarkson University, Stephanie Shuckers, Director, 315.268.6536, sschucke@clarkson.edu
West Virginia University, Bojan Cukic, 304.293.9686, bojan.cukic@mail.wvu.edu
University of Arizona, Judee Burgoon, 520.621.5818, jburgoon@cmi.arizona.edu
Center website: http://www.citer.wvu.edu/
image of a horizontal line
Automated Detection of Altered Fingerprints
For over 100 years, fingerprint identification has been successfully used to identify suspects and victims, primarily in law enforcement and forensics. Now it has become the backbone for broad security applications at border crossings, civil registration, and access control to secure buildings, or computer login. With the widespread deployment of Automated Fingerprint Identification Systems (AFIS), there have been growing instances worldwide where individuals, particularly criminals wish to conceal their true identity and illegal aliens wish to enter another country. Such individuals have altered (mutilated or destroyed) their fingerprint patterns by means of abrading, cutting, burning, or performing a plastic surgery on fingertips in order to evade AFIS. image of finger prints melted off and altered by cutting image of fingerprints before and after alteration One of the urgent tasks faced by law enforcement and border control agencies worldwide is to detect the altered fingerprints automatically, so that individuals with altered fingerprints go through a secondary inspection to establish true identity. Because law enforcement handle millions of fingerprints every day this detection needs to be extremely fast and reliable; meaning very few false alarms (as is the Department of Homeland Security’s US-VISIT system and the FBI’s IAFIS system). Research supported by the Center for Identification Technology Research (CITeR) has led to the development of an innovative approach for automatically detecting altered fingerprints based on pattern analysis techniques and mathematical modeling of fingerprints. Altered fingerprints are detected by observing abnormality in two fundamental fingerprint features - orientation field (fingerprint ridge flow) and minutiae (ridge bifurcation and ending points). image of fingerprint detection software finding patterns in images With CITeR funding, Anil Jain and his students at Michigan State University (MSU) have developed algorithms for automatic detection of altered fingerprints. The resulting software for detecting altered fingerprints has been licensed to Morpho (Safran Group), one of the world’s leading suppliers of identification, detection, and e-document solutions. Morpho customers include the Federal Bureau of Investigation (FBI) and more than 450 government agencies in over 100 different countries. The technology for automatic detection of altered fingerprints, developed by the MSU team through CITeR funding, will be integrated in Morpho products to prevent criminals and asylum seekers worldwide to evade identification through AFIS. This is an example of a successful transition from university research to a proof-of-concept to a commercial product. Economic Impact: The expected economic benefits of this breakthrough technology will come from the fact that it will foil most attempts by criminals and terrorists to alter fingerprints. This innovative advancement is expected to have many major, albeit hard to quantify positive economic impacts, mostly in avoided security breaches and the of the associated, oft incalculable costs. They will also result in economic benefits resulting from: 1) welfare programs secured by fingerprint recognition, effectively preventing fraud through fingerprint alteration; 2) prevention of criminals and other undesirable individuals from crossing national borders, and; 3) forestalling asylum seekers with prior history of criminal conviction from gaining entry where they are not wanted. For more information, contact Anil Jain, 517.355.9282, jain@msu.edu.
image of a horizontal line
Fingerprint Liveness Detection
image of a hand with each fingerprint being electronically identified Researchers at CITeR have shown that fingerprint biometric scanners, used for secure authentication, can be deceived easily, using simple, inexpensive techniques with fake or dismembered fingers, called spoofing. In this CITeR breakthrough, it has been demonstrated that perspiration can be used as a measure of liveness detection for fingerprint biometric systems. As a result, the potential for spoofing biometric fingerprint devices, one major vulnerability in the industry, in being minimized. Unlike cadaver or spoof fingers, live fingers demonstrate a distinctive spatial moisture pattern when in physical contact with the capturing surface of the fingerprint scanner. The work has considerate applications for homeland security. The pattern in the fingerprint images begins as ‘patchy’ areas of moisture around the pores spreading across the ridges over time. Image processing and pattern recognition algorithms have been developed to quantify this phenomenon using wavelet and statistical approaches. Previously, commercial biometric devices did not have a mechanism to prevent spoofing. Prior to the Fingerprint Liveness Detection (FLD) research the main approach to spoofing prevention was to combine the biometric with additional hardware to measure liveness signals such as the electrocardiogram, pulse oximetry or temperature. Disadvantages included the need for additional hardware combined that was bulky and inconvenient and possibility spoofable by a live (un-authorized) finger in combination with the spoof finger. The advantage of the new CITeR approach is that the biometric itself is naturally integrated with the liveness measure, requiring only an additional software algorithm to protect from spoofing. This research has raised the visibility of these major security issues through presentations, publications, and mainstream media (Discovery Channel, New York Times, National Public Radio) featuring FLD. As a result, industry has moved towards developing biometric devices that incorporate liveness, as well as other anti-spoofing measures. These CITeR-developed algorithms are being considered by major biometric companies internationally. Researchers have developed and applied for patents that represent the next generation of these original liveness algorithms. The center universities have licensed the intellectual property to a start-up company, called NexID Biometrics, LLC, incorporated and owned by the researchers. The company is now developing the technology for licensing to the biometric industry and system integrators for integration with their devices. image of a gloved hand fingerprinting a person Economic Impact: Research performed in CITeR has been followed by thorough evaluation and commercialization through a small company. The algorithm has been customized to provide liveness detection for variety of fingerprint sensors. Its commercialization pathways included collaboration with CITeR affiliates on high-security applications as well as integration in mass market swipe fingerprint sensors integrated with laptops. At this time, well over 1,000,000 laptops sold worldwide include versions of fingerprint liveness detection approaches derived from CITeR research. For more information, contact Stephanie Schuckers, 315.268.6536, sschucke@clarkson.edu.
image of a horizontal line
Multimodal Biometric (MUBI) Toolset
The design of multi-biometric systems has become significantly easier. Researchers at the Center for Identification Technology Research (CITeR) have developed the Multimodal Biometric (MUBI) Toolset. This toolset addresses the growing need in the prediction and evaluation of performance of systems that integrate multiple biometric devices and/or modalities. The toolset brings together more than a dozen algorithms from the research literature. It includes an embedded tutorial on multimodal biometric systems and fusion techniques. These algorithms represent all major types of biometric score normalization and fusion. The toolkit offers performance curves representing each biometric device. Then, it calculates ranges of performance characteristics (genuine accept vs. false accept rates) of different multi-biometric system configurations. It assists users with the selection of individual device performance characteristics such that they meet the desired application-specific performance goal. No such tool existed before the MUBI became publicly available as an open source software product, downloadable at no charge from CITeR’s Web site. The toolset supports biometric systems designers, system evaluators, students and all others interested in performance analysis and integration of biometric systems. For the developers of multi-biometric systems, MUBI significantly reduces the time needed to analyze and define the most suitable combination of biometric devices/modalities. Center developers are receiving numerous inquires about specific tool features from companies and federal agencies. The development of MUBI continues by adding features, improving graphical user interface, and allowing tool users to integrate their own experimental fusion techniques into the tool. Economic Impact: The toolset has been downloaded hundreds of times, mostly by students studying information fusion techniques in biometrics, software engineering and sensor networks. At the time that MUBI was being developed by CITeR researchers, major biometric systems in the US government (FBI’s New Generation Identification system and the Department of Defense’s Automated Biometric Identification System, etc.) moved towards adopting such multimodal identification techniques. Currently, MUBI is being used by CITeR members to investigate and develop optimal combinations of biometric modalities for clients. CITeR is committed to keeping MUBI available free of charge through an open source software license. For more information, contact Bojan Cukic or Arun Ross, 304.293.9686, bojan.cukic@mail.wvu.edu, arun.ross@mail.wvu.edu. Center for Information Protection (CIP) A CISE-funded Center
Iowa State University, Doug Jacobson, Director, 515.294.8307, dougj@iastate.edu
Stony Brook University, R. Sekar, 631.632.5758, sekar@cs.sunysb.edu
University of California-Davis, Matt Bishop, 858.534.6898
image of a horizontal line
Identity Theft Awareness & Prevention
Fraud prevention is fundamental to the survival of any business (or government for that matter) as is prevention of consumer/customer identity theft. Researchers at the Center for Information Protection have made advances in ID theft awareness research, evaluation, and education. They have developed company awareness policies sand lists of behavioral traits that make employees vulnerable to behavior that can lead to sensitive personal data loss of customers. Data collected from surveys and interviews of insurance professionals as well as of consumers provided the basis for this work. The literature in behavioral and workplace psychology and business/behavior was qualitatively scrutinized for insights into high risk and careless behavior of employees and customers. Public policies and best practices for due diligence were analyzed to more precisely identify avenues for remediation that can reduce future identity theft losses. Deliverables of this work included books on ID theft, manager, and employee training materials and presentations for corporate and public education.image of a thief shining a flashlight on a laptop Economic Impact : This work is fundamental to enabling sustainable economic development. The theft of identities has become an epidemic and threatens to disrupt business at all levels as well as the personal lives of the business's employees. Prevention of this theft can only be accomplished through the systematic training and education of individuals. For more information, contact Steffen Schmidt, 515.294.3825 or 515.294.7256, sws@iastate.edu. Center for Integration of Composites into Infrastructure (CICI)
(Formerly Repair of Buildings & Bridges with Composites - RB2C)
West Virginia University, Hota GangaRao, 304.293.9986, ghota@wvu.edu
Rutgers University, Perumalska Balaguru, 732.445.2232, balaguru@rci.rutgers.edu
University Miami-Florida, Antonio Nanni, 305.284.3391, nanni@miami.edu
North Carolina State University, Sami Rizkalla, 919.513.4336, sami_rizkalla@ncsu.edu
Center website: http://www.cemr.wvu.edu/cfc/cici/index.php
image of a horizontal line
Extending the Service Life of Bridges with Composite Wraps
Bridge infrastructure in the U.S. is aging. Bridges show their age through crumbing concrete and rough roads due in large part to foregone and/or poor maintenance. Repairs and upgrades are often necessitated because of increasing traffic loads and heavier vehicles. Traditionally, the only way to remedy these problems has been to close all or part of a highway and replacing the entire structure. This process results in huge replacement costs, complicated traffic re-routings and long delays. Researchers at the Center for Integration of Composites into Infrastructure (CICI) have developed tools that allow highway departments to design structures that take save costs by using innovative materials to rehabilitate, rather than replace failing bridges. image of a pillar before and after composite wrapping Composite wraps, consisting of fiberglass or carbon fiber fabrics saturated with a resin and bonded to concrete members, are being used to repair bridge members that are deficient due to corrosion, decay or even accidents. At CICI, these composite wraps been demonstrated to be highly effective on a number of projects. CICI is working with the West Virginia Department of Transportation - Division of Highways to develop standard details and specifications to allow practicing engineers to design, install and inspect composite wraps. Composite wraps can also increase the strength of an otherwise good member. This can enable existing structures to carry higher and heavier loads than they were originally designed accommodate. Installation of the wraps can be done very rapidly with minimal traffic interruption. Economic Impact: Using composite wraps can extend the lifespan of a structure by a decade or more. Rehabilitation time can be reduced from months to weeks. The result is reduced costs and increased structure life. The cost savings directly impact taxpayers by allowing limited transportation funding to improve more structures at substantially minimized cost. For more information, contact Hota GangaRao, 304.293.9986, hota.gangarao@mail.wvu.edu.
image of a horizontal line
Design Guide for Reinforcing Bridge Decks and Railings with Longer Service Capability
Imagine bridge decks that could last 75 to 100 years in service rather than the current 25 to 40 years. Corrosion is one of the main causes of infrastructure decay. Researchers at the Center for Repair of Buildings & Bridges with Composites are enabling a whole new industry to fill the need for potentially high volumes of fiber-reinforced polymer (FRP) rebar. image of bridge construction Drafting and successful interface with American Association of State Highway and Transportation Officials committee, T-6 will ENABLE the use of FRP reinforcing bars in bridge decks in the USA. This recently adopted design guide is the culmination of many years of research. This work means that previously funded NSF research in the field of fiber-reinforced polymers can now be used to make the Nation’s infrastructure more durable and longer lasting. The document will allow fiber-reinforced polymer producers, as well as state and federal departments of transportation to safely incorporate well-researched design, testing and implementation criteria to new materials for new construction and renovations. Before these guidelines had been published and adopted, there was little or no incentive for state Department of Transportation (DOT) engineers to consider as fiber-reinforced polymers. Economic Impact: A primary cause of bridge deck deterioration is corrosion of steel rebar. By using noncorrosive FRP rebar the bridge decks will last many years longer and delaying the costly replacement by decades. This will save taxpayers money by not having to replace the decks and also by reducing delays related to bridge reconstruction. By establishing these national standards on the use of FRP rebar, state highway departments and manufacturers will have clear specifications to provide for the widespread adoption of FRP rebar. For more information, contact Fabio Matta, 803.777.1917, fmatta@sc.edu.
image of a horizontal line
Preformed Fiberglass Grating Panel Systems (GRIDFORM)
GRIDFORM consists of fiberglass grating panel systems with fiber-reinforced polymer (FRP) plate for stay in place use. These FRP grating panels replace steel rebar in reinforced concrete bridge decks on vehicular bridges. The grating panels are shop fabricated and shipped to the job site ready for installation on the steel bridge girders and the concrete pour. Field installation time for the GRIDFORM panels including the concrete pouring is approximately 25% of normal steel rebar installation and concrete pour. This reduced installation time results in lower field installation costs and less disruption of service for people needing access to the bridge for travel. Additionally, reduced field installation time translates into a lower rate of construction workplace injuries. image of men working to spread cement on a bridge GRIDFORM grating panels have become recognized as a viable alternative to traditional steel reinforced concrete bridge decks. The use of GRIDFORM panels meets the Federal Highway Administration's initiative of "Get In and Get Out." The emphasis by FHWA is to reduce the amount of construction time and the concurrent disruption to the traveling public by utilizing new technologies and methods for rapid construction of bridges and roads. The new technology will result in producing the FRP grating panels at the manufacturing site of the FRP grating. This new breakthrough technology has resulted in a new product line for the Strongwell plant located in Chatfield, Minnesota. Strongwell is promoting this new product line to county and state transportation officials as a time saving alternative to traditional construction materials. Economic Impact: By using GRIDFORM panels, contractors can reduce construction time by 25%, which directly resulting in lower labor costs saving taxpayer’s money through more efficient use of transportation funding. This will also save the traveling public money directly by reducing the closure time needed to replace or repair and existing bridge decks and the time and inconvenience related to associated lane closures and detours. For more information, contact Antonio Nanni, 305.248.3391. Center for Integrative Materials Joining Science for Energy Applications (CIMJSEA)
The Ohio State University, Sudarsanam S. Babu, 614.247.0001, babu.13@osu.edu
University of Wisconsin-Madison, Sindo Kou, 608.262.0576, kou@engr.wisc.edu
Lehigh University, John DuPont, 610.758.3942, jnd1@lehigh.edu
Colorado School of Mines, Stephen Liu, 303.273.3796, sliu@mines.edu
Center website: http://materialsjoining.osu.edu/cimjsea/CIMJSEA/Welcome.html
image of a horizontal line
Very High Power Ultrasonic Additive Manufacturing for Energy Applications
Next generation power plants have been designed to operate at higher temperatures to improve efficiency. To further improve efficiency, advanced cooling methods are needed. This requires complex heat exchanger designs with unique thermal characteristics. A new manufacturing process, Very High Power Ultrasonic Additive Manufacturing (VHP UAM) has demonstrated the ability to fabricate these complex shapes. In addition, VHP UAM is an approach to manufacturing that is capable of embedding sensors into finished parts. These sensors can then be used to reduce costly down time by monitoring process and structure parameters that add condition-based maintenance capabilities. image of potential hybrid examples: Embedded electronics, embedded fiber optics, and complex shapes. Very high power ultrasonic additive manufacturing uses a unique combination of ultrasonic energy and force to create complex metal structures with dissimilar materials. The breakthrough is that thin strips of metal(s) can now be more easily bonded to create an engineered structural component that has novel thermal, corrosion, and operational properties. Economic Impact: VHP-UAM crosscut many industries, including EWI and other CIMJSEA sponsors because it overcomes cost and geometry constrains typically associated with conventional bonding methods, such as explosion bonding, cladding and brazing. The new process enables more cost effective engineering solutions that are essential for the next generation of efficient power plants. The ability to create complex parts, as well as embed sensors will extend the lifecycle and significantly reduce the manufacturing costs of next generation power plants. See http://www.techcolumbus.org/central-ohio-startup-launches-disruptive-materials-process-technology for additional details. A new business venture (Fabrisonics) has been launched to market this technology in collaboration with EWI and Solidica. For more information, contact Suresh Babu, 614.247.0001, babu.13@osu.edu. Center for Lasers and Plasmas for Advanced Manufacturing (CLAM)
University of Virginia, Mool C. Gupta, Director, 757.325.6850, mgupta@virginia.edu
University of Michigan, Jyoti Mazumder, 734.647.6824, mazumder@engin.umich.edu
Southern Methodist University, Radovan Kovacevic, 214.768.4865, kovacevi@engr.smu.edu
Center website: http://www.nsf.gov/eng/I/UCRC/directory/lam.jsp
image of a horizontal line
Hybrid Laser - UltraLight Steel Auto Body (ULSAB) Project
Research at the Center for Lasers & Plasmas for Advanced Manufacturing (CLAM) has made it possible to reduce the weight of vehicles and improve the fuel efficiency and safety. More and more galvanized steels high-strength steels are being used in the automotive industry. All (100%) of the ULSAB-A VC body structure is made of high strength steel (HSS), with over 80 percent being AHSS steels (dual phase steels take 74% among the AHSS steels). This breakthrough impacts outer body panels, inner panels, underbody, bumpers, impact beams, and reinforcements and more. Previous methods required the pre-processing and post-processing actions and are costly to be used in the practice. image of spatters and blowholes This new welding procedure combines the laser welding with the gas tungsten arc welding (GTAW) used as a preheating source has been successfully developed to lap join the galvanized dual phase steels in a gap-free configuration. GTAW leads the laser beam at the specific distance to preheat the work pieces. Under the controlled heat input from the GTAW, zinc coating at the top surface is burned and the metal oxides are generated at the top surface of pieces. Under these welding conditions, a stable “keyhole” is produced, which provides a channel for the highly pressurized zinc vapor to be vented out. Productivity efficiency is being dramatically increased by this new welding procedure in comparison with the other methods for welding of galvanized steels in a gap-free lap joint configuration. Economic Impact: Current practice for laser welding of Zn-coated steel sheet in the automotive industry is to provide a slight gap between the two sheets to be welded. The gap allows for the zinc vapor to escape. Problems associated this gap drive additional process costs for each and every such joint. Therefore, significant savings can be achieved by enabling a zero-gap laser weld condition. For more information, contact Shanglu (David) Yang, shangluy@mail.smu.edu. Image of Experimental setup for hybrid
GTAW preheating-Laser welding
of galvanized steels in a gap-free
lap joint configuration. Image of sample of the gap-free lap joints from the hybrid GTAW preheating-laser welding in galvanized dual phase
steels.
image of a horizontal line
Field Portable Welding of Titanium Tubes
image of a titaniun tube Welding titanium tubes in military aircraft is an exceedingly critical and very difficult task. Ordinarily this process is accomplished by manufacturing in super clean operational environments using high-skilled personnel. The U.S. military not only would like to develop the capability to perform welding of titanium tubes aboard ships and into austere, remote areas, but it would also like to avoid using some of the chemicals traditionally used to clean titanium tubes prior to welding. Working with the Center for Lasers and Plasmas for Advanced Manufacturing, the U.S. military has demonstrated that surfaces of titanium tubes can be successfully cleaned using lasers instead of caustic and environmentally harmful chemicals, thus successfully removing the oxidation layer and any contaminants on the outside of the tube. This laser technology provides a very accurate method of controlling the depth of oxide removal in welding. Now that the feasibility of this approach has been demonstrated work is underway to package the system in a portable, maintainable system for deployment in the field. Economic Impact: This technology will have economic impact in two ways. First, it reduces the use of toxic chemicals for surface oxide removal. Second, it lowers the cost of sample preparation prior to welding. It also is a portable system that has military and commercial applications. For more information, contact Mool C. Gupta, 757.325.6850, mgupta@virginia.edu.
image of a horizontal line
Extending Damage Limits of Hydraulic Systems in Military Aircraft
image of a military helicopter Titanium tubing provides the critical arteries of hydraulic systems in military aircraft. The tubing is comprised of thin-walled tubes capable of withstanding high pressures in the range of 5,000 psi. Research at the Center for Lasers and Plasmas for Advanced Manufacturing (LAM) at the University of Virginia has helped in assessing the ability to expand the damage limits of the tubing; that is, how much sustained damage can be safely tolerated. Expanding the damage limit can reduce maintenance man hours and reduce operational support costs. Research results have demonstrated that there were additional margins in some areas that translated into expanded damage limits. As a result of this work, aircraft are performing much better from a maintainability standpoint. This should result in considerable savings to the military over the next 15-20 years. Economic Impact: Knowing the structure damage limits under realistic operating conditions can avoid the premature failure of components. The premature failure can cause loss of life and failure of aircraft. For these reasons, this work enhances safety and is having large economic impacts. For more information, contact Mool C. Gupta, 757.325.6850, mgupta@virginia.eduimage of wiring
image of a horizontal line
Ultra Lightweight Structures Using Carbon Nanotubes
graph of carbon nanotube loading Ultra lightweight materials capable of electronic conduction are needed by National Aeronautics and Space Administration and the military. Ultra lightweight electrically conducting materials would provide structures for Electromagnetic Interference (EMI) Shielding applications for commercial and space applications, development of advanced sensors, lower cost canopy for aircrafts, lightening protection, electronic packaging, printed circuit boards etc. Research at the Center for Lasers and Plasmas for Advanced Manufacturing (LAM) at the University of Virginia has shown that ultra lightweight electrically conducting materials can be obtained by incorporation of lightweight carbon nanotubes in polymeric materials. Research has demonstrated that the weight of the nanotubes can be further reduced by conversion to foam structures. Density of 0.56 gm/cm3 was obtained. These kinds of flexible conductive composites may be used for typical antenna systems, lightning-protected aircraft composite panels, avionics line replaceable unit (LRU) enclosures, connector gaskets, electrostatic and space charge dissipation materials, and different types of electronic pressure sensitive switches or sensors. The University of Virginia has filed a patent application on this technology due to its large commercial and defense application potentials. Economic Impact: Increasing amounts of electromagnetic signals are emanated from variety of electronic components. If they are not adequately shielded from external noise these electromagnetic signals may cause interference of nearby equipment. Electronic shielding of many components is therefore essential. Lightweight electrically conducting nanocomposites will find applications for shielding of military components, biomedical instruments and of instruments used daily life such as cell phones, computers, laptops, radio, CD players etc. The economic impact of lightweight electrically conducting nanocomposites is substantial but is difficult to quantify. For more information, contact Mool C. Gupta, 757.325.6850, mgupta@virginia.edu. SEM images of CNT nanocomposite. SEM images of CNT nanocomposite.
image of a horizontal line
Laser Texturing of Surfaces and Commercial Applications
SEM images of CNT nanocomposite. Laser processing provides a unique method of modifying materials surfaces by depositing large amounts of energy onto the surface of a material in a tightly controlled manner. Research at the Center for Lasers and Plasmas for Advanced Manufacturing (CLAM) at the University of Virginia has helped to develop enhanced textured surfaces on metals and semiconductors. The laser treatment causes pillars to form on the treated surface. These pillars provide for greater light absorption for solar energy conversion, enhanced light detection, improved tissue growth for body implants, higher catalytic activity, and better heat sinks. This research is leading to the formation of a new high technology company for commercial products and defense applications. Because of its large commercial and defense application potentials, the University of Virginia has filed an industry supported patent application. This technology can be used for solar energy applications for efficient trapping of sun light incident at different angles. Microtextured surfaces can be used for anti-icing applications. Economic Impact: Ice buildup is a major problem for commercial and military aircrafts, blades for wind energy generation, refrigeration systems, and outdoor antennas. For these reasons, the economic impacts of this technology for key industries and for the nation are substantial but difficult to quantify precisely. For more information, contact Mool C. Gupta, 757.325.6850, mgupta@virginia.edu. Center for Metamaterials (CfM)
CUNY City College, Director, David Crouse, 212.650.5330, dcrouse@ccny.cuny.edu
Clarkson University, S.V. Babu, 315.268.2336, babu@clarkson.edu
University of North Carolina at Charlotte, Michael Fiddy, 704.687.8594, mafiddy@uncc.edu
Center website: http://174.143.170.127/iucrc/publicFactSheetServlet?centerId=63
image of a horizontal line
Fast, Flexible, and Accurate Algorithms for Metamaterial Device Design
Fast and accurate optical modeling tools are essential in efficient development of new photonic, plasmonic and metamaterial devices for a wide range of applications, including optical and infrared sensors, imaging systems, solar cells and other renewable energy devices. These applications are important to almost all CfM member companies. Unfortunately, all commercially available modeling and design programs have significant shortcomings in that they are slow, not user-friendly, and are prone to errors when used to model realistic metamaterial structures and devices. The commercially available programs lack the ability to simulate situations that would be encountered in real-life situations and have limited ability to test the metamaterials when exposed to complex light and radiation patterns. Bottom line: with existing programs it is difficult and time consuming to extract the desired information from the modeling results. image of solar panels Researchers at CfM are addressing these limitations by developing new programs and algorithms specifically designed to quickly and accurately model metamaterials. These algorithms improve the speed of modeling and simulation, provide better accuracy, and organize the data in the ways desired by for photonics scientists and engineers. This CfM research project has received the much sponsor interest. Soon the program will be at a state where it can be distributed to the other member companies of the CfM. This project is ongoing and the program will be continually developed to add additional capabilities and improve its speed and ease of use. Image of the RCWA Modeling Program from the Center for Metamaterials. This application models
and analyzes several types of optical materials and offers quick graphical construction and
easy data extraction. The core algorithm is fast, accurate, and employs different
approximations depending on the desired speed and accuracy of the output and
structure's nature. There are several advantages of this software. First, the program improves the speed of modeling metamaterial and optical devices by a factor of 10,000 relative to existing commercially available programs. Second, it can be more accurate for designing realistic optical materials. Third, it is easy to add-on design optimization functions, scans of radiation patterns, and coupling the optical modeling to electrical and thermal modeling programs. These advantages represent a breakthrough in metamaterials modeling and design. Early versions of the software are already being tested and used by one of the CfM member companies, Phoebus Optoelectronics, to model and design an optical sensor for NASA and other projects funded by the United States Army and DARPA. Phoebus is using this program to take advantage of its capabilities of scanning hundreds of possible device structures to finalize device designs. Economic Impact: The expected economic benefit of this breakthrough technology will reduce design and development time for new photonic, plasmonic and metamaterial devices that can be applied in a variety of research, commercial and defense applications. Examples of these applications include high efficiency solar cells and other renewable energy devices (hydrogen and methanol generation devices), advanced optical sensors and imaging systems, ultra-high bandwidth communication devices, and optical coatings for cloaking, stealth and electromagnetic shielding. The resulting improvements in design and development efficiency should ultimately reduce product costs thereby enhancing the competitiveness of American industry in the global marketplace. For more information, contact David Crouse, 212.650.5330, dcrouse@ccny.edu. Center for Microcontamination Control (CMC)
Northeastern University, Ahmed A. Busnaina, Director, 617.373.2992, busnaina@coe.neu.edu
University of Arizona, HG "Skip" Parks, Director, 520.621.6180, parks@ece.arizona.edu
Center website: www.ece.arizona.edu/~cmc/ or www.cmc.neu.edu
image of a horizontal line
Physical Removal of Nanoscale Particles from surfaces and trenches
Researchers at the Center for Microcontamination Control (CMC) have developed a substrate independent technique for the removal of nano particles (down to 26 nm) from large areas in a very short time (less than a minute). The technique uses high frequency acoustic streaming in a specially designed tool by CMC researchers. The technique has also been shown to be capable of efficiently removing nanoparticles from deep trenches (as deep as 500 microns). This is the first time that such removal from trenches has been directly demonstrated. The technique has been applied to semiconductor wafers, hard disk media and head, and flat panel displays and is the first demonstration of substrate independent removal of nanoparticles. These techniques enable companies to remove nanoscale particles without any effect on sensitive substrates. The techniques are based on reductions of the boundary layer thickness from thousands of microns to submicron levels. They allow even low velocities to remove nano particles and opens the field to many other applications that requires a high shear velocity near the surface. A patent has been filed and a member company (PCT Systems) already made two prototypes. Another member company (Seagate) is ordering an additional prototype to evaluate for their fabrication development based on our results in removing manufacturing defects. For more information, contact Ahmed Busnaina, 617.373.2992, a.busnaina@neu.edu.
image of a horizontal line
Observing Bacteria on the Inside Walls of Ultrapure Water Pipes
A decade ago, the Center for Microcontamination Control sponsored work that lead to Polymerase Chain Reaction (PCR) amplification of DNA in ultrapure-water-born bacteria. Eventually, a process was developed that would measure one bacterium in one liter of water. Subsequently, it became apparent that most of the bacteria reside on the walls of the ultrapure water piping in concentrations of 10,000 to 1,000,000-times greater. The process of semiconductor process contamination results when small areas of the bacterial colonies or biofilms are released from the surface at infrequent and random intervals. Once this concept was understood, it was realized that the primary issue is detecting the nucleation and growth of bacterial films on the piping materials used in the distribution of ultrapure water. The breakthrough was the development of a new and novel technology that can monitor and detect growth of surface bacterial films. This technology can be made so sensitive that it can detect the protein substance that must deposit before the first layer of bacteria attaches to the piping walls. It can also be made less sensitive for less demanding applications. image of water pipes Economic Impact: This device will be of most use in the pharmaceutical industry, where bacterial monitoring is now coming under more scrutiny and is more serious than in the semiconductor industry. The CMC is currently working with a small business to develop and manufacture this detector. For more information, contact Jon Sjogren, jsjogren@ece.arizona.edu Center for Nondestructive Evaluation (CNDE)
Iowa State University, Lisa Brasche, Director, 515.294.5227, lbrasche@cnde.iastate.edu
Center website: http://www.cnde.iastate.edu/
image of a horizontal line
Generic Scanner to Image NDE Data
image of the scanner next to a laptop The Generic Scanner or "GenScan" has demonstrated the ability to take off-the-shelf, relatively inexpensive nondestructive evaluation (NDE) flaw detectors and combine them with non-encumbering position encoding devices and newly developed software. This creates semi-automated NDE scanners that have far greater capability than previously available scanners. This CNDE research was initially funded by the Federal Aviation Administration to develop improved methods for inspecting composite aircraft structures. The main advantage of the generic scanner is that it allows inspectors to create images that can provide a more intuitive and thorough inspection of relatively large areas of commercial aircraft, e.g., composite control surfaces. The system has been designed to mate with a number of portable NDE devices used throughout the aviation industry. Image scans from the scanner can be readily saved and transmitted electronically for further off-site analysis. GenScan has been successfully coupled with several eddy current and ultrasonic flaw detectors. A number of scanner prototypes have been assembled and beta site tested at several aviation maintenance facilities in the civil and military sectors. These GenScans can increase inspection capabilities of a variety of existing NDE devices, enables maintenance organizations to extend the use of instruments, and mitigates the need to purchase more expensive, specialized NDE instruments with built-in imaging systems. Economic Impact: The generic scanner will provide more robust inspections, particularly of relatively large featureless areas encountered such as those on composite aircraft. It is being actively reviewed by the US Navy and US Air Force for its ability to increase existing inspection capabilities and can be easily adapted for inspection applications. For more information, contact David K. Hsu, 515.294.2501, dkhsu@iastate.edu or Dan J. Barnard, 515.294.9998.
image of a horizontal line
Dripless Bubbler: Portable Scanner for Aircraft Inspection
Researchers at the Center for Nondestructive Evaluation (CNDE) have developed a field able ultrasonic scanning system for aircraft inspection. Developed by the center’s Composite Group, the "Dripless Bubbler" is the first portable ultrasonic scanner with a closed-cycle water couplant and uses high frequency focused ultrasonic beam. It is essentially a portable ultrasonic scanner designed and developed for aircraft inspection. It can be attached to the fuselage of an aircraft and inspect it for hidden corrosion. It uses a unique closed-cycle pump/vacuum water handling system that uses focused transducers. The focused ultrasonic beam leads to superior image resolution and more accurate determination of the metal loss due to corrosion. The dripless bubble scanner has the unique capability of scanning over protruding rivets on the aircraft skin. The closed-cycle water-handling feature makes it compatible with the safety requirements of maintenance hangars. Because this device performs ultrasonic inspection with a focused beam, it provides much improved resolution and sensitivity compared to previous methods. The resolution afforded by the focused transducer makes it a useful tool for mapping out the depth profile of corrosion. Economic Impact: The Dripless Bubbler received an R&D 100 award. It was licensed to and commercialized by Sierra Matrix, Inc. of Fremont, California. The technology was also used in addressing the corrosion problem of KC135 wing skins around fasteners. For more information, contact David K. Hsu, 515.294.2501, dhsu@cnde.iastate.edu.
image of a horizontal line
Time-Proven “Coin Tap” Automated
The hearing-based, manual tap test, practiced widely by aircraft inspectors, was computerized and automated to give it quantitative and imaging capabilities and to take the "human factor" variation out of the inspection procedure. The tapping action was automated with the invention of a magnetic cam-action cart. Equally-spaced and uniform taps were made as the cart was pushed over the part's surface. The simple encoding method gave the system a previously unavailable imaging capability. Computer-aided tap tester (CATT) has proven effective for the inspection of both composite structures and metal honeycomb structures on a wide variety of control surfaces on aircraft. It also provided the quantitative inspection results in the form of images that can be archived electronically. image of an airplane Economic Impact: The technology was patented and licensed to a start-up company, Advanced Structural Imaging, Inc. in 2001. Two of the original inventors of the CATT participated in the company. So far, Boeing and other aircraft manufacturers and R&D organizations in NDE have purchased ten units from the company. For additional information, contact David K. Hsu, 515.294.2501, dhsu@cnde.iastate.edu. Center for Particulate and Surfactant Systems (CPaSS)
University of Florida, Brij Moudgil, Director, 352.846.1194, bmoudgil@erc.ufl.edu
Columbia University, Ponisseril Somasundaran, 212.854.2926, ps24@columbia.edu
Center website: http://iucrc.perc.ufl.edu/
image of a horizontal line
Prototype Greenness Index for Mineral Resource Development
A new Greenness Index based on an integration of the Twelve Principles of Green Chemistry and ICMM’s ten principles for Sustainable Mining Development is in the prototype stage. This tool will bridge the existing gap between focused efforts in the chemical and mineral processing industries. The Greenness Index will allow mining companies to evaluate the multifarious aspects of sustainability in mineral processing operations. Supported by several major mining companies, this collaborative and comprehensive effort is the first of its kind. Over the next twelve months the algorithm will be tested and refined for chemically assisted flotation operations. This will ultimately allow the economic and environmental impact of sustainable processes to be engineered.image of a mine interior Economic Impact: This new initiative and the progress made in the past year has already spurred great interest in the mining and the chemical industry in terms of providing a framework and tools to evaluate and account for chemicals in the mining life cycle. The work is designed to provide the necessary metrics to evaluate current industry standards. With iterative refinements to the prototype, we plan to develop a robust tool that can suggest greener alternatives to old industry standards. Nationally it promotes awareness of the use and impact of chemicals and open accountability. The integration of the Twelve Principles of Green Chemistry and ICMM’s ten principles for Sustainable Mining Development philosophy into the prototype is essential to promote greener mineral processing. For more information, contact Ponisseril Somasundaran, 212.854.2926, ps24@columbia.edu.
image of a horizontal line
Sustainable Mineral Resource Recovery
One of the major sustainability challenges facing the mineral industry today is the selective and efficient recovery of strategically important valuable minerals and metals from low quality complex ores. Processing of ores is characterized by poor mineral recovery and high water and energy consumption. Flotation separation, which is still the most widely used method, is severely hampered because of the presence in ores of waste minerals, notably silicates. While the adverse effects of waste minerals are well recognized, the root causes for these effects remained elusive, thereby impeding development of robust solutions to the problem. image of Network formation by fibrous minerals leading to poor mineral
recovery and process efficiency. In collaboration with Vale and Cytec Industries, CPaSS researchers have uncovered important root causes. The pioneering work has demonstrated that the shape and morphology of the waste silicates have large negative impacts on processing efficiency; in some cases far greater than that resulting from the chemistry of such silicates. Most notably, strong evidence was found for complex fibrous networks in suspensions of the ground ores. These networks dramatically increase slurry viscosity, reduce the efficiency of gas dispersion and bubble-particle attachment; all of which lead to poor separation and process efficiency. The prevailing belief in the scientific community was that the detrimental effect of waste silicates was due to hetero-coagulation between the silicates and the valuable minerals, generally referred to as slime coating. CPaSS research has demonstrated that platy, acicular or fibrous particles interact and entangle in ore suspensions to form micro and macro networks, which result in dramatic changes in slurry rheology, gas dispersion and bubble-particle attachment. Such network formation leads to several undesirable consequences: transport of large amounts of non-value silicates to the value mineral concentrate by bubble flux, reduction in valuable mineral recovery, high water and energy consumption. Our initial discoveries were made in Ni ores. Economic Impact: This novel research will have a significant scientific and technological impact, allowing us to devise ways to enhance selective separation of valuable minerals from complex and poor quality ores while consuming less water and energy, thereby addressing the sustainability challenges facing the U.S. industry for the foreseeable future. Plant operators will benefit by implementing our recommendations and solutions, particularly in diagnosing the problem before processing, and making better decisions related to selection of reagents and conditions. For more information, contact Ponisseril Somasundaran, 212.854.2926, ps24@columbia.edu. Center for Pharmaceutical Development (CPD)
Georgia Institute of Technology, Andy Bommarius, Director, 404.385.1334, andreas.bommarius@chbe.gatech.edu
University of Kentucky, Eric Munson, 785.864.3319, eric.munson@uky.edu
Center websites: http://cpd.gatech.edu and cpd.uky.edu
image of a horizontal line
Better/Cheaper Drugs: New Routes to Active Pharmaceutical Ingredients
image of pharma manufacturing of pills Many active pharmaceutical ingredients, the part of a drug formulation responsible for its beneficial action, contain an amine group, similar to amino acids, the building blocks of life. Moreover, such amine groups have to be present as a single enantiomer and not a mixture, in other words the amine groups have to have a very specific orientation in space, or else the drug most often either is ineffective or even detrimental (recall the case of Thalidomide, where the presence of the wrong enantiomer causes birth defects). Pure amines are difficult to synthesize, so difficult in fact that the Pharmaceutical Roundtable of the American Chemical Society Green Chemistry Institute listed the generation of such pure amines from easily accessible ketone precursors as the second highest priority for novel, aspirational reactions. A team of CPD researchers has developed a novel protein biocatalyst that achieves just such a transformation to amines from ketones. They started from a known protein biocatalyst and engineered it to accept ketones and to synthesize the desired amines in great stereochemical purity. Being able to catalyze the conversion from ketone substrates to amine products is such an important addition to the toolbox that it stands to develop into a platform technology, applicable to the synthesis of a wide variety of targets in several therapeutic areas. Production of active pharmaceutical ingredients via biological routes stands to increase yields and shorten process routes via enhanced selectivity of key steps. One CPD sponsor intends to make use of this new ability for its own drug synthesis development efforts. The pharmaceutical company will receive the first batch of protein in 2012, less than one year after the ultimately successful protein template was first begun for development. The company indicated that it has several potential targets to which it will apply this new technology. Economic Impact: This technology could impact the synthesis of drugs that contain chiral centers adjacent to nitrogen by providing more efficient methods for their manufacture. Not only is this process considerably more economical than existing processes that use the traditional wholly chemical routes, but they also leave substantially reduced environmental footprints. The impact on the manufacture of a single blockbuster drug is can be in excess of a billion dollars over the lifetime of such a drug. In a comparable case, Merck recently published an improved route to the active ingredient of Januvia® and Janumet®, a drug soon to reach blockbuster status; that route is said to be at least 25% cheaper than the current one. In addition, such biologically-inspired manufacturing with reduced environmental footprints are welcomed by most communities because processes to secure procure permits are greatly facilitated. In summary, more efficient manufacture of pharmaceuticals will help to drive down the cost of drugs to patients and help keep high-end pharma industry manufacturing jobs in the US. As a result, this innovation can be expected to have broad economic impact across the pharmaceutical and fine chemical industries. For more information, contact Andreas Bommarius, 404.385.1334, andreas.bommarius@chbe.gatech.edu Center for Precision Forming (CPF)
Ohio State University, Taylan Altan, 614.292.5063, altan.1@osu.edu
Virginia Commonwealth University, Muammer Koc, 804.827.7029, mkoc@umich.edu
Center website: http://nsm.eng.ohio-state.edu/cpf/
image of a horizontal line
Forming Advanced High Strength Steels (AHSS)
image of a car body along with a pie chart Technology to manufacture automotive components from AHSS assists the automotive industry and its suppliers in producing vehicles that are light in weight, use less fuel, and reduce pollution. Members of CPF such as Honda, General Motors and Johnson Controls are benefiting from CPF’s research on forming AHSS. Research at the Center for Precision Forming focuses on aspects of forming behavior of high strength steels including: stamping, bending, edge cracking, die wear, lubrication, hot stamping, etc. Knowledge gained from this research will assist further implementation of advanced high strength steel (AHSS) in the automotive industry with the obvious benefit of lighter weight vehicles with resulting improvements in fuel economy and decreased carbon emissions. Mild steel has traditionally been used for vehicle structures and closure panels because of its low cost and good formability. Implementation of AHSS can enable the use of thinner gage sheet (less volume and less mass) for comparable structural performance. The challenge is that AHSSs are much more difficult to form and assemble. This research advances the forming technology. Due to their higher strength, the automotive parts made from these steels exhibit considerable springback so that maintaining the part shape and tolerances requires a deeper understanding of the mechanical characteristics of these materials. CPF is further developing well-known tests and developing new methods for predicting springback and for forming these materials to defect-free components for use in automotive manufacturing. The direct result is mass reduction without sacrificing performance and safety. Economic Impact: This forming technology advancement will help proliferate AHSS in vehicle structures through improves stamping, trimming and assembly procedures for automotive sheet metal components; thus improving the viability and competitiveness of the U.S. automotive industry. Because this technology assists in producing vehicles that are lighter weight, more fuel efficient and less polluting, the economic impact on the nation should be significant. For more information, contact Taylan Altan, 614.292.5063, altan.1@osu.edu. Center for Process Analytical Chemistry (CPAC)
University of Washington, Brian Marquardt, Director, 206.685.0112, marquardt@apl.washington.edu
University of Washington, Mel Koch, Marketing Director, 206.616.4869, mel@cpac.washington.edu
Center website: http://www.cpac.washington.edu/
image of a horizontal line
New Sampling and Sensor Initiative (NeSSI™)
image of NESSI components Although process analyzers have undergone significant technological advances recently, the systems that deliver samples to them have hardly changed in the last fifty years. An initiative, launched in 2000 by the Center for Process Analytical Chemistry (CPAC), is primarily concerned with the treatment and continuous analysis of samples extracted from process equipment. NeSSI™ provides specifications, guidelines and a forum for the on-going development of a Lego®-style building block platform for analytical systems. It was widely recognized that CPAC’s leadership had developed NeSSI™ into a valuable global ad hoc initiative comprised of end users and suppliers and is resulting in permanent changes within the process analysis field. It has now been commercialized by several hardware manufactures and is being specified on major new-build projects. Several engineering companies now provide networked connectivity for all components of the NeSSI™ platform, opening the door to smart diagnostics and improved remote technical support. NeSSI™ is now used to enable improved process analytical measurements in the petrochemical, chemical and oil refining industries and is being studied for applications in the pharmaceutical and bio-technology industries. Economic Impact: These new NeSSI™ platforms are showing reduced costs (both in the cost to build and the cost to own) with increasing reliability, and hence the value delivered by process analyzers. The combined benefits to industry are growing and will soon be measured in tens of millions of dollars per year.image of a gas pipeline
image of a horizontal line
Non-Destructive Spectroscopic Measurement: Inline Octane Sensor
The Center for Process Analytical Chemistry (CPAC) pioneered a revolutionary approach to octane determination in oil refineries. This ground-breaking new octane sensing method uses a non-destructive spectroscopic measurement followed by multivariate calibration techniques to predict diverse physical, chemical, and consumer properties of fuel. This industry-changing advance was possible because the spectrum of the material clearly reveals the number and types of functional groups (e.g., methyls, methylenes, olefins, aromatics). In combination these determine gasoline's physical, chemical, and consumer properties. As a bonus, the octane sensor can simultaneously predict a number of important properties of gasoline such as density, vapor pressure, and percent aromatics. All of these measurements are made nondestructively on one cc of sample. Results are available instantaneously. The approach has proven an invaluable adjunct to process analytical chemistry. image of a gas station Economic Impact: This octane determination method represents a vast improvement over previous octane determination methodologies because gasoline octane levels were determined by rather antiquated ASTM-CFR test engines, wherein the sample's performance was compared to reference fuel blends. The instrumentation required for the measurement was very expensive (over $100,000), required constant maintenance, needed frequent standardization, consumed approximately one pint of gasoline per test and, most importantly, required 20 minutes to produce results. The now commercially available, in-line, real time, "octane sensor" is now universally used worldwide by oil refining companies because it quickly, accurately and in real time predicts octane levels from the near-infrared vibrational spectrum of the inline sample. Worldwide, This in-line octane sensor saves the oil gasoline refinement industry many millions of dollars per day; an estimated 1.5 billion dollars per year.image of NESSi For more information, contact James Callis, 206.543.1208, callis@u.washington.edu or Mel Koch, 206.616.4869, mel@cpac.washington.edu.
image of a horizontal line
Robust Vapochromic Sensors for O 2 Sensors
Vapochromic materials are metallo-organic compounds whose absorption or emission spectrum changes in the presence of analytes (chemicals) of interest. This vapochromic sensor methodology allows selective and fast spectroscopic detection of a broad range of analytes (permanent gasses, common solvents, moisture, small molecular weight organics and halo-organics). To illustrate the breakthrough, measurement of oxygen in gases or as dissolved oxygen in water can be done with such sensors. With suitable fiber optic connections, these sensors can be used for remote process or environmental measurements, many in hostile environments. The technology for this oxygen sensor has now been demonstrated in a variety of applications and it has unique advantages over existing approaches. Sensor uses range from process measurements for production plant operations to medical and environmental sensing applications. Economic Impact: A variety of other potential vapochromic sensors are being studied that could have wide applications and economic impacts for variety of industries. For more information, contact Brian Marquardt, 206.685.0112, marquardt@apl.washington.edu or Mel Koch, 206.616.4869, mel@cpac.washington.edu.
image of a horizontal line
Process Chemometrics
Through the efforts of the Center for Process Analytical Chemistry (CPAC), the tools of chemometrics were introduced to the chemical industry, allowing important process and product performance quantities to be obtained from indirect chemical measurements. An important example of the process of chemometrics is the calculation of the fuel performance standard, such as the octane number of gasoline, from infrared and near infrared spectroscopic data obtained on-line during the blending process. Another is the estimation of product performance at an early state in the manufacturing process such as elongation strength of finished polymer fibers. For the first time, Multivariate Statistical Process Control (MSPC) allowed manufacturing processes to be controlled using all of the process measurements together as opposed to the old methods of Statistical Process Control (SPC) which demanded the analysis of control charts for each process variable. Chemometric methods have allowed industrial chemists and engineers to extract all of available information from data acquired during the manufacturing process. Additionally, it has provided tools to determine the actual value of process measurements and/or control parameters leading to a major cost savings by discontinuing the acquisition of useless information. image of a refinery Several new companies have been established to help chemical and material companies learn to use chemometric tools developed at CPAC. Most chemical, material, pharmaceutical, food, and fuel companies depend on internal chemometric groups who do exploratory and routine analysis of process and product data. As well, many analytical instruments have now incorporated chemometrics in their operating systems. Recent efforts at CPAC to enhance the treatment of data, development of alignment algorithms for GC data and data fusion methods for multisensory analysis, will further add value to the field of chemometrics. Economic Impact: The impact of utilizing chemometrics is in the many millions of dollars relative to its effect on process control for optimizing processes across many industries. For more information, contact Brian Marquardt, 206.685.0112, marquardt@apl.washington.edu, or Mel Koch, 206.616.4869, mel@cpac.washington.edu. Center for Research on Information Technology & Organizations (CRITO)
University of California - Irvine, Vijay Gurbaxani, Director, 949.824.5215, vgurbaxa@uci.edu
Center website: http://www.crito.uci.edu/
image of a horizontal line
The Net-Enabled Organization
The forces of globalization and networking technologies are affecting the ways that companies work and do business. The commercial airplane unit of Boeing, for example, is partnering with a large number of groups around the world to build the new Dreamliner 787 airplane. Behind that network of organizations is a backbone of infrastructure for an information-rich environment. In such an environment, information needs to be available anywhere and anytime to anyone who needs it. To make this happen most efficiently, hierarchical organization structures are needed wherein decisions are made, then information flows up and down and needed approvals can be obtained. Simultaneously, everyone can look horizontally across multiple organizations to build teams, to complete tasks and to solve problems when within their scope of responsibility. With the help of CRITO, Boeing is working to improve understanding of how to orchestrate networks with internal and external partners. image of the Boeing 787 Center research is providing the theoretical basis as Boeing transforms itself into a different kind of organization. The underlying hypothesis is that those companies that are best at orchestrating these networks and at using them to provide most value to their customers will be the most successful. One important dimension of the solution is better understanding what companies need on order to keep tight internal control versus what it can or should safely network. Benefits of this approach are less inventory, fewer facilities and services that are market driven instead of internally driven. Profitability can be increased and companies can be more responsive. In the case of Boeing, airplanes can be built more efficiently. Using the tools of social science and business research, including case studies, interviews, surveys, modeling and analysis, CRITO researchers are working on the conceptual aspects the networked business model and helping to create the processes and tools to work more effectively in these new structures. Economic Impact: Benefits of this approach are less inventory, fewer facilities and services that are market driven instead of internally driven. Profitability can be increased and companies can be more responsive. In the case of Boeing, airplanes can be built more efficiently. For more information, contact Vijay Gurbaxan, 949.824.5215, vgurbaxa@uci.edu. Center for Resource Recovery and Recycling (CR3)
Worcester Polytechnic Institute, Diran Apelia, Director, 508.831.5992, dapelian@wpi.edu
Colorado School of Mines, Brajendra Mishra, Associate Director, 303.273.3893, bmishra@mines.edu
Katholieke Universiteit Leuven, Bart Blanpain, Associate Director, (32) 16.321216, bart.blanpain@mtm.kuleuven.ac.be
Center website: http://www.wpi.edu/academics/Research/CR3/index.html
image of a horizontal line
Forming Advanced High Strength Steels (AHSS)
There was unanimous agreement at a recent International Sand Reclamation Conference that the low rates of reuse of foundry sand is one of the highest priority problems confronting the industry. The annual generation of foundry waste (including dust and spent foundry sand) in the United States is thought to range from 9 to 13.6 million metric tons (10 to 15 million tons). Researchers from the NSF Center for Resource Recovery and Recycling (CR3) at Victaulic, a charter sponsor of the center, and at the Kroll Institute for Extractive Metallurgy at the Colorado School of Mines have developed a more efficient process to liberate beneficial products (clay and seacoal) from waste green sand foundry dust. The use of sand casting in U.S. foundries is well established. Sand casting is the least expensive of all casting processes. It economically produces rough metal parts. Raw castings are then machined to produce finished products. image of construction equipment Through research a hydrocyclone process was developed that doesn't use harsh chemicals and reclaims approximately 80% of the clay and seacoal from the dust. The advantage of this work for industry is that it offers a more economical and more environmentally friendly process for recovering clay and seacoal from green sand foundry dust and from waste green sand than was previously available. This work improves the method for disposing of green sand foundry dust and recycling of binding materials. Binders are materials added to a sand mold to bond sand particles together (i.e., it is the glue that holds the mold together). Economic Impact: This breakthrough impacts original equipment manufacturers that use finished metal parts. It minimizes costs related to waste disposal of green sand foundry dust and the cost of buying bond. This CR3 based innovation is estimated to save 1/2 to 1M million USD per green sand foundry per year. It will also reduce or remove accumulations of waste materials. For more information, contact Corby Anderson, 303.273.3580, cganders@mines.edu or Jim Van Wert, 610.559.3389, JVanWert@victaulic.com. Center for Virtual Proving Ground Simulation (CVPGS)
University of Iowa, L.D. Chen, Director, 319.335.5674, ldchen@engineering.uiowa.edu
University of Texas-Austin, Raul Longoria, 512.471.0530, r.longoria@mail.utexas.edu
Center website: http://www.nsf.gov/eng/iip/I/UCRC/directory/vpg.jsp
image of a horizontal line
National Advanced Driver Simulation Facility
Researchers at the Center for Virtual Proving Ground Simulation (CVPGS) at the University of Iowa and the University of Texas at Austin have developed enabling technologies that use the world-class facility, National Advanced Driving Simulator (NADS), for operator/driver-in-the loop simulation for vehicle designs and for driving safety research. The NADS facility, completed in 2002, has been used by Deere & Co., Caterpillar Inc., Continental-Teves, Bosch, etc. for “virtual proving” experiments, and by the National Highway Traffic Safety Administration for highway safety studies. Economic Impact: The National Advanced Driving Simulator is the only driving simulator in the world in which such activities can be carried out in a full 360 o immersive virtual environment with a high fidelity motion cues in all six degree-of-freedom of vehicle motion. A new approach is adopted that uses a commercially available multi-body dynamics code for real-time simulation. At this time is not possible to offer a statement on the economic impact of this work. For more information, contact L. D. Chen, 319.3355674, ldchen@engineering.uiowa.edu. computer-generated image of a driving simulation Ceramics Composites & Optical Materials Center (CCOMC)
Clemson University, Philip Brown, Director, 864.656.6072, pjb@clemson.edu
Rutgers University, Rich Haber, 732.445.4931, rhaber1@rci.rutgers.edu
Center website: http://ccmc.rutgers.edu
image of a horizontal line
Ceramics - Normally Opaque - Made Highly Transparent
Researchers at the Ceramics, Composites, and Optical Materials Center have refined ceramics that have glass-like transparency for light to pass through. For millennia ceramics have been commonly opaque and brittle. image of transparent ceramics This work, conduct at Clemson University and initially funded by the Department of Defense, has developed ceramics that are highly transparent and are considerably tougher than conventional analogs. Such materials are of value to a wide variety of applications including high power compact lasers, transparent armor, and radiological sensing technologies. Conventional ceramics are opaque because light is scattered from residual voids in the granular microstructure. Clemson researchers developed processes that fully remove voids and limit structural evolution so that the new ceramics possess features that are smaller than the size of light. More specifically, a two-step temperature process was established that yields ceramics with granular structures that are only 300 nm in average diameter. For comparison, one’s hair is about 300 times larger. The attainment of full density while maintaining sub-light granular dimensions permits transparency equivalent to glass. The reduction in granular size scales also yields significant enhancements in the mechanical hardness and toughness of the ceramics as required for armor and high power lasers applications. Economic Impact: As with many other important modern technologies, transparent ceramics were invented in the United States. However, for over a decade now, Japan has been the world leader in the production of transparent ceramics largely due to a decline in US funding and science competitiveness. This work, originally supported to help regain domestic know-how in fabricating transparent ceramics, has additionally facilitated the establishment of a domestic education and industrial supply chain critical to the future use of these materials in various defense, security, and sensing applications. While there are a few commercial examples of transparent ceramics, they remain a fairly young technology. Accordingly, the exact economic impact is unclear but ceramics certainly have the potential to displace crystals and glasses in numerous high technology fields as development continues. To that end, lasers, ultra-hard materials, and sensors represent multi-billion-dollar domestic industries that will only become more important as defense and security threats expand. For more information, contact John Ballato at 864.656.1035, jballat@clemson.edu. Cooling Technologies Research Center (CTRC)
Purdue University, Suresh Garimella, Director, 765.494.5621, sureshg@purdue.edu
Center website: https://engineering.purdue.edu/CTRC/
image of a horizontal line
Graphene-Based Thermal Interface Materials (TIM)
Efficient heat dissipation is critical in many applications in integrated electronic circuits and other similar applications. Thermal interface materials (TIMs) are necessary for heat dissipation because microscale gaps/voids between thermal transfer surfaces (e.g., between a CPU and heat sink) are unavoidable. The heat transfer efficiency across these interfaces can be dramatically enhanced by filling these gaps/voids with appropriate TIMs. Graphene, a single atomic layer of graphite with honeycomb lattice structure, exhibits very high thermal conductivity (3000-5000 W/m-K). this makes it an outstanding candidate for TIM applications. Graphene flakes can be vertically grown on multiple substrates by either plasma enhanced chemical vapor deposition (PECVD) or chemical reduction from graphite oxide. A mixture of commercial thermal paste or polymer with graphene flakes is another candidate graphene-based TIM option. iamge of Scanning electron micrograph of vertically
aligned graphene petals grown by microwave
plasma enhanced chemical vapor deposition
and bonded to a substrate for use as low
thermal resistance interfaces. CTRC researchers are synthesizing and characterizing both graphene composites and vertically grown graphene. Researchers have synthesized vertical graphene sheets on silicon and copper substrates and measured their thermal interface resistance. The measured resistance is among the lowest resistance values reported in literature, indicating their excellent promise for high-performance interface applications. Also, simulations of the vertical graphene properties have demonstrated that the synthesized graphene density on the substrate is a key factor determining composite thermal interface resistance. By isolating this trend, researchers are able to match simulation results with experimental data for the first time. Currently, the TIMs industry primarily uses thermal greases and pastes. One of the common problems is that it may leave unwanted residues on the surface after removing the TIMs. Vertically grown graphene can be easily applied to a surface by growing graphene on thin supporting layers or directly on the surface of the products that need increased thermal contact. Even though excellent preliminary thermal performance has been measured, facilitating its use in industry applications requires further attention. Economic Impact: Companies who improve their high-performance thermal interface materials will improve their chip and device packaging strategies. Graphene-based TIMs can possibly reduce costs. The economic impact has been the reduced headcount and overhead related to doing this research and development because the faculty and students of CTRC perform this preliminary and foundational research. CTRC provides industry with the methods, modeling and data needed to implement graphene TIMs in their devices. It is difficult to precisely measure the impact of this work but it has been estimated to be between one and 10 million USD for at least one of our member companies. For more information, contact Suresh Garimella, 765.494.5621, sureshg@purdue.edu or Xiulin Ruan, 765.494.5721, ruan@purdue.edu.
image of a horizontal line
Miniature Piezofan Arrays for Cooling Electronics
Innovative, miniature piezoelectric fans have been developed in this CTRC project into a viable technology for meeting a variety of cooling needs in portable and small-scale electronic devices. Analytical tools have been developed for modeling the flow field, heat transfer, and fan structure; flow-structure interaction is currently being investigated, to allow the design of optimal cooling systems. Studies have been done to test and characterize the thermal, electromechanical, fluid dynamic and acoustic performance of piezoelectric fans. Interactions between multiple fans are being studied; coupling effects between the fans can cause the amplitude to increase by up to 40% over that of a single fan. The images show the fan and
the heat transfer distributions
brought about by two fans
vibrating in front of a heated
surface. In smaller devices, where rotary fans are not practical and electronics are pushed to the limits of their heat dissipation capacities, piezoelectric fans offer the only realistic cooling solution while meeting the noise and power constraints of portable devices. Piezofans have no bearings or wearing parts that cause cyclic breakdown or noise-producing concerns. These fans are small, silent and very low-power devices. They present no electromagnetic interference, nor will they affect magnetic fields. They produce negligible heat and are reliable over a high temperature range. The piezofans are cost-effective, use simple circuitry, highly efficient, and lightweight. In order to realize the potential of piezoelectric fans in industry, CTRC optimized the design of fan blades based on robust electromechanical and flow-structure modeling. They are well suited to providing supplemental cooling in hot spots and other stagnant areas in devices where rotary fan action is ineffective. Applications are in wireless devices, video game systems, automotive applications and multimedia systems where compact, low noise and low power consumption is essential. Economic Impact: Center research has advanced the science to a stage that it is now possible to use them in applications where cooling requirements need to be met for low profile products such as laptops and cell phones. The impact of piezofans to many industries and markets is on the cusp of increased commercial use. Piezofans are the solution to many problems in various industries where the traditional rotary fan failed. They result in lower cost, lower noise, lower power consumption, better size capability and better reliability. Center members have new products aligned to use this technology in ways that will give them a marketing edge. For more information, contact Suresh Garimella, 765.494.5621, sureshg@purdue.edu.
image of a horizontal line
Validated Models for Particulate Thermal Interface Materials
Thermal Interface Materials (TIMs) continue to be a bottleneck for developing the next generation of micro-processors with smaller chip sizes and increased power. Development of better TIMs is imperative to ensure efficient heat removal from the microelectronic systems, which in turn improves the system reliability and performance. Accurate modeling of thermal interface materials requires either complex 3-D computational simulations or improved analytical models. Most existing models do not consider particle-particle interactions. Many fail when volume loading exceeds 30%. Numerical modeling of realistic three-dimensional microstructures (at high filler volume loadings) considering inter-particle interactions was performed using full-field meshless simulations and random particle network simulations. The developed models are validated with experiments on representative systems. images of Bingham fluid model prediction of polymeric thermal interface material
squeezing force (top row) for multiple hierarchically nested channel designs
(bottom row). The models can be efficiently used to accurately predict the effect of varying: 1) the filler particle conductivity; 2) the base polymer matrix conductivity, and; 3) size distribution and arrangement of the filler particles, on the composite thermal conductivity of TIMs. These models are expected to provide critical help in the design of high performance TIMs. Economic Impact: Precise contact resistance values, studying different materials, and analyzing degradation of TIMs are necessary for heat transfer models in the electronic cooling industry. Improved models will allow for the design of more efficient heat sinks for electronic applications and lead to lower energy use and higher performance. The thermal interface materials research that comes out of CTRC is immediately used by the member companies and has an impact in power electronics, telecommunications, cellular base stations and mobile phones, automotive electronics, portable/wearable electronics, pervasive computing devices, electric vehicle batteries, power distribution systems in computers, large-scale servers, military electronics and avionics. For more information, contact Suresh Garimella, 765.494.5621, sureshg@purdue.edu or Ganesh Subbarayan, 765.494.9770, ganeshs@purdue.edu.
image of a horizontal line
Two-Phase Transport in Microchannels
Researchers at the Cooling Technologies Research Center are exploring boiling and two-phase (liquid and vapor) flow in microchannels. Transport through microchannels that range in width from 100 to 400 micrometers in copper and silicon substrates has been characterized. A predictive model has also been formulated that aids in the design and optimization of microchannel heat sinks. image of State-of-the-art silicon thermal test chip used for
investigation of two-phase flow in microchannels cut
directly into the chip surface. The test chip is capable of
producing uniform heat dissipation and local
measurements of temperature. This CTRC work has resulted in better understands of transport in microchannels, and hence in rendering microchannel heat sinks implementable in electronics cooling applications. Several novel experimental and modeling tools have been developed. Laser-induced fluorescence thermography is used to measure the liquid temperature during flow boiling heat transfer within microchannels. Infrared Particle Image Velocimetry (IR-PIV) is being developed as a tool to make measurements inside silicon microstructures (with no optical access), capitalizing on the transparency of silicon to infrared light. System-level analysis of microchannel cooling systems, with an emphasis on design for energy efficiency and manufacturability, is now possible through a software tool developed in the Center. CTRC research has developed models to use of this technology in cost sensitive, very high power electronic applications. Liquid cooling techniques using microchannels enables high power electronics in hybrid vehicles, avionics and spacecraft. This technology was awarded the Alexander Schwarzkopf Prize for Technological Innovation by the I/UCRC Association in 2011. Economic Impact: Microchannel heat sinks that employ two-phase liquid cooling are compact and requires less pumping power for cooling, hence making them an attractive option from economic perspective. The ease of integration of these devices into high power electronic systems is an added advantage. The net cost associated with thermal management devices can be reduced by 60 to 70 percent. For more information, contact Suresh Garimella, 765.494.5621, sureshg@purdue.edu.
image of a horizontal line
Enhancement of Heat Pipe Transport Properties and Thin Film Evaporation
One of the key limiting factors in electronics technology today is the ability to remove heat from the processor inside nearly every device. Heat pipes are commonly used in electronics cooling applications due to their ability to move large amounts of heat over reasonable distances with only small drops in temperature. This efficient heat transport is due to the phase change of an internal working fluid from liquid to vapor and porous structures imbedded inside which passively drive the working fluid. Efficiency increases would allow the device to operate with less temperature drop across the heat pipe, therefore keeping electronic components relatively cooler at the same heat load. While typical devices in industry today use randomly packed particles as a porous wick, this work has shown how device performance may be dramatically improved upon by designing the wick structure at a microscale level. image of Prediction of static meniscus shape formed by water in an arranged
packed bed of copper particles (left) and complex toroidal vortex formed
due to Marangoni convection obtained via three-dimensional
simulation of evaporation from a wick pore (right). Economic Impact: As devices in industry are packaged into ever decreasing sizes, microscale design techniques promise substantial impacts. By operating heat pipes at a lower temperature, the overall power consumption of the cooling solution is decreased because auxiliary cooling components, such as axial fans which dump heat to the ambient atmosphere, do not have to work as hard. Improvement of heat pipe devices may dramatically aid the semiconductor industry. Thermal management technology, such as heat pipes, which has the ability to dissipate higher heat loads, has the potential to spur technological and economic growth in these industries. For more information, contact Suresh Garimella, 765.494.5621, sureshg@purdue.edu or Jayathi Murthy, 765.494.5701, jmurthy@purdue.edu.
image of a horizontal line
Transport in Porous Structures and Metal Foams
Metal foams are novel heat transfer surfaces with potential use as heat sinks and heat exchangers. They have been successfully employed in the heat exchanger industry by companies such as Honeywell. Power electronics is another potential industry where this technology is increasingly finding use. Porous structures are also found as sintered particle beds in heat pipes - a device found today in technologies as diverse as laptops and satellites alike. Advanced military electronics often use heat pipes for their high effectiveness and reliability in harsh environments, apart from being self-powered. image of Advanced microtomography based prediction of
temperatures within (left) a metal foam, and (right) a
sintered copper porous bed. A novel computational methodology for detailed modeling of open-cell foams and heat pipe wick structures using advanced imaging techniques such as microtomography (commonly known as CT scanning) has been developed at the CTRC. Applications include heat exchangers, energy absorbers, breather plugs, CO2 scrubbers, micrometeorite shields, heat shields, optics and mirrors, wind screens and baffles, cryogenic tanks, lam discs, missile baffles, anti-slosh baffles, air oil separators, and high temperature filters. Economic Impact: Devices such as solar inverters and uninterruptable power supplies, where the technology will be implemented, will benefit from advantages such as cooler electronic components and smaller heat sinks; thereby increasing the life of the equipment as well as shrinking the overall dimensions. With reliable property information, expensive experiments are avoided. The designer can just choose samples based on the values of the properties we measure, thereby cutting down the design time significantly. The greater impact would be once the technology is fully optimized and realized. It can be employed to cut down the equipment size by half or more, thereby cutting down significant material costs. The other savings are in terms of pumping power. These novel structures, use a far lower pumping power (cost) compared to their conventional alternatives. For more information, contact Suresh Garimella, 765.494.5621, sureshg@purdue.edu or Jayathi Murthy, 765.494.5701, jmurthy@purdue.edu. Connection One - Center for Communication Circuits & Systems Research Center (CCCS)
Arizona State University, Sayfe Kiaei, Director, 480.727.7761, sayfe.kiaei@asu.edu
University of Arizona, Marwan Krunz, 520.621.8731, krunz@ece.arizona.edu
University of Hawaii, Magdy Iskander, 808.956.3434, iskander@spectra.eng.hawaii.edu
Rensselaer Polytechnic Institute, Michael Shur, 518.253.6819, shurm@rpi.edu
Ohio State University, John Volakis, 614.292.5846, volakis.1@osu.edu
Center website: http://www.connectionone.org/
image of a horizontal line
Universal RF Transceiver and Sensors
The ultimate goal of communication and computing systems is ubiquity; wireless devices that can be used in many applications ranging from biomedical sensors, environmental sensors, wireless mobile phones, and RFID tags. For example, In order to develop an RF wireless system that can be used as an implanted bio-sensor inside the body, transceivers must be small (less than few millimeters) and capable of staying inside the body for 5-10 years without changing batteries. These needs also apply to mobile systems like multi-mode universal mobile and smart phones. Such systems require multiple standards (like GSM, CDMA, and UMTS) and must adhere to stringent power and size requirements - the entire transceiver including the antenna, has to be integrated into very small areas. Current smart phones require integration of RF, antenna, signal processing, Video and Image processing, and microprocessors all on the same system. This has been a major roadblock for this technology. Connection One researchers have developed a multi-mode transceiver that is completely integrated on one chip. image of the earth In terms of power, two major components in transmitter architecture are the power amplifier (PA) and the PA modulator. PAs accounts for over 70% of power consumption in handsets and consume a significant portion of the handset's volume. Therefore, altering the power amplifier topology to lower the demand on their bulky passive filters while simultaneously increasing the efficiency and linearity is essential when realizing high-efficiency monolithic transmitter architectures. A new method using a noise shaping technique to modulate the controller integrated circuits in switched-mode converters and power amplifiers reduces the demand on the output filters of the structures. Economic Impact: This breakthrough technology has impacted communications generally and multi-mode phones specifically. As a result of this Connection One work it is now possible to reduce the overall size of communications transmitters and products. The new architectures are more efficient than other techniques. This research has resulted in over 50% overall improvement in efficiency (improved battery life by 30-40% - reduced transmitter power by over 40%). As a result of this work the overall complexity and size of transceivers, PAs, antenna and power management components have been reduced by 30-50%. This has been very important to a number of center sponsors, including Qualcomm, Texas Instruments and Broadcom, and to the communications industry generally as new efficient topologies for transceiver chip sets are developed. For more information, contact Sayfe Kiaei, 480.727.7761, sayfe.kiaei@asu.edu.
image of a horizontal line
Meta-Ground Plane for Airborne Radar and Electronic Warfare Systems
Researchers at the Center for Communication Circuits & Systems Research Center (CCCS - Connection One) have developed a meta-ground plane for Ultra low-profile UHF wideband sensors used in airborne radar, electronic warfare systems and homeland security applications that require wideband, low frequency (UHF) antennas for roadside mine detection. These sensors offer a small, lightweight and low loss solution to a persistent problem. TV and cell-phone base station antennas can be glued on the fuselages, rooftops, or sidewalls of buildings. An application of this small UHF antenna is for sensors capable of locating targets concealed under trees and forests. The meta-ground plane will drastically reduce the profile of such bulky antenna systems on UAVs, thus, increasing their ability to accomplish their missions. No prior art existed to solve this problem without penalties in weight, gain, and efficiency. The entire aerospace industry is interested in these novel conformal antennas because they are both small, but can also operate when installed on airframe surface without protrusion. Also, ground personnel can carry them in their backpack and used them for communication at all frequency band and wideband information reception, including videos, and high-resolution images without relying on large and heavy reflectors that are not portable. image of various sensors Economic Impact : This antenna is a transformational technology as it provides for extremely wide bandwidth in a very small size antenna. It is already marketed by a small company in Virginia and has attracted the attention of the U.S. Air Force and Boeing for automated guidance systems. A potential and highly touted commercial application is that of wideband communications. Specifically, this small and very thin antenna can cover all television, cellular, satellite and ground to air communications. As it replaces the large whip antenna and other non-conformal antennas, it is best suited for automotive applications. Specifically, this simple, small but very wideband antenna can replace the multiple automotive antennas required in modern automobiles. Currently, over 14 individual antennas are used to cover the automotive bands. This antenna can replace all of them using a small 6” aperture that is both light and small for inconspicuous placement. The cost reduction is tremendous. Further, because the antenna is small, several of them can be placed at different locations of the vehicle for ubiquitous coverage and for implementing “Multiple Input Multiple Output” scenarios for uninterrupted high bandwidth (instantaneous) video communications. That is, the developed antenna is a breakthrough technology with high impact in all aspects of communications for commercial and defense applications. For more information, contact John Volakis, 614.292.5846, volakis@ece.osu.edu. Industry/University Center for Biosurfaces (IUCB)
SUNY Buffalo, Robert Baier, Director, 716.829.3560, baier@buffalo.edu
SUNY Buffalo, Anne Meyer, 716.829.3560, aemeyer@buffalo.edu
University of Memphis, M. Shah Jahan, 901.678.2620, mjahan@memphis.edu
Center website: http://wings.buffalo.edu/iucb/
image of a horizontal line
Superior Relief From Dry Eye Problems
Many people suffer from “dry eye” problems, or a “gritty” sensation when blinking in a dusty environment. A significant improvement in the lubricity of in-the-eye comfort formulations has been achieved with the introduction of a borate-buffered solution of HP Guar containing active demulcents polyethylene glycol and propylene glycol. Researchers at the Industry/University Center for Biosurfaces (IUCB) developed a new tissue-on-tissue testing protocol that demonstrated the superior reduction of “blinking” friction associated with addition of this novel solution compared to the normal saline-wetted tissue surfaces. Previously available test methods did not reveal the clinically relevant superior lubricity for the borate-buffered HP Guar formulation. Synthetic materials articulated with preserved tissue surfaces did not exhibit the very low coefficients of friction actually obtainable in the tissue-on-tissue test system. The scientific lesson is that laboratory simulations of biological joints and of other situations where bioadhesion is important must adequately replicate the complex natural tissue surfaces involved. The new measurement technology provides information that correlates better with what really goes on in the eye, and fosters the development of products that reach to the public’s benefit more quickly than possible with earlier techniques.image of a woman putting in eyedrops Economic Impact: Alcon Laboratories of Fort Worth, Texas has brought this and a new formulation to market, under the trade names SYSTANE and SYSTANE ULTRA. Annual commercial sales of the SYSTANE “artificial tears” solutions now exceed $100,000,000. As a result of this and other prominent successes of Fort Worth, TX-based Alcon Laboratories’ new product introductions, Novartis Corporation has acquired Alcon and merged it with their CIBA Vision Division, Duluth, GA, continuing now in a joint partnership to develop superior contact lens lubricating solutions employing IUCB test methods. For more information, contact Robert Baier, 716.829.3560, baier@buffalo.edu.
image of a horizontal line
Allergy Friendly Room Program
Until recently, there have been only specific individual products available to improve air quality in indoor environments. Pure Solutions LLC has developed a patented process that has been tested at the University at Buffalo’s Industry/University Center for Biosurfaces (IUCB). It received the 2008 Innovator Award for the best innovation in the hospitality industry by Cornell University’s Institute for Hospitality Entrepreneurship. The process provides pre-packaged allergy friendly rooms to the hospitality market, as well as commercial and residential markets. Pure Solution's allergy friendly rooms process offers multiple interventions to substantially improve an indoor environment and has added a quarterly maintenance program that will continue the hygienic conditions over a 24-month period. The process substantially reduces particles, bacteria, mold spores and fungi in indoor environments. An added benefit is energy savings of 25% or more through the cleaning and sanitization of heating and cooling coils in an air-handling unit. The company has developed a worldwide licensing program and has partnered with companies in the U.S., Canada, Dutch Caribbean, Barbados, United Arab Emirates, Singapore, Scandinavia, Malaysia and China to provide its allergy friendly room technology to markets around the world. The Hyatt national hotel chain agreed to convert 2,800 rooms over the next year. The contract, valued at over $10.0 million, had initial work started in July 2010, and will generate substantial royalty fees for the Company. Extensions of the process are in development for the environmental control of large conference rooms where participant comfort and attention can be maximized. image of a hotel room interior Economic Impact: This work is resulting in substantial new business opportunities. A sampling includes: PURE is in final negotiations with the owner of 153 Five Star hotels in India, for an initial trial conversion of 76 rooms at a price of $240K and a net profit of $150K; its licensee in Taiwan has been successful in having “PURE Allergy Friendly Rooms” become a brand standard for the Starwood Luxury Hotels,. The Grand Hyatt San Francisco has successfully tested a smoking room conversion program on a 40/60 revenue share program with a US Master Licensee. For more information, contact Robert Baier, 716.829.3560, baier@buffalo.edu.
image of a horizontal line
Inadvertent Implants? Visualizing Lung Cell pH
Inhaled particles and pollution can stress lungs, causing asthma and other diseases. Indigestible fibers, too long to be engulfed, cause lung disease. Most difficult to remove are long, thin asbestos-like fibers. Researchers at the Industry/University Center for Biosurfaces (IUCB) have shown how the body protects itself against safe insulation glass fibers, and how to select formulations for new, safe building materials. A surprise has been the discovery of a new use for the insulation fibers, as scaffolds for regenerating body tissues. “Chemistry in action” is recorded and displayed using laser photonics combined with confocal microscopy to take “visual slices” through living cells. Living cells take in a dye that gives off fluorescent rays of two different colors, red for acid production and blue for alkalinity. Lung cells digest away respirable fiberglass by an acid attack that shortens them, and an engulfment into the cells that allows them to be digested and carried away before disease processes can be triggered. image of a microscopic slide showing red stained cells against a black background These microscope views show that how one treats the surface of glass can control the degree of spreading and adhesion of living cells. The cells shown here are expected to make saliva to give comfort to "dry mouth" sufferers, from dissolving-glass implants replacing failed natural glands. In addition to discovering how the lung safely handles certain new forms of industrial fiberglass particles, this research demonstrated that some inhaled nanoparticles become agglomerated and then processed by the lung tissue in the same way as larger pollutant particles. The result? A recent National Technology Program decision that certain inhalable glass wool fibers are reasonably anticipated to be a human carcinogen replaced the earlier decision (7th Annual report in 1994) that all respirable glass wool is reasonably anticipated to be a human carcinogen. This change was made because not all glass wool or man-made fibers were found to be carcinogenic. This breakthrough allows some fiber compositions to be removed from the list; the distinction between those left on and those removed is based on their biopersistence. Economic Impact: This advance is a scientific triumph of collaboration by those who have worked in industry, academe and government. Removing some fibers from the list of carcinogens has resulted in substantial economic benefit to industry and consumers. IUCB is now exploring the use some of the newly produced rock wool glass products as scaffolds for “tissue engineering” (see photo). For more information, contact Robert Baier, 716.829.3560, baier@buffalo.edu. Institute for Next Generation IT Systems (ITng)
North Carolina State University, Dennis Kekas, Director, 919.515.5297, kekas@ncsu.edu
Duke University, Kishor Trivedi, 919.660.5269, kst@ee.duke.edu
Center websites: http://www.nsf.gov/eng/iip/iucrc/directory/cacc.jsp and http://www.itng.ncsu.edu/
image of a horizontal line
Software Rejuvenation
drawing of how the system connects to data collection, measurement based model, estimates, and rejuvenation action and circles back to the system Researchers led by Kishor Trivedi at Duke University and the Center for Advanced Computing and Engineering (CACC) developed a method to detect problems of memory leak, data corruption, and fragmentation that have plagued a wide range of computer systems and networking components. Problems build up over time and lead to performance degradation, hanging up or freezing, and other failures of computing systems. Such system failures and resulting downtime cost billions of dollars in banking, telecommunications, military and other sectors. Such failures may also cause a loss of life in life-critical systems. Memory leak is a phenomenon in which memory resources in computing systems decrease over time and eventually cause system problems. The problem occurs because software programs request memory but sometimes don't release it. This unreleased memory accumulates over time. The researchers collected empirical data on these problems and developed a way to monitor the course of the deterioration and to predict when future problems would occur so that preventive measures could be taken. This software rejuvenation method has been adopted by IBM in their X-series servers, and subsequently other companies including Oracle and Microsoft have adopted this technology. This technology has also been adopted in telecommunications sector. The use of software rejuvenation is known postpone or prevent disrupting system failures and hence reduce the cost of downtime. It has also been implemented in NASA’s space-based software systems. Economic Impact: Ramifications of computer system failures and associated downtime, cost the banking, telecommunications, healthcare, armed forces and other like organizations billions of dollars each year. Several studies have analyzed the economic cost of IT systems’ downtime. In large systems, direct associated downtime cost has been calculated to average around $125,000 per hour (in data center environments this value can reach $335,000 per hour). However, there is a missing gap in terms of the economic impact of software rejuvenation strategies specifically. Even so, we can extract interesting remarks from these studies. According to the Aberdeen Group Research Report on June 2010, based on the analysis of 125 organizations, the “best in class” companies (top 20%) in terms of time to recover, number of downtimes and percentage of data availability are able to recover their systems 6.5-times faster than the laggard companies (bottom 30%). In absolute economic terms, “best in class” companies were losing 40-times less revenue than laggard organizations. Simply improving from laggard to average (middle 50%) organization, this CACC work will increase the revenue to the company by about $1.3 million USD per year. So, any recovery mechanism (i.e. software rejuvenation) able to reduce the outages and its consequences is reducing significantly the outage invoice while increasing the revenue and productivity of the organizations.graph of time vs failure rates of aging software For more information, contact Kishor Trivedi, 919.660.5269, kst@ee.duke.edu. Intelligent Maintenance Systems (IMS) A CISE-funded Center
University of Cincinnati, Jay Lee, Director, 513.556.2493, jay.lee@uc.edu
University Michigan-Ann Arbor, Jun Ni, 734.936.2918, unni@umich.edu
Missouri University of Science & Technology, Jagannathan Sarangapani, 573.341.6775 sarangap@umr.edu
Center website: http://www.imscenter.net/
image of a horizontal line
Prediction and Prevention of Hydraulic Hose Failure
Researchers at the Intelligent Maintenance Systems (IMS) have developed a system for predicting and preventing failure of hydraulic hose systems. There are no commercial products for this purpose in the marketplace. Previously, there have been only a few studies of hydraulic hose prognostics. IMS has developed a smart sensor with embedded prognostics algorithms that can be easily attached to the hose. It monitors the degradation of hose behavior and alerts the potential failures, and further request early repair or replacement autonomously. image of a hydraulic hose system The technology has many critically important industrial applications. Across the board, end users of systems that rely on hydraulic hoses will be able to attain dramatic improvements in machine uptime and provide safety. There will also be important reductions of environmental and human safety consequences related to hose failure. Smarter, stronger and safer hydraulic hose systems will provide significant benefits to customers involving many applications. Economic Impact: The developed IMS prognostics tools have been implemented in many IMS sponsors’ production including P&G, GM, and Boeing. The economic impacts of this work exceed $100M. For more information, contact Jay Lee, 513.556.2493, jay.lee@uc.edu.
image of a horizontal line
Watchdog Agent for Assessing Equipment Performance
images of mining equipment and wind turbines The Watchdog AgentTM is deployable to embedded data acquisition devices and monitoring data acquisition platforms on industrial equipment. Since machine or process breakdowns severely limit their effectiveness, methods are needed to predict products’ life expectancy. Information about the remaining life of products and their components is crucial for their disassembly and reuse, which in turn leads to a more efficient and environmentally friendly usage of product and resources. Development of the Watchdog Agent by researchers at the Intelligent Maintenance Systems (IMS) center answers the aforementioned needs. The Watchdog is essentially software that can be applied to just about any product or system for which it would be beneficial to predict when and why the product or system is going to fail, from a simple valve to a complex system. Watchdog Agent™ assesses and predicts the process or equipment performance based on the inputs from the sensors mounted on it. Performance-related information is extracted from multiple sensor inputs through signal processing, feature extraction and sensor fusion techniques. Historical behavior of process signatures is utilized to predict their behavior and thus forecast the process or machine performance. Researchers developed the Watchdog Agent toolbox to include tools for a wide variety of applications including EV smart battery, wind turbine, automotive systems, machine tools, etc. These tools can then be customized to meet the needs of the particular industry or processes involved. Most of previous systems are limited to few tools. In today’s competitive market, production costs, lead-time and optimal machine utilization are crucial values for companies. The watchdog agent’s continuous assessment and prediction of product’s performance enables collaborative product life-cycle management in which products are followed, assessed and improved throughout their life-cycle. Watchdog software has been used in test-beds for Harley-Davidson, Tongtai Machine, TechSolve, and it is in the process of being applied at Omron, Toyota, General Electric and other companies. In its now commercial form, prognostics engineers are able to leverage the research to enhance asset reliability and management of wind turbines and mining equipment. Economic Impact: The combined economic impacts to IMS company members exceeded $100M. By improving the monitoring, diagnostic, and prognostic capabilities of industrial equipment, production processes become more reliable and more cost effective because the equipment has less downtime, lasts longer, and costs less to maintain. This benefits consumers, industry and the national economy. For more information, contact Jay Lee, 513.556.2493, jay.lee@uc.edu. Membrane Science, Engineering and Technology Center (MAST)
University of Colorado Boulder, Alan Greenberg, Director, 303.492.6613, alan.greenberg@colorado.edu
New Jersey Institute of Technology, Kam Sirkar, Director, 973.596.8447
Center website: http://www.mastcenter.org/
image of a horizontal line
High-Recovery Desalination Process for Brackish Groundwater
Many regions of the world including the Middle East are plagued by a severe scarcity of fresh water sources, and so seawater has been the most commonly used raw water source for desalination. Desalination is a well-established process that uses reverse osmosis (RO) for the removal of salt from seawater or other brackish (salt-containing) water sources. RO is a pressure-driven process in which water is forced through a polymeric membrane while salts are retained. A major barrier to efficient desalination processes is the potential for the precipitation by sparingly soluble salts on the surface of the membrane, a process termed scaling.image of Bench-scale experimental system for testing
ultrasonic sensor-controlled flow reversal. The overall goal of this work is to develop and build demonstration desalination pilot plants based on RO using brackish groundwater that would operate in Jordan and Israel. The innovation in this work is to delay the onset of scaling by a unique combination of the flow-reversal technology developed at Ben Gurion University and the ultrasonic sensor technology developed by the NSF Center for Membrane Science, Engineering and Technology at the University of Colorado Boulder. To date, the project has received support from the NATO Science for Peace (SfP) Program, the Middle East Desalination Research Center (MEDRC), and ROTEC, a small start-up company specifically formed to commercialize this technology. Economic Impact: This new sensor-based separation process has the potential to significantly reduce costs associated with brackish water (BW) desalination, which is expected to have a market value of $30 billion by 2015. This technology enables higher recovery so that there is more fresh water and less salt produced per unit of feed water. For example, adding the technology to a 1200 m 3 /day BW-RO plant (80% fresh water recovery) would enable operation at 92% recovery thus generating 1380 m 3 /day of fresh water. The total cost for adding the technology would be about $11,000/year but the additional water produced is valued at about $31,000/year, a significant economic benefit, while the costs associated with reduced salt generation are estimated at about $62,000/year. In addition to the favorable economics, implementation of the technology will relieve pressure on existing water sources thereby reducing friction and facilitating cooperation between affected countries as they attempt to cope with dwindling freshwater supplies. This technology should help make it possible for water scarcity to become less of a driver of future conflict. It is difficult to overestimate impacts of avoiding the economic and human costs that would be associated with diminishing such future conflicts. For more information, contact Alan R. Greenberg, 303.492.6613, alan.greenberg@colorado.edu. images of SEM micrographs showing
representative membrane surfaces
from ultrasonic sensor-controlled flow
reversal (left) and no flow reversal
(right); minimal scaling visible on left,
but significant scaling on right. SEM micrographs showing representative membrane surfaces from ultrasonic sensor-controlled flow reversal (left) and no flow reversal (right); minimal scaling visible on left, but significant scaling on right.
image of a horizontal line
Reducing Membrane Fouling in Water Treatment Processes
The problem of membrane fouling by natural organic matter is critically important in the production of potable water from surface waters such as rivers and streams. Many such waters have a brown or tan appearance due to the natural organic matter they contain (humic and tannic substances). Those materials tend to bind to the surface of water treatment membranes in reverse osmosis and nanofiltration processes. Researchers at the Membrane Science, Engineering and Technology Center have evaluated and characterized several kinds of membranes to determine their fouling characteristics. In addition to published research, the Dow Chemical Company has evaluated the technology and is in the process of launching new products that benefited from the knowledge gained from this Center project. For more information, contact Alan R. Greenberg, 303.492.6613, alan.greenberg@colorado.edu. images of Bench-scale experimental
system for measuring
membrane flux decline. images of Bench-scale experimental
system for measuring
membrane rejection . images of SEM micrographs showing
representative surfaces of clean
(left) and fouled membrane
(right). Photopolymerizations Center (PC)
University of Iowa, Alec Scranton, Director, 319.335.1414, alec-scranton@uiowa.edu
University of Colorado, Christopher Bowman, 303.492.3247, Christopher.bowman@colorado.edu
Center website: http://css.engineering.uiowa.edu/~cfap/
image of a horizontal line
Ultra-Rapid Photopolymerization Method
Novel (meth)acrylate monomers for ultra-rapid photopolymerization have been developed by researchers at the Photopolymerizations Center (PC). This program has identified and characterized several new monomers that provide highly photosensitive acrylate compositions with excellent physical and mechanical properties. These materials have potential for the design of improved structural adhesives in engineering applications. One application noted by UCB Chemicals is that of inks used in printing on food packages. Fast-reacting monomers can reduce both cost and food contamination. The fast-reacting monomers result in inks that dry faster and in packaging that is not as slippery, thereby improving the ability to stack packages. These two effects help reduce packaging costs. These materials have also been demonstrated to improve properties when used as dental restorative materials.image of Fiber optic coatings, one potential use for this technology An added benefit to the fast-drying ink is that it does not seep through the packaging, and therefore does not contaminate food contained in the package with chemicals. Economic Impact: The economic impact of this project is significant in several respects. First, the enhanced understanding of the formation-structure-property relationships in monomers has been critical in designing formulations. This approach dramatically reduces the experimental evaluation necessary to develop photopolymerizable formulations for new applications, enhancing their penetration into new markets. Further, the existence of new monomers developed by this project that have enhanced characteristics will improve coating performance. Through improved performance, the solventless photopolymerization process, which has improved economics and environmental compatibility, will be able to penetrate markets that could not be reached otherwise, particularly in automotive or other outdoor applications. For more information, contact Christopher Bowman, 303.492.3247, christopher.bowman@colorado.edu.
image of a horizontal line
Covalent Adaptable Networks (CANS)
PC researchers have developed Covalent Adaptable Networks (CANS), which are polymer networks that are adaptable and have reversible structures with concomitant abilities to reduce stress and change shape after polymerization. These networks have the unique combination of being covalently bonded polymer networks that maintain an ability to change their bonded state. This capability enables materials to alleviate stress, change their shape, become adhesive (or debond), or even to heal fractures and cracks. image of woman with mouth open for dentist to check her teeth Two different classes of CANs exist, those that utilize radical-mediated addition fragmentation and those that utilize thermoreversible Diels-Alder reactions as the activatable bond. In a series of papers, we have demonstrated that the addition-fragmentation based CANs enable three critical advantages: reduction of polymerization shrinkage stress, photoactuation and light induced shape changes, and a novel mechanically assisted photolithographic process that enables a single light exposure to achieve complex topography. The thermoreversible CANs have also been demonstrated to be of significant value through three key developments as well: the ability to heal cracks, the ability to be remotely actuated and manipulated through radiofrequency exposure, and an ability to form complex, custom 3D objects simply through a thiol-ene based photofixation process. Economic Impact: These materials present an entirely new and functional class of thermosets. Thermosets resins represent a multi-billion dollar market that has exclusively focused on polymer networks that are permanent and unalterable. Here, for many applications, nearly all of the same advantages can be achieved with this CANs approach with the added benefit of numerous additional and desirable properties such as reduced stress, the ability to heal and mend cracks and defects, and the ability to be recycled more easily. Because the technology has such broad reaches across industries from adhesives to composites to 3D prototyping and photolithography, there has been significant interest from companies. Several invention disclosures have already been submitted spanning applications from conventional composites to dental material to 3D prototyping, to photolithography and adhesives. Dental companies involved in 3D prototyping and adhesive companies have expressed interest or have already optioned the technology in these fields. For more information, contact Christopher Bowman, 303.492.3247, christopher.bowman@colorado.edu.
image of a horizontal line
Improvement in Photo-Cured Acrylate Coatings
image of the technology, difficult to figure out what is in the photo At the University of Iowa's Photopolymerizations Center, (PC) a novel photochemical method to eliminate oxygen inhibition in free-radical photopolymerizations has been developed. This work provides a unique and practical solution to a major problem involving photo cured acrylate coatings, namely, inhibition by air at the coating surface. The advance involves the inclusion of two specially selected components in the reactive formulation: 1) a light-absorbing molecule which interacts with the ground state (triplet) oxygen to produce an excited (singlet) state of oxygen (Zn-ttp in the figure), and 2) a second compound which reacts with the singlet oxygen thereby removing from the system (dimethyl anthracene in the figure). By introducing the near IR illumination before UV curing, the combination of singlet oxygen generator and trapper can effectively remove the molecular oxygen dissolved in the system. It therefore significantly increases the polymerization rate in air environments. Unlike the traditional methods to mitigate oxygen inhibition, this new method decouples the oxygen consumption and the polymerization process. The peroxide products formed from the oxidation of trapper have the potential to create new reactive centers upon UV illumination or heating. Economic Impact: Oxygen inhibition is widely regarded as the most important unsolved problem in acrylate polymerization. Methods to mitigate the problem are generally expensive, ineffective or undermine the properties of the resulting polymer coating. This breakthrough provides an attractive new alternative for solving this important problem. Because the method is based upon the addition of trace quantities of specially selected additives it can be applied to any acrylate system with no other modifications to the reactive formulation. Henkel Loctite Corporation expects this technology to be of significant commercial value. For more information, contact Alec Scranton, 319.335.1414, alec-scranton@uiowa.edu.
image of a horizontal line
Improved Understanding of Photopolymerization Using Photobleachable Dyes
Understanding the mechanism of this chemistry helps provide the basis for essential irreversible loss of color in a variety of consumer products. Research at the Photopolymerizations Center (PC) has examined brightly colored photopolymerizable compositions. Research at PC is providing improved fundamental understandings of the photo-induced electron transfer processes that determine the retention or loss of color, as well as the formation of active centers that lead to polymer formation. The work provides direct evidence for conditions required to achieve simultaneous photopolymerization and mechanistic understanding of both reversible and irreversible photobleaching of colored compositions. The work is important for photocurable adhesives in dental and orthodontic materials that provide easy visualization during placement (due to color), then upon light exposure, polymerize and become colorless. Economic Impact: A fundamental understanding of photopolymerizable systems that possess color has important economic implications for a variety of products. The work has played a key role in the dental industry by in the form of photo-bleachable sealants, orthodontic bracket adhesives as well as electronic adhesives. It will have significant economic impact in industries that use photopolymers for an array of coatings, dental/orthodontic materials, electronic adhesives and encapsulants. It has already impacted dental material including photocurable composite fillings, sealants and orthodontic bracket adhesives. For more information, contact Alec Scranton, 319.335.1414, alec-scranton@uiowa.edu.
image of a horizontal line
Real Time Methods to Examine Photopolymerization Conversion
Development of real time instrumentation and methodologies to examine and correlate photopolymerization degree of conversion vs shrinkage stress and key mechanical properties. This work at the Photopolymerizations Center has provided results that improve upon 25 years of previously unsuccessful attempts to understand critical relationships involving dental materials and other photocurable, cross linkable systems. Key critical questions relating to shrinkage, stress, degree of conversion and associated mechanical properties have finally been definitively addressed. These advances have provided definitive proof concerning the polymerization, shrinkage stress and mechanical properties. Previous efforts failed to address all 3 relevant aspects resulting in extensive speculation and hand waving arguments. Economic Impact: A more thorough understanding the relationships among the degree of conversion, the shrinkage stress, and the mechanical properties could impact any product that is based upon crosslinked polymer networks. This advance will have significant impacts in any industries that uses photopolymers for a coatings arrays, dental/orthodontic materials, electronic adhesives and encapsulants. It has already impacted the development of dental material including photocurable composite fillings, sealants and orthodontic bracket adhesives. For more information, contact Christopher Bowman, 303.492.3247, christopher.bowman@colorado.edu or Alec Scranton, 319.335.1414, alec-scranton@uiowa.edu. Power Systems Engineering Research Center (PSERC)
Arizona State University, Vijay Vittal, Director, 480.965.1879, vijay.vittal@asu.edu and Jerry Heydt, Site Director, 480.965.8307
Cornell University, Lang Tong, 607.255.3900, ltong@ece.cornell.edu
University of California Berkeley, Shmuel Oren, 510.642.1836, oren@ieor.berkeley.edu
Colorado School of Mines, P. K. Sen, 303.384.2020, psen@mines.edu
Georgia Tech, A.P. Sakis Meliopoulos, 404.894.2926, sakis.m@gatech.edu
Howard University, James Momoh, 202.806.5350, jm@scs.howard.edu
University of Illinois at Urbana-Champaign, Peter W. Sauer, 217.333.0394, sauer@ece.uiuc.edu
Iowa State University, Venkataraman Ajjarapu, 515.294.7687, vajjarap@iastate.edu
Texas A&M, Mladen Kezunovic, 979.845.7509, kezunov@ece.tamu.edu
Washington State University, Anjan Bose, 509.335.5593, bose@wsu.edu
Wichita State University, Ward Jewell, 316.978.6340, ward.jewell@wichita.edu
University of Wisconsin-Madison, Christopher DeMarco, 608.262.5546, demarco@engr.wisc.edu
Center website: http://www.pserc.org/
image of a horizontal line
SuperCalibrator: Expanding Real-Time Information for Power System Operators
Significant penetration of renewable resources, demand response and distributed generation cause rapid increases in the level of uncertainty in control of electric power systems. This requires a completely new approach to the design of a control system for the future power grid. Current centralized approaches to providing power system operators with the information they need to be aware of their power system’s situation (or state) have become highly inefficient and, for all practical purposes, are now considered obsolete. To address this problem, researchers at the Georgia Institute of Technology have developed a game-changing technology called “SuperCalibrator.” System visibility to operators has been identified as an important component for power grid reliability. The SuperCalibrator provides unprecedented visibility update rates of 60 times per second thereby reducing the risk of blackouts. image of a power station The SuperCalibrator is a distributed state estimator that uses a detailed model of a substation and measurements from all devices in the substation (such as meters and protection relays) for the purpose of extracting a real time model of the substation, identifying and rejecting bad data, identifying and correcting topology errors, and verifying the model parameters. In the presence of at least one valid Global Positioning System (GPS) synchronized measurement, the real time model of the substation is valid for a specific time instant with precision one microsecond. Subsequently, results from each substation (that is, the state of each substation) are transmitted to the control center where all substation real time models are combined to synthesize the real time model (operating state) of the entire system for that instant in time. Synthesis of the entire system model requires minimal computations. The most significant advantages of the SuperCalibrator are: a) high update rates of the system-wide state estimation and solutions (with a demonstrated operation of 60 times per second); b) the accuracy of the real time model; c) scalability that enables high update rates independent of system size, and; d) the model’s ability to extract real time model during disturbance conditions. image of the SuperCalibrator screenshot The SuperCalibrator is a distributed state estimator that uses detailed models of substations and measurements from all substation devices to provide power operators with real-time information that can help them avoid blackouts and improve the efficiency of the grid. When these characteristics are compared to the present state-of the-art in state estimations, it is clear that the SuperCalibrator is a breakthrough that enables unprecedented speeds for achieving improved system visibility. The SuperCalibrator has been implemented and demonstrated at several substations (in US Virgin Islands, New York Power Authority and Pacific Gas and Electric). With the expected massive phasor measurement unit installations across the U.S., SuperCalibrator will play an important role in the future for both control center applications and substation protection and control. Economic Impact: The potential economic impacts of SuperCalibrator will result from: 1) elimination of the costly centralized state estimator; 2) operational savings resulting from the accurate real time model that enables avoidance of overly conservative or wrong decisions, and; 3) early (and fast) detection of the danger of cascading power outages (saving on the costs of blackouts). At the individual utility level, this technology can replace the costly centralized state estimator resulting in a savings of several million dollars for each utility. Since the nation’s power grid operational procedures and optimization are based on the system model, a more accurate real time model provided by the SuperCalibrator will favorably impact practically all operations of its power grid. A conservative 0.1% reduction in operating losses will translate in $200 million dollars annual savings for the Nation’s power grid. That said, the true economic impact of a more visible power grid will remain a topic for speculation. The impacts are believed to be very high because avoidance of just one widespread blackout could result in difficult to document savings of billions of dollars. For more information, contact Sakis (Athanasios) Meliopoulos, 404.894.2926, sakis.m@gatech.edu.
image of a horizontal line
Advanced Power System Visualization Tools
image of people working in front of a large screen showing the power grid connections Researchers at PSERC have integrated new visualization techniques with power system modeling methods to create visual insights for the user into the condition of power systems. Using visualization tools, industry can "see" what is happening without disruption of the actual energy production. Using two- and three-dimensional plotting capabilities coupled with power system animation, the technology gives the user a picture of the power system that synthesizes thousands of pieces of information. The visualization technology shortens the time between observing power system problems and identifying appropriate corrective actions, thereby making power systems more reliable. It integrates visualization of economic and engineering data, thereby informing decision-making for economic and reliable power system operation. It is widely used on operational and long-term planning analyses in the electric power industry, allowing engineers to efficiently run and analyze the reliability and economic effects of alternative scenarios. It also enables power systems engineers and operators to better communicate with non-technical audiences that often include business and regulatory policy-makers. It serves as a training tool for technical and non-technical audiences. From either a free download or as run from a CD inserted in a popular power system education book, the tool is improving university education by giving students simulation experiences that give insights into the operation of real power systems so that students can learn about the very complex technology of power systems through simulations. This technology was awarded the Alexander Schwarzkopf Prize for Technological Innovation by the I/UCRC Association in 2005. Economic Impact: Problems with situational awareness can lead to blackouts, such as the Blackout of 2003 that cost over ten billion dollars. The cost of outages and transmission constraints leads to some $100 billion in costs nationwide every year, so this tool is directly addressing the need for improved power system reliability. This technology has been successfully commercialized via a new small business. It is being incorporated into software that is sold worldwide. The visualization tool is a spin-off from university research that demonstrably improved power system monitoring, control, analysis and education in the electric power industry. A small business has installed the tool in some 20 control centers across the U.S. to improve situational awareness for control room operators. It is being used by over 700 engineers and policy-makers worldwide. The technology will be used by the National Electric Reliability Corporation for nationwide reliability monitoring. For more information, contact Tom Overbye, 217.333.4463, overbye@ece.uiuc.edu. Queen's University Environmental Science and Technology Research Centre (QUESTOR)
The Queens University of Belfast, Wilson McGarel, Director, 44 02.890335577, w.mcgarel@qub@ac.uk
Center website: http://questor.qub.ac.uk/newsite/index.htm
image of a horizontal line
Prevention of Bulking at Sludge Wastewater Treatment Plants
Sludge bulking is a serious problem for wastewater treatment operators. Sludge bulking, when it occurs, is a very expensive problem to eradicate. One of the prime causes of sludge bulking is the filamentous bacteria, Microthrix Parvicella , which exists at low levels in many treatment plants without causing problems. When concentrations rise above a threshold level, however, bulking and foaming is the all too common result. It occurs in the settling tanks of activated sludge plants and results in a reversal of the settling process whereby solids float to the top of the tank rather than settling to the bottom. Researchers at The Queen's University Environmental Science and Technology Research Centre (QUESTOR) have developed a diagnostic test for the detection of Microthrix Parvicella . The test prevents the occurrence of foaming or sludge bulking at activated industrial and municipal sludge wastewater treatment plants. The technology enables operators of treatment plants to monitor the concentration of Microthrix and take remedial action before the serious problem of sludge bulking or foaming occurs. image of sludge in a treatment plant and image of the Microthrix Parvicella under a micoscope Previously there had been no effective means for operators to monitor the concentration of Microthrix Parvicella . As a result sludge bulking occurred randomly and without warning. This new technology allows operators of treatment plants to detect and prevent the occurrence of sludge bulking and to take more economical remedial action before the problem occurs. Economic Impact: This technology has resulted in a new commercial product known as SLUDGEGUARD, a powerful ELISA test kit that quantifies and allows small changes in the concentration of Microthrix Parvicella to be detected. A one-step test kit for rapid detection is also under development. For more information, contact Wilson McGarel, 44 02.890335577, w.mcgarel@qub.ac.uk. Safety, Security and Rescue Research Center (SSR-RC) A CISE-funded Center
University of Minnesota, Nikos Papanikolopoulos, Director, 612.625.0163, npapas@cs.umn.edu
University of Minnesota, Vassilios Morellas, 612.624.4822, morellas@cs.umn.edu
University of Pennsylvania, Vijay Kumar, 215.898.3630, kumar@cis.upenn.edu
University of Denver, Richard Voyles, 303.871.2481, rvoyles@du.edu
Center website: http://www.ssrrc.dtc.umn.edu/
image of a horizontal line
Scout Robot Platform: Urban Search and Rescue
image of the scout robot next to a tree outside The development of large-scale robot teams has been prohibitive for a number of reasons. The complexity of such systems has been hard to simulate, especially in the case of a many to one relationship between a marsupial robot and the robots it can deploy. Additionally, the construction of physical systems can be expensive to implement and maintain. However, there is a number of scenarios in which large scale distributed teams are advantageous such as urban search and rescue, biological or chemical release monitoring, or distributed surveillance and reconnaissance. Distributed robot teams are often able to leverage the power, computational, and locomotive capabilities of a larger system to transport, coordinate, and control miniature robots which may carry more specialized capabilities into areas that are spatially restrictive. Research at the Safety, Security, Rescue Research Center has resulted in the development of the Scout Robot Platform currently being used by the U.S. Army and several police departments for search and rescue missions. The above robot has a cylindrical shape that allows it to be deployed by launching it from an appropriate barreled device. Once deployed, these robots move using a unique combination of locomotion types. Each Scout is provided with a sensor suite, which may vary with the Scout's mission. Scouts may contain some combination of a CMOS camera, a passive infrared sensor, a microphone, and other sensors. Economic Impact: The Scout Project has resulted in a start-up (ReconRobotics Inc.) that has sales of more than 20 million USD annually and employs 35 people. The Scout is projected to achieve sales of 100 million USD in sales within a few years. More than 4,000 robots have been deployed worldwide by the US Army and Navy, the FBI and various police forces in more than 50 countries. For more information, contact Sunil Saigal, 813.974.3780, saigal@eng.usf.edu.
image of a horizontal line
Plume Tracking with a Reconfigurable Computing Platform
Robotic teams are envisioned to assist or even replace humans in search and rescue operations such as when dealing with a chemical leaks. The objective researchers at the Safety, Security and Rescue Research Center (SSR-RC) is to develop algorithms and reconfigurable hardware that will allow distributed groups of robots to search an area and determine the source, type and quantity of dangerous gases released in the atmosphere due to an accident or malicious act. In order for robots to achieve this goal, they must be able to determine their position, create detailed representations of the area they search, and coordinate in their distributed detection and estimation task. Additionally, robots must deal with mobility issues when navigating in unstructured environments or need to climb stairs. To this end, the center has designed adaptive sensing algorithms that allow robots to determine the optimal locations where they need to move in order to receive the most informative measurements for the detected chemical. Additionally, stair-climbing estimation and control algorithms have been implemented that allow safe and precise navigation inside buildings. These dynamic re-configurable processes allow re-tasking of hardware and software resources, thus making adaptations to varying operating conditions possible. For more information, contact Stergios Roumeliotis, 612.626.7507, stergios@cs.umn.edu or Richard Voyles, 303.871.2481, rvoyles@du.edu. Robot team searching for the source of a gas leak.
drwaing of the scout robots connected to one other via signals
image of a horizontal line
Distributed Decision Making for Large Scale Disaster Management (DDM-LSDM)
DDM-LSDM is an exciting area of work that deals with disaster management for homeland security applications. Disaster management is becoming increasingly complex, due to uncertainty, limitations in resources, difficulty of coordination among teams, the existence of multiple and at times conflicting objectives, the need to adapt continuously to changing situations, and the scale of the operations. We model disaster management with multiple autonomous agents that can sense, act, and make decisions at different time scales using the available information and communication channels. The project builds on a simulation tool developed by the Robocup Rescue Project after the 1995 Kobe earthquake. The tool simulates civilians, traffic blocks, fires, and building collapse. Police, emergency and fire agents need to rescue civilians and extinguish fires before the civilians die and the fires spread. Traffic blocks hamper their movements, noisy sensors make assessing the situation hard, loss of communications prevents effective team coordination. We focus on the decision processes and communication needs of the agents, addressing specifically the need to rapidly adapt to changing situations. We study: 1) distributed decision making algorithms that make the best use of available information, and; 2) multi-agent systems approaches to manage interactions and cooperation among large numbers of individual agents and teams of agents. Using this agent-based simulation tool we can model large cities with many people and emergency responders, and study how each decision made by each agent affects the global outcome of the disaster. The simulation tool works on real maps of real cities, giving the decision makers ways of assessing how well their emergency plans will work under different circumstances and provide a training tool for emergency responders. Economic Impact: In the opinion of SSR-RC sponsors, when completed this simulation tool will be capable of facilitating disaster management operations by providing emergency responders and citizens with a tool that can be used not only to train emergency workers, but more importantly to better understand how disasters can affect them, to locate escape routes and to become better prepared when a disaster strikes. The long-term plan is to connect the simulator with software systems such as Eden (Emergency Development Environment), an open software system for rapid deployment humanitarian response management (from the Sahana Software Foundation), so that the information used by the simulator can be updated with real-time data provided by citizens in the affected areas. For more information, contact Maria Gini, 612.625.5582, gini@cs.umn.edu. Security & Software Engineering Research Center (S2ERC)
A CISE-funded Center
Ball State University, Wayne Zage, Director, 765.285.8664, wmzage@bsu.edu
Iowa State University, Doug Jacobson, Co-Director, 515.294.8307, dougj@iastate.edu
Virginia Tech, T. Charles Clancy, 540.251.2090, tcc@vt.edu
Center websites: http://www.serc.net and http://www.cyber.vt.edu/s2erc
image of a horizontal line
More Efficient Access Control to System Development Data
Researchers at the Security and Software Engineering Research Center (S2ERC) have developed tools and methods that allow access to development data, information, and artifacts without sacrificing security and confidentiality. The context is one in which some participants in development projects must be granted unrestricted access to a project’s data but are denied access to other data that may be present on the development network. For example, partners, customers, and subcontractors may all be co-located at the team-lead’s facility, but their access to data via the host company’s corporate network must be limited to project-relevant development artifacts only. Project participants who are also employees of the host company must have access to information beyond project-specific artifacts. image of office workers in a building at night Current access control approaches tend to be binary and inflexible, with the result being that developers are incorrectly denied access to data, information, and development artifacts that their program or work requires. In the case of a co-located team in which not all members are employees, or where access to information is on a need-to-know basis, this wastes time and money as developers wait to be added to access control lists and networked artifacts are moved from one protected directory to another, or from one server to another. This could be the case where software components are reused from one project into another. The innovative approach taken by center researchers uses probability-of-risk rather than group membership to determine access privileges. The access control model dynamically calculates the risk of disclosing specific information based on probabilistic estimation. The access control decision can then be made by comparing the estimated risk with a pre-defined threshold. Economic Impact: The potential reduction in development costs and scheduling is significant. Increasingly, as development projects can no longer afford to develop internal expertise for all aspects of a product, they need to team with partners and hire subcontractors. Where access to networked information is determined by group membership, delays can be chronic, lengthy and costly, especially where access approval requires multiple approvals (e.g., where an artifact is to be reused from a previous or a concurrent project). Flexible access control has the potential to eliminate much of this delay and accelerate the development of the common knowledge and understanding required for successful development. In this way it can save significant dollars for many types of organizations. For more information, contact Robyn Lutz, 515.294.3654, rlutz@cs.iastate.edu.
image of a horizontal line
Design Metrics Technology
Improvements in the software development process depend on the ability to collect and analyze data drawn from the various phases of the development life cycle. The Security and Software Engineering Research Center (S2ERC) Design Metrics Team has developed a metrics-guided methodology for maximizing and maintaining software reliability. This technology provides an unbiased framework that efficient makes cost-effective determines for design improvements, code-modifications and related testing and management strategies. Applying this methodology to software designs identifies and highlights stress points within software. This helps improve overall design quality. Stress points are defined as critical components in software; points where errors in coding and programming logic are likely to occur. image of satellites. Caption is Satellite-related projects are one type of technology
benefiting from design metrics. Identifying such components in advance and applying mitigating approaches results in improved resource allocation. In the coding phase, the technology can identify stressful components and provide change impact analyses. In testing, metrics can assist in determining where testing efforts should be focused and the types of test strategies that are needed. In twenty years of metrics validation on a wide variety of projects ranging from missile defense, satellite, accounting, and telecommunications systems to interactive games, the design metrics have identified at least 75 percent of error-prone components with very few false positives. Applying this design metrics technology is assisting developers in engineering higher quality and reliability into the software products. This technology was awarded the Alexander Schwarzkopf Prize for Technological Innovation by the NSF I/UCRC Association in 2007. The S2ERC Design Metrics Team continues to learn more about enhancing reliability and dependability of critical software systems. Economic Impact: Software unreliability often is due to design faults. While software can fail for reasons other than faulty design, these design mistakes occur in various forms, including design inconsistencies and semantic errors. Historically, identifying error-prone components early in the life cycle reduces software failures and their associated costs. For more information, contact Wayne M. Zage, 765.285.8664, wmzage@bsu.edu.
image of a horizontal line
Formally Verifying Software Systems
This work has addressed these past limitations by identifying, formalizing and automating the complex processes of identifying security relevant data and its potential for leakage or corruption. In so doing, usable infrastructures are becoming more available for defining and developing provably secure software. This work advances the state of the art in software systems development for homeland security by providing guaranteed compliance with security goals. Further efforts have identified new algorithms for important problems such as security level inference and credentials discovery. This advance is being used not only in security typed languages, but are also helping the operating systems community to define policies and services tailored to the security requirements of applications. This will significantly enhance the capabilities of commercial software developers to articulate and realize security goals. The researchers are working with Motorola to evaluate how the tools can be used to make applications of embedded devices such as cellular phones more secure. Economic Impact: This work was the genesis of a stream of works that sought to explore the practical use of language-based systems. It has led to several substantive research projects targeting formal verification of computing systems at organizations such as Cornell University, University of Pennsylvania and Microsoft. The JifClipse integrated developer environment tool created as part of this project has been used and adapted by several groups, and has been distributed to industry and academics over the Internet. These projects support at least 10-20 paid student and professional researchers, and will likely impact the quality of several commercial products, e.g., elements of type-based security are being considered for several key Microsoft products. For more information, contact Patrick McDaniel, 814.863.3599, mcdaniel@cse.psu.edu.
image of a horizontal line
Spotlighting the Code
In addition to designing, coding and testing software, software engineers need to “maintain” it. “Maintenance” consists of the innumerable adaptations, enhancements and “bug fixes” that are needed over the years that a system is in service. Numerous studies have shown that 50% or more of the life-cycle budget of most software system is spent on “maintenance”, largely because maintenance software engineers struggle to understand unfamiliar code, often developed by others years before. image of screenshot of the software spotlighting the code The S2ERC “Spotlighting the Code” project’s “software reconnaissance” technique helps software engineers quickly locate code he needs maintenance in large systems. The technique has been very influential in subsequent software maintenance research; the original 1992 paper was judged the “most influential” ten years later based on the number of citations it had received. The open source Recon and TraceGraph tools developed by the project are still downloaded hundreds of times each year. A screen shot of TraceGraph analyzing traces from an Apache web server is pictured. Economic Impact: According to the U. S. Bureau of Labor Statistics, in 2004, there were 760,840 software engineers working in the US. About half of these are involved in “maintenance” activities. If software reconnaissance enables software engineers locate code quickly and thus decrease maintenance time by just 1%, This would represent annual salary savings of about 200 million USD. For more information, contact N. Wilde, 850.474.2548, nwilde@uwf.edu.
image of a horizontal line
Safety in SmartHomes
With the increasing availability of low cost wireless devices such as cellular phones, medical devices, and home networking devices, it is now possible to remotely and programmatically control various devices and events at remote locations (e.g., in homes, offices and in the field). However, while offering increased flexibility and convenience to users, these devices have also raised the possibility of serious malfunction due to proximity. For example, aircraft navigation systems and medical implants are affected adversely by cellular devices and other devices that emit large amounts of radiation such as an NMR. Despite stringent emission standards, proximity of two or more devices, often of different kinds, can raise serious threats to human life. The SmartHome project has investigated, among others, issues of safety. A key contribution of this work is universal software/hardware architecture for devices that radiate. Such an architecture, obtained with the help of Digital Device Manuals, allows controllers to be embedded in various life threatening environments such as hospitals and aircraft. This helps ensure safe operation of co-located mobile and other radiating devices. image of a man using a mobile phone on an airplane A collection of one or more devices, each described by its Digital Device Manual and reachable over a network, is a ConnectedSpace. The behavior of each device is expressed using an extended finite state machine. A set of policies may be enforced on the ConnectedSpace to ensure safe operation. Such safety policies are monitored and enforced by safety controllers. Procedures were developed for the automatic synthesis of optimal safety controllers in ConnectedSpaces. The notions of policy relaxation and safety ranking are novel to this work. Economic Impact: The procedures developed in this work have the potential for substantial economic impact in the health care and air transportation industry through the avoidance of accidents due to interference from mobile devices. Passengers are asked to turn off their cell phones to bring to nearly zero the impact of cell phone communications on flight navigation systems. The ConnectedSpaces paradigm offers an automated mechanism for turning phones off once the pilot has indicated that they must be turned off. In hospitals, mobile phones are not allowed near critical areas because they might interfere with medical devices. ConnectedSpaces modeling and implementation of its algorithms minimizes interference thereby reducing the risk of accidents due to inappropriate mobile device operation, thusly avoiding the human and economic costs that can be associated with these events. For more information, contact Aditya Mathur, 317.494.7823, apm@cs.purdue.edu.
image of a horizontal line
Visual Intrusion Detection System (VIDS)
Connectivity is the lifeline for many of the services we expect. Losing control of network nodes even for the shortest period of time can generate unpredictable consequences. Loss of connectivity can also provide an adversary with unexpected advantages which may lead to life threatening adverse events, injury, extended power outages, water contamination and subsequent losses of confidence in large portions of the economy. It is crucial, therefore, to mitigate network threats. Monitoring is an effective deterrent against misbehavior from both insiders and intruders. screenshot of the VIDS software The VIDS project combines the research efforts of visualization and network security to create a practical tool for network security analysis/monitoring. A visual approach offers a number of benefits over the traditional textual analysis of security data. Network-based attacks have become more sophisticated and visualization can increase the speed at which security issues are identified. Rapid identification of attacks can lead to more effective responses where decisions must be made quickly. Efficient security monitoring has always been complex, involving collecting, correlating and storing information from many sources, firewalls and intrusion detection systems. Visualization is an effective tool to address the analysis of millions of log entries by distilling large amounts of data into something meaningful. Additionally, complex relationships can be hidden within the large amount of data produced by security tools, whereas an image can convey these relationships in direct and concise forms. These images can assist security personnel in deciding on areas to investigate. Often, patterns that were not anticipated are revealed when the data are graphed. The picture depicts a VIDS three-dimensional view of alerts with various priorities, shown in the graph as spheres of different colors. Our aim is to provide security analysts with a tool to discover patterns, detect anomalies, identify correlations and communicate their findings. Those who wish others harm are no longer necessarily geographically distant, but can be just behind the firewall. The destructive potential of cyber-attacks are real and as more high technology products are designed to communicate directly without human involvement, the attacks can cascade unpredictably. Economic Impact: Assisted by VIDS, analysts protect our essential digital infrastructure, identified by President Obama, as “the backbone that underpins a prosperous economy and a strong military and an open and efficient government. Without that foundation we can't get the job done.” It is estimated that $1 trillion was lost in 2010 to cybercrime; a figure that is considered low due to unreported incidents. If analysts using a system such as VIDS can avoid just 1/1000 of the value of these cybercrimes, then the savings could amount to $1 billion. [May 2009 remarks by the U.S. President on Securing our Nation’s Cyber Infrastructure.] For more information, contact Dolores M. Zage, 765.285.8646, dmzage@bsu.edu. Silicon Solar Consortium (SiSoC)
North Carolina State University, George Rozgonyi, Director, 919.515.2934, rozgonyi@ncsu.edu
Georgia Institute of Technology, Ajeet Rohatgi, 404. 894.7692, ajeet.rohatgi@ece.gatech.edu
Center website: http://www.nsf.gov/eng/iip/iucrc/directory/sisoc.jsp
image of a horizontal line
New Silicon Growth Techniques Lowers Costs of Solar Photovoltaics
Crystalline silicon continues to dominate the photovoltaics (PV) industry in the renewable energy market. Within silicon based solar, cast multicrystalline (mc-Si) and Czochralski (Cz) grown material account for the majority (~80%) of PV devices made. Each type has advantages and disadvantages when considering the total cost of production. image of solar panels Traditionally much of the performance disadvantage incurred in mc-Si materials is a derivative of the growth methodology. Due to the nature of the solidification of the Si melt, the crystal segregates into smaller randomly oriented crystals and suffers from many planar dislocations. These regions serve as sinks for impurities, along with crystallographic stress defects which reduce photo-diode quality. This limitation of traditional as-grown mc-Si can only be overcome through advanced gettering techniques and supplemental processing which are currently not conducive to commercial application. SiSoC researchers at the Georgia Institute of Technology (GIT) along with commercial partners have also produced >18% conventional cells through study of growth methodology and commercial process optimization. Collaborations with researchers at multiple companies have explored new growth techniques that seed the mc-Si casting crucible with a (100)-oriented Si crystal. With careful growth rate and temperature control they are able to grow a nearly single crystalline material over a large vertical and horizontal area of a casting which maintains the seed orientation. This material is called quasi-mono, cast-mono, or mono-cast (mcast-Si). Due to the crystal orientation of mcast-Si, anisotropic texturing methods normally used for Cz-Si can be applied to the wafers during cell processing. The net result is a >1% absolute boost in efficiency over isotropically textured mc-Si wafers (non-encapsulated). This type of material when commercially processed has obtained >18% efficiency which is on par with Cz-Si material. However, much work remains to optimize the growth process. One issue is that a limited percentage of a mcast-Si ingot is capable of achieving maximum efficiency. The same lifetime and contamination distributions found in traditional mc-Si remain in the mcast-Si ingots. In addition, material near the edge and corner regions of the cast reverts to mc-Si and its traditional material and efficiency limitations. image of man with gloves holding a silicon chip If processes can be optimized to increase the area of monocrystalline material and if the material quality can be maintained with reduced costs, then the advantages of the mcast-Si material would be multi-facetted. One advantage would be the packing factor for wafers in a module. Mcast-Si wafers are 6x6 inches (~244 cm2) square like mc-Si wafers. Cz-Si wafers are 6x6 inches (~239 cm2) pseudo square in most cases with rounded corners due to growth constraints. A module can hold the same amount of mcast-Si cells as Cz-Si cells. Hence the mcast-Si material provides additional power due to maximizing the active area of the PV module. A second advantage is that the material retains the flexibility of Cz-Si for advanced cell structures needed to make the PV industry more competitive. Under application of one of GIT’s more advanced structures, mcast-Si material has achieved >19% conversion efficiency on a full 244 cm2 substrate. This is a significant efficiency for full-scale cells based on materials grown using a casting methodology. Economic Impact: A key cost of production metric for the PV industry is the total cost of production in terms of the power produced ($/Watt). If module efficiency is fixed at 16% and the wafer cost considered, mc-Si material is significantly cheaper to produce (~0.35¢/Watt) when compared to Cz-Si (~0.50¢/Watt). The potential impact of this collaboratively developed mcast-Si material on the PV industry is clear. If its cost can be driven down to near mc-Si levels while maintaining performance levels, on par with Cz based cells, then mcast-Si would provide a significant $/Watt cost advantages in the PV market. For more information, contact Ian Cooper, 404.894.4041, ian.cooper@gatech.edu. Smart Vehicle Concepts Center (SVC)
Ohio State University, Raj Singh, 614.292.9044, singh.3@osu.edu
Texas A&M University, James Boyd, 979.458.0419, jboyd@aero.tamu.edu
Center website: http://www.SmartVehicleCenter.org
image of a horizontal line
Design Concept for Smart, Adaptive Seat Belts
Current seat belt systems use a compromise design for the median sized individual in an average collision. These systems rely on mechanisms that have a limited capacity to adapt to varying conditions and are massive and complex. Under the direction of Marcelo Dapino at the Ohio State University Researchers at the Smart Vehicle Concepts (SVC) Center have developed an innovative design concept for a new generation of automotive seat belts. The research is focused on the enabling technologies for the adaptive seat concept: friction reduction via piezoelectrically-induced ultrasonic vibrations next-generation flexible sensors. By using “smart” materials these adaptive seat belts promise enhanced crash safety along with a reduction in the mass and complexity of the seat belt system. Smart materials rely on externally applied stimuli such as magnetic fields, electric fields, heat or light. image of a crash test dummy in a research setting Crash data suggest that small changes in friction forces at the D-ring have a large effect on the chest force. Smart seat belts measure chest force by using flexible smart polymer sensors that are woven in the seat belt webbing. These sense Ultrasonic vibrations, then small piezoelectric actuator embedded in the D-ring automatically adjust the friction force and maintain the desired constant chest force during a crash. The critical benefit over existing seat image and drawing of an adaptive seat belt concept belts is that the system modulates the chest force independent of webbing displacement. The fundamental technologies investigated in this research, ultrasonic friction control and flexible polymer sensors, are directly applicable to adaptive seat belts and numerous vehicle components such as suspension links, steering, powertrains and human-machine interfaces. image of Experiment for fundamental
characterization shows part of
the experiment developed to
quantify and understand the
fundamental mechanisms of
Poisson-effect ultrasonic
lubrication. Economic Impact: Adaptive seat belts can change the economics of vehicle safety by greatly increasing the effectiveness of seat belts and associated reduction in injury and insurance claims, while simultaneously facilitating fuel economy due to lower bulk and mass of the overall seat belt system. Use of solid-state lubrication can eliminate the need for lubricants that can be expensive. Economic gains can also be achieved in hydraulic systems in agricultural and construction equipment, commercial vehicles, ships and aircraft; thus greatly reducing the size of batteries needed for powering mobile devices. For more information, contact Marcelo Dapino, 614.688.3689, dapino.1@osu.edu. Water and Environmental Technology (WET) Center
Temple University, Rominder Suri, Director, 215.204.6937, rsuri@temple.edu
University of Arizona, Ian Pepper, Director, 520.626.3328, ipepper@ag.arizona.edu
Arizona State University, Morteza Abbaszadegan, 480.965.3868, morteza.abbaszadegan@asu.edu
Center website: http://wet.temple.edu/
image of a horizontal line
Advanced Oxidation Processes (AOPs) for Water and Wastewater Treatment
Advanced Oxidation Processes (AOPs) generate highly reactive species (e.g., hydroxyl radicals) for the oxidative destruction of target pollutants in water and wastewater. The WET Center has been researching a number of technologies that can be used to generate hydroxyl radicals. The Center AOPs (ozone, UV, ultrasound, hydrogen peroxide and their combinations) have been successfully applied for the removal of wide range of ECs including: steroid hormones, pesticide, pharmaceuticals, 1,4-dioxane and BPA. Comparative life cycle analysis (LCA) and life cost analysis (LCC) is used to evaluate, optimize and determine the environmental impacts of AOP technology. The best AOP is selected for a given application according to the efficiency of ECs removal, technical feasibility, energy consumption and costs.image of wasterwater treatment equipment in a laboratory setting A pilot-scale, skid mounted water treatment system has been acquired for testing at several global locations. This highly automated AOP pilot plant supplied by ITT (a Center member company) consists of an oxygen generator, ozone generator, low intensity UV lamps, and peroxide feed pump. This unit is also equipped with inline ozone monitors to measure ozone in air, water, and atmospheric phase, UV sensor, degassing unit, and catalytic ozone destructor. This system uses three proven treatment technologies (Ozone, UV, and hydrogen peroxide) in six different ways to eliminate organic pollutants. It has a capacity to treat 2,500 to 25,000 gallons per day of water and wastewater. Economic Impact: The AOPs are gaining attention in the market and have tremendous application potential for drinking water, municipal wastewater, industrial wastewater, and groundwater treatment. A member company estimates revenue generation of $30 million in five years, as well as the creation of new jobs from the application of this technology. The global water industry is estimated to be about $500 billion. For more information, contact Rominder Suri, 215.204.6937, rominder.suri@temple.edu.
image of a horizontal line
Real-time Detection of Contaminants in Potable Water Distribution Systems
The WET Center has developed a Real-Time Sensor Laboratory at the Water Village within the University of Arizona. Multiple sensors in parallel allow for instantaneous detection of both chemical and microbial contaminants. However, currently there are no real-time sensors available for human pathogenic viruses. Therefore, we utilized next generation thinking to ensure the safety of potable water for consumers. Essentially we have utilized advanced oxidation processes (AOP) such as UV/H2O2 to oxidize both chemical and microbial contaminants effectively destroying them. The validity of this process was evaluated with real-time sensors and in the case of viruses, cell culture. Data showed that neither trace organic contaminants nor human pathogens survived AOP. Therefore, AOP can ensure the removal of contaminants found in potable water because of inadequate treatment, accidental intrusion events via broken distribution pipes, or deliberate intrusion through acts of bioterrorism. image of wasterwater treatment equipment in a laboratory setting Economic Impact: Since water is delivered to consumers via distribution systems in every town and city within the U.S., the economic impact of this proof of concept for the nations could involve billions of dollars annually. For more information, contact Ian Pepper, 520.626.3328, ipepper@ag.arizona.edu.
image of a horizontal line
Survival of Infectious Prions during Wastewater Treatment
Transmissible spongiform encephalopathies (TSE) are a group of neurological prion diseases of mammals. In humans these include Kuru, Creutzfeldt-Jakob disease (CJd), sporadic Creutzfeldt-Jakob disease (spCJd), and variant Creutzfeldt-Jakob disease (vCJd). In animals TSE includes scrapie in sheep and goats, bone spongiform encephalopathy (BSE) in cattle, and chronic wasting disease (CWD) affecting deer, elk, and moose. Normal prions found within humans have a tertiary structure involving the alpha helix. In contrast, infectious prions have a beta sheet structure. Of interest is the fact that when an infectious prion encounters a normal prion it converts the normal prion to the infectious mode, ultimately resulting in disease. One route of exposure to infectious prions is through raw, contaminated wastewater via animal rendering and meat processing operations due to prion-infected cattle or sheep. Recently published research documented that prions can survive wastewater treatment. However, Western blot technology was used in this study which only looks at the amino acid sequence and does not distinguish between infectious and normal prions. image of a prion under a microscope Researchers at WET developed a new assay that only detects infectious prions. They used this assay to study of the fate of prions during wastewater treatment. Data showed that prions are actually inactivated during mesophilic or thermophilic anaerobic digestion negating the possibility of prions surviving wastewater treatment. This is significant in that if prions had survived wastewater treatment, they could have been found in biosolids and subsequently land applied with a potential for infecting cattle. Economic Impact: The outbreak of “mad cow disease” in Britain in the 1990s resulted in the slaughter of hundreds of thousands of animals and many millions of dollars in damage. The new assay should make it possible to avoid or at least substantially reduce such losses in the future. For more information, contact Ian Pepper, 520.626.3328, ipepper@ag.arizona.edu. Wireless Internet Center for Advanced Technologies (WICAT) A CISE-funded Center
Polytechnic Institute of New York University, Shivendra Panwar, Director, 718.260.3740, panwar@catt.poly.edu
Auburn University, Prathima Agrawal, 334.844.8208, agrawpr@auburn.edu
University of Virginia, Barry Horowitz, 434.924.0306, bh8e@virginia.edu
Virginia Tech, Tamal Bose, 540.231.2964, tbose@vt.edu
University of Texas at Austin, Theodore Rappaport, 512.471.6500, wireless@mail.utexas.edu
Center website: http://wicat.poly.edu/
image of a horizontal line
Millimeter-Wave Propagation: Enhancing Wireless Technology
Wireless service providers today face a bandwidth crisis that will soon impede their growth, unless solutions that provide more bandwidth, such as millimeter-wave technologies, are perfected and adopted within the next few years. As mobile data traffic continues to increase at an exponential rate, wireless services are now faced with a bandwidth crisis, as there is not enough bandwidth at lower carrier frequencies to accommodate global data traffic. While current wireless technologies provide data rates on the order of Megabits per second, millimeter-wave devices will offer users Gigabits per second, a 1000-fold increase. These millimeter-wave technologies offer a solution to the bandwidth-crisis, as there are 10’s to 100’s of Gigahertz of bandwidth available at millimeter-wave frequencies. Successful development of millimeter-wave systems, however, requires accurate knowledge of how millimeter-wave signals propagate and are affected by their environment. Wireless Internet Center for Advanced Tech (WICAT) researchers at the University of Texas at Austin are developing fundamental millimeter-wave propagation models that should lead to wider propagation of wireless signals in outdoor environments. The understandings gained through this research are enabling development of improved millimeter-wave mobile broadband communication technologies and systems. This research is a breakthrough in the sense that this is the first time that millimeter wave channels are studied in an outdoor mobility environment. image of Testing wireless signals in outdoor
environments. Improved applications of millimeter-wave wireless technologies are everywhere because the application space is broad and growing quickly. These include cellular-phones that will provide data rates of 10’s to 100’s of Gigabits per second. Data rates in these ranges will enable mobile users to download entire libraries worth of information in fractions of a second. Shorter-range applications, such as wireless home media-centers, are already available, and provide consumers a hint of what is possible with data rates several orders of magnitude greater than what is currently achievable with most wireless technologies, including streaming of high-definition media content. Data centers, which are an increasingly large consumer of electricity throughout the globe due largely to the energy required to cool larger servers, will benefit from millimeter-wave wireless technologies. These data center designs will be more space and power efficient because cables will no longer prevent servers from being arranged for optimal cooling. Millimeter-wave technologies also have applications outside of traditional communications. Homeland security will likely benefit because the nature of millimeter-wave signals are more applicable for detecting motion and for identifying objects hidden under thin layers of clothing or tissue. In the near future, security personnel will likely have mobile hand-held millimeter-wave scanning devices that can detect hidden weapons. Doctors may soon have portable scanning devices that can detect tumors without the need for more expensive technologies. Economic Impact: It is clear that millimeter-wave systems will enable continued development of wireless technological and market growth. This WICAT research provides the fundamental knowledge to support the continued growth of the communications industry - which accounts for ~8% of the GDP of the United States, or more than $1 trillion of annual revenue. This leadership in the development of millimeter-wave technologies will help ensure the global competitiveness of the communication industry of our nation. By enabling orders-of-magnitude higher data rates the nation’s wireless capacity, productivity and competitiveness will be enhanced. For more information, contact Theodore Rappaport, 512.471.6500, wireless@mail.utexas.edu.
image of a horizontal line
Alleviating the Mobile Bandwidth Crunch
image of a wavy rainbow following out of a mobile phone (representing a bandwith of data) The rapid growth in cellular wireless traffic as a result of the popularity of smart phones has assumed crisis proportions as cellular carriers have scrambled to keep up. A widely quoted study from Cisco estimates that traffic will double every year or so for the next several years. This traffic will increasingly consist of popular video applications such as video streaming, which consumes about two orders more bandwidth than a voice call. Researchers at the WICAT are working on a variety of low cost technologies to provide the required additional bandwidth. These include cognitive radio, which makes use of unused wireless spectrum in a smart way. Other avenues include the use of 60 GHz radio technology, a part of the radio spectrum which has been shown to have much higher range, and is therefore more usable, than previously reported. Advanced wireless channel aware video compression and transmission technologies will also play a part. Finally, the notion of using relays to extend the range, coverage, and capacity of cellular networks has been pioneered by WICAT. All of these innovations are now at various stages of adoption or consideration by the cellular network vendors, many of which are industry members of WICAT. For example, partner companies like InterDigital and Samsung have incorporated WICAT research in their future plans. WICAT has also participated in standards bodies to influence their trajectory. Once deployed by cellular carriers, these will enable new applications, many video based, that would have been infeasible without the additional bandwidth unlocked by these technologies. The focus of WICAT is to facilitate the growth of low cost, mobile access to the Internet, using innovative technology. Economic Impact: The cellular business in the US has annual revenue of approximately $200 billion and employs over 250,000 individuals. Billions are spent annually by the carriers to keep up with demand, most recently with the upgrade to the new 4G technology. We estimate that the technologies WICAT is pioneering will save carriers tens of millions of dollars over the next decade or so by providing innovative, efficiency enhancing solutions to some major problems. This will result in lower cost for consumers and a more internationally competitive industry. For more information, contact Shiv Panwar, 718.260.3740, panwar@catt.poly.edu.