Article courtesy of Michelle Taylor, Editor-in-Chief | February 6, 2015 | Laboratory Equipment | Shared as educational material
Pittcon 2015 unveils new discoveries in equipment and lab solutions that accelerate scientific research.
Jazz music, purple, green and gold, Bourbon Street, the French Quarters, Louisiana Creole cuisine and science. While the last one may stick out like a sore thumb, it finds its place among New Orleans traditions when the Pittsburgh Conference on Analytical Chemistry and Applied Spectroscopy returns to town March 8 to 12. Pittcon 2015 has more than 900 exhibitors, and is set to draw 18,000 attendees, exceeding the 16,200 mark set last year in Chicago.
There’s a reason Pittcon is held annually. All over the world, scientists are making new discoveries by the day, and companies are developing new equipment and solutions to help them do so. In its 65 years, Pittcon has been home to countless announcements that have revolutionized the industry and accelerated the evolution of scientific discovery.
A few themes come out of Pittcon every year at the conclusion of the show—last year it was hybrid and correlative microscopy systems, food testing and compact spectrometers. This year, signs are already pointing to nanotechnology as a central motif, as evidenced by the selection of Naomi Halas for the Wallace H. Coulter Lecture. Halas, the Director of Rice Univ.’s Richard E. Smalley Institute for Nanoscale Science and Technology, is a pioneer in the field of plasmonics, creating the concept of the “tunable plasmon” and inventing a family of nanoparticles with resonances spanning the visible and infrared regions of the spectrum. Her lecture will focus on new types of metal-based nanoparticles and nanostructures of various shapes and compositions that have given rise to new strategies to harvest, control and manipulate light based on the structures and their properties.
A recent Laboratory Equipment reader survey supports the notion of a growing nanotechnology industry, with 41 percent of respondents saying they expect nanotechnology to be the biggest driver of next-generation instrumentation in the next decade.
“Nano is becoming very pervasive,” Michael Meador, Director of the National Nanotechnology Coordination Office, told Laboratory Equipment. “Some of the next breakthroughs we will see in advanced technology will actually come as the result of marrying nanotechnology with other techniques, whether it is advanced manufacturing or biotechnology. Once you start to do that, you are going to create even more demand for instrumentation to support it.”
Pervasive is the perfect word to characterize current nanotechnology research. According to Meador, it’s affecting a multitude or industries, including:
Sensors: Wearable technology has the pulse of the industry right now, and it will only become more important from a human performance as well as health management standpoint. Autonomous sensor systems that have communications and data processing capabilities will allow for better monitoring and detection in everything from homeland security and defense to automobile performance.
Water: Nano advancements enable the use of nanoscale membranes to execute water purification on demand using a minimal amount of energy. Additional uses include the development of sensors to measure water quality, reduction in water consumption and contamination and the development of smart fertilizers to reduce agricultural runoff.
Natural gas and petroleum: Nanotech plays a role in fracking fluids to improve recovery, as well as in the design of coating and insolation to reduce the losses that occur during transfer of natural gas when it is shipped.
Energy: Already improving energy storage and batteries, nanotech can also have an impact on energy distribution systems in the form of carbon nanotube-based cabling for power distribution lines. Currently, the nanotubes do not have the correct electrical conductivity and current-carrying capability to be a viable option, but researchers are working on improving the electrical properties. Meador believes there could be a breakthrough in the next five to 10 years that would allow these carbon nanotube-based wires to be used as power cables.
Materials: Nanotechnology is making strides in developing materials that are higher strength and stiffness, but lighter weight as a replacement for conventional metals or composites that are currently used in aircraft, automobiles and other transportation systems. These new nano-composites have the potential to take weight out of a vehicle, enabling significant reductions in fuel consumption. For example, a study out of Rice Univ. suggests that the use of advanced composites in aircraft and automobiles could reduce U.S. petroleum consumption by 6 million barrels a day, and reduce global CO2 emissions by 25 percent.
Medicine: There are currently 17 nanotechnology-based drugs in clinical trials, with opportunities for the development of more nanoparticle-based diagnostics and therapeutics. Current applications focus on delivering a drug payload to a specific target—for example, smart drugs that deliver anti-cancer treatment straight to a tumor, minimizing harmful effects on healthy cells.
Another example in this area of research comes from MIT researchers who recently developed a system that could deliver optical signals and drugs directly into the brain via complex multimodal fibers that are about the width of a human hair. In addition to transmitting different kinds of signals, the new fibers are made of polymers that closely resemble the characteristics of neural tissues, allowing them to stay in the body much longer without harming the delicate tissues around them. Essentially, the researchers built neural interfaces that will interact with tissues in a more organic way than devices that have been previously used.
That is just one example of the intersection of nanotechnology and modern medical research. The expectation of growth in coming years is large, and is fueled partly by President Barack Obama’s 2013 “BRAIN Initiative.” Modeled on the famous and successful Human Genome Project, the BRAIN Initiative is a research effort to revolutionize understanding of the human mind and uncover new ways to treat, prevent and cure brain disorders like Alzheimer’s, schizophrenia, autism, epilepsy and traumatic brain injury. Since the announcement in 2013, dozens of agencies, academic institutions, scientists and other contributors have made significant commitments to advancing the Initiative, including more than $300 million in public and private investments.
In terms of instrumentation, nanotechnology researchers rely most on those that can help them understand and characterize the chemical composition of materials. These include microscopy, both electron and probe techniques, as well as spectroscopy techniques like Raman.
Areas of growth for both instrumentation and research include the need to be able to do more in situ, review processes in real-time and get as close to the nanoscale as possible, says Meador.
“For instance, you can do measurements in situ during the synthesis of carbon nanotubes to better understand the mechanisms by which these nanotubes grow,” he explained. “Then, you can use that information to control the growth process to get better-quality nanotubes, or nanotubes with a specific chirality or size.”
Two years ago at Pittcon, big data emerged as one of the hot topics. Big data will still be big at this year’s conference, but companies and researchers are now trending toward leveraging smaller portions of said data.
“The instrumentation is developing faster than we can buy and assimilate it,” Nina Fedoroff, Professor of Biology at Penn State Univ., told Laboratory Equipment. “What needs improvement is mostly analytical approaches to large amounts of data. Storage is not as much the problem as making sense of all the data.”
Fedoroff, former science advisor to the Secretary of State and National Medal of Science winner, explains that scientists have not yet figured out how to harness differing data. For example, in molecular biology, one important area of research involves understanding everything that is present in nuclei using various kinds of labeling procedures. But different approaches yield different data—and that’s where researchers are getting stuck.
“If you think about the amount of information in making a molecular-level map of even a mouse’s brain, you are talking about petabytes of data,” Stephen Ross, Nikon’s General Manager of Product and Marketing told Laboratory Equipment. “It’s so much information that it can be very unruly to work with—almost impossible and very costly to store, as well as very hard to access and use. I think that will get better, but I think currently, the majority of people would just like to be able to look at a specific area.”
As it stands now, data is hard to integrate, but as technology develops and manufacturers respond to users’ needs, that may change. Thirty percent of our survey respondents ranked high performance computing as the emerging technology most likely to affect research in the next decade, only behind instrument miniaturization (43 percent) and data search and retrieval systems (36 percent). Data analysis was the first application of scientific software, and it’s still clearly a guiding force. Ever increasing computing power and parallel computation platforms suggest that this is not going to slow down.
However, the scientific software industry has one big challenge to overcome before it sees complete integration—usability. Whether speaking about mass spectrometers, microscopes or software platforms, most scientists hold firm in their desire for easier-to-use solutions. Year after year, survey respondents have chosen ease-of-use as the element they would most like manufacturers to focus on in the next 10 years. This year is no different—ease-of-use was the top selection with 58 percent of the vote.
“It’s not just a slick user interface that is lacking,” Louis Culot, CEO of BioData, told Laboratory Equipment. “Too often scientific software has the power to solve a researcher’s problems, but the effort to learn the software gets in the way, and the software contains power that might be beyond the researcher’s expertise. There are two avenues to solving this and they both need to be addressed. First, the software has to be intuitive and simple to use, approaching a consumer software experience. But that, by itself, is not enough. For intense analytics or statistics, for example, the software has to do a better job of intuiting the nature of the data and problem the research is trying to solve, and be more of an expert system rather than a set of tools.”
In his solution to these software challenges, Culot touches on the influence that consumer communication technologies often have over new scientific software approaches. A combined 75 percent of survey respondents said they expect consumer technologies to affect their research somewhat (52 percent) and very much (23 percent). This has already begun with software enhancements to integral instrumentation, like balances, that resemble the popular pinch-to-zoom gesture on smartphones. Scientists are members of the larger consumer population, so it’s only natural for new communication and collaboration methodologies they are familiar with outside the lab to transition into their daily research activities. At the same time, Culot warns, it’s not always a good idea to try to force a product that was designed for consumer use into a scientific setting.
“Scientific search and retrieval will take more cues from consumer search in that we will see much better machine prediction and assistance as part of the search process,” concludes Culot. “Search software will have access to a broader set of databases, including many non-traditional sources, and it will be able to make its own connections between those databases above and beyond what the human researcher originally specifies. Software in general will continue to move to the cloud and off of the client. Predictive and artificial intelligence technology will continue to enhance and further inform human analysis. I expect software to begin to suggest fruitful avenues for research and collaboration, in addition to assisting with the research process.”
Instrumentation and purchasing outlook
According to Frost & Sullivan’s “Mid-Year Report & Analysis of the Lab Products Market” for the Laboratory Products Association, the global lab market is forecasted to grow to $40.5 billion in 2015 and $41.7 billion in 2016. This equates to a $1.1 billion increase from 2014 to 2015. This year, instrumentation and equipment is responsible for $14.0 billion (34.6 percent), chemicals, reagents and life science kits for $14.8 billion (36.7 percent) and consumables for $11.6 billion (28.6 percent).
In terms of global forecasted revenue, Europe is still slightly ahead of the U.S., totaling an expected $15.9 billion in 2015 compared with the U.S.’s $15.6 billion. However, the U.S. is projected to catch up come 2016, comprising $16.1 billion to Europe’s $16.2 billion. China is expected to stay the steady course, increasing by about $100 million each year to reach a total $1.8 billion in 2016.
On the heels of an increase in global and U.S. revenue, most individual R&D budgets for this year are expected to meet product development goals. Sixty-seven percent of Laboratory Equipment survey respondents said their budgets would cover more than half of their research goals for the year. Unfortunately, 18 percent said it was unlikely, and another 15 percent said their goals would not be met.
Analytical instruments appear to be the most important tools in the lab, with 49 percent of respondents saying they expect to purchase new systems in the next 12 months. An additional 50 percent said, given a hypothetical unlimited budget, they would spend most of the money on analytical systems, as well. In addition to analytical instruments, 2015’s purchasing plans include: software (26 percent), meters and monitors (25 percent), spectrometers (24 percent), physical testing equipment (20 percent) and imaging systems and microscopes (16 percent). Most of the equipment is a replacement for existing systems (51 percent) or an addition to existing systems for enhanced capabilities (25 percent). Biotechnology (43 percent), nanotechnology (41 percent), medical/diagnostics (36 percent), computers/electronics (30 percent) and automation/robotics (27 percent) are the top five expected drivers of instrumentation in the next 10 years.
Pittcon trends change annually because science changes daily. Ten years ago, not many predicted the rapid rise of nanotechnology, and while we knew computers and software would be important, the depth of their integration in our society was understated. While forecasts can be made and predictions published, there are always surprises when leaders, consumers, managers, executives and researchers get together to discuss the industry. Collaborations are sprung, ideas are fleshed out and the path to the future is set.