Betting on the Cloud: Life Science Data Management


Whether cloud computing emerged from J.C.R. Licklider’s vision of an “intergalactic computer network” or computer scientist John McCarthy’s notion of computation delivered as a public service, Computer Weekly’s “History of Cloud Computing” puts its genesis in the 1960s.

Although it didn’t gain traction until the 1990s, cloud computing has now passed the hype phase and is fully established, but in some industries more than others. IHS Technology projected that enterprise spending on the cloud will reach $235.1 billion by 2017 – triple the 2011 investment. 

Forbes reported that cloud computing is the strongest technology investment sector for the third year in a row, according to Deloitte's 2015 Global Venture Capital Confidence Survey. 

Life Science and the Cloud

Cloud computing is generally credited with three main benefits when it comes to business agility: increased collaboration, cost savings, and interoperability/connected operations. Given these advantages, what opportunities does the cloud present for life science? 

Nature reports that “biologists are joining the big-data club” and looking to the cloud to solve data access and volume challenges. “Much of the construction in big-data biology is virtual,” writes Vivian Marx.

Accenture sees opportunities for life science companies to “accelerate their adoption of the cloud across broader parts of their organization and value chain.” In its Outlook report, “Three Ways the Cloud is Revolutionizing Life Sciences", Accenture projects:

1. Personalized medicine will benefit from cloud-based social ecosystems; 

2. New-product development improvements will draw from more external innovation and multi-sourced data; 

3. A more global operating model will drive new partnerships and expanded markets

Cloud Security Concerns

Whenever cloud computing for data management comes up, so too does the topic of security. Companies question whether the cloud is safe for intellectual property and regulatory compliance. A 2012 survey by IDG Enterprise found that 70 percent of respondents - “a significant margin” - saw security as the primary barrier to cloud adoption.

By 2014, survey responses evolved to calling for public and private cloud service providers to “create and communicate security policies to be considered a valued partner.” 

Interestingly, the 2014 survey (the most recent by IDG) also found that the cloud increases IT flexibility, innovation, and responsiveness, and that cloud investments continue to rise, with enterprise organizations investing in cloud solutions significantly more than SMBs.

CIO magazine responded to the cloud security issue with an article uncovering the “20 greatest myths.” Suffice it to say, “server huggers” don’t get much sympathy.

Jens Hoefkens, director of strategic marketing and research at PerkinElmer, adds that there are solid technology solutions for security concerns - from encryption of data both in flight and at rest - to the use of local servers.

Given that companies are dealing with sensitive research data (and often patient data), PerkinElmer designed its cloud-based system with a strong emphasis on maintaining data safety and privacy. All data - regardless of whether it is in rest or in transit - is subject to strong encryption and also uses Virtual Private Cloud (VPC) architecture for secure access. The Signals system deploys in multiple geographic regions to help customers meet various regulatory and legal requirements regarding the physical location of the data.

Life Science & Pharma Trending Toward the Cloud

For life sciences customers, and the pharmaceutical industry in particular, cloud-enabled solutions meet the needs of two big trends:

1. Outsourcing: in which pharmaceutical organizations are doing more work outside their firewalls

2. High content data: a growing list of technologies that require extremely large and costly computing and storage capabilities

While many life science organizations have moved certain parts of their operations to the cloud – Human Resource activities, e-mail, and even clinical trial data – for some reason R&D is the last holdout.

“Ninety percent of clinical trials are running on electronic data capture systems – all cloud-based,” says Daniel Weaver, senior product manager for translational medicine and clinical informatics at PerkinElmer. “So the most critical, valuable, and expensive data they generate is already on the cloud. The concept that research data can’t be on the cloud doesn’t hold up to scrutiny.”

In fact, researchers in need of the data – and of collaborating around it, seem to agree. A collective of genomics researchers recently called on funding agencies to help establish a “global genomic data commons in the cloud", to be accessed by authorized researchers worldwide. 

Cloud: Control and Flexibility

PerkinElmer’s leadership embraces a cloud-computing future, recognizing that data analytics is a business driver. The cloud is necessary to best meet the needs for data management, integration, and analysis. As a result, the company is creating informatics solutions that leverage the cloud’s benefits of lower costs, zero installation, Internet of Things, and collaboration: PerkinElmer Signals and Elements. 

Nicolas Encina, vice president of the Innovation Lab at PerkinElmer, says the company is making a purposeful shift to “connect our instruments and make them smarter by virtue of linking them up to a cloud infrastructure.” Structuring, storing, and mining data in the cloud, and linking to a visualization platform enables informed decision-making, from managing instruments to downstream data processing. 

Perhaps we have not achieved an intergalactic computer network or computing as a public service. But the cloud does provide life science and other organizations with a secure, seamless, flexible, and low-cost means of effective data management.

What benefits are you deriving from the cloud? Are you leveraging industry best practices in cloud computing?

New Tools for the Translational Researcher


Developing treatments that take individual variability into account (“personalized medicine”) has given rise to a new discipline in science: translational research or translational medicine. Scientists in this field work to translate biological phenomena into targeted, evidence-based medicines that improve health and treat disease by more optimally matching drugs and individuals.

Currently, the field of translational medicine research is accelerating, with 90 percent of pharmaceutical companies reported to be engaged in some translational projects as a means of reducing cost while improving outcomes. 

This translational revolution affects academic research as well. For instance, the National Institutes of Health created the National Center for Advancing Translational Science (NCATS) in 2012 to speed the translation of basic research into new treatments and cures for patients – moving more quickly from “bench to bedside.” The goal is to merge basic, preclinical, and clinical research with clinical implementation and public health data to develop new approaches and demonstrate usefulness. 

Moreover, translational medicine is the focus of many governmental initiatives around the world including:

• Genomics England, a company wholly-owned and funded by the UK Department of Health, was set up to deliver a flagship project to sequence 100,000 whole genomes from NHS patients by 2017. 

• The Obama Precision Medicine Initiative, announced at the beginning of the year. 

• And collaboration between public and private institutions, like the human genetics initiative Regeneron launched last year with Geisinger Health System of Pennsylvania and the National Human Genome Research Institute. 

Top Challenges for Translational Research

But research cost and complexity are among the top challenges for clinical research and translational projects, according to NCATS. Contributing to cost and complexity are the growing sources, types, and volumes of data stemming from newer high-content techniques in translational research, including: 

• Digital pathology

• Multiplexed flow cytometry

• Next-generation sequencing

• Proteomics

• Metabolomics

• Genomics

• Cellular assays

Translating Data Management & Analytics into Knowledge

Since Moore’s Law has propelled technical innovation, with faster and more precise systems generating vast sums of data, the challenge has become effective data management and data analytics. How do we make sense of the data and convert it into knowledge?

Current software solutions are ill-equipped to help translational researchers search, access, integrate, and analyze all the data that could help them make that next breakthrough. Therefore, as the field of translational medicine continues to grow, researchers need best-in-class solutions that lend speed and ease to their work. Self-serve access to a wide variety of data, using an informatics solution designed specifically for translational medicine workflows, will enable these researchers to more quickly and easily identify and manage the biomarkers that are essential to realizing the promise of personalized medicine.

“Unless you can start harnessing data and making sense of it, in an automated way, with systems that are engineered to solve big data problems, you’ll be overwhelmed by the data very quickly,” says Nicolas Encina, vice president of the Innovation Lab at PerkinElmer . “You can no longer effectively manage this data manually, and you certainly can’t analyze it or process it manually either.”

Introducing Signals™ for Translational

As a company dedicated to providing products and services that help researchers answers questions that improve life, PerkinElmer has built, from the ground up, a cloud-based data management and aggregation platform designed specifically to address the translational workflow. 

PerkinElmer Signals™ for Translational offers out-of-the-box support for the complete precision medicine workflow – from data acquisition to biomarker discovery to validation. The purpose-built, Software-as-a-Service (SaaS) platform easily integrates experimental and clinical data, enabling translational scientists to search for, retrieve, and analyze relevant aggregated data from across internal and external sources. PerkinElmer Signals™ has been designed with flexible and scalable data models to provide the scalability, agility, and collaborative environment required to support modern life science research.

“Too often, people think about data oriented from the informaticist’s or technologist’s point of view,” says Daniel Weaver, senior product manager for translational medicine and clinical informatics at PerkinElmer. “Signals for Translational presents the data in a way a regular scientist will be able to understand. It’s organized around concepts a scientist gets, around the subjects of clinical trials, patient visits, samples collected, etc. We view it as the next generation of how users will interact with data – by connecting instruments to a global cloud environment and serving as a bridge from the laboratory to the Internet of Things.

By connecting instruments and systems involved in translational research to the cloud, PerkinElmer offers researchers and project managers more insight into how the translational project is performing, when data is available, and what the data is telling them – in a sense, becoming a central nervous system for the connected research environment.

Get to Know PerkinElmer Signals™ for Translational

If you are interested in learning more on PerkinElmer Signals™ for Translational, we’re offering a webinar looking at examples of use in areas as varied as Translational Medicine and High Content Screening, led by Jens Hoefkens, director of strategic marketing and research at PerkinElmer. 

You will learn how PerkinElmer Signals™ will enable you to: 

• Access all your research and clinical data the way you want to, when you need to

• Develop and validate your hypothesis with integrated analysis solutions

• Enable effective collaboration within and across organizational boundaries

“Pharmaceutical companies are poised to generate very large volumes of complex datasets,” Hoefkens says. “The webinar will cover how Signals supports modern life science research with flexible and scalable data models.” 

How is cost and complexity affecting your translational research? 

Big Data Analytics: Finding & Developing New Drugs Faster


Big Data’s reach is stretching farther and farther. Finance, marketing, healthcare, life sciences – virtually every industry is looking to gain a competitive advantage from using data analytics and business intelligence platforms to uncover more insights from data - faster. According to advisory firm EY, research by the Economist Intelligence Unit indicates that 77 percent of companies that maximize how they use data are also ahead in financial performance.

The EY’s “Order from Chaos” report looks at Big Data from a good news/bad news perspective: Powerful data and visual analytics and the maturing of open architecture, cloud computing and predictive analytics are helping organizations get better with data, but many organizations aren’t moving fast enough to keep up. One reason: the complexity of data and its myriad sources. PerkinElmer’s partner, TIBCO, recently published a post on the promises and pitfalls of Big Data. In it, they also noted that the sheer volume of data, the speed with which it is generated, and the ‘siloing’ of multiple data sources were overwhelming companies.

Data Analytics Driving Research & Development

According to McKinsey, the pharmaceutical industry is seeing a growth in data from multiple sources, including the R&D process, retailers, patients and caregivers. Breaking it out of data silos and putting it to immediate good use is the trick. “Effectively utilizing these data will help pharmaceutical companies better identify new potential drug candidates and develop them into effective, approved and reimbursed medicines more quickly,” state the McKinsey analysts.

When it comes to effective use of data, however, the benefits of a business intelligence platform are enhanced by domain expertise. Life science and pharmaceutical R&D data generated from multiple, different, disparate sources requires effective integration as well as powerful analytical and visualization technologies to draw conclusions that give businesses leverage. Purpose-built data analytics solutions - built from the ground up by people who understand the workflows, the studies & experiments, as well as the data needs of the researchers and scientists - provide a more robust, relevant experience than one-size-fits-all solutions.

The Future of Life Science Data Analytics

In its report on Big Data in pharmaceutical R&D, McKinsey invites us to “imagine a future” where these things are possible:

• Predictive modeling - leveraging available molecular and clinical data - helping to identify new potential candidate molecules with a high probability of success as drugs that safely and effectively act on biological targets

• Clinical trial patient profiles improving with significantly more information, such as genetic factors, to enable trials that are smaller, shorter, less expensive, and more powerful. 

• Real-time monitoring of clinical trials rapidly identifying safety or operational signals, leading to action that avoids delays or potentially costly issues.

• Electronically-captured data flowing easily between functions, instead of being trapped in silos -- powering real-time and predictive analytics to generate business value.

The solutions to these challenges – or opportunities – can be addressed by using advanced analytics and applying appropriate visualizations.

EY advises that in order to harness the power of big data and advanced analytics, companies should manage data and analytics projects as a portfolio of assets – similar to a financial investment portfolio. It’s important to use an agile analytics approach to balance value. Here are some life sciences analytics from an “ideal portfolio.”

Functional AreaAnalytics
Research & DevelopmentChemistry and Lead Discovery
Genomics Data Analytics
High Content Screening
High Throughput Screening
Quantitative Pathology
Flow Cytometry
Translational Research
Clinical DevelopmentTranslational Medicine
Project and portfolio management
Clinical Trial Operations
Clinical Trial Data Management
Risk-Based Monitoring
Clinical Trial Data Review
Health Outcomes
Real-Time Data Analytics for Biopharmaceuticals

There is evidence that pharmaceutical companies are incorporating real-time analytic solutions to effectively analyze key data without the time lags associated with previous analysis methods.

• A top 10 global pharmaceutical company sought to reduce the time, cost, and risk of running its clinical trials, while accelerating time-to- market. Deploying a data visualization and analysis platform contributed a 20-40 percent productivity improvement in its clinical data review, saving three to four days per month in a 10-week study. 

• Typical challenges in preclinical and clinical safety assessment are significantly minimized by interactive graphical data analysis. At Novartis, this approach has proved to be “efficient, powerful, and flexible” in improving both detection and systematic assessment of safety signals.

• As part of Roche’s “Fail Fast” strategy, the right analytics platform helps safety scientists and data scientists work collaboratively to solve queries from the safety science group. Data Provision Specialist Joel Allen says the ability to analyze and visualize data - to correctly answer queries in the most expeditious way - leads to better data-driven decisions. 

EY’s report predicted great benefit “if life sciences organizations are able to apply their acumen with big data and analytics to drive decisions and engage smart collaboration.” Yet Gartner cautioned that by this year, likely 85 percent of the Fortune 500 organizations would fail to effectively exploit big data for competitive advantage. 

At PerkinElmer, we’re applying our expertise in the life sciences, pharmaceutical R&D and clinical development to provide informatics solutions and services that help you take advantage of everything Big Data has to offer. You can learn more about PerkinElmer's informatics solutions powered by TIBCO Spotfire®, the leading data analytics and visualization platform at the heart of these three customer success stories above. 

How is your company using all of its data to drive its competitive advantage?

High Content Screening Improves Pipelines, Advances Drug Discovery


From Screening Drug Compounds to Better Understanding Disease, Image-based HCS Maximizes Value of Data

Extracting Maximum Value from Phenotypic HCS Data

It’s well known that if you’re going to fail in drug discovery, it’s best to do it early. No one wants to waste energy, effort, and – more importantly – time and money taking a compound to clinical trials, only to see it fail...late in the process and at great cost. For a more productive discovery pipeline, image-based high content screening (HCS) is one phenotypic drug discovery strategy that maximizes data value.

According to Nature, phenotypic screening is a strategy for the identification of molecules with particular biological effects in cell-based assays or animal models. The technique gives researchers data on hundreds of parameters – over hundreds of wells and hundreds of plates. Unlike target-based approaches, phenotypic imaged-based screens without a known target offer a “boundless number of opportunities when it comes to what the numbers will tell you,” says Dr. Philip Gribbon of the Fraunhofer Institute.

The goal in phenotypic screening is the detailed multiparametric characterization of the cellular phenotype. It’s important to capture all the information the cell is providing, rather than single parameters. With comprehensive phenotypic information, the proper tools can help you derive actionable biological conclusions.

Using physiologically relevant model systems and leveraging rich information from image-based, high content screens – at the beginning of compound testing – helps discerns those phenotypic changes that are present without undesirable effects on the system of interest.

We know that phenotypic screening strategies were more productive than target-based drug discovery strategies in discovering first-in-class small molecule drugs, from 1999 to 2008. So while it’s not a new technology, phenotypic screening is enjoying renewed attention. 

Reaching HCS’s Full Potential

In a 2014 Journal of Biomolecular Screening article (Increasing the Content of High-Content Screening: An Overview) authors Shantu Singh, Anne E. Carpenter, and Auguste Genovesio “assess whether HCS is as high content as it can be.” They found the vast majority (60-80 percent) of HCS used univariate analysis tools and only one or two image-based features in the analysis. A continuing need for better analytical approaches was needed, they said, but concluded: “As data analysis approaches for these more complex types of HCS experiments become well developed and incorporated into easy-to-use software, we anticipate more powerful applications of HCS to blossom, the value of a typical experiment to increase, and ultimately the technology to be more widely adopted.”

That was music to our ears, coming at a time when we introduced High Content Profiler™, specifically to address the needs for a more robust, interactive, easy-to-use, fast, and powerful solution to analyze HCS images. Singh, et al, argued that “Advanced data analysis methods that enable full multiparametric data to be harvested for entire cell populations will enable HCS to finally reach its potential.”

So How Can You Reach Your Full Potential in Phenotypic HCS?

If you’re looking to complement your target-based drug discovery with phenotypic screening and image analysis, make sure you’ve got the best solutions – namely, no-compromise speed and sensitivity instruments, and the ability to manage, analyze, and visualize the mass of data from such screening.

For immediate phenotypic validation of your hits, you want:

• higher-throughput, extremely sensitive imaging

• automatic image transfer and powerful image segmentation and analysis

• advanced statistics, machine learning, and data analysis methods

From image acquisition to hit selection, PerkinElmer’s uniquely comprehensive phenotypic screening solutions help you generate rich, multi-parametric data with integrated downstream analysis, leading to better hypothesis generation and validation. In this brief video, customers discuss some of the advantages they’ve found using our HCS systems and informatics platforms to accelerate phenotypic screening workflows and better understand disease.

What you may not be aware of is a new version of High Content Profiler, still powered by TIBCO Spotfire®, addresses more than just HCS data; it covers a wide variety of analytical needs for high content analysis, including: 

• Drug profiling and clustering
• Cytometry analysis
• High throughput screening
• Curve fitting
• High dimensional data analysis

If your target-based approach isn’t delivering the results you need, you may want to consider phenotypic HCS. The technologies available today to capture stunning images and multiparametric data from each cell give you far more options to pursue – and as Dr. Gribbon says, boundless opportunities. Today, we are far better equipped to gain significant insights from all the data produced by HCS – maximizing the value of that data.

The right tools make all the difference. As leaders in the field of HCS, PerkinElmer provides the most comprehensive selection of integrated solutions for successful phenotypic-based drug discovery.

Are you using target-based or phenotypic imaging-based drug discovery strategies? If you do, what tools are you using to maximize the impact of your drug discovery data?

Clinical Trials Management: Spot Problems Earlier, Easier with Visual Analysis


The number of clinical trials has grown more than 35 times since the National Institutes of Health first made its site public in 2000. And the data incorporated in the 195,770 trials registered so far in 2015 has expanded to include not only traditional clinical measures, but also translational and outcomes data. 

With that much data, and the proliferation of technologies to support its creation, how can Clinical Project Managers (CPMs) and Clinical Research Associates (CRAs) improve visibility into their trials? How can they spot problems and address issues earlier in the clinical process? And lastly – how can they do these things without attending coding bootcamp or becoming a computer scientist?

Go for Visual Analytics

What’s needed for effective clinical trial management are unifying solutions that enable analytically driven decision-making. Faster, dynamic solutions enable the user to quickly analyze disparate data from multiple sources to create a complete picture of what’s occurring in clinical development – as it happens. 

It’s no longer a clinical best practice to wait six months for data to be locked, cleaned, and analyzed. Traditional reporting methods are too slow, cumbersome, and time consuming. Trial data is needed from ‘First Patient In’ -not at discrete time points, but rather in an ongoing manner. Visual data analytics based on electronic data capture make this possible. 

Incorporating visual analysis into clinical operations provides timely information and actionable insight to keep trials on track. Program, country, and study managers and monitors can make decisions based on live, interactive scorecards that track everything from planned vs. actual budgets, to study milestones including Investigator Review Board approvals and patient visits. 

Choosing the right visual analytics platform can help companies achieve clinical operational excellence. But analytics platforms are only as effective as the underlying expertise available to platform users. At PerkinElmer, for example, our clinical analytics platform - powered by TIBCO Spotfire® - is backed by our years of experience:

     • Building advanced analytics solutions to cover drug development workflow needs

     • Breaking rigid data silos to power real-time and predictive analytics

     • Offering value-added analytics consulting services to adapt the solutions to specific client needs.

Empowering Risk-Based Monitoring

FDA, EMA, and PMDA all recommend risk-based monitoring (RBM) of clinical investigations to enhance patient safety, improve data quality, and drive efficiencies. Visual analytics provide valuable insights for RBM, as it accelerates data aggregation through continuous collection and automated consolidation. 

To confidently identify and assess issues early enough to improve study safety and efficiency, technology platforms must enable continuous monitoring with near real-time intuitive visualizations, analytic dashboards, and applications.

Visual Leads to Virtual

Virtual biotechs – small companies with a few executives overseeing the outsourcing of biopharmaceutical R&D – have emerged in the last ten years in response to tightening capital markets. They rely on the leanest development teams and outsourcing to achieve clinical proof of concept for a drug candidate.

This new drug development model coincides with the emergence of faster, flexible visual analytics and business intelligence tools which assist small, nimble companies in their drug development efforts. Virtual biotechs use flexible platforms with real-time access to aggregated data and programs for trial management, monitoring, data analyses, and business operations. The right platform can take a virtual firm and its partners from initial study startup with applications for trial timelines and progression to Phase III project management. 

Spotting Outliers, Trends, and Problems

At its heart, effective clinical trial management means finding the things that are going wrong, or have the potential to go wrong, early. The right platforms for visual analytics – for both business and science intelligence – help the user more easily and rapidly spot the outliers, find the trends, and unearth the problems that are buried in the sea of clinical operations data. 

Making the change from traditional reporting and query tools to a visual analytics platform means less time preparing data, and far more time acting on the insights from it. 

Are you using the right visual analytics platform for clinical trial management and data analysis?