ISO Certification 9001:2008 for Informatics R&D and Global Support

               

Do Standards Have a Place in Software Development?

Standards would seem to be anathema to software developers, who might protest that their use would stifle the creativity and flexibility required for agile or iterative development.

By their very definition, standards are a model or example established by authority, custom, or general consent; something set by authority as a rule for the measure of, among many things, quality or value.

Is it possible to create standard requirements to improve the quality or value of software, without affecting the very creativity needed to achieve what the software or application is being designed to do? The simple answer is yes. ISO 9001:2008 certification can improve quality management systems for software. The International Organization for Standardization (ISO) establishes requirements for quality management systems in organizations that seek: 

• to demonstrate their “ability to consistently provide product that meets customer and applicable statutory and regulatory requirements” and

• to “enhance customer satisfaction through the effective application of the (quality management) system.” This includes establishing processes for continual improvement.

Because ISO 9001:2008 requirements are “intended to be applicable to all organizations, regardless of type, size, and product provided,” they apply well to quality management in software development and service.

This year, PerkinElmer received ISO 9001:2008 certification for its Informatics R&D and Global Support functions - both of which are important to customer satisfaction.  Guido  Lerch, the company’s executive director of quality control and assurance for informatics, and Lillian  Metcalfe, quality system manager, say they saw an opportunity with ISO 9001:2008 certification to proactively invest in implementing standards and thus the overall quality of software delivered.

Measure Inputs and Outputs

Certification provides a framework under which individual companies create the processes and procedures that lead to quality products and services.  It is not ISO mandating what the certified organization does, but rather the organization seeks certification of standards it devises and applies to processes.

For software development, PerkinElmer has not restricted what its development teams can do. Instead, the Company structured the “inputs and outputs” – the precursors for starting development and the processes to evaluate what they release.

“In the middle, we want our developers to go off and experiment and try lots of things,” Lerch says. 

Adherence to ISO 9001:2008 requirements assures that R&D and global support processes are clear, documented, and monitored, and that people are trained. It creates checks and balances to monitor the effectiveness of the quality management system, which leads to improved product quality and more satisfied customers.

It can also reduce the duration or scope of customer audits, as customers gain confidence from the knowledge that standards and processes are in place to develop and build products in a consistent manner. “There are hundreds of questions they won’t need to ask us,” according to Lerch, since PerkinElmer can explain its processes.

ISO 9001:2008 gives customers confidence they know exactly what they are deploying because the software has gone through a thorough testing regimen that follows certified procedures. In addition to knowing there are quality standards in place, there are also two people – Lerch and Metcalfe – whose roles are dedicated to the quality management of all PerkinElmer software released.

Two Important Points

While PerkinElmer Informatics has received ISO 9001:2008 certification for its R&D and Global Support functions, the company has been following such guidelines in principle for many years. Certification formalizes the company’s efforts.

ISO 9001:2008 has now been updated to ISO 9001:2015. PerkinElmer has three years to recertify under the new standards, and is committed to not only achieving certification, but maintaining it.

How confident are you in the quality management of the software and applications you’re deploying? Does ISO 9001:2008 certification increase that confidence?

The Convergence of Search and Analytics: A 360-Degree Data View


The convergence of search and analytics provides greater visibility.

Business intelligence (BI) tools were born to optimize data-driven decision making, but BI tools need structured data on which to operate. All too often, relevant data isn’t nicely formatted, stored and curated but instead resides hidden in some other less-structured format - such as text fields, technical reports or other longer descriptive prose. Extracting this hidden data has historically been too difficult, meaning that current datadriven decisions are actually based on only the most accessible and incomplete information. 

Enterprise search tools were introduced to help business and data analysts find more relevant content across disparate repositories. They provide greater classification and retrievability from document stores, file shares and other content silos within an organization. The end user still has to open up the content and read it to extract value and understanding and see the relevant data. Any subsequent analysis requires the end user to transcribe information into a form suitable for incorporation into a BI tool - often correlating data across the multiple relevant pieces of content uncovered by Enterprise Search. 

The Birth of Search Based Analytics

Recently Gartner highlighted a trend which has been apparent to many industry observers - that these two solutions are converging into a space they describe as ‘search based analytics.’ Search based analytics provide a capability to deliver a 360-degree view of all available and relevant information for both content access and analysis. As part of this process, Content Analytics can be applied to tease out the underlying data structures from otherwise unstructured content, facilitating the path from content access to data analytics. Not only does this unify data, but the layering of data visualization and statistical analysis over search helps identify and explain connections and provide insight.

Think of it this way: a Google search provides a long list of varied information assets related to a search. This data is presented as hyperlinks, providing the end user with a prioritized list of results they can read and explore further. However, Google still relies on the user to read and determine value from the sea of offerings in order to extract the underlying meaning. 

In contrast, Yahoo! Finance presents data plots, charts, and tables so that the end user can query and filter these structured values - enabling analysis of trends and patterns. The data is supplemented with links to analyst reports, press releases and pertinent content. This is a basic example of search-based analytics, where content links are provided to enhance the previously structured data.

Content Analytics ingests the content sources for you and extracts their deeper meaning. Content Analytics creates structured data from unstructured sources and supplements it with the previously structured data as well as the ability to quickly access the relevant content for further context.

To be truly successful, such a convergence must leverage natural language capabilities - both to interpret the underlying content and to make it easier to find the information users seek. People interpret concepts and findings in many ways, and search-based analytics solutions must accommodate the variance of language integral to human nature. After all, data is only useful if you can find it and draw insights from it.

This enables workers at all levels – not just IT or super users – to gain a complete picture based on a search of the complete data set. You don’t have to know how to ‘speak database’ to get at the data you need.

Years ago, Gartner called the notion of such comprehensive user-friendly search Biggle, from a combination of BI and Google. But this vision is only just now starting to be realized. Evidence suggests that businesses have begun to explore and invest in this convergence of search and analytics.

Challenges with Current Tools

Organizations continue to struggle with integrating disparate data – structured, unstructured, internal, and external. Search tools can help find documents, but they still rely on the user to generate insights from the data within. And valuable data is still locked away in unstructured or poorly structured forms.

Analytical tools still rely on data that is structured, rendering them useless for an organization’s unstructured data. Unless something changes, Gartner estimates that up to 90% of information assets in a typical enterprise will be inaccessible by 2017.

Holistic Data Initiatives

There are opportunities to unify the search and analytics experience to allow immediate access to both relevant content and analysis of structured and unstructured, public and private data.

For instance:

• Major hospitals are analyzing their Electronic Medical Records, including unstructured text from doctor’s notes, to better understand Hospital Operations and clinical results.

Pharmaceutical companies are combining sample, compound, or material data from technical documents and reports with assay and test data from data warehouses, experimental information from ELNs and safety assessments from clinical study reports to provide a 360º view of the sample, material or compound. 

Clinical development organizations are reducing the risk of clinical trial failures through improved study design and more targeted study populations by extracting and analyzing a correlated view of all patient information – from biomarkers to phenotypes to electronic health records - to drive patient stratification.

Outcomes researchers are analyzing medical outcomes through access to public sources, health records and internal databases to provide a consolidated view of the effect of therapeutics in the broader population. 

• Industry leaders are combining public and proprietary knowledge to see a consolidated and unified view of the current competitive landscape, thereby identifying business risks and opportunities through the aggregation and correlation of disparate data sources into a single holistic view. For instance, pharmaceutical companies are assessing drug patent lifespans and their impact on market dynamics. 

PerkinElmer Signals™ Perspectives, powered by Attivio®, delivers that 360-degree data view, addressing the challenge that all research organizations face. It gathers the information stored in many different places and formats, ranging from databases to reports to social media, and provides the analysis that enables decision making based on complete, relevant information.

Do you have a complete picture of your data? To learn how to find and correlate disparate data using natural language queries – without having to understand the underlying data structures – view our webinar here featuring PerkinElmer Signals™ Perspectives: Universal Adaptor for TIBCO Spotfire.

Clinical Informatics: So Much Data, So Little Time

               

Companies in the pharmaceutical and healthcare industries, overwhelmed by data, are turning to clinical informatics to solve their data challenges.

And there are plenty of challenges:

•  The industry is facing an ever-increasing number of clinical trials worldwide.

•  Every day, clinical and drug development professionals are generating more complex and disparate kinds of data.

•  Most companies are using a wide variety of methods to collect, sort and analyze the data.

•  Many of these data collection and sorting methods are unconnected and isolated from each other, leading to a need for manual data aggregation.

This new video on PerkinElmer Informatics for Clinical Development explains the importance of rapid, straightforward analysis to allow fast-paced decisions that are critical to the success of a clinical trial.

To be valuable, clinical informatics must deliver data structured in a way that allows it to be effectively retrieved and used.

Whether improving patient enrollment & retention, reviewing clinical data and operations, or handling critical risks in monitoring, the objective of clinical informatics is provide data integration across the development lifecycle and deliver a complete picture of your pipeline - in real time.

This empowers you to make meaningful, data-driven decisions, increases your Company’s clinical development speed & agility while dramatically reducing costs and helps you meet the challenges of the future of medicine head-on.

Learn more about PerkinElmer Informatics for Clinical Development here.