Getting Data in Order for a Data-driven Future

Getting data in order for a data-driven futureImage | Pixabay.com

As data becomes more important than documents when it comes to regulatory submissions and operational decision-making, there is a renewed focus on data quality. Amplexor’s Agnes Cwienczek sets out the steps life sciences organisations must take to get their data in order.

In future, a whole range of interconnected business functions, and indeed external entities along the supply chain and regulatory process, should be able to rely on a comprehensive, approved master data source to support a whole range of critical actions. Those could be submissions and compliance related, or support internal analytics and management reporting.

Without complete confidence in a defined quality, integrity, currency and completeness of data, however, there are two main risks to the business functions. One is that they will fall foul of the authorities, providing inaccurate or out-of-date information, and as a consequence incur delays to market. The other is that everything they do from this point will be founded on false assumptions, so that problems are perpetuated and risks are magnified as documents are built from data, data is extracted from documents, and as changes and updates are introduced to some or all of that content.

On top of the inevitable delays and additional expense incurred by reactive damage-limitation, companies entering into a remedial data preparation exercise are likely to be investing those new resources in a finite project with a single target use case. This is directly at odds with the higher and more strategic aim of creating a more data-driven business.

Start with regulatory information

Regulatory information is a great starting point for proactive data transformation, because it serves so many other functions and use cases beyond its own. Transformation of regulatory information data quality, warrants senior buy-in – since all commercial and reputational outcomes stem from this. From Manufacturing to Quality to Pharmacovigilance, accurate and consistent regulatory information has an integral role to play in maintaining standards, identifying anomalies, managing changes, keeping records, upholding patient safety and demonstrating continuous compliance. However, there are some critical considerations around regulatory data quality:

First, any process and any supporting IT system is only as good as the data it has access to and what it is able to reliably do with it. So however prestigious the (change) management consulting firm and however modern the chosen new IT system, unless there is defined, meaningful, dependable and up-to-date content, any investment in these powerful enablers will be compromised.

Improving the quality and future usability of data is something that needs to start now, outside of any specific departmental system project. And it is something that will take a long time, and which needs to happen continuously. For this reason, it’s important to assess, scope the work involved, which will involve prioritising target data and critical use cases, so that transformation can happen in a staged way towards a long-term goal.

Secondly, being realistic and setting expectations will be important in ensuring that the quality, richness and overall value of regulatory data grows and benefits the organisation as a whole. Not all life sciences companies have the resources needed to overhaul all of their data in one go, and data transformation, governance and maintenance should be a continuous discipline rather than a one-off effort. So at some point the responsibility for data monitoring and upkeep needs to become part of everyone’s day-to-day remit and routine.

Choose high impact data

It will be important to set some parameters and decide which data takes priority. Logically, it may make sense to target improvements by geography (starting with strategically important target markets), and then by the data with the greatest impact (for instance, data that’s most likely to be consumed by others, outside of the Regulatory operation).

That data’s likely to be:

  • Details of existing product registrations and their status
  • Details of the markets products are registered in
  • Related submissions/the latest documents and information used in these registrations
  • Information about safety-related changes and details.

Work on operational data, such as who made what changes as well as any data linked to the company’s internal planning and tracking objectives, can come later. Details of interactions with the various external authorities might fall within this tranche of data improvement/enrichment.

Give stakeholders a voice

Appointing cross-functional teams will help ensure that all users of regulatory data within the organisation have a voice and can play a part in determining information’s relative value and various use cases. This will help pinpoint which data is most critical and needs the most work, and how that data is currently collated, stored, maintained and used across processes and departments.

As with any initiative of strategic importance, transformation needs to be desired, championed and led from the top of the organisation. Better data and clearer process visibility needs to become part of the DNA of the business, which demands a shift in mind-set. Then a whole framework will need to be agreed, backed up by standard operating procedures (SOPs) and policies for everyone to follow.

In an ideal scenario, certainly if the company is big enough, dedicated data managers should be appointed or hired, to oversee the data transformation journey and to act as supervisors to execute the new framework. There are two qualifications to this, however. First, everyone will need to take a degree of responsibility for adhering to future data management guidelines and maintaining ongoing data quality. Second, there is plenty of help on the outside as already-overstretched teams strive to cope with all of this alongside their day jobs.

Get a grip on data quality

Unless companies start to allow for data analysis, preparation and quality maintenance as part of their project plans, they will continue to fall victim to project scope creep and last-minute fire-fighting – triggering new ad-hoc data remedial work. Until companies get a grip on data quality, they will continue to be caught out by audits, missed registrations, or project disruption triggered by the discovery of poor-quality data. Technology plays a key role in addressing the data quality issue, offering sufficient flexibility and granularity to accommodate all current and future data requirements and driving good data practice.

About the author

Agnes Cwienczek is Head of Product Management and Consulting at Amplexor, with a remit including the provision of business process and data management expertise in the areas of Regulatory Information Management, Document Management, Submission Management and Labeling Management. Prior to joining Amplexor, Agnes worked at Merck in its Global Regulatory and Quality Assurance department, a milestone in a career spanning 15+ years at the frontline of regulatory information management.