Over time Life Sciences regulatory processes will become increasingly data focused, driven by cross-functional data connectivity requirements and the growing need for structured data submission to health authorities. This places new urgency on data quality governance – which has connotations of complexity and cost. Here, Gens & Associates founder Steve Gens and Preeya Beczek, Director of Beczek.COM, dispel 5 common myths about what’s involved.
From the latest measures around safety and regulatory rigor to renewed focus on agility, efficiency, and streamlined paths to market, everything in Life Sciences now seems to rely increasingly heavily on companies’ effective organisation and handling of data. But what needs to happen so users/process owners can fully trust a data’s quality?
The more critical data becomes to regulatory procedures, to safety processes, to clinical research, to manufacturing, and ultimately connecting all of those parts of the value chain more seamlessly, the greater the need for formal and consistent data governance models and data management practices to support data across all internal and external touchpoints.
Gens & Associates’ most recent World-Class RIM survey (of 76 Life Sciences companies, in 2022) found that the top performers expected to have most of their systems connected and sharing data within the next 2-3 years – with electronic trial master file (eTMF) systems, quality management systems (QMSs), master data management (MDM), and enterprise resource planning (ERP) being the highest priority for investment. Yet, without the assumption of high trust in the data with strong data governance, the risks can become intolerably high as companies’ dependency on the flow of good data broadens.
It can be tempting to create a major initiative supported by a large consulting budget – due to a lack of confidence in getting all of this right. But actually it’s more important that work starts now. Challenging some preconceptions can be very useful here.
Preconception 1: this will require a huge consultancy engagement
A first myth is that data quality governance will inevitably be an overwhelming programme. But all positive change has to start somewhere, and it’s important to decide whether a top-down or a function-by-function (with consistent practices) approach will produce the quickest wins, and the greatest overall progress. What works for one company may not suit another, especially when considering the size of the product portfolio – and that’s okay.
Preconception 2: this will be complex & costly
The second myth is that complexity and high cost are unavoidable. The ‘data driven’ agenda might feel fresh and new in Life Sciences, but digital process transformation is well advanced in other industries and solid frameworks already exist and have been adapted for data quality governance in a Life Sciences ‘Regulatory+’ context. In other words, this needn’t be a steep learning curve, or leave companies with huge holes in their transformation/organisational change/IT budgets.
Actually, much of what’s needed is around nurturing the right culture, assembling the right teams or assigning key roles, communicating successes, and being on the same page as a company about the goals of this whole undertaking.
Preconception 3: we must do this largely out of obligation
Another common myth is that companies should tackle data quality governance largely because they have to, but there are far bigger reasons to make this a priority. These range from more tightly-run business operations to a safer and more convenient experience for patients as consumers of existing and new products.
The tighter the controls around data quality, the more companies can do with their trusted data – use cases which could extend right out into the real world (such as prompter access to real-time updates to patient advice).
Preconception 4: this is an IT/data management thing; it’s not our concern
Another myth is that data quality governance is an IT/data management concern first and foremost. Whereas, time and again, the key success factors for a data quality programme are found to have little to do with technology, and everything to do with culture, organisation, and mindset.
Specific contributors to progress, distilled from the most promising programmes being rolled out today, include a shared data quality vision, for instance, so that good data-related practice becomes second nature. Another is establishing ‘actionable governance’- in the form of a data quality office and assigned data quality officer, whose remit is to oversee efforts to clean up and maintain good data. Then there’s ensuring that senior leaders advocate for a culture of data quality built into rewards systems; that executives drive a ‘right first time’ mindset around data as it is captured and first entered into a system.
Formal continuous improvement is important too – that is, continued rigor in raising the quality of data and making this consistent across the company over time. Underpinning all of this must be transparency of data quality performance, good communications about progress, and a plan for celebrating success as the quality and usability of data is seen to improve across the company.
Preconception 5: we’ve already nailed quality; we don’t need to formalise this
Companies might assume too that, because they are already fairly vigilant about data quality, they don’t need a formal governance programme. This is almost never the case.
Clues to issues when a company has challenges with its data quality (which will deepen as data becomes increasingly fundamental to critical everyday processes) include data quality not being viewed as an organisational competency/ linked to the organisational culture; a lack of clear data quality vision, policy, or strategy; and data connectivity being prioritised ahead of the organisational support required to properly leverage the value of that interconnected data.
The critical elements of a good data quality governance programme
With a strong framework, any company can get started on the right track, however they decide to approach this (e.g. bottom up – function by function, or top down and enterprise wide). In fact, programmes are more successful when you have both.
The Establish/Launch Phase – an initial ground preparation phase – is about setting out a data quality vision and principles; establishing an ’actionable’ data quality operating model with formal jobs (where needed) and defining/redefining roles and responsibilities; and conducting an awareness campaign.
The Operational Phase involves establishing optimal processes and capabilities – e.g. by adjusting to learnings from the Establish phase; ensuring that all roles with a bearing on data quality have these responsibilities set out in job descriptions and covered as part of the annual review process; and establishing recognisable rewards for high quality data.
Finally, in the Optimisation/Institutionalisation Phase, desirable behaviour is embedded and fostered within the organisational culture – ensuring that everyone gets and stays on board with maintaining and continuously improving data quality, to everyone’s benefit. Tools might include automated data quality dashboards to monitor KPIs; data integration and connectivity throughout function and organisation; and organisation-wide data quality level reporting, supporting a culture of quality.
The key takeaway though is to make formalised data quality governance a priority, and to do this soon. Taking a phased approach to systematic data quality governance paves the way for companies to move forward with their improvement efforts now, taking a bite-sized approach. As progress becomes evident, momentum ought to gather organically.
About the authors
Steve Gens is the managing partner of Gens & Associates, a global life sciences advisory and benchmarking firm specialising in strategic planning, RIM programme development, industry benchmarking, and organisational performance.
Preeya Beczek, Director of Beczek.COM, is an independent regulatory affairs expert, providing teams with valuable insights, advice and strategies for operational excellence, by optimizing process, systems, roles and operating models.