The options may be more sophisticated, and the technology more robust, but does hosting systems externally aid the data-driven agenda?
Subscribe to our blog
Cloud computing and companies’ attitudes toward it have come a long way over the last decade, as maturing and ever more sophisticated services seemingly address the concerns of old – security, continuous access to systems and so on.
Certainly in Life Sciences - so often behind the curve in technology adoption - there has been an awakening to the potential represented by the cloud for accessing new functionality more rapidly and in a cost-effective way (where legacy investments may be holding back progressive ambitions). The cloud is also a good place to do special processing, to support finite projects or peak workloads, because of the promise of resource elasticity and usage-based pricing, where companies only pay for the capacity they need at the time. So to run some big numbers, perform special analyses, or test out a new application, the cloud offers a good fit. And as a facilitator for innovation, more advanced analytics, and big-data activities such as social listening/adverse event tracking, it holds a lot of promise.
But there are limitations, even now, and it’s important that companies are aware of them. If the organisation is considering running core information systems in the cloud, for instance, it had better be sure that the data is tamper-proof and that it won’t ever be transferred out of the region in which it is governed. If regulatory criteria requires that sensitive data cannot leave the EU, companies must be absolutely sure that it is never replicated to a data center outside that region. The complicating factor here is that different regions, even individual countries, each have their own information compliance demands.
The Cloud Won’t Cut Complexity
On a more fundamental and practical level, companies will need to be sure that they are not introducing new complexity by putting some of their computing infrastructure, applications and data into the cloud. In the context of regulatory information management, for instance, the Life Sciences industry now largely understands that the answer to higher-quality data, smoother, faster processes and more cost-efficient is to standardize, simplify and amalgamate systems so that there is data continuity and consistency from one side of the company’s operations to the other. For the cloud to play a part in this, it needs to facilitate this set-up – supporting a single instance of definitive master data which is correct, approved and up to date at source, and which everything else springs from. This is how companies will drive up quality, reduce risk, shorten timescales and create scope for at least some degree of process automation for routine, repetitive tasks.
On the face of it, the cloud ought to be ideal in paving the way for greater data fluidity, because it starts from a premise of standard functionality and presentation which suits the majority rather than the peculiar preferences of a small minority. The more everything conforms to a standard approach, the greater the chance of simplification and holistic treatment of data, and of data sharing. Provided of course that the systems, wherever they are hosted, are designed integrally to carry data right through from one end to the other - with complete visibility and traceability. A scenario which very few software vendors support today.
As a result, strategic decision-makers need to be thorough in their questioning of system vendors or potential cloud partners when they promise the world. These are not decisions to be taken lightly. Just putting systems out to the cloud will not drive transformational change, and it could make existing information and content management complexity worse if a wrong turn is taken.
While standardization and simplification are important to flow of correct, definitive data beyond functional and departmental boundaries - so that it can be re-used any number of ways without being recompiled or re-entered time and again - it is equally critical that the user experience forms part of the decision process. Bespoke and customized systems tend to have high user adoption rates because they have been adapted so well to users’ preferences. While this can produce its own problems – high lifetime costs, poor integration with other applications – users’ response and willingness to use systems remains an essential consideration as companies review the way they build and manage regulatory information.
Responsiveness, agility and innovation are all high on the Life Sciences strategic agenda, and the imperative for change is clear as the industry takes a fresh view of the role it will play in public health in the future. Smart use of data will play a fundamental part in enabling much of this - provided companies don’t trap themselves in new silos. And the cloud, for all of its wonderful potential, could become exactly that – another silo - if companies don’t ask the right questions or lay the right groundwork.
Developing a data-driven culture and data-driven ways of working should be a much higher priority, a shift that should precede any decisions about hosting systems or running analyses or other processes in the cloud.
About the author
Romuald has devoted his 25-year career to-date to various roles related to compliance, document management, and content management in the Life Sciences industry. He has held leadership roles both on the client side and in consulting, including delivery, sales, and project and line manager. His experiences bridge on-premise and cloud environments in Europe and the US. Romuald holds a Master’s Degree in Drug Regulatory Affairs from the University of Bonn, Germany, and a diploma in data technology from the Technical University Darmstadt, Germany.