Today, the Pharmaceutical industry, like many, has its feet in both camps when it comes to Big Data. Some parts of the industry, such as genomics and drug discovery, were early adopters and today couldn't imagine life without Big Data technologies and approaches. Others are pushing their current approaches to near their limits, and are beginning to consider "what's next?"
Currently, the situation in the Big Data space is somewhat in flux. Large, established vendors now have their own Big Data offerings, which while attractive in some ways (technologies you know, vendor you have experience with, processes similar to that you've used before), they are no universal remedy (the cost alone for many of these can be truly eye watering). On the other hand, many of the original Open Source offerings for Big Data now have well established ecosystems around them, with plenty of (largely VC backed) organisations offering support, training, certification, validation.
Technology does not remain still, and new projects, products and approaches are launched all the time. Some of these new offerings build upon older ones, delivering completer solutions by filling in the technology stack, while others take the latest research from academia to re-visit the core of the current Big Data offerings and replace them with faster; more flexible and more scalable solutions.
For the industry's IT leaders, it can present a dilemma. Your existing vendors are constantly pitching Big Data solutions to you, normally requiring another Big Data system to calculate the price! Newer Big Data companies are now pitching supported and validated offerings to you for more reasonable fees, albeit with steeper learning curves, while your technologists are off exploring the latest solutions which do more, but often lack the support and validation that our industry requires.
At Quanticate, we are taking a twin-track approach. On the one hand, through a mixture of training, innovation teams and best practice sharing, we are happily processing ever-larger data sets with our existing technology stacks. While this cannot continue forever, it does allow us to deliver using approaches well known to regulators and auditors alike, even as our clients deliver us growing data sets to work on. On the other hand, we are actively investigating Big Data in clinical trial solutions, trying them out on training data sets, and exploring the pluses and minuses of the various Big Data offerings out there.
As part of this, we are currently working with Medidata on ways to incorporate Wearables and mHealth into the design of future trials. These designs, if adopted, will make use of our Big Data infrastructure to analyse the large amounts of data generated from these devices. To learn more about our current thinking (and those of the regulators) on mHealth, Wearables and apps, please see our recent blog post: 'mHealth Apps and Wearables in Clinical Trials to Consider'. Otherwise, we continue to attend and speak at a wide range of Big Data events. Our CTO, Nick Burch, has recently spoken at events such as Berlin Buzzwords, Apache Big Data Europe, Apache Big Data North America, and the 4th Annual Clinical Data Integration and Management conference. You can watch his talk from the latter event, or read a transcript of his session. Otherwise, read more on Quanticate's technology and tech strategy.
Authors note: This blog was originally published on 23/05/2014.
Related Blog Posts:
Bring your drugs to market with fast and reliable access to experts from one of the world’s largest global biometric Clinical Research Organizations.
© 2024 Quanticate