Across a wide range of disciplines, science is facing a problem: our ability to detect, measure and quantify the world around us is expanding exponentially, both in terms of scale and complexity. This brings exciting new opportunities, e.g. to develop safe new drugs faster and more cheaply and to create “precision medicine” that will tailor healthcare solutions to the needs of specific patient groups. However, the data management tools available to those at the coalface of laboratory research are either entirely absent, or else designed around 20th, or even 19th, century paradigms and technologies. We have spent more than 10 years in a world-leading biomedical research laboratory working to understand and solve these challenges. We had to start from scratch and completely redesign the tools for data management to meet the needs of 21st century laboratories. Our focus has been on combining the most powerful computational solutions now available with a deep understanding of user needs and their practical challenges in modern laboratory environments. We have designed software tools explicitly to “make big data easy” for researchers, thereby empowering the people best positioned to understand their own data, lowering the barriers to entry and removing pain points.
This sponsored talk will tell the story of our journey to bring the many practical advantages of a fully integrated and easy to use big data management and analysis ecosystem into 21st century laboratories and highlight some of the many challenges we face, some we have solved and some we are yet to conquer.