In this blog series, we’ve explored the impact that data centres are having on transforming the way we live and work, enabling the IoT Age and driving the application economy. We’ve seen how the hybrid cloud and software-defined infrastructure (SDI) can help businesses define and create the data centre they need to handle the unprecedented volumes of data generated by our digital lifestyles. So today, as we come to the end of our jaunt through all things SDI, let’s take a look at the last step in the data centre journey – putting it to work to process all this new data and extract meaningful insights.
When we’re talking big data and insights, it’s a short hop to analytics. For many, it’s the Holy Grail of IT, but actually achieving it is still a challenge.
Intel is doing a lot of work in this space. We’re collaborating across a broad ecosystem of players to enable highly performant, secure, open solutions with lowered barriers to adoption.
In other words, our focus is on pulling together an overall platform strategy to help organisations get ahead in the race to realise value.
Our approach can be broken into four stages as follows:
- Infrastructure: Creating a platform of breakthrough technologies - multicore CPUs, Solid-State Drives, network fabric, security and so on – that enable vast amounts of information to be processed (often in real time) at an economic rate. Until recently, this was only viable for a minority of organisations.
- Big Data Platform: Working closely with the data platform ecosystem to optimise use of Intel technology features and enable easier deployment. As with datacentre orchestration and networking, Intel is making significant contributions to Open Source projects for big data platforms such as Hadoop and Spark.
- Tools and Utilities: Simplifying the tools (eg. for data visualisation) and utilities used to broaden analysis and accelerate application development, making it easier and quicker for business users to derive insights from their data.
- Reference Use Cases and Architectures: Capturing and documenting applications which support vertical use cases and drive business value as references to help organisations jumpstart their analytics plans. An example of such a use case is Intel’s work with Cloudera to develop a solution that ingests socioeconomic data and electronic medical records to identify high-risk patients and reduce hospital re-admission rates.
Big Data in the Real World
With the Intel approach laid out, I now want to turn attention to some of the more transformational real-world usages of big data analytics that I’ve come across. For me, these examples really help highlight how extracting insights from data leads to real world change for the better; and indeed brings into focus the huge impact the humble data centre itself enables.
First up, the National Centre for Genomic Analysis (CNAG) in Spain sequences the equivalent of eight full human genomes per day to offer more in-depth and comprehensive insights to healthcare organisations that will help improve treatments. The analyses it provides to physicians have the potential to cut diagnosis times for rare diseases, and to identify the optimum combination of treatments for each individual patient for more common illnesses like cancer. This could be the beginnings of a revolution in the way the medical industry operates – a move to personalised or telemedicine. Rather than relying on medications and operations to make people better, doctors could use a patient’s own genomic data to identify the best way to cure or heal them, reducing risk and uncertainty.
The great work being done by the Michael J. Fox Foundation for Parkinson’s Research is another example of big data analytics being used to drive medical breakthroughs. In this case, the organisation’s big data analytics platform detects patterns in participant data collected from wearable technologies that monitor symptoms. This enables researchers and physicians to measure progression of the disease and to speed progress toward breakthroughs in drug development.
Meanwhile, we’re working with Ben Gurion University in Israel, which is using Hadoop clusters to power its Big Data Analytics Lab. Students are able to use the platform to mine huge data sets and apply machine-learning algorithms to create analytics solutions for all sorts of industries – from helping ISPs optimise page ranking to enabling holidaymakers to find the best hotel quickly and accurately through review sites.
The Momentum of Moore’s Law
It’s a good time to be considering all these advances. 2015 marks fifty years since Intel founder Gordon Moore predicted that processor speeds would double, and associated costs halve, every two years. This concept – Moore’s Law – has proved accurate and is now the driving force behind our industry. We’ve stuck to this cadence to date, and as a result, have seen the huge technological advances that have made the impossible possible and created the world we live in today. If the cadence had been at three years, analyst house IHS estimates that technology would still be at 1998 levels. Smart phones would still be nine years away, with the commercial Internet and social media in their infancy.
Looking at it like this underlines for me the vast potential that technology opens up for us. With Moore’s Law set to continue into the future, all the things we’ve talked about in this series that are just beginning now, will soon be reality – they’ll be in your data centre, helping you achieve big business benefits while driving down time, cost and labour. What will you do to take advantage of them?
To explore the issues raised in this blog series, and other IT hot topics, visit Intel IT Center or follow us on Twitter.
The post From Infrastructure to Insights – Making Your Data Center Make a Difference appeared first on Blogs@Intel.