Quantcast
Channel: webadmin@intel.com – Blogs@Intel
Viewing all articles
Browse latest Browse all 903

The new scale of compute demands a new scale of analytics

$
0
0

With the proliferation of popular software-as-a-service (SaaS) offerings, the scale of compute has changed dramatically. The boundaries of enterprise IT now extend far beyond the walls of the corporate data center.

 

You might even say those boundaries are disappearing altogether. Where we once had strictly on-premises IT, we now have a highly customized and complex IT ecosystem that blurs the lines between the data center and the outside world.

 

When your business units are taking advantage of cloud-based applications, you probably don’t know where your data is, what systems are running the workloads, and what sort of security is in place. You might not even have a view of the delivered application performance, and whether it meets your service-level requirements.

 

This lack of visibility, transparency, and control is at once both unsustainable and unacceptable. And this is where IT analytics enters the picture—on a massive scale.

 

To make a successful transition to the cloud, in a manner that keeps up with the evolving threat landscape, enterprise IT organizations need to leverage sophisticated data analytics platforms that can scale to hundreds of billions of events per day. That’s not a typo—we are talking about moving from analyzing tens of millions of IT events each day to analyzing hundreds of billions of events in the new enterprise IT ecosystem.

 

This isn’t just a vision; this is an inevitable change for the IT organization. To maintain control of data, to meet compliance and performance requirements, and to work proactively to defend the enterprise against security threats, we will need to gain actionable insight from an unfathomable amount of data. We’re talking about data stemming from event logs, network devices, servers, security and performance monitoring tools, and countless other sources.

 

Take the case of security. To defend the enterprise, IT organizations will need to collect and sift through voluminous amounts of two types of contextual information:

 

  • “In the moment” information on devices, networks, operating systems, applications, and locations where information is being accessed. The key here is to provide near-real time actionable information to policy decision and enforcement points (think of the credit card company fraud services).
  • “After the fact” information from event logs, raw security related events, netflow and packet data, along with other indicators of compromise that can be correlated with other observable/collectable information.

 

As we enter this brave new world for IT, it’s clear that we will need an analytics platform that will allow us to store and process data at an unprecedented scale. We will also need new algorithms and new approaches that will allow us to glean near real-time and historical insights from a constant flood of data.

 

In an upcoming post, I will look at some of the requirements for this new-era analytics platform. For now, let’s just say we’re gonna need a bigger boat.

 

 

 

 

Intel and the Intel logo are trademarks of Intel Corporation in the United States and other countries. * Other names and brands may be claimed as the property of others.

Read more >

The post The new scale of compute demands a new scale of analytics appeared first on Blogs@Intel.


Viewing all articles
Browse latest Browse all 903

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>