22 Jun 2012

Dealing with Data Explosion

0 comments Permalink Friday, June 22, 2012

Data storage capacity requirements in today’s enterprises are increasing at an alarming rate. According to CSC research, average enterprise data capacity will need to grow by 650% in the next 5 years. This is driven by increased user connectivity and an organisation’s dependence on the information these users create and exchange. Couple this with today’s mobile user requiring ubiquitous access to their data from any platform, anywhere, anytime and we can see why organisations are struggling to keep up with the storage requirements for this data explosion.

Security data storage has always had its own challenges. Firewalls, IDS/IPS and Vulnerability Assessment systems produce an ever expanding amount of device log data that is invariably stored for a period of time (up to 3 months) which may stretch out to years to meet policy or regulatory requirements. But do these stored logs help you when you have a security incident? Will these reams of logs be enough information to understand the incident, the breach, the exposure? Would you be able to perform the necessary forensic analysis on these logs? How often have you witnessed security professionals only logging blocked traffic, when the traffic you are really interested in is what is actually being passed by your security devices into your environment. This is the traffic that contains the serious threats worth worrying about.

In my previous post on whether  big can data solve the unfulfilled promise of network security, I discussed the traditional logging and reporting paradigm, and how it doesn’t allow you to reproduce incidents with enough fidelity to detail the breach, the time the intruder was inside, the systems they accessed and the data they stole. Device logging  doesn’t give you full range of options, and it may not even alert you to an incident. The only way you can truly assess the security of your network is to analyse full packet captures of your traffic, you are presented with a new and interesting challenge. However a single gigabit network can transport terabytes of traffic a day. How and where do you store full packet captures (weeks or months) of your network traffic.

The only true representation of your data, is the data itself. The only way you are able to play, pause and rewind attacks completely is to store an entire copy of all the traffic.
The Cloud offers extremely low cost, high capacity storage which is perfect for short term storage of this sort of data. It offers secure upload and encryption, and it can be replicated and distributed if required. You only pay for what you use, and for how long you use it. Coupling full packet captures with Cloud storage makes perfect sense. You capture the data, upload it, and let someone else store and process it for you.
Packetloop accesses the full fidelity of the data. It gives you play, pause and rewind It has access to all events and can replay them any time with new insights to find blended and sophisticated attacks or exfiltration. It scales, it's focused on providing executives with the metrics and overviews they are looking for (dashboards) but powerful enough to track and trace incidents.

Packetloop is designed to leverage Big Data to perform analysis of terabytes of full packet captures. Scalable to handle the data on your network now and into the future. Shouldn't you be giving your organisation the best chance of detecting intruders, containing the incident and remediating with the best evidence and information?

No comments:

Post a Comment