A Little Security for Big Data

Read Article

Securing big data in the enterprise requires smart policies enforcement, thorough analytics and high performance tools, says Rajesh Maurya

Bringing the issue of security into the big data discussion often produces two divergent schools of thought from IT professionals—categorical denial that big data should be treated any differently from existing network infrastructure, and an opposite response towards over-engineering the solution given the actual (or perceived) value of the data involved.

Big data—defined by Gartner as high-volume, high-velocity and/or high-variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimisation—increases routine security challenges in existing data networks.

These are the four facets, as defined by IDC, that give rise to challenges but also opportunities:

Volume: The amount of data is moving from terabytes to zettabytes (1 zettabyte is 1,000,000,000 terabytes) and beyond
Velocity: The speed of data (in and out), from static one-time datasets to ongoing streaming data
Variety: The range of data types and sources – structured, un/semi-structured or raw
Value: The importance of the data in context
Yet, while big data presents new security challenges, the starting point to resolving these challenges remain the same as creating any other data security strategy: by determining data confidentiality levels, identifying and classifying the most sensitive data, deciding where critical data is to be located, and establishing secure access models for both the data and analysis.

Plan around the big data lifecycle
Properly defended big data necessitates defining specific security requirements around the big data lifecycle. Typically, this begins with securing the collection of data followed by securing access to the data. Like most security policies, a proper assessment of the threats to the organisation’s big data never ends but revolves around ensuring the integrity of data at rest and during analysis.

Performance is a key consideration when securing the collected data and the networks. Firewalls and other network security devices, such as those for encryption, must be of sufficiently high performance so they can handle the increased throughput, connections and application traffic. In a big data environment, policy creation and enforcement are more critical than usual because of the larger volumes of data and the number of people who will require access to it.

The sheer amount of data also proportionately increases the need to prevent data leakage. Data Loss Prevention technologies should be employed to ensure that information is not being leaked to unauthorised parties. Internal intrusion detection and data integrity systems must be used to detect advanced targeted attacks that have bypassed traditional protection mechanisms, for example, anomaly detection in the collection and aggregation layers.

The inspection of packet data, flow data, sessions and transactions should all be scrutinised.
Because big data involves information residing over a wide area from multiple sources, organisations also need to have the ability to protect data wherever it exists. In this regard, virtualised security appliances providing a complete range of security functionality must be positioned at key locations throughout the public, private and hybrid cloud architectures frequently found in big data environments. Resources must be connected in a secure manner and data transported from the sources to the big data storage must also be secured, typically through an IPSec tunnel.

Leveraging big data with the right tools
While big data presents challenges, it also offers opportunities. With the right tools, vast amounts of information could be analysed, and this allows an organisation to understand and benchmark normal activities. If that organisation could then monitor for users who stray from that norm, it could proactively get ahead of potential data and system breaches.

This effort is aided by competent IT staff and efficient deployment of the appropriate security tools. These tools include dedicated logging, analysis and reporting appliances that can securely aggregate log data from security and other syslog-compatible devices. These appliances will also analyse, report and archive security events, network traffic, Web content, and messaging data. Policy compliance could then be measured and easily customised reports produced.
The difficulty in capturing, managing and processing information quickly in big data environments will continue to make security an afterthought in many firms.

As portable storage and bandwidth continue to grow, the mobility of these larger datasets will also increase, resulting in breaches and disclosure of sensitive datasets. Threats will likely come from intruders manipulating the big data in such a way that business analytics and business intelligence tools could generate false results and lead to management decisions that could profit the intruders.

Even small changes in big data can have a big impact on results. So, organisations must not ignore the need to secure big data assets—for security reasons, business intelligence or otherwise. They must address big data’s main needs in terms of authentication, authorisation, role-based access control, auditing, monitoring, and backup and recovery. Going forward, big data analytics involving behavioural benchmarking and monitoring will also become increasingly crucial in addressing next-generation information security challenges.

Rajesh Maurya is Country Manager, India & SAARC, Fortinet.


If you have an interesting article / experience / case study to share, please get in touch with us at editors@expresscomputeronline.com

Comments (0)
Add Comment