AI and the Future of Network Security
According to a Cisco whitepaper examining the rapid expansion of the internet of things, there will be more than 50 billion internet-connected devices in the world by 2020, representing a hundred-fold increase since 2003. This proliferation of connected devices is making our everyday lives and work easier, but the convenience comes with a number of risks.
Research from PwC indicates that the number of worldwide cybersecurity incidents rose by 38% in 2015, the largest increase in any of the 12 years the firm has conducted its Global State of Information Security Survey. Due in large part to the ever-increasing number of connected devices in the workplace, the average North American enterprise is inundated by alerts from its cybersecurity systems; 2014 research estimated the number at 10,000. The most active enterprise networks often receive an astounding 150,000 alerts per day.
Even the best infosec teams have a hard time managing this barrage of potential cybersecurity threats, which is why some IT innovators are turning to a new solution: artificial intelligence (AI).
The promise of AI in the cybersecurity space revolves around its ability to process and analyze millions of data points at incredibly high speeds. Maintaining a safe and secure network involves gathering, organizing, and cross-checking data from every device that connects to it -- a process that is becoming increasingly overwhelming for IT professionals as the number of those devices continues to grow.
To spot a potential threat, a cybersecurity team must have a deep, nuanced understanding of its organization’s standard IT protocols, including the behavior of privileged users, accounts, and access points and the normal flow of authentication attempts. Simply put, a threat only appears as a threat if it deviates from standard practices.
An AI-fueled cybersecurity platform could do a great deal to minimize the number of false positives and enable IT teams to focus their energies on combating real threats. When an AI algorithm is given access to an organization’s internal log and monitoring systems, it can evaluate the usage patterns of each individual employee, create a series of baseline activity profiles, and keep an eye on all network activity 24 hours a day.
AI is tremendously useful as this type of catch-all mechanism, but it becomes truly invaluable once it starts to recognize threats in micro-deviations that are all but invisible to the human eye. As an AI tool is fed more and more data over time, it becomes capable of maintaining a constantly moving standard by which to judge potential threats. Human monitors are ultimately limited by the amount of data they can process in a day, but an AI has no such limitations.
Moving forward, AI will be most effective as a cybersecurity mechanism when deployed alongside human experts. When it comes to processing data and spotting potential threats, the unparalleled “brain power” of an AI system gives it a distinct advantage over human cybersecurity monitors. That being said, most AI still struggles to find and execute appropriate fixes once a security breach has been confirmed.
If the AI “arms race” that many experts believe is on the horizon comes to be, understanding how and when to manage and deploy cybersecurity-oriented AI will become much more important than simply having an AI system in place and using it indiscriminately. Hackers and cybercriminals will soon have just as much access to AI technology as enterprise IT teams do, meaning the fate of network security will be determined first and foremost by which side knows how to use AI best. This begins with recognizing the respective strengths and weaknesses of humans and machines, and helping both achieve their full potential.
Recommended For You
Low-Power WANs offer an alternative to 5G for connecting a fast-growing array of basic devices and sensors that transmit small amounts of data.
An effective network visibility strategy requires understanding the technical, financial, political, and legal aspects impacting your network operations.
Emerging organizational structures for IT include placement of IT pros in user areas and departments forming their own "micro IT's."
Comparing a good and bad trace helps identify performance issues. Dynamic baselining can be used when you do not have a good trace to reference.
Combining commodity server platforms and FPGA-based SmartNICs will allow network applications to operate at hundreds of gigabits of throughput with support for millions of simultaneous flows.
SD-WAN implementations are on the rise thanks to the potential cost savings, increased network resiliency, and better application performance they deliver.