Can New Math Speed Up The Internet?
In 1977, Abraham Lempel and Jacob Ziv of the Israel Institute of Technology came up with the algorithms for LZ1 and LZ2 lossless data compression. At that time, the Internet was not even a dream. Packet switching was in diapers, and TCP/IP did not become standard until five years later. However, their research is the foundation of most data compression systems used today to transmit large quantities of data.
Now, a group of scientists from Denmark's Aalborg University, in collaboration with researchers from Caltech and MIT, want to introduce a new way to transmit information over the Internet that they claim is 5-10 times faster, using mathematical equations instead of sending packets.
Seem crazy? It looked like a hoax to me at first, but after reading the paper carefully, I came to the conclusion that the science is sound, and that the scientists have demonstrated its effectiveness.
Roughly 98% of telecommunicated information is being carried over the Internet today, versus 1% in 1993 and 53% in 2000 (registration required). According to the IEEE, though we have increased the speed of data transmission over the Internet from 300 bit/s in 1984 to 1Gbit/s in 2014 (3.3 million times faster), the amount of data being transmitted every day is billions of times larger than it was 30 years ago. In fact, Intel said that more data was transmitted in 2010 over the Internet than in the entire history of the Internet through 2009.
That is why new approaches such as this one from Denmark can give new life to the Internet and help deliver the services we demand every day.
The technology is based on Random Linear Network Coding, and its main advantage is the ability to send information compressed into the data stream with the ability to reconstruct itself if parts are missing, using network coding to store and send the signal in a different way. The technique processes data packets in chunks and, unlike traditional packet switching, allows users to receive them in any order.
Network coding is especially helpful with multimedia content such as music and video streaming. Imagine Netflix being able to send Blu-ray quality video to its subscribers using only 2 Mbit/s.
Another advantage the technology offers is security. Because data in a TCP connection always travels along the same path, it's easy for hackers and government organizations to spy on communications. But network-coded data can travel by different paths and can be reconstructed only at its destination, and only by its intended recipient.
"With the old systems you would send packet 1, packet 2, packet 3 and so on. We replace that with a mathematical equation. We don't send packets. We send a mathematical equation," Frank Fitzek, professor of electronic systems at Aalborg University and a network coding pioneer, explained in an Aalborg news post. "You can compare it with cars on the road. Now we can do without red lights. We can send cars into the intersection from all directions without their having to stop for each other. This means that traffic flows much faster."
Fitzek claims that, in experiments with network coding of Internet traffic, equipment manufacturers experienced speeds 5-10 times faster than usual.
Network coding could be critical to implement the next generation of 5G wireless standards, new software-defined networks, and the explosion of Internet-connected devices known as the Internet of Things.
Due to the new protocols, information will have to be encoded and decoded using patented technology. That means that new networking hardware and software needs to be developed. But network coding can take advantage of multipath TCP (implemented in iOS 7).
We'll probably have to wait a few years to see network coding being used to deliver data to our homes, but large organizations, especially cloud providers, could start using it internally to speed up data transmissions between their servers, saving money and resources.
Recommended For You
Low-Power WANs offer an alternative to 5G for connecting a fast-growing array of basic devices and sensors that transmit small amounts of data.
An effective network visibility strategy requires understanding the technical, financial, political, and legal aspects impacting your network operations.
Emerging organizational structures for IT include placement of IT pros in user areas and departments forming their own "micro IT's."
Comparing a good and bad trace helps identify performance issues. Dynamic baselining can be used when you do not have a good trace to reference.
Combining commodity server platforms and FPGA-based SmartNICs will allow network applications to operate at hundreds of gigabits of throughput with support for millions of simultaneous flows.
SD-WAN implementations are on the rise thanks to the potential cost savings, increased network resiliency, and better application performance they deliver.