MIT Working On System To Write Its Own Network Traffic Algorithms

Network congestion has always been an issue and engineers have steadily been improving the Transmission Control Protocol (TCP) algorithms to be more effective at handling the congestion. MIT have devised a computer system to do that itself.

TCP algorithms have always been developed by human engineers, but researchers at MIT will be presenting a computer system, named "Remy", that will automatically generate the TCP algorithms based on variations in the traffic and congestion.

In simulations, the system has shown that it is capable of developing algorithms that are two and three times better at congestion control than the ones written by human engineers. The key is that the system can adapt the algorithms for varied loads by testing the algorithms against what it sees.

In tests that simulated a high-speed, wired network with consistent transmission rates across physical links, Remy's algorithms roughly doubled network throughput when compared to Compound TCP and TCP Cubic, while reducing delay by two-thirds. In another set of tests, which simulated Verizon's cellular data network, the gains were smaller but still significant: a 20 to 30 percent improvement in throughput, and a 25 to 40 percent reduction in delay.

It will be interesting to see what future developments are in store from this research. While the debate of whether computers can do things better than humans, this may be one area where they take home the gold.

Social
Source MIT
Channels Internet, Science
Topics Internet, MIT, Research, Internet Traffic, Protocols
Related