MIT Researchers Find In-Band Full-Duplex Trick for Faster, More Efficient Wireless Networks

Capable of supporting 100 times more devices at 10 times the data rate of 4G LTE, could this be the future of wireless communications?

Gareth Halfacree
5 years ago β€’ Communication

Researchers from the Lincoln Laboratory at the Massachusetts Institute of Technology (MIT) claim to have addressed the biggest problem with in-band full-duplex (IBFD) communications, where radios send and receive simultaneously on the same frequency: interference.

MIT researchers Kenneth Kolodziej, Bradley Perry, and Jeffrey Herd are among those who believe that adding in-band full-duplex capabilities to communications systems will improve their ability to operate in the increasingly-congested radio spectrum allocations while simultaneously increasing the efficient use of said spectrum allocations. The secret: where a traditional radio receives on one frequency and transmits on another, an IBFD system transmits and receives on the very same frequency at the same time β€” halving its spectrum requirement without harming performance.

At least, that's the theory. IBFD implementations have an issue, however: self-interference, where the transmission interferes with the reception and vice-versa. It's this which has kept IBFD from becoming the norm β€” but the researchers claim to have a fix: adaptive digital beamforming and a smart cancellation system which allows the system to ignore its own noise, allowing the technique to be used at higher power levels and on directional phased antenna arrays for the first time.

"Phased arrays can direct communication traffic to targeted areas, thereby expanding the distances that the RF signals reach and significantly increasing the number of devices that a single node can connect," explains Kolodziej in an interview with MIT News on the subject. "The self-interference elimination is particularly challenging within a phased array because the close proximity of the antennas results in higher interference levels. This interference becomes even more difficult as transmit powers exceed half of a watt because distortion and noise signals are generated and must also be removed for successful implementation."

The system works by coupling the output of an active transmission channel to that of an otherwise inactive receive channel on the same phased array antenna. Using a reference copy of the transmission, an adaptive cancellation algorithm filters out the transmission signal, any distortion, and the noise it creates β€” leaving the receiver with a clean signal to process.

According to the team's calculations, an implementation could support 100 times more devices and 10 times faster data rates than existing 4G Long Term Evolution (LTE) systems β€” while offering a boosted communication range of up to 60 miles. The only real trade-off: an antenna roughly half again as large as those used in 5G New Radio (5G NR) base stations.

The team's work has been published in the journal IEEE Transactions on Microwave Theory and Techniques.

Gareth Halfacree
Freelance journalist, technical author, hacker, tinkerer, erstwhile sysadmin. For hire: freelance@halfacree.co.uk.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles