Much of the world’s web traffic is routed through data centers, which also fuel power-guzzling artificial intelligence (AI) applications. In the U.S. alone, data centers consumed around 4 percent of the country’s total electricity in 2023, and that number is projected to rise up to 12 percent by 2028.Some researchers are thinking big, formulating innovative schemes to make data centers more sustainable. Others, like Martin Karsten, a professor of systems and networking at the University of Waterloo in Canada, are enhancing current methods in the tiniest of ways to reduce data center energy consumption.Karsten and his former master’s student, Peter Cai, uncovered inefficiencies in how the Linux operating system (OS) processes network traffic at the kernel level: The Linux kernel is the “seed” or core of the OS, managing communication and resources such as memory between the hardware and processes.In the case of data centers, these processes entail figuring out where to send segments of information called “data packets” transmitted over the network. When the network receives a new packet, it triggers an interrupt request. The kernel sends this request to the CPU, which then puts its current task on hold to process the packet.The OS also periodically polls the network to check for incoming packets before an interrupt request fires. This busy polling mechanism minimizes delays in network traffic processing at the cost of CPU power consumption. During low traffic conditions in particular, polling continues even without packets to process, leading to needless energy usage.Karsten decided to optimize this approach. “Instead of always waiting for a fixed period of time, we dynamically wait based on what we know is going on in the application,” he says. He devised a technique such that busy polling happens during high network traffic periods, with unnecessary interrupt requests suspended. When network traffic dies down, interrupt-based delivery resumes. “That makes the resul...
First seen: 2025-04-21 13:35
Last seen: 2025-04-22 04:39