fbpx
Techitup Middle East
Expert Opinion

Exclusive: Is your Data Centre a ‘Carbonivore’? 

Sustainability has become a boardroom priority for modern organisations. Alongside the pressure of the ongoing climate crisis, there are good business reasons for this to be a focus: 68 percent of prospective employees are more likely to apply for and accept positions from environmentally sustainable companies, and businesses making ESG-related claims see higher growth than their competitors. 

However, when paired with ambitious digital transformation strategies, IT and security leaders are left with the challenging brief of building a secure, innovative IT infrastructure whilst limiting carbon emissions.  

The data centre dilemma  

The first step in solving this problem is to address the worst culprit: data centres. Businesses generate a near exponential amount of data, with data centres facilitating the storage and near endless flow of this data between servers, tools, virtual machines, and applications, consuming vast amounts of energy in the process. In fact, data centres account for around two percent of all global carbon emissions each year, boasting a carbon footprint larger than the airline industry. 

These ‘carbonivores’ not only contribute huge amounts of carbon emissions to an organisation’s overall footprint, the electricity needed to run them leads to hidden costs and operational challenges. Dublin is a prime example of this struggle: its fast-growing data centre industry gobbled up 18% of Ireland’s total electricity use in 2022, matching the energy consumption of all urban homes during the same period. The excessive energy consumption has led to national concerns, and the government has pushed trade associations and data centre operators to commit to the creation of a new data centre efficiency metric and flexible energy consumption. 

Research across European data centre operators has shown a shared struggle to secure reliable power at good prices, and UK operators reported strong concerns around the cost of energy in maintaining their servers. Reducing energy consumption, and therefore making cost savings, is a no-brainer – but how can businesses place their data centres on a carbon diet? 

Reducing consumption 

Energy consumption is not a new challenge for the data centre industry. Many system and silicon vendors have introduced more energy-efficient hardware in recent years, offering servers and storage devices with lower power consumption and features such as variable fans. Data centre cooling has also been a source of innovation, optimised through tactics such as free cooling, hot aisle/cold aisle containment, and even constructing new hubs in colder climates for natural cooling effects. But businesses themselves can limit the consumption of their data centres by reconsidering their network management strategies altogether.  

All data centres use various security and monitoring tools to capture data communications through network traffic, each with hidden costs and carbon outputs of their own. One of the popular network analytics probes used across service providers and enterprises requires up to 586 W of power to process 16Gbps of network traffic. Monitoring 100Gbps of traffic would therefore require seven individual probes, consuming the equivalent of roughly 100 home refrigerators in just one year (35,934 kWh). 

Some of this traffic processing is not necessary. Streamlining this patchwork of separate tools, by determining what traffic is processed by which tools to reduce data duplication, allows data centres to run at a much greater data efficiency, cutting energy costs and reducing carbon output.  

Gaining true visibility into the network can guide businesses in making the right decisions to streamline their data flows. There are four key tactics to optimise tools usage and eradicating redundant or replicated data packets. When combined, they go a long way towards reducing irrelevant network traffic, in turn cutting energy consumption and building a more sustainable infrastructure. 

Deduplication 

The structure of modern data centre networks prioritises resiliency and availability, but this approach creates duplicate network packets across a network, dramatically increasing the volume of traffic being processed by data centre tools. Each network packet only needs to be analysed once, and deduplication allows operators to identify and remove duplicates before sending network data to tools, reducing redundancies and thereby requiring far less energy. 

Application Filtering 

Application filtering separates data based on traffic signature, distinguishing between high- and low-risk applications, even when data is encrypted. High-volume, trusted applications such as YouTube can be filtered out, allowing businesses to focus data centre tools where they are needed. This reduces the amount of data flowing across the network and limits the energy use of data centre tools.   

Flow Mapping 

The process of sending only the relevant network data to meet each tool’s needs, flow mapping drastically reduces network traffic and prevents tools from becoming overloaded with information from unnecessary subnets, protocols or VLANs. 

Flow Slicing 

This method of optimisation focuses on reducing the information shared via network packets in every user session. Much like flow mapping, flow slicing functions on the basis that inundating tools with non-essential information wastes valuable energy, and many tools only need to see the initial setup, header, and handshake information. Flow slicing is highly efficient and can carry a big impact on tool traffic and ultimately energy consumption: real-world deployments reduce tool traffic by 80 to 95 percent.  

Achieving true efficiency with deep observability 

Efficiency, in spending, man hours, and energy usage, has been the name of the game for IT and security leaders who are constantly tasked to do more with less. Many cloud journeys began, or were escalated, in response to the COVID-19 pandemic. As a result, inefficiency in infrastructure and spend is something that all modern businesses are now contending with, and a focus on carbon emissions is just the latest pressure for businesses to reduce these redundancies.  

These tactics do more than reduce the energy requirements of data centres. By streamlining data tooling, they can enable organisations to remove unnecessary data centre tools and reduce sprawl, saving valuable IT spend for more innovation.  

The old sustainability mantra, “reduce, reuse, recycle”, reminds us that the first and best way to save resources is to not use them in the first place. Deep observability and better data management empowers enterprises to do just this. By applying a more strategic, considered approach, decision-makers can benefit their businesses’ budgets, network uptime, and even the planet.  

Related posts

Nutanix’ Project Beacon Widens Coverage for Cloud Native Users  

Editor

Paying Ransomware is Financing Crime – How Organisations Can Break the Cycle

Editor

Embracing GenAI Solutions in the Era of Hybrid Multi-cloud 

Editor

Leave a Comment