English
Energy Consumption through AI: A new Challenge
Energy Consumption through AI: A new Challenge
The increasing energy requirements of AI pose major challenges for data centers. In our blog post, we shed light on how sustainable solutions and new technologies can help to minimize the ecological footprint and increase efficiency.
In our last blog post “Sustainability in Data Centers: A must in the Age of AI“, we highlighted the importance of sustainable practices in data centers. Today, we would like to take a closer look at a pressing issue that is becoming increasingly important in the world of artificial intelligence (AI): the rapid increase in energy consumption by AI systems.
Rapid Rise in Energy Consumption
With the exponential growth of AI technology, energy requirements are also increasing rapidly. Large technology companies are investing billions in AI accelerators quarter after quarter, leading to a surge in power consumption in data centers. In particular, the rise of generative AI and the increasing demand for graphics processing units (GPUs) have led to data centers having to scale from tens of thousands to over 100,000 accelerators.
Energy Requirement per Chip grows
The latest generations of AI accelerators launched by Nvidia, AMD and soon Intel have resulted in a significant increase in energy consumption per chip. For example, Nvidia's A100 has a maximum power consumption of 250W for PCIe and 400W for SXM. The successor H100 consumes up to 75 percent more, resulting in a peak power of up to 700W. This development shows that although each new generation is more powerful, it also requires more energy.
Challenges and Solutions
As energy consumption continues to rise with each new generation of GPUs, data centers are faced with the challenge of meeting this demand efficiently. This is where innovative cooling technologies such as liquid cooling come into play, enabling effective heat dissipation while maintaining high power density.
An important step in overcoming this challenge is the increased use of renewable energy sources. In addition, leading chip manufacturers such as Taiwan Semiconductor (TSMC) are working to improve the energy efficiency of their products. TSMC's latest manufacturing processes, such as the 3nm and the future 2nm process, promise to significantly reduce energy consumption while increasing performance.
Forecasts show that the energy requirements of AI will continue to increase in the coming years. Morgan Stanley estimates that the global energy consumption of data centers will rise to around 46 TWh in 2024, which is already a threefold increase compared to 2023. Other forecasts assume that data centers could account for up to 25 percent of total electricity consumption in the USA by 2030.
Conclusion
The rapid development of AI technology brings with it enormous challenges, particularly in terms of energy consumption. As a data center operator, we see it as our duty to promote and implement sustainable solutions. However, the gigantic challenges of the AI age can only be overcome together - by the united IT industry, from AI developers to chip manufacturers and data centers.
Source: Forbes
You might also be interested in
Sustainability in Data Centers: A must in the Age of AI
Sustainability in Data Centers: A must in the Age of AI
In the age of AI, sustainability is becoming increasingly important for data centers. Despite the enormous energy consumption of AI chips, operators must assume ecological responsibility. Now more than ever.
Nvidia’s rapid development cannot be overlooked. The media is full of reports about the technology company’s phenomenal growth, driven by the high demand for its powerful AI chips. Nvidia’s graphics cards (GPUs) are the backbone of many AI applications and have recently made it the third most valuable company in the world.
For operators of data centers, which are the digital backbone for AI and today’s economy, this is an exciting time. But let’s not be fooled: AI chips come at a high environmental cost, as they consume enormous amounts of energy and water.
It may seem that sustainability concerns are taking a back seat to the AI boom. But the opposite is true. For our industry, sustainable business practices are more important than ever and will only become more urgent in the coming years.
The environmental Costs of Data Centers
The figures speak for themselves: in the USA alone, the energy consumption of data centers could reach 35 gigawatts by 2030 - enough to power around 26 million households. Globally, AI servers could consume as much energy as Argentina, the Netherlands or Sweden by 2027.
As an industry, we must be under no illusions about the environmental costs of meeting this unprecedented demand. We need to focus all the more on sustainability - and data center operators who are not yet on board should urgently embark on the sustainable path as well.
History of Sustainability in Data Centers
The data center industry is no newcomer to sustainability. We have long had metrics for tracking energy and water consumption. Power Usage Effectiveness (PUE), which measures the energy consumption of a data center, was introduced in 2006. Water Usage Effectiveness (WUE) followed in 2011. These metrics encourage data centers to use resources more efficiently and at the same time offer the opportunity to save costs. Our own PUE value, for example, is 1.08, just above the ideal value of 1.0 - which means that we use almost all of our energy directly for our IT equipment and are therefore highly efficient in our use of this valuable resource.
However, reducing energy and water consumption are only two ways to achieve more sustainable data centers. In addition to operational processes, building with lower CO2 emissions can make the biggest difference - at least in the case of new facilities or upcoming renovations. In this area, we are also seeing more financing linked to sustainable outcomes.
Encouragingly, data center operators are increasingly turning to renewable energy. In the USA, over 40 gigawatts of wind and solar energy are already in use. We ourselves already operate our data center, including air conditioning, lighting and connected office space, exclusively with green electricity.
Innovative technologies also play a major role. Our sustainability page gives you an insight into the technologies we currently use in order to operate as sustainably as possible.
Sustainability: a Win for Everyone
Of course, sustainable efforts not only have a positive impact on the environmental footprint of data center operators. At a time when sustainability awareness is (thankfully!) on the rise, a strong commitment to sustainability can also provide a huge competitive advantage and have a promising impact on corporate image.
At the same time, there are exciting opportunities for innovation as our industry seeks a more sustainable path. Because AI chips require so much energy to operate and cool, the technological breakthrough that reduces their energy consumption will have a multiplier effect. Even a reduction in energy consumption of just 10 percent would be a huge saving.
Ultimately, however, making data centers more sustainable is a win-win for everyone - the providers, the businesses they serve and the environment. Our industry may not have all the answers yet, but together we will surely find a better way.
Source: Forbes
You might also be interested in
Snapshots remain free of Charge for centron Customers
Snapshots remain free of Charge for centron Customers
centron customers can breathe a sigh of relief: snapshots will continue to be free of charge to ensure maximum reliability without additional costs.
In the world of web hosting and cloud services, news has recently caused a stir: Hetzner, one of Germany’s leading providers of web hosting and data center services, has changed its billing model for snapshots. Since the end of May, existing customers have had to pay for their snapshots, which represents a significant departure from the previous practice where snapshots were available free of charge for five years.
Hetzner's new Invoicing Strategy
Hetzner is introducing a new billing model that aims to bill products with monthly fees on an hourly basis in future. This should enable a more precise and fairer distribution of costs. However, there was one crucial detail that was missing from the original announcements: existing customers who previously had a free volume of 1,800 GB per month for cloud snapshots will now have to pay for these snapshots from May 31, 2024. According to Hetzner, this only affects a small number of customers, who were informed via a separate email.
(Soruce: Golem.de)
Comparison with other Providers
However, Hetzner is not the first provider to charge for snapshots. Amazon Web Services (AWS) and Microsoft Azure have long since established a fee-based model for their snapshot technologies. With AWS, snapshots are stored incrementally, which means that only the changed data blocks are backed up in order to save costs. Azure uses a similar technology, with costs based on unique blocks and pages.
(Soruce: NetApp )
Good News for centron Customer
However, centron customers have no reason to worry. At centron, snapshots will remain free of charge in the future. At centron, snapshots are considered crucial for reliability, which is why they are made available to customers at no additional cost. After all, data security and customer satisfaction are centron's top priorities. The decision to keep snapshots free of charge underlines this commitment.
Free snapshots are a significant advantage, especially for companies that depend on reliable and free backup solutions. This enables data to be restored quickly in the event of corruption, infection or accidental deletion without incurring additional costs.
You can find more information about the snapshot service from centron here.
You might also be interested in
Data Centers: The Key to Digital Transformation
Data Centers: The Key to Digital Transformation
The “Data Center Impact Report Germany 2024” highlights the central role of data centers in the digital transformation. Find out how centron is contributing to this development with sustainable and secure IT infrastructures.
Germany is at a turning point: digitization is progressing steadily. High-performance data centers are the backbone of this development, as a result of which the demand for IT computing power has increased tenfold since 2010. The “Data Center Impact Report Germany 2024” by the German Datacenter Association (GDA) underlines the central role of data centers in this process. We at centron would like to take the publication of this study as an opportunity to present our contribution to this development in concrete terms.
Secure and reliable Operation: A Must for Digital Sovereignty
In a world where almost every application relies on digital infrastructure - from smartphone apps to critical infrastructure such as hospitals and financial services - the highly available and fail-safe operation of data centers is essential. At centron, we attach great importance to offering our customers precisely this security and reliability. Our data centers meet the highest security standards and guarantee compliance with German and European data protection laws to ensure the data sovereignty of our customers.
Sustainability as the Core of Our Actions
Another key aspect of the report is the role of data centers in promoting sustainability. Digitalization contributes significantly to the reduction of CO2 emissions by replacing analog processes and enabling more efficient solutions. At centron, we are proud to be pioneers in the use of renewable energy. Our data centers obtain the majority of their electricity from renewable sources. Across Germany, 88% of the electricity consumed by colocation data centers currently comes from renewable sources. At centron, we also rely on advanced cooling technologies to continuously improve our energy efficiency.
Growth and economic Contribution
The Data Center Impact Report Germany 2024 further states that the data center industry creates significant economic value and contributes to Germany's digital sovereignty and economic resilience. The demand for cloud services, big data analytics and AI technologies continues to drive growth. At centron, we are continuously investing in the expansion of our infrastructure to meet this demand and make our contribution to the digital transformation. The IT capacity of colocation data centers in Germany is expected to increase from the current 1.3 GW to 3.3 GW by 2029. This is also reflected in significant investments: according to the forecast, around EUR 24 billion will be invested in the expansion of colocation capacities by 2029.
Challenges and Opportunities
Despite the positive developments, the industry faces considerable challenges such as high electricity costs, a shortage of skilled workers and complex regulatory requirements. centron is actively committed to finding solutions, for example by promoting training and further education in the IT sector. The shortage of skilled workers is not only one of the biggest challenges in our eyes: 65% of the companies surveyed outside the Frankfurt am Main metropolitan region cited this as the biggest hurdle in the Data Center Impact Report Germany 2024.
The Future: Sustainable and Regional Development
The GDA study impressively shows how important efficient and sustainable data centers are for Germany's digital future. Data centers are increasingly being recognized as drivers of regional development. The establishment of a data center brings numerous advantages, from fiber optic connections to the creation of new jobs and the use of waste heat for municipal heat supply. Already 28% of colocation operators surveyed use their waste heat for reuse and a further 31% plan to invest in such technologies.
At Centron, we see ourselves as an integral part of this development and are working to make our data centers even more sustainable and efficient. We are actively committed to promoting a sustainable digital infrastructure.
Centron - your partner for a sustainable digital future!
You might also be interested in
Exchange Server Update: New Features and Licenses 2025
Exchange Server Update: New Features and Licenses 2025
Microsoft's Exchange Server is changing. From 2025, there will be important updates and a new subscription model. We show you what the Subscription Edition will bring and how you can prepare for it.
Microsoft has updated its roadmap for the development of Exchange Server, which brings with it numerous innovations and a change to the licensing model from the end of 2025. The focus is on the new Subscription Edition (SE), which is the direct successor to the current Exchange Server 2019.
New Licensing: Subscription Edition from 2025
The introduction of SE from the third quarter of 2025 marks an important turning point: users must have a suitable subscription license or an active Software Assurance contract as part of volume licensing. This change follows the SharePoint Server Subscription Edition model and is part of Microsoft's Modern Lifecycle Policy, which promises to continuously update the product as demand arises.
Technical Innovations and Updates
With the CU15 for version 2019, which will be released later this year, the Exchange Server will receive support for TLS 1.3 and bring back certificate management in the Exchange Admin Center (EAC). These changes will allow administrators to work more efficiently with certificates by requesting new certificates, finalizing received certificates, and exporting and importing RPX files.
It is also interesting to note that the CU15 removes support for the Unified Communications Managed API 6.0 and the instant messaging feature in the web version of Outlook, indicating the prioritization of newer technologies.
The Switch to Kerberos and other important Changes
Shortly after the introduction of the SE, with CU1 in October 2025, Kerberos will be introduced as the standard protocol for server-to-server communication and will replace NTLMv2. This update will also introduce a new Admin API and remove Remote PowerShell, which was already discontinued at the end of 2022 for security reasons.
Strategy for the Upgrade
Microsoft sets out the recommended upgrade path in detail in its roadmap: Users should ideally upgrade to version 2019 CU14 with Windows Server 2022 now, before switching to Exchange Server 2019 CU15 when the new Windows Server 2025 operating system is released. The direct switch to SE then takes place from CU14 or CU15. For Exchange 2016 users, there is no direct upgrade option to SE, which requires a faster migration to the 2019 version.
Conclusion
The upcoming changes to Exchange Server not only mean technical updates for users, but also a significant change in licensing. The new Subscription Edition promises continuous updates and adaptations to modern technologies, but also requires a switch to the subscription model, which could pose a challenge for some organizations. The transition should therefore be planned well in advance to ensure a seamless transition.
Source: heise online
API Strategies for sustainable Success
API Strategies for sustainable Success
In the world of software development, APIs are essential building blocks. This blog post will guide you through the best practices of API design, from the correct use of HTTP methods to efficient data handling.
The development of APIs (Application Programming Interfaces) is a central challenge in software development. In order to create a powerful, maintainable and user-friendly API, there are best practices that both newcomers and experienced developers should follow.
Basic Principles of API Design
1. Using the HTTP Methods:
GET to read data.
POST to create resources.
PUT to update existing resources.
DELETE to delete resources.
Other methods such as PATCH, OPTIONS and HEAD should be used according to their specific use cases.
2. Descriptive URIs:
URIs (Uniform Resource Identifiers) should be descriptive and represent resources, not actions. Example: `/users` for user resources or `/products` for product resources.
3. Naming Resources with Nouns:
Plural nouns are standard, i.e. `/users`, `/products`.
4. Introduce Versioning:
By inserting the API version in the URI, e.g. `/api/v1/users`, changes can be implemented without affecting existing clients.
Efficient Data Management
Using HTTP status codes correctly: Suitable codes such as 200 OK, 201 Created, and 500 Internal Server Error signal the result of an API operation.
JSON as a data exchange format: JSON is lightweight, easy to parse and widely used.
Use HTTP headers: These are used to transfer metadata and control caching, authentication and content type.
Standardized response format: Consistent structures for success and error responses facilitate parsing by clients.
Pagination for large data sets: Pagination should be implemented to improve performance and reduce the load on the client and server.
Security and Documentation
Authentication and authorization: Mechanisms such as OAuth and JWT (JSON Web Tokens) secure the API. Authorization mechanisms regulate access based on user roles and authorizations.
Error handling: Informative error messages and appropriate HTTP status codes are essential.
Comprehensive documentation: Tools such as Swagger or Redocly support the documentation of the API, including endpoints, request/response formats and authentication mechanisms.
Testing and increasing productivity
API testing: Thorough testing of the API in positive and negative scenarios is essential to ensure robustness.
Fast API development with low-code tools: Tools such as Linx enable rapid development thanks to ready-made specifications and drag-and-drop interfaces.
Conclusion
Adhering to these guidelines and using suitable tools enables the development of reliable APIs that are not only functional but also future-proof. Although technologies evolve, the basic principles of API development remain constant and form the foundation for successful software projects.
Monolith vs. Microservice: Which is the better Fit?
Monolith vs. Microservice: Which is the better Fit?
Developers are often faced with the choice between modular monoliths and microservices. This blog post compares the two approaches and helps you to find the best solution for your project.
In the world of software development, the topic of system architecture is of crucial importance. The discussion about modular monoliths and microservices is particularly intense. But which type of architecture is better? The answer is complex and depends heavily on the respective use case.
The Fascination for Microservices
Microservices are ubiquitous in the IT industry and are often seen as the modern standard for scalable and robust software solutions. Thanks to technologies such as cloud solutions and containerization, especially through Docker and Kubernetes, microservices have gained popularity. This architecture promises better scalability and flexibility by dividing large applications into smaller, independently manageable services. However, microservices also come with challenges:
- Complex service communication and API management
- Increased resource requirements
- Difficulties with debugging
- Potentially high costs
Many companies implement microservices without fully understanding the challenges involved, which can lead to inefficient and expensive solutions.
The modular Monolith as an Alternative
In contrast to this is the modular monolith, an architecture that offers advantages for small to medium-sized projects in particular. A modular monolith consists of independent modules, each of which contains everything necessary for its functionality and has clearly defined interfaces. This structure offers several advantages:
- Reduced complexity compared to microservices
- Easier refactoring and maintenance
- Lower operating costs
Especially when it comes to the development of simple CRUD systems or MVPs, a modular monolith often proves to be sufficient. Modularity also makes it possible to convert individual modules into microservices if required as soon as the system and requirements grow.
Domain-Driven Design and modular Monoliths
The integration of Domain-Driven Design (DDD) into a modular monolith can further improve its structure. DDD focuses on developing software around the core processes and rules of a specific business area. In a modular monolith, each module can correspond to a so-called "bounded context", which clearly structures the business logic and facilitates future extensions.
Conclusion
Although microservices have their place in large-scale projects such as Netflix or Shopify, a modular monolith is often the better choice for most small to medium-sized companies. This architecture offers a balanced combination of modularity, simpler management and cost efficiency. The key is that each company chooses the right architecture form based on its specific requirements and resources.
Citizen Development: Innovation meets Demographics
Citizen Development: Innovation meets Demographics
Overcome demographic change by empowering your workforce to develop their own digital solutions. Our blog shows how Citizen Development can promote collaboration between generations and boost innovation.
In a world where the half-life of knowledge is rapidly decreasing and technological innovations are constantly changing the way we work, today's workforce is facing an unprecedented challenge: demographic change. With an ageing workforce on the one hand and a generation of digital natives on the other, companies are faced with the task of uniting these different talents and skills. This is where the concept of Citizen Development comes into play - an initiative that aims to empower tech-savvy business users to develop their own solutions using low-code platforms to drive digital transformation.
Seize the Opportunities
Inclusion through Innovation: Citizen Development offers a unique opportunity to utilize cross-generational knowledge and skills. Older employees bring valuable experience and a deep understanding of business processes, while younger colleagues use their technical affinity to develop agile solutions. This synergy not only promotes innovation, but also inclusion within the company.
Lifelong Learning as a Standard: At a time when lifelong learning is essential, Citizen Development provides a platform for all employees to learn new skills and continuously develop. This helps to bridge the digital divide between generations and create a culture of continuous improvement.
Meeting the Challenges
Ensuring Quality and Safety: Despite the positive aspects, citizen development also brings challenges, particularly in terms of software quality and the potential increase in shadow IT. Companies must ensure that the solutions developed not only meet the requirements of the specialist departments, but can also be integrated into the existing IT infrastructure and meet high security standards.
Targeted Training and Development: The successful implementation of Citizen Development requires tailored training programs that are adapted to the different prior knowledge and learning styles of employees. Older members of the workforce in particular, who may have less experience with digital technologies, need support to navigate the new environment with confidence. Innovation meets Demographics
Conclusion and Reading Recommendation
Citizen Development not only stands for the democratization of technology development within a company, but also symbolizes a cultural shift towards more agility, inclusion and lifelong learning. While demographic change undoubtedly brings challenges, Citizen Development offers a promising opportunity to overcome them and bridge the gap between generations. By fostering collaboration, accelerating digital transformation and reducing the burden on IT departments, Citizen Development can help future-proof organizations. The journey may be fraught with challenges, but the benefits of such cultural transformation are immeasurable.
We recommend the Study "IT Trends 2024" by Capgemini to anyone who would like to delve deeper into the topic. This comprehensive study not only sheds light on the concept of citizen development, but also offers valuable insights into other current trends and challenges in the IT sector. The study can serve as a guide for managers and IT professionals to fully exploit the potential of Citizen Development and create sustainable, agile and inclusive workplaces.
China strives for technological Independence
China strives for technological Independence
China is taking a new approach to its technology policy: Foreign hardware and software, including products from AMD, Intel and Microsoft, are being banned from government computers.
In an extraordinary move to strengthen the domestic technology industry and increase data security, the Chinese government has decided to ban foreign hardware and software from government computers. This particularly affects products from world-renowned giants such as AMD, Intel and Microsoft. The aim of this measure is, on the one hand, to reduce dependence on foreign technology. At the same time, local manufacturers are to be promoted.
Background of the Ban
A report by the Financial Times reveals: China is turning to its own secure technologies. This far-reaching step does not only include the work computers of civil servants. It also extends to the servers used by government departments above the municipal level. The message behind this is clear: Strengthening national security and technology independence. China strives for technological Independence
Impact on the global Market
This decision could have a significant impact on the global technology market. AMD, Intel and Microsoft are among the world's leading providers of hardware and software. Exclusion from a market as large as China could not only lead to financial losses, but also shift the balance of power in the global technology industry.
Lists of permitted Technology
To facilitate the implementation of this new directive, lists of CPUs, operating systems and centralized databases classified as "secure and reliable" have been published, as reported by Reuters. Interestingly, all of the technologies listed are from Chinese companies. This underlines China's ambitions to turn its local technology industry into a global power.
Reactions
In response to inquiries from international news agencies such as Reuters and the Financial Times, the Chinese government and the hardware manufacturers concerned have not yet commented. This lack of communication raises questions about the potential long-term impact of this decision on international business and diplomatic relations.
Conclusion
China's decision to switch to domestic technologies marks a significant step towards technological self-reliance. It could serve as a model for other countries looking to strengthen their national security and local industries. It remains to be seen how this decision will affect the companies concerned and the global technology landscape.
Source: heise online
Cloud Reorganization: KKR acquires VMware's EUC Business
Cloud Reorganization: KKR acquires VMware's EUC Business
KKR acquires VMware's EUC division for four billion dollars. This strategic acquisition not only signals a new direction for VMware, but could also have a lasting impact on the cloud computing landscape.
Investment firm KKR has caused quite a stir in the cloud computing industry by acquiring VMware's End-User Computing (EUC) division for around four billion dollars. This sale is part of a larger restructuring process at VMware after the company was acquired by Broadcom. Broadcom's decision to divest parts of VMware that do not fit with its core strategy raises questions. What will the future of the software vendor and its position in the market look like?
VMware's EUC Division - New Era under KKR
VMware's EUC division, known for its advanced digital workspace solutions such as Horizon and Workspace ONE, is at the center of this transaction. These tools are essential for organizations that want to manage their applications, desktops and data seamlessly and securely across multiple devices and platforms. Under the leadership of Shankar Iyer, the EUC division will continue to operate as an independent company. The focus will be on supporting customer relationships through significant investment.
Industry Consequences of Restructuring
This development could have far-reaching consequences for the industry. Analysts are predicting a possible mass exodus of VMware customers, triggered by the takeover by Broadcom and the resulting changes. These concerns are not unfounded, as Broadcom has already started to reorganize the acquired divisions and has discontinued some products.
KKR's Strategic Moves in the Technology Market
KKR is no newcomer to the software business. Its successful acquisitions, such as that of BMC in 2018 and the subsequent acquisition of Compuware by BMC, demonstrate its expertise and ambitions in the technology sector. With the acquisition of VMware's EUC division, KKR could further strengthen its position in the cloud computing market and open up new opportunities for innovation and growth.
Impact on the Cloud Computing Landscape
The question that now arises is how these changes will affect the landscape of cloud computing. As the industry continues to grow and evolve rapidly, such strategic moves could be critical to meeting the needs of modern businesses while setting new standards for the delivery of digital workplace solutions. Cloud Reorganization: KKR acquires VMware’s EUC Business