It is projected that cloud computing will account for 13% of world electricity consumption by 2030. A prediction is that, in the future the computing power wont be the bottleneck or the parameter to optimize. It will be energy consumption. Maybe that will even become benchmarks for developing state of the art AI/ML Algorithms. With chips going down to 5nm and server grade hardware pushing its limits as well. The performance bottleneck will soon be insignificant compared to the energy tax each iteration of an algorithm would take.
Energy will become the major operating cost for all the data centers and cloud servers. That will be the one parameter that can make these companies like AWS run more leaner on a daily basis. 2 possibilities than. Either there has to be a breakthrough in how we fundamentally store, retrieve and erase data on physical medium or how we carry out computations. The latter would prove more useful as it is the energy heavy component among the two. Logic in memory is a hybrid approach that combines both aspects and can save energy. The second possibility is that we have to figure out innovative ways to counter the energy problem. Microsoft's Project Natick has claimed that underwater data-centers are a viable option.