Environment Variables
The Week in Green Software: Transparency in Emissions Reporting
February 27, 2025
For this episode of TWiGS, Chris and Asim discuss the latest developments in emissions reporting, AI energy efficiency, and green software initiatives. They explore the AI Energy Score project by Hugging Face, which aims to provide an efficiency benchmark for AI models, and compare it with other emissions measurement approaches, including the Software Carbon Intensity (SCI) for AI. The conversation also touches on key policy shifts, such as the U.S. executive order on AI data center energy sourcing, and the growing debate on regulating the data center industry. Plus, they dive into the Beginner's Guide to Power and Energy Measurement for Computing and Machine Learning, a must-read for anyone looking to understand energy efficiency in AI.
For this episode of TWiGS, Chris and Asim discuss the latest developments in emissions reporting, AI energy efficiency, and green software initiatives. They explore the AI Energy Score project by Hugging Face, which aims to provide an efficiency benchmark for AI models, and compare it with other emissions measurement approaches, including the Software Carbon Intensity (SCI) for AI. The conversation also touches on key policy shifts, such as the U.S. executive order on AI data center energy sourcing, and the growing debate on regulating the data center industry. Plus, they dive into the Beginner's Guide to Power and Energy Measurement for Computing and Machine Learning, a must-read for anyone looking to understand energy efficiency in AI. 

Learn more about our people:

Find out more about the GSF:

News:

Events:

Resources:

If you enjoyed this episode then please either:


TRANSCRIPT BELOW:

Asim Hussain:
There's this assumption out there that we're trying to hunt for the right, true essentialist value of measurement, and it really isn't like that 

Chris Adams: Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.

I'm your host, Chris Adams. 

Hello, and welcome to another edition of This Week in Green Software. I'm your host, Chris Adams. Today, we're tackling an ongoing conversation in software today, predicting, measuring, and accurately reporting emissions data, particularly in AI. And as AI adoption skyrockets, so does its energy footprint.

Putting pressure on data infrastructure and sustainability goals. So today we'll be looking at a few new reports, what's going on, and generally doing a kind of roundup of the news and recent events along this. Because it's not all doom and gloom, although there is some. I'm also joined today by my friend and frequent collaborator, Asim Hussain.

Asim, can I give you some space to introduce yourself before we do our weekly, well, semi weekly, news roundup?

Asim Hussain: Not so weakly, anymore. Yeah. Hi. I'm Asim Hussain. I'm the Executive Director of the Green Software Foundation. So we are a standards organization and our mission is a future where software has zero harmful environmental impacts. And you might not be surprised to hear that we believe one of the best paths forwards is developing standards through consensus of multiple organizations.

Because through setting those standards, you can direct billions of dollars into the right places. And if you do it wrong, you can direct billions of dollars into the wrong places. So let's do it right.

Chris Adams: Okay. Thank you for that Asim. If you're new to this podcast, my name is Chris. I'm the director of technology and policy now at the Green Web Foundation, which is not the same as the Green Software Foundation. It's a small Dutch nonprofit, although we are members, founding members of the Green Software Foundation, along with a number of other much, much larger technology giants.

And I'm the host of this podcast and I'll also be doing my best to compile all the links and stories that we have so that if there's anything that has caught your interest as you listen to this, possibly whilst you're washing your dishes, you've got something to follow up with later. Alright!

Asim Hussain: Is it time for my yearly apology for naming it the Green Software Foundation and causing this constant confusion? 

Chris Adams: I think it might be, but sometimes it works in our favor as well, because when people speak to us, like a scrappy startup, a scrappy kind of wacky little non profit, then they say, "oh, we've heard a bunch about you folks. Oh, we thought you were bigger," you know, so it's, we do have, it opens interesting doors. We sometimes do, I have had the odd conversation where people thought I was the Green Web Foundation.

Yeah. So this is,

Asim Hussain: Yeah, let's wear the hats that benefit us at any given moment. 

Chris Adams: Pretty much, yeah. So this is what we're going to have and I think that we are doomed to have this mix up and the fact that we are speaking to each other on a regular basis probably doesn't help us, actually. Maybe we should, I don't know, have some big dramatic fallout or something.

Asim Hussain: Oh yeah, let's do like a fake fallout on the internet, yeah.

Chris Adams: We're not that keen for engagement, are we, mate? Let's not do that, alright? Okay. So, I was going to ask if you're sitting, comfortably, Asim, but I can see that you're on a standing desk, so I think you're now standing comfortably, presumably, right?

Asim Hussain: At attention.

Chris Adams: All right, well in that case, shall we start and look at the first story and then see where we go from there?

All right, so the first story that's kind of shown up on the radar is the AI Energy Score from Hugging Space. Sorry, Hugging Face, not hugging space, god. Yeah, so this is, this is actually essentially a project that is being spearheaded by folks at Hugging Face, but with also involvement from companies you've heard of like Salesforce and so on, to essentially work out something that might be a little bit like an Energy Star for AI.

Now, you probably, it's probably not called Energy Star because Energy Star is a trademark, but the general idea is, essentially, if we're going to have various AI models and things, then we should be thinking about them being efficient, and there are tools available to make this possible, actually. Asim, I know you had a chance to look at some of this, and you've had quite a few conversations with Boris Gamazaychikov the at Salesforce.

They're the kind of one of the AI leads. I'm mentioning Boris because he's quite involved in the GSF. There are lots of other people involved with the Hugging Face project, but Boris is the person who we know, so that's why we've got that named.

Asim Hussain: He's not, he's, so just to be clear like he's not a member Salesforce is not a member of the green software foundation. But yeah, I've just been chatting to boris obviously because we want to, one of the things we try and do is chat to everybody who's doing something in the AI measurement space so that we can at least try and coordinate and have like a common voice.

That's kind of one of the one of the things that we've been doing. Yeah

Chris Adams: Cool, and if I understand it correctly, we'll share a link to both the GitHub.io, the kind of public facing site with all this information about how the Energy Score project is working, plus the leaderboard, which has various closed and open source models. It's actually showing how efficient they are at performing particular tasks.

We'll also share a link to the GitHub repo, which actually shows how it's made because it's using tools that you may have heard of if you've ever messed around with AI models yourself. So it's using Code Carbon, which is pretty much the default tool that people use to work out the environmental footprint of a training run or anything like that.

And I believe the set, the other tool is Optimum or optimal Benchmark. I can never remember, but these two

Asim Hussain: that the actual benchmark tool? That's the thing that actually runs the benchmark, yeah.

Chris Adams: Exactly. So this is not like wacky stuff. This is stuff that you probably should have heard of or you are likely to come across, to see. And there is actually a Docker container for people who aren't able to publish their entire open models, with the idea being that you can run some of this

behind the file, as it were, so you can then share some of the numbers back And, Asim, I can't, while I've got you, I wanted to ask you about this because I know that the, I've been kind of tracking the AI Energy Score project for a few months, but I know there was some work inside the GSF to create a Software Carbon Intensity for AI

Asim Hussain: Oh yeah.

Chris Adams: these aren't competing, but they do overlap and maybe you could actually share a little bit more to explain what these two things are or even what is this SEI for AI in this context.

Asim Hussain: And there's also others as well. So we're talking Sir Joseph, the head of R&D is also sitting in with meetings at the ITU, International Telecoms Union, and so they're working on work themselves. There's EcoLogits from Samuel Rice. There's, there's other ones as well. And I probably just want to preface this by saying something, and I'm going to try and put some words to these thoughts.

I've internalized a lot of how I think about measurement and through conversations with others, I just want to make sure, 

I want to try and get my point across, which is there isn't one true way of measuring everything. It's not like there's one winner and one loser. 

What it is, is that different measurement systems have different trade offs. They incentivize certain things, they disincentivize other things, they have broader scopes and narrower scopes. And one of the things I've realized is you, it's almost impossible to create a measurement system which ticks every single box. Like it's almost impossible to have a measurement system which has the ability to measure like a broad spectrum of stuff and yet still also be consistent and repeatable and all these other areas, all these trade offs.

So yeah, I love AI Energy Score but there's also other ones as well. I just want to preface it by saying every single measure is designed for a particular audience and a particular problem. And I think that's kind of like one of the one of the ways I like to talk to people about it because I do get concerned that people they're always, 

there's this assumption out there that we're trying to hunt for the right true essentialist value of measurement and it really isn't like that

 so take all of my feedback on everything just with that, you know that context in mind yeah, so I think and I think that's kind of like one of the one of the ways that we look at it.

So what's really good about the AI, do you want me to talk about it? You know a

Chris Adams: please do. Yeah, 

Asim Hussain: I mean

Chris Adams: I'm listening to more because I'm, I've got some things to share, but I'm, I haven't heard that much about this. And I haven't been, and I know that the GSF had these workshops going on where people have been exploring this stuff. And I haven't been in those, but I suspect I know you've been in beside them.

And I suspect there've been some good, interesting conversations as a 

can't 

result.

Asim Hussain: I can't

dive too much deep into it because we're still in progress and we had the agreement not to, you know, give too much information about in-progress stuff.

So if someone has a crazy idea, we're not going to publish it and We'll allow people to have these private conversations But I think there's some stuff I can share that one of the things that's come out from our conversations is there's a really, almost one of the most strongest feelings from the group is for a measure that really has a broad scope for a lot of different AI systems, but also for the breadth of the AI life cycle as well.

So, you know, not just inference and also not just training, but also like, the model as it's deployed in an infrastructure. So it's an end to end computation that includes everything across the chain from edge devices all the way over to data preparation. And so there's various scores, so for instance, there's something called the green, the Green AI Index from the Responsible AI Institute, which is also another measure, and that kind of focuses on a pretty broad spectrum.

There's AI Energy Score, which is excellent because it is focusing on just the model itself. And so when you think of the life cycle, it's not like, it's not gonna, it's just focusing on the model. And they've made, they've done a great job of making it a type of measure, which is consistent and repeatable. And they've done that by, you know, you've got your model. Here's our, here is the benchmark you run. You have no, you've got to run this benchmark. Yeah. you also have to run it on this particular hardware because you can't just get a better score by just running on a better hardware. You want to try and measure the model.

Like you've got to, you've got to, you've got to turn variables into constants to kind of get some sort of measure from that perspective and it's really interesting related to the next thing I'm going to talk about the beginner's guide to, it's a report that's coming out because they, I think they did a really good job they're trying to summarize different types of measurements and I think they put it as a system measurement was kind of very big picture It's kind of what one of the things I think maybe where the SCI for AI is going to be talking about. Then they're kind of job/application specific measurements where you kind of make more of those variables constants.

And then there's kind of what we call a code measurements, which are I want to measure, you know, the emissions of this piece of code. In order to do that, you really need to turn a lot of other variables into constants, so you can know that if you turned a for loop into a while loop, what the actual, like, impact would be.

And where I'd say AI Engine Scores is in terms of that taxonomy, it lands more on the code one. But I'm not saying that's a, I'm saying that is the only way you can get something that is consistent where you can actually have a model that, that, you can really give a score to. And it does incentivize a lot of things.

It incentivizes a lot of the almost code based

patterns to improve model efficiency. But it, because of the way it's worked, it won't incentivize other things. Like, it won't incentivize running compute in cleaner regions. Yeah, cause,

Chris Adams: different kinds of energy, or different cooling, for example, you're only looking at the, just the code part specifically.

Asim Hussain: And that's fine.

Yeah, that's fine. Because if you included that, then you wouldn't be able to have a measure that is going to tell you, okay, is Llama better than DeepSeek? They kind of just want to know that from a, you need to turn these things into variables. So, it's very good from that perspective. And I think it's one of the most advanced ones. It's the best one that does it's job. It does do it's job by being a, by, and they admit this, by having like a narrow bandwidth. 

Chris Adams: There's one card it uses, I think it's an NVIDIA H100. I'm, I believe it's that, but I'm not sure I would know an NVIDIA H100 if it was dropped on my feet, so I need to be very clear that I'm at the limits of my expertise when it comes to hardware there. Okay, and the other thing we should probably mention, though, that this was one of the projects that was announced at the AI Action Summit in Paris that happened earlier on, I believe this month, actually, which has all kinds of announcements, so, in Europe, there is a, I think two, I think it's a 200 billion, yeah, a 200 billion euro fund specifically for rolling out AI across Europe.

There was a something that was kind of like a European take on this whole ridiculous Stargate thing. A ginormous French data center thing. 

Asim Hussain: Yeah. 

 

Chris Adams: That was Macron giving him some me too. And there was even actually for civil society, there was 400 million euro fund to kind of try and get an idea of the unintended consequences or talk about how you might reign in some of the worst excesses of this new technology that's being kind of deployed in all these places, sometimes where you're asking for it, sometimes where you might not be asking for it.

Asim Hussain: So 0.2 percent of the 200 billion is for

Chris Adams: Yeah. It's 

Asim Hussain: the question of whether this is a

Chris Adams: It does speak volumes about our priorities, about who are we serving here, basically, I suppose, or whose needs are being prioritized when you have something like that. But yes, this is, this is some of the kind of ongoing conversations we, I guess, we actually have. there's just two things I just want to check because you used to, you mentioned a couple of projects that people might not be aware of that may be relevant for this conversation.

So you spoke about Ecologits, as I understand it, this is if you're using AI right now and you don't have a model, for example, I mean, you don't have like a whole training setup, you can use something like Ecologics to get an idea of inference. So that's, is that the case?

Asim Hussain: Yeah, I think, it does have a methodology as well. So you can actually just take their methodology and, I think he actually asked us to use the word estimate, but like, cause it's all not, none of this is direct measurements, right? So estimate the emissions of a model, but they also have like an API.

So if you have a named model you can call the API and it will kind of give you information about the, I do believe it's only carbon, it might be carbon and water, I can't quite remember, but it kind of gives you

Chris Adams: French, they have, there's like five specific impact, kind of impact factors. There's like water, ADP, like abiotic depletion, something like that. There's basically five things, and one of them is carbon, and one of them is energy, I believe. And this, you don't need to be, like, if you're already using Claude, or you're already using AI, OpenAI, this is just like a one Python package that essentially wraps the function calls you make to, to that API to get some of the numbers back.

So,

Asim Hussain: I don't think, I don't, I think Ecologist is just for models itself, I don't think it's for, 

Chris Adams: Oh no, it is for inference. because we, we put a funding bid to the European AI Act Implementation Fund, where they were basically looking for this stuff. And the thing we realized was that if you are, if you're not doing any training, but you're just doing inference, this is one of the Python packages that will give you an idea about the numbers.

But it is very much, 

Asim Hussain: inference only, 

Chris Adams: yeah, exactly, inference

Asim Hussain: That's one of the conversations, yeah. Like the biggest conversation we're having in this side for AI right now is to include training or not to include training. And like one of the things the AI Energy Score and Ecologits is that it doesn't include training. The Green AI Index does include training. And, you know, that's it's a very, It's a very, oh god, it's such a hard question, it's like so much nuance to it.

Chris Adams: Well, yeah, because if you're including training, then whose training are you including, right? So if I'm using, say, Llama, should I be saying, should some of Llama's footprint, which was training, and we know, should that be allocated to me, or should it not be? And like, we can point to existing protocols that like say maybe you should, but in this case maybe that isn't.

So yeah, this is an open question right now.

Asim Hussain: Well if you, this is where my brain is so stuck in this area. Because if like, if you include open sources. I want open sources models in yours. It doesn't incentivize the reuse of models. If you don't include an open sources, if you're saying it's open source, I'm not going to include it. You can be a company that just goes, "I open sourced this model so I don't have any emissions." So there's like so many different ways it can be. This is a very, hard question that we need to solve. I also think it's very interesting because it's a, I think it's, I think it's, the training question is.

I, I suspect us figuring out or getting consensus on the training question, a very nuanced discussion and conclusion to the training question will actually help in many, other areas of like, how do you actually measure software? Because I think it's, it's, such a difficult question to answer.

I think the solution will inform so many other areas as well, which are kind of slightly simpler.

Chris Adams: It's almost as if using generally accepted accounting practices first developed hundreds of years ago might not be all that useful for thinking about how you use open source models and open weight models in

Asim Hussain: yeah, advanced technology systems. 

Chris Adams: Okay,

Asim Hussain: it's something to do with cloning. Like, if you can clone something, a click of a button, you can't clone a chip. I don't know. I haven't got fully refined thoughts on this yet. So, let's move on.

Chris Adams: We'll wait with bated breath for these, the outputs from the workshops as you do them. All right. So. that gave us a lot of time to chat about that stuff. The other thing I'll just quickly name check for the AI Action Summit was there was a statement called within, the Within Bounds Statement.

I'll share a link to that. This was something that, actually my organization worked with or the organization I'm part of. So, Michelle Thorne, who's my colleague and normally sits next to me, she was working with 120 different civil society groups to basically lay out a set of demands to say, look, if we're talking about AI and we're allocating literally hundreds of billions of euros or dollars to this stuff, can we talk about what it's for and who's benefiting from this stuff?

We'll share a link to that because it's actually, in my view, quite well written and it does a very good job of actually talking about some of the issues that we might not be talking about all the time as people in industry to see how the rest of the world is actually like having to respond to some of this, I suppose.

So we'll share a link to that. But the juicy one now, Asim, is the one that you wanted to talk about, and that we both were nerding out a lot, was A Beginner's Guide to Power and Energy Measurement, an Estimation for Computing and Machine Learning. This is the next story that we have inside it, and I believe you've shared a link to the archive, the archive link for this pre print, because it's a really cool looking paper, and it's publicly available for everyone right now, but it might, I think it's going to be going to some journal, but I'm not quite sure, and figured But 

Asim Hussain: I thought it got published in the, in an NREL journal. I don't know. Maybe it's not maybe it's not in a real journal or maybe now that I understand how journals what journals are maybe doesn't really matter 

Chris Adams: So NREL here being the National Renewable Energy Labs of the United States of America. That's what NREL was in this case here. We've shared a link to it and, you did talk a little bit about why you like this, but can I give you a bit more space to talk about why you've enjoyed this? Because you don't need to be a beginner to actually appreciate this as far as I understand it, right?

Asim Hussain: No, it goes into a lot of detail. I mean, it says beginner, I'd say it goes a beginner's guide. Probably a little bit of imposter syndrome there, because I'd actually call it, like, it's very well written, so a beginner could start it, but I think it goes into very advanced topics that not many people know at all.

So, I think it goes from beginner to advanced. Yeah, I'm quite proud, Akshaya is the lead author of it, and Dawn Nafus is there, these are two people I worked very closely with at Intel. Very proud of this piece of work from them and the people, people over there. I share this with my team, so we're all working on kind of like thinking about how to measure energy.

And it's just exciting to see, just see how her and everybody else kind of rationalize this all into a very easy to understand, you know, set of concepts. As I said before, like they, they, you know, the first thing they go through to try and come up with this taxonomy, you know, are you measuring for a system?

Are you measuring for a job or are you measuring for code? And I think they've done a really good job of trying to like explain the difference they talk about are you measuring directly versus are you measuring versus proxies? I love the fact that she even goes down and said, there's this idea that we have is there's I always say like everything's a model like you can't, there's actually no such thing as direct measurements.

There's just a very advanced model. and she even goes down into, you know, even if you're using a watt meter and not against a wall, you've actually really got to consider like many of the rare areas because you've got to calibrate it. If you don't calibrate it, it's not going to really go, you calibrate a model, right?

It's not going to like, you know, actually turn out the right numbers and gives you a lot of cautionary tales, you know, where, what to think through. And it really just goes into just a lot of these. I don't know if it's worthwhile going into all of it, but there's just a lot of detail about the things to consider, you know, idle power draw, you know, not only that, but like when you run things, when you run, we always knew that like, it was challenging to measure when you're on shared infrastructure, but then they go into like other details, which is like, it gets even more challenging because the, like, the information you're getting from the socket might actually contain information from the energy draw from the memory and it's hard to, like, disambiguate all of this stuff.

There's ways in which, if you're accessing memory, it increases the idle power of a CPU. There is so much great information here, and a lot of little tips as well.

Chris Adams: Yeah, I think I would agree. It's if you are a beginner, there is some stuff that you can take away, but there is a lot of depth inside this. It's, I actually really enjoyed it too. I enjoyed reading it so much that, actually Dawn sent me, she emailed, I think, emailed me at the beginning of this year, actually, saying, "hey Chris, Check out this cool paper" and I really enjoyed reading it and we were going to do an interview.

We've actually got an interview lined up with Dawn Nafus and one of the other authors, Charles Tripp, who was writing for this. And I believe was at NREL and then has left NREL because, 

Asim Hussain: because? 

Chris Adams: of yeah, basically, this was the way that we could actually get some people speaking about it.

Because since we've had a change in administration, if you're a federal employee it's much, much more for you, difficult for you to talk about anything relating to, well, sustainability and technology, which is a real shame, especially when, like, it's useful to be able to draw upon expertise for people who do this kind of stuff, right?

So, maybe that's a question we should ask ourselves, like, are we okay with the people we're asking of these questions to not be able to talk to the public about this kind of stuff? But, what we do have, but to go back to the actual paper. I agree with you. I found it really, useful and this hierarchy of interventions was really useful because one of the key things that it kind of highlighted was basically where you have some control and where you don't have some control and give you a real chance to actually say, well, if I'm not able to do this, what, and what are my options?

If I'm still trying to make a meaningful and measurable, yeah, change. Because in many cases, you do have to think about some of the trade offs. The things you might do at a data center level to make some parts maybe slightly more energy efficient or maybe more carbon efficient can have knock on effects elsewhere, for example, further down the kind of, the list,

like further down the chain, basically. And this is what they do talk about. It's a really fun read if you're interested in AI. There's so much depth and the nice thing is the thing that one thing that's really quite nice about NREL specifically is that they've shared all the data to back up a bunch of this stuff.

So in the podcast interview that we have where we dive into this a bit more, we'll be showing there are some links to all the data sets that NREL was using when they were doing all these constant training runs to figure out what their, what the footprint of x might be and everything like that. So it's probably one of the most useful when open data sets we've seen for people who are trying to get

an idea about what the environmental footprint of using, I mean, AI directly, what the direct footprint of this might actually be.

Asim Hussain: I'd argue this is like a seminal piece, and you know, if there's like, I imagine this is going to be like essential reading for Green Software courses around the world. If you really want to like major software, you should this paper. 

Chris Adams: Awesome work. I don't work with Akshaya, but I guess, awesome work Akshaya and friends, for that, but probably not just for beginners. So please do not be turned off by the beginners part. It's definitely not just for beginners. There's loads there.

Asim Hussain: They probably put beginners in to make sure the beginners read it but advanced people might think "I already know" so I already know tdp so I don't need to know this.

Chris Adams: Yes, by TDP, you're referring to the Thermal density. Oh, what does it stand for? But that's

Asim Hussain: I thought it's thermal design power

Chris Adams: I think it might be actually you're right. This is the amount of power that gets used at certain amounts of utilization, right? So if I'm using the chip at maximum output, it's going to use this much power. But if it's only using half it's going to be something like that.

Asim Hussain: Yeah, but it's also like Akshaya that kind of opened my eyes to understanding kind of how these power curves, she goes into detail here like how those, you know, we hear about these power curves which tell you 10% utilization is this, that's 30% is this. If you, i'm not going to go into details if you read the paper and you realize how those power curves are made they are very rough estimates of what it like looks like, you know, like you don't really know you don't really, you just, there's no register which is telling you I'm 50% percent like, you're just seeing how much throughput, you're just seeing how much you, basically... 

Should I go into it? You basically chuck like a benchmark at it and you keep on hitting you keep on going like okay, dude, it was a website benchmark. Okay, do one hit per second.

Okay, it's fine. You keep on doing it until the benchmark can't go any higher and it's now like 500,000 page views a second. "Okay, I can't seem to do more than 500,000. I must be at 100 percent utilization." That's how that calculation works. And then you think to yourself, "Okay, what does 90% utilization mean?"

If I did 500,000, I'm just going to do 450,000 requests. And that's like the approximated idea of what 90% utilization means. But, what it really kind of ends up meaning is that it, the, it depends on the benchmark because an AI benchmark will have a different energy consumption, your pseudo 90% than a database benchmark, than this benchmark.

When you actually look at the big benchmark providers like, Esper, CERT and all these other ones, they're collections of different types of applications. And the power curve is the average of those. Which is why, like, if you know you're running, and that's why if you're using like a power curve based over, that's what I think it's saying, if you're using a power curve based off of a CERT benchmark, and you're saying that's what your AI consumption is, it might not be.

You really want a power curve which has been generated just by running a, an AI workload. Because the AI workload might just trigger different parts of the chip in different ways. It's very complicated. Yeah, and it, so, it's one of the things we were like, talking about, It's actually one of the reasons I kind of really like the way Kepler works.

Because Kepler, 

Chris Adams: Sorry, I'm going to you there. before you go on this, the reason it's, I'm actually, the reason I'm quite happy to give some space for this, is that people who have listened to this might not know that you were literally working at Intel trying to figure this stuff out when you were doing a bunch of the green software stuff, so it's okay, listen, you know, I, like, you do have some prior art in this stuff, right?

Asim Hussain: Yeah. Yeah, we're basically diving into all this stuff. And I kind of learned so much while I was over there. How Kepler works is quite interesting. Is, So Kepler is this kubernetes based system which does a whole bunch of things but one really intelligent thing it does is it tries to figure out what your energy consumption is from the actual stuff that's running on the chips that you're running on. So it has like a machine learning model that, I think it's got, I think it's got some, if you start off Kepler with nothing and it doesn't know anything it will tell you energy numbers but it kind of learns and improves and fine tunes itself based upon A, your actual chips, B, how your chips were configured, C, what you're actually running on your chips.

So you kind of get a more accurate power reading from Kepler. One of the things I think would be great for them to do is to kind of just take that out of Kubernetes. And, because that doesn't necessarily need to be a Kubernetes piece, but it's baked into that infrastructure. Because that would be generally useful everywhere. Yeah.

Chris Adams: We will share links to both of those, and Asim, you're able to find a link for some of this power curve nerdery, that would be very, helpful, because I do know...

Asim Hussain: This paper's got it, yeah.

Chris Adams: Well, okay, in that case, we'll use that, because I do know that, well, some of the work I'm doing outside of being on podcasts with you, for example, I'm aware of, like, there are people putting together procurement guidelines where they speak specifically about this kind of stuff like please tell us what the figures are going to be for this power curve based on these ideas here and being able to refer to some of the actual literature is actually very helpful for people to understand why a government buyer might be asking for this stuff and why that's being used as one way to figure out some of the environmental footprints of the use of digital services.

All right, we'll add some links to that one and then we'll see what we're doing for time. Can I share one? I want to share a story from me. So this one, this is actually, it's not so much about, it kind of is about technology. This is actually an executive order from the USA called Advancing United States Leadership in Artificial Intelligence Infrastructure.

We've shared a link to this and the reason I shared this is because I think it's actually because I work in the policy working group inside the GSF and because we speak a lot about the carbon intensity of power and stuff like that. It's often quite rare to find really good, quite well written and detailed examples of kind of policy.

And this is one that, for a short, beautiful short period of days, was actually publicly available. So this was, I think,

Asim Hussain: I see the link is, oh, no, it's a real link. No, it is way back machine.

Chris Adams: It's webarchive.org, whitehouse.gov, briefing room, presidential actions, on the 14th of January. Just before the new guy came in, there was an executive order all about essentially, deploying AI, and this was specifically about if you're going to deploy AI on public land, what, and in the US

there's lots and lots of federally owned public land, what kind of criteria do you actually want to require as condition of people being able to put things on your land like this? So just the same way that people who have private land, they can say, you can run a datacenter here, as long as you do X, Y, and Z.

This pretty much lays out, okay, here's what you should be looking for. And this stuff includes a bunch of really, in my view, interesting and like very insightful and incisive policy, pieces of policy inside this. So when we talk about the carbon intensity of power, we've spoken before on this podcast multiple times about how in the hydrogen sector, we already have a very rigorous way of talking about how energy can really be green.

And done a recent podcast interview with Killian Daly from EnergyTag talking about this idea, like three pillars, the idea that energy has to be timely. So you can't have power at night being greened with like solar because they're two separate times a day. Deliverable, like you need to be able to have the generation on the same grid as you're consuming from because otherwise it's not very convincing that it's really powering it. And additional, you need to have new power coming in. This literally is name checking every single one of these inside this. Like the actual wording they use

Asim Hussain: in terms of power, in terms of more generally applying that

Chris Adams: this is specifically for data centers. So if all data centers are like, I'll read some of the kind of quotes from this. Basically, like, as part of ongoing work, the Secretary of Defense and Secretary of Energy shall, Blah, blah, blah, blah, blah, will require concurrent like any AI data centers on a federal site will have procured sufficient new clean power generation with capacity value to meet the data centers needs.

And they've, literally explicitly said "has to be deliverable and has to be matched on an hourly basis." So those are the three things right there. They've actually been more explicit about additional elsewhere. So this is like the three things that already in place in other industries, for the first time, really laid out for how the, how you should be doing this for AI data centers.

So if you're a policymaker outside the USA, just copy this link. This is probably some of the best stuff of particularly relating to policy, to energy policy. When

Asim Hussain: it, but does it say, by the way, shall, you know, you know, the shall means, so just everyone who is listing, shall is a very important term. Shall in the standard space. I presume the policy

Chris Adams: You don't get not do to Basically what they're saying.

Asim Hussain: You gotta

Chris Adams: is mandatory if you want to things on federal land. Elsewhere, yeah,

Asim Hussain: should is different. The, the, so just to, the reason you're talking about, as I presume it's the what's mandated is clean energy. Or is what's mandated,

Chris Adams: yeah, sufficient new clean energy power generation is they use, they, and later on, they actually talk about what counts as clean energy in this because there's a bunch of stuff, it's quite a long executive order, and we've had this new guy come in power, who's basically, who's rescinded every other executive order, apart from this one, even though it's not visible, so there's some stuff inside this,

Asim Hussain: into this one. There's something which benefits, benefits something else.

Chris Adams: So there is the whole thing here about, for example, this does say, well, if we're going to have clean energy, we're going to call it carbon free, and we're going to talk about not just renewable, like wind and solar, they talk about, say, the deployment of nuclear, which America, in America, people tend to be more receptive to, or in some places at least. So there's a part, there's a part there. But they even talk about, say, if you're going to have fossil generation, it needs to be 90 percent carbon capture, right? Now, this is a very high bar to hit, because there, right now, there's basically nowhere in any kind of at scale operation which is hitting 90 percent capture of this.

So if you were to have gas and you were to have this is probably about as rigorous as you can reasonably ask. And if anyone is actually, in the year 2025, when we know all the science available to us, you're not saying something like this, got to ask, okay, who's captured, who is captured here?

Because that is a really, like, there, there is just, it's, you need to have this if you're going to be talking about the use of fossil fuels inside this. And really, you probably shouldn't be using fossil fuels at all anyway. But like, this is examples of, yeah, this is what policy does look like.

If you're going to do this, do this properly.

Asim Hussain: Yeah, but at the same time I think what we're seeing is, I mean, it's interesting that the up, I don't know if I've got time to go into it, but the uptime report talks about the, 

the increasing demands is forcing organizations to, you know, like you utility, there's so much demand from data centers.

It's not really a question of, you know, you've got to use clean energy. It's like, you don't have the energy or you now have to be a good place. You go to demand response. 

But there's also then driving up pressure for those organizations. They're kind of walking back a lot of the stuff previously and there's a lot of fossil fuel generation being thrown out.

Asim Hussain: I have not verified this at all, but today I saw something on my feed. Which said that, I don't like, anyway, which, which said that, Elon's, 

Chris Adams: You might be about the x.ai datacenter, the one in Memphis, running

Asim Hussain: in Memphis, there's gonna be, there's, like 15

Chris Adams: yes!

Asim Hussain: to power it. Which, you know, probably is because the utility said to him, "You're not putting an unbelievable load on our grid. We do not have the capacity for you." And he probably went, "ah, I'll build my own gas generators without asking anybody."

Chris Adams: There is a bit of a story behind this. So essentially, the, there was a datacenter, the x.ai datacenter was built very quickly by datacenter standards. And

usually, if you want to have power for a data center, you're going to have to wait some time if it isn't already available.

And, the, basically the approach that was taken was to essentially deploy a bunch of mobile gas turbines to provide the extra megawatts of power such that you could power that. Now the problem is these are really bad for local air quality. So you're shortening the lives of all the people who live around there, for a start, for the sake of this.

And, the other thing that, one of the reasons you're able to do this is because, they count as a mobile generators, they're not covered by the same clean air laws. So you wouldn't able to, yeah, exactly. So essentially this is stuff which has a real human cost, right? This is an already marginalized and kind of racialized community that it already has very bad air and has like elevated cases of asthma and all the stuff like that.

So there is a real human cost being paid here. And the decision has been made. "We're going to use this because we've decided that's more important than the lives of people around here." So, like, that's essentially what coming down to.

Asim Hussain: But also, I mean that, I'm guessing from the fact that this was an active executive order as a, you know, a few months ago that, that wasn't on federal land and therefore, or something like that must be, or

Chris Adams: This is somewhat separate. I mean, for a start, this, the, for the things, for the, xAI case in particular,

 you don't, any of the local air guidelines or the local air kind of, laws about air, about air quality, don't apply to mobile providers.

Asim Hussain: Oh, 

Chris Adams: providers.

Yeah. Yeah.

Asim Hussain: Even with this executive order, you can always get around it by just playing on mobile?

Chris Adams: So this was, this executive order came later. So we've had this things in xAI. That's been something that we saw last summer. All right. This was only published in January and they, and then it was literally on the White House website for seven days before the new guy came in and it down while pointing to the previous one.

And it's also worth bearing in mind that executive orders are not law. So even though someone can say they need to do this, that doesn't mean that it overrules existing law, example. So absent any other law, this is what you can ask for. And this is why they're able to say for federal law, this is the things we'd be doing.

There's actually a bunch of other really good stuff inside this, in particular, the air quality stuff. So the, as a contrast to saying, "It's okay to use this stuff. Who cares whose lives are shortened?" On the environmental justice, there's a whole piece in this about saying you, if you're going to deploy data centers in public land, then you need to have constant monitoring, all this visit, and have this visible everyone else to see as well.

So like these are the things that I think we don't see that you could totally take as examples away from this. And, they've also literally said. If you're going to deploy, you can't deploy in places which have had traditionally poor air quality below this, this air toxic, AirTox Screening. So basically, places which have already been harmed already, you don't get to deploy them in these places anymore.

And like, this is why I think this is actually quite well written stuff, because it does take into account all these things which we've had, which have been coming up again and again. So if you were trying to come up with some policy for deciding how you deploy, there is so much you can lift from this yourself, for your own corporate policies or anything like that.

Asim Hussain: There's very few benefits to a local community for having a data center built near you, there's very few jobs. There's like very, like, there's a couple of people walking around this giant warehouse and there's all, they've sucked all your electricity, and they, and there's, and you know. I don't know. The data center industry needs to, I was, it was fascinating to me when I was chatting, 

I was at an infrastructure conference last year and I was chatting to a gentleman, won't name his name, from the utility sector, and he was saying to me something very interesting. He was saying to me, he believes the data center industry, this is before, he who shall not be named apparently, entered office.

So, this is before that happened, but he was saying he thinks the data center industry is headed right towards full regulation the same way utilities are regulated. So if you want to do a power plant, you can't just go "Oh, it's gonna make me a lot of money. I'm gonna build a power plant here." You have to go through so many checks and balances.

Your profit is limited. Everything is limited. And he was saying based upon the conversations that are happening, you know, you're claiming that this technology is so fundamental to life and existence that it therefore is a commodity, therefore it's something that's you know similar to energy. Energy utilities can't just say "ah we're going to rack up our prices 40, 43 percent because everybody wants it." You've got to, they'd be regulated for that.

So he was really putting a very convincing argument to me that if the data center industry is not careful It's going to get regulated that way and then they don't want to get regulated that way. It's not fun, apparently. And so I think things like this really matter.

Chris Adams: Yeah.

Asim Hussain: Really do matter.

Yeah, you to think about it. If you're with a data center, you can't be, you can't not think about the impacts of the region that you're in. You've got to really put effort into where you need to be a positive net benefit to the place you're being installed, you know, locally as well.

Chris Adams: So this is actually one thing that, so what I think you're, the argument you're making is that if you can, if you're going to present yourself as a utility, something which is what foundational to everything running on, then you probably, maybe there, then you should expect utility style profits rather than SaaS style profits, Because the margins that you might see, when you're from certain, tech giant companies is like 30 percent for example. That's not the same as utilities might be looking at like around 10 to 15 percent for example. And you have different kinds of oversight being introduced.

So yes, this is a conversation that we might have. I suspect it might be longer than we have given the time we have available, but yes, this is something we might point to. Just following on from this, there's a, you did mention this, uptime report, Uptime Institute Report. We'll share a link to that as well.

And I think there is, we might be in a situation where we have a bit of a fight on our hands, or we might be seeing a fight taking place because we do see like in Europe, for example, where, which is probably, arguably, the place where you see fights around data center deployment the strongest. We've just seen new laws be published about what criteria you need to actually have if you're going to connect to the data centers.

This was published, I think, last week, and we'll share a link to that. Where, in contrast to what we've just talked about here, where the US policy was very clear and was very good, we now see, essentially, a guideline saying you can connect data centers to the grid, but you need to have your own generation and you need to integrate nicely with the grid, but there's no mention of climate change, no mention of local environment or anything like that.

This is literally going to likely incentivize even more on site fossil based generation for this, absent no other criteria being in place. So we might see this being challenged, but I think I agree with you. We currently do have this case where, yes, you got all this new technology being deployed,but there is the kind of, we have a fight where there's almost zero regulation and it doesn't feel like it's going to last. It, I can't see how

Asim Hussain: You don't think, you don't think the absence of regulation is gonna last?

Chris Adams: I think what's going to happen is that if you continue to go through this stuff, you're, what's probably going to happen is that you will end up with so much pushback that you will end up with much, much more heavy handed regular legislative responses to this. Because, right now, there's been this push to kind of, essentially, neuter any kind of meaningful science based or data informed discussion around this.

All that does is play into the hands of a much, much more, a much, much more dramatic response later on. So I think it's, if you want to deploy stuff, then this does feel kind of long term, not very helpful for them. But then again, there's a question about, do we need this, how much do we actually need to be deployed?

There's probably a democratic discussion to actually have about that.

Asim Hussain: Well, we haven't even spoken about DeepSeek and its impact on this whole question and kinda how it has, if I'm going back to that conversation, how that person utilities that their big question was. Because 

the data center providers, everybody's telling them we need a lot more energy in the future.

And they're going, "well, my God, do "we actually put the effort in to try and roll out this new capacity? And then only to find out on the day in the two years later like "ah, we got it wrong I'm sorry, we won't need that." They're asking the and I'm just a, they're asking the "is it BS?" question because they need really to figure it out and I was thinking okay, they might have just been convinced. Then DeepSeek comes along and Now you know everybody's asking the question "huh, will we need this capacity upgrade?" And now, as soon as DeepSeek came along, everybody said, "yeah, that's great. Now we are gonna do even more AI. We do, we definitely need the capacity, but now we can do more with it." And you're like, well, hang on. Because there is oftentimes a thing that goes on in, you have to create the hype to get the funding. You have to create the hype to get the funding. If you want to convince like investors to invest in your organization, if you want to convince them, you have to create the hype. And what DeepSeek's done is it's just popped it and I don't know how much it's popped it because only the investors know how much it's been popped.

But it's popped it and I wonder if it's really popped it quite significantly and whether we are going to see like a significant pullback. Is Stargate really going to happen or does it really not, really matter? They just want to hand money out to, it's just a reason to hand out 500 billion because you know, why not?

Chris Adams: We can share a link to, there's a good paper from Sasha Luccioni talking a little bit, and friends, about Jevons Paradox. I've actually written a blog post about this as well, particularly for DeepSeek, to kind of make this accessible for people who are trying to understand. Does this, is this going to reduce the footprint or is it going to increase the footprint?

Because there's a few different criteria you want to take into account. Just saying,

Asim Hussain: pops the bubble, it will decrease the footprint, I think.

Chris Adams: That's, this is the thing we can look into and decide. Because the flip side is that if this makes it more likely that they'll, you'll, if this lowers the barrier so that more people are able to use it in more places, that can lead to an absolute increase.

So there are different, there are two different, there are different ways and different takes on this and it's very much case of, okay, this is, yeah, this is one thing we'll show a link to. Asim, I think we've gone down a bit of rabbit hole so we should probably look at the events there's anything particularly we have here.

Asim Hussain: So, there's a couple events coming up.

There is the Practical Advice for Responsible AI on February the 27th at 6pm in London. , it's a UK event. And it's gonna be held in person in the Adaptavist offices, and it's gonna talk about green AI with Charles Humble and AI governance Team with Jovita Tam. There's the GSF Oslo meetup happening on again, February 27th at 5:00 PM. It is in person in the Accenture offices from 5 to 8 PM.

And they're going to talk about how to leverage data and technology to drive sustainability initiatives and enhance security measures, dive into green AI, obviously. There's going to be talks from Abhishek Dewangan and Jonny Mauland. I do apologize.

Chris Adams: Sorry,

Asim Hussain: Read them. I'm sorry, Johnny.

Sorry, Abhishek. Details in the podcast notes. And think that's it. I think I'll pass over to you Chris.

Chris Adams: Yeah. Okay, then. I think that takes us to the end of what we have for this. I assume if there's a particular free resource you would point people to right now on green software as a final thing, what would you point people to as a parting?

Asim Hussain: Oh, honestly, it's that beginner's guide. I don't know if it's I don't know if it's, it is very good, I read the Beginner's Guide to Power and Energy Measurement and Estimation for Computing and the last word.

Chris Adams: Wow, Akshaya better be getting a promotion after this, man. This is just like, this is, so yes, this, I agree. It was a really fun read. If you want to basically sound knowledgeable about AI, this is probably the most useful thing to read. And that's as someone who's written a report all about the environmental impact of AI ourselves, where we work.

All right, Asim, it's really lovely to see you again, mate. Thank you so much coming on. I hope the people who did listen to this were able to stay with us and we didn't go get too self indulgent. And if we did, please do tell us and we'll make sure that we don't do it too much next time. And otherwise I'll see you in one of the future episodes of This Week in Green Software.

Thanks, mate.

Asim Hussain: See you later, mate.

Chris Adams: Toodle oo!  

Hey everyone, thanks for listening. Just a reminder to follow Environment Variables on Apple Podcasts, Spotify, or wherever you get your podcasts. And please do leave a rating and review if you like what we're doing. It helps other people discover the show, and of course, we'd love to have more listeners.

To find out more about the Green Software Foundation, please visit greensoftware.foundation. That's greensoftware.foundation in any browser. Thanks again, and see you in the next episode.