Environment Variables
Cloud Infrastructure, Efficiency and Sustainability
May 8, 2025
Host Anne Currie is Joined by the esteemed Charles Humble, a figure in the world of sustainable technology. Charles Humble is a writer, podcaster, and former CTO with a decade’s experience helping technologists build better systems—both technically and ethically. Together, they discuss how developers and companies can make smarter, greener choices in the cloud, as well as the trade-offs that should be considered. They discuss the road that led to the present state of generative AI, the effect it has had on the planet, as well as their hopes for a more sustainable future.
Host Anne Currie is Joined by the esteemed Charles Humble, a figure in the world of sustainable technology. Charles Humble is a writer, podcaster, and former CTO with a decade’s experience helping technologists build better systems—both technically and ethically. Together, they discuss how developers and companies can make smarter, greener choices in the cloud, as well as the trade-offs that should be considered. They discuss the road that led to the present state of generative AI, the effect it has had on the planet, as well as their hopes for a more sustainable future.

Learn more about our people:

Find out more about the GSF:

News:

Resources:

If you enjoyed this episode then please either:
Connect with us on Twitter, Github and LinkedIn!

TRANSCRIPT BELOW:

Charles Humble: In general, if you are working with vendors, whether they're AI vendors or whatever, it is entirely reasonable to go and say, "well, I want to know what your carbon story looks like." And if they won't tell you, go somewhere else. 

Chris Adams: Hello, and welcome to Environment Variables, brought to you by the Green Software Foundation. In each episode, we discuss the latest news and events surrounding green software. On our show, you can expect candid conversations with top experts in their field who have a passion for how to reduce the greenhouse gas emissions of software.

I'm your host, Chris Adams. 

Anne Currie: Hello and welcome to Environment Variables, where we bring you the latest news and updates from the world of sustainable software development. Today I'm your guest host Anne Currie, and we'll be zooming in on an increasingly important topic, cloud infrastructure, efficiency and sustainability.

Using the cloud well is about making some really clever choices, really difficult choices upfront. And they have an enormous, those choices an enormous impact on our carbon footprint, but we often just don't make them. So our guest today is someone who's thought very deeply about this.

So Charles Humble is a writer, podcaster, and former CTO who has spent the past decade helping technologists build better systems, both technically and ethically. He's the author of The Developer's Guide to Cloud Infrastructure, Efficiency and Sustainability, a book that breaks down how cloud choices intersects with environmental impacts and performance.

So before we go on, Charles, please introduce yourself.

Charles Humble: Thank you. Yes, so as you said, I'm Charles Humble. I work mainly as a consultant and also an author and a technologist. I have a, my own business is a company called Conissaunce, which I run. And I'm very excited to be here. I speak a lot at conferences, most recently, mainly about sustainability. I've written a bunch of stuff with O'Reilly, including a series of shortcut articles called Professional Skills for Software Engineers, and as you mentioned most recently, this ebook, which I think is why you've invited me on.

Anne Currie: It is indeed. Yes. So, to introduce myself, my name is Anne Currie. I've been in the tech industry for pretty a long time. Pretty much the same as Charles, about 30 years. And I am one of the authors of O'Reilly's new book, building Green Software, which is entirely and completely aimed at the folks who will be listening to this podcast today.

So if you haven't listened to it, if you haven't read it or listened to it because it is available in an audio version as well, then please do so, you'd enjoy it. So, let's get on with the questions that we want to ask about today. So, Charles, you've written this great ebook, which is also something everybody who's listening to the podcast should be reading.

And we'll link to it in the show notes below. In fact, everything we'll be talking about today will be linked to in the show notes below. But let's start with one of the key insights from your book, which is that choices matter. Things like VM choices matter, but they're often overlooked when it comes to planning your cloud infrastructure.

What did you learn about that? What do you feel about that, Charles? 

Charles Humble: it's such an interesting place to start. So I think, when I was thinking about this book and how I was putting it together, my kind of starting point was, I wanted like a really easy on-ramp for people. And that came from, you know, speaking a lot at conferences and through some of the consulting work I've done and having people come up to me and say, "well, I kind of want to do the right thing, but I'm not very clear what the right thing is." 

And I think one of the things that's happened, we've been very good about talking about some of the carbon aware competing stuff, you know, demand shifting and shaping and those sorts of things. But that's quite a, quite an ambitious place to start. And oftentimes there are so many kind of easier wins, I think. And I kind of feel like I want to get us talking a little bit more about some of the easy stuff. 'Cause it's stuff that we can just do. The other thing is, you know, human beings, we make assumptions and we learn things and then we don't go back and reexamine those things later on. So I've occasionally thought to myself, I ought to write a work called something like Things That Were True But Aren't Anymore or something like that 

because we all have these things. Like my mental model of how a CPU works until probably about two years ago is basically a Pentium two .And CPUs haven't looked like a Pentium two for a very long time, and I have a feeling I'm not the only one. So, you were specifically asking about like CPUs and VM choices, and I think a lot of the time, those of us, certainly those of us of a certain age, but I don't think it's just us, came through this era where Windows and Intel were totally dominant. And so we naturally default to well, "Intel will be fine"

because it was right for a long time.

Anne Currie: Yeah.

Intel 

Charles Humble: was the right 

Anne Currie: Who could ever have imagined that Intel would lose the data center? It's 

Charles Humble: Absolutely it is extraordinary. I mean obviously they lost mobile mainly to ARM and that was very much a sort of power efficiency thing. Fair enough. But yes, the idea that they might be losing the data center or might have lost the data center is extraordinary. But you know, the reality is first of all, if you are thinking about running your workloads. So, AMD processors, more or less how a cross compatible of Intel wants. It's not totally true, but it kind of is. So they have an X86 compatible instruction set. So for the most part, your workloads that will run on Intel will run on AMD.

But not only will they run on AMD, they will probably run on AMD better.

Again, for the most part, there are places where Intel probably has an edge, I would think. If you're doing a lot of floating point maths, then, maybe they still have an edge. I'm not a hundred percent sure, but as a rule of thumb, AMD is going to be, you know, faster and cheaper. And the reason for that has a great deal to do with core density. So AMD has more cores per chip than Intel does, and what that means is you end up with more processing per server, which means you need fewer servers to run the same workload. I ran some tests for the ebook and that came out,

so I had a 2000 VM instance and we had 11 AMD powered servers. So running, epic, the AMD Epic chips and we needed 17 Intel powered servers to do the same job. Right? So that's roughly 35% fewer servers. It's not, by the way, 35% less power use. It's actually about 29%, something like that, less power use 'cause the chips are quite power hungry, but still that's a big saving, right? And it's also, by the way, a cost saving as well. So the other part of this is, you know, it is probably about 13% cheaper to be running your workload on AMD than Intel. Now obviously your mileage may vary and you need to verify everything I'm saying.

Don't just assume, "well, Charles Humble said it's true, so it must be." 

It'll be a foolish thing to do, but as a rule of fault, the chances are in most cases you're better off and I'll wager that you are a lot of the time when you are setting up your VMs on your cloud provider, your cloud providers probably default to Intel and you probably just think, "well, that'll be fine."

Right?

So kind of a case of trying to flip that script. So maybe you default to AMD, maybe you evaluate whether ARM processors will work. We are seeing another surge of ARM in datacenters. Though, as I said, that comes with some it. In mobile, the trade offs are pretty straightforward with ARM to anything else. In data centers it is a little bit more nuanced. But basically it's that, and I think it's, I think it's this thing of, as I say, of these assumptions that we've just built up over time that we don't, we're not very good at going back and reexamining our opinions or our assumptions. And then the other thing that I think feeds into this is we build layers of abstractions, right? That's what computer science does, and we get more and more abstracted away from what the actual hardware is doing. I found myself this morning when I was thinking about coming on the show, thinking a bit about some of the stuff Martin Thompson's been talking about for years, about mechanical sympathy.

I'm sure you have experiences of this, and I know I have,

where, you know, I've been brought into a company that's having performance problems. And you look at, there's one that I actually remember vividly from decades ago, but it was, an internet banking app. So it was a new internet bank that was written in visual basic, weird choice, but anyway, go with me here. And they were reading. It was all MQ series, so IBM MQ series under the hood, right? So basically you've got messages that were written in XML being passed around between little programs. It looks a bit like microservices, but 20 years ago before we had the term roughly. And what they were doing, so when you read a message off an MQQ, you read it off essentially one byte at a time.

And what they were doing in a loop in Visual Basic was they were basically saying string equals string plus next byte. Does that make sense? So, string equals string plus new string. That kind of idea. Now under the cover, they're doing a deep string copy every single time they do that. But they had no idea 'cause they were visual basic programmers and didn't know what a deep string copy even was.

Fair enough. And then they were going, "why is our audit process grinding to a halt?"

And the reason is, well, 'cause you'll, we just need like an API. But what I'm getting at is we have these, we get very abstracted away from what the hardware is doing

because most of the time that's fine, right?

That's what we want, except that our abstractions leak in weird ways. And so sometimes you kind of need to be able to draw on this is what's actually happening to understand. So as I say, in the case of,

in the case of CPUs, if you haven't been paying attention to CPUs for a while, you probably think Intel still has the edge, but right now, sorry, Intel, they don't.

Hope that changes. Competition is always good. But you know, it's just a great example of, you probably don't even think about it. You probably haven't thought about it for years. I know, honestly I hadn't.

Anne Currie: Yeah.

Charles Humble: But then you start running these numbers and go, "gosh, that's, you know, like a 30% power saving."

That's, at any sort of scale, that's quite a big deal. And so a lot of the things that I was trying to do in the book was really that. It was just saying, well, what are some of the things that we can do that are easy things,

that make a massive difference?

Anne Currie: It's interesting. What you're saying there reminds me a little bit of somebody who was a big name in tech back in our early, you'll remember it very well. Joel Spolsky 

used to write a thing about, you know, what would Joel do? He used to work on, do a lot of work on usability, studying usability.

And he'd say, well, you're not looking for, to change the world and rewrite all these systems. You are often just looking for the truffles, the small changes that will have an outsize effect. And what you're saying is that, for example, moving from Intel to AMD is a small truffle that will have an outsized effect. If you do it at the right time,

it's, actually you could probably, it's not so much, as you say, the trouble with go with going to an ARM ship or, you know, Graviton servers that's been pushed very heavily by AWS at the moment. Big improvement in energy use and reductions in cost. But that is not a lift, that's not an instant

oh, flick of switch and you go over. They, you know, there are services that are no longer available. There are, you know, you're gonna have to retest and recompile and do all the things, but it's not such an obvious truffle. But you are saying that really that the intel AMD might be a really easy win for you.

Charles Humble: Yeah, absolutely. Absolutely. It's funny you mentioned Joel Spolsky there. 'Cause actually his, so I read his User Interface Design for Programmers, I think the book is called, about 30 years ago probably. It's just, I still, like everything I know about user interface, I swear it comes from that like book.

It was such a brilliant, it's also hysterically funny. It has all sorts of examples of just, it's very wittily written and has some wonderful examples of, you know, just terrible bits of user interface. Like the Windows 95 start button, which is in the bottom left hand corner. Except that if you drag to the bottom left hand corner of the screen, which is one of the easy places on a screen to hit, you miss the start button because aesthetically it looked wrong without a border around it.

But then no one thought, well, maybe we should just make it so if you miss, but you are there, you know, like it's full of just examples like that. It's very funny. And yeah, absolutely. This, business of, as I say, so much of, we have as an industry, been very profligate, right? 

We've been quite casual about our energy use and our

hardware use. So there's another example, which is to do with infrastructure and right sizing.

Again, this is just one of those things, it's such an easy, quick win for people

and it's another thing that connects to this business of our old assumptions. So when I started in the industry, and probably when you started in the industry and we ran everything in our own data centers, procurement was very slow, right?

If I needed a new server, I probably had to fill in a form and 10 people had to sign it, and then it would go off to procurement and it would sit doing, heaven knows what for a couple of months, and then eventually someone might get around to buying a server and then they'd install the software on it and then it would get racked.

And you know, like six months of my life could have gone by, right.

And so what that meant was if I was putting a new app in, and at some point someone would come along to you and go, "we're putting this new app in. How many servers do you need?" And what you do is you'd run a bunch of load tests on, I dunno, load runner or something like that.

You'd work out what the maximum possible concurrent, like, oh, sorry, concurrent was a poor choice of word there.

Simultaneous number of users on your system, rather.

Yeah.

Right. You simulate that loads, that would tell you how many boxes you needed. So suppose that said four servers, you go to procurement and you go "eight, please."

Anne Currie: Indeed. 

Charles Humble: Right. And no one would ever say "why do you need eight?" Because, right. And that's just. That's just what we do. And what's weird is we still do it, right. Even though elastic compute on the cloud means surely we don't need to. We kind of have this mindset of, "well, I'll just, I'll add a bit more just to be on the safe side 'cause I'm not

too confident about my numbers.

Anne Currie: There is a logic to it if it's easy because it, the thing that you fear is that you'll under provisioning and it'll fall over. So there's a big risk to that. Over provisioning, yes, it cost you more, but it's hard. It's really hard to get the provisioning perfect.

So we over provision and then you always intend to come back later and right size. And of course you never do because you never get a chance to come back and do things later. 

Charles Humble: Something I say a lot to the companies that I consult to is "well just run an audit."

Anne Currie: Yes, indeed. Yeah.

Have 

Charles Humble: a three month process or a, you know, like a three month or a six month mission where we are gonna do a right sizing exercise. We're gonna look for zombie machines. So those are machines that were, you know, once doing something useful but are doing nothing useful anymore. And also look for machines that are just sitting idle and get rid of them. You actually have an amazing story in the, in your O'Reilly book, the Building Green Software book from Martin Lippert. So he was tools and lead sustainability for VMware, Broadcom, part of the old Spring team.

He talks about, so in 2019, I think it was in VMware, they consolidated a datacenter in Singapore. They were moving the data center and basically they found that something like 66% of all the host machines were zombies. 66%.

Yeah. And that's untypical. 

Anne Currie: No, it's not.

Charles Humble: I've gone and done audits. 

50% plus is quite normal.

 So I have this like thing that I quite often say to people, I reckon you can halve your carbon emissions

in your IT practice just by running an audit and getting rid of things you don't need. 

And it may even be more than that. 

Anne Currie: Yeah, indeed. As VMware discovered, and people do it at a time when they move data centers. I often think this is probably a major reason why when people go, "oh, you know, I repatriated, I moved away from the cloud back in and I saved a whole load of money."

Yeah, you would've made, saved that money doing that kind of exercise in the cloud as well. Probably more because the cloud, the trouble with the cloud is both amazing, it has amazing potential for efficiency because it has great servers that are written to be very efficient and you wouldn't be able to write them that efficiently yourselves.

So there's amazing potential. Spot instances, burstable instance types, serverless, you know, there's loads of services that can really help you be efficient. But it's so easy to overprovision that inevitably everybody over provisions massively. And especially if you lift and shift into the cloud, you massively over provision.

Charles Humble: There's a related thing there as well because it's so easy to

and then you just forget about it. Evevn on my own, like sort of, you know, personal projects, I've suddenly got a bill from Google or something and I've been like, "oh hello, that then?"

And you know, it's something that I spun up three months ago for an article I was writing or something and I'd just totally forgotten about. And it's been sitting there running ever since, you know, like, and you could imagine how much worse that is as an enterprise, this is just like me on my own doing it.

And it's that kind of thing. I think. So thinking about things like auto sizing, you know,

scaling up remembering, to scale back down again. People often scale up and don't scale down again. There's some of the Holly Cummings stuff around Lightswith Ops. This idea of, you know, basically you want to be able to spin your systems back up again really easily.

That sort of stuff. Again, this is all stuff that's quite easy to do, relatively speaking.

Anne Currie: Relatively. So much easier than rewriting your systems in Rust or C, I can assure you of that.

Charles Humble: Well, a hundred percent, right? And, again, you know, I've made this, 

I've made this joke a few times on stage and it's absolutely true. We kind of, because we're programmers, we automatically think, "oh, I'll go and look at a benchmark that tells me what the most efficient language is," and it will be C or C++ or something.

And like "we will rewrite everything in C or C++ or Rust." Well that would be insane. And your company would go bust and nobody is gonna sponsor you to do that for very good reason. And

what you want to be doing is you want to be saying, "well, you know, what are the pragmatic things we can do that will make a huge difference?"

And a lot of those things are. You know, rightsizing. It's a really good example. 

Anne Currie: Yeah, I mean, I clearly we're, this is something that you and I have discussed many times and it was one of the reasons why at the end of Building Green Software, we devised the Green Software Maturity Matrix that we donated to the Green Software Foundation, 

Charles Humble: Yes. 

Anne Currie: because the, what we found over and over again when we talked to conferences, went out and spoke to people is that they had a tendency to leap right to the end, rewrite things in.

You know, they say, "well, we couldn't rewrite everything in C or Rust or we'd go outta business, so we won't do anything at all." And they step over all the most important, they step over all the truffles, which are switching your CPU choice, switching your VM choice, doing a right sizing, audits, doing a basic audit of your systems and turning off stuff, doing a security audit because a lot of the, these zombie systems actually should be turned off in a security audit because if they're there and they're running and they're not being patched and nobody owns them anymore, nobody knows what they're doing anymore, they will get hacked.

They are the ways into your system. So sometimes the way to pitch this is a security audit.

Charles Humble: Absolutely. Yes, and I do, I use the Maturity Matrix quite a lot in this ebook. Actually, it's one of the things that I reference all the way through it for exactly this reason, because it's, as I said, I think we tend to go to the end a lot. And actually a lot of the stuff is so much earlier on than that.

And I think it's just a, yeah, 

I think it's a really important thing to realize that there's a huge amount you can do. And actually as well, it's gonna save you an awful lot of money. And given the kind of very uncertain business environment that we're in, and people are very kind of worried about investing at the moment for all sorts of quite sensible reasons, this is one of those moments where actually if you're thinking about "I want to get my business onto a more, or my IT within my company onto a more sustainable footing," this is absolutely the right time to be having those conversations with your CFO, with your execs because, you know, this is the time where businesses need to be thinking, "well, how do I cut cost?" And there's a huge amount of waste. I guarantee you if you've not looked at this, there will be a huge amount of waste in your IT you can just get of

and be a bit of a hero and, you know, do good by the planet at the same time.

It's like, what's not to like?

Anne Currie: Yeah, because I mean, different companies, different enterprises, different entities have different roles in the energy transition. For most enterprises, your role is to adopt modern DevOps practices really, it's a new start. You don't mean you don't have to start there. You can start with the, as you say, manual audit.

Sometimes I've heard it called the thriftathon, where you just go through and you go, "do you know that machine? Turn it off." You know, you can use that kind of, they use the screen test method of "you don't think anyone's using it, turn it off. Find out if anybody was using it." And then you can use that to kind of step yourself up to the next level.

You and I both know holly Cummins, who was a guest, cut two back, one back, on this podcast. And she introduced the idea of, Lightswitch Ops, which is the, first kind of automation. If you haven't done any automation up till now and you want to learn how to do automation, a really good bit of automation is the ability to turn machines off automatically, maybe for a period overnight or, and you try that out on machines like your test suites, to just get yourself into the, to the simplest form of automation. It can also, if you are on the right, it depends if you're on the right models and you're in the cloud potentially, or you have the right

infrastructure, then that can save you money. It might not always save you money because you have to have made the right infrastructure choices. It might just that be that the machine sits on and doesn't really do anything. You've just turned off your application. But you really want to be turning things off to save power.

You know, and it's a really good way of getting you into the DevOps mindset, which is where everybody needs to be with so many payoffs.

Charles Humble: Yes.

Anne Currie: But yes. So, we'll go back to, do ask the questions. So, in part of, in, well, one of your talks is writing greener software, even when you are stuck on prem, and you talk about the fact that not everybody has the option to move into the cloud.

So what, then? What do you do if you can't move into the cloud?

Charles Humble: Yeah, that's, it is such an interesting question, that. So obviously there are things you can't do or can't do very easily, and one of the most obvious of those is you can't choose green locations on the whole if you're running stuff in your own data centers. So again, going back to these easy wins, an easy win is to use something like Electricity Maps, which is a tool which basically tells you what the energy mix is in a given region.

Oh.

And then you say, "I shall run my workloads there 'cause that looks good." There's a little bit more to it than that. You kind of want a location that not only has the greenest energy mix at the moment, but also has like credible plans for that to keep improving.

Anne Currie: Yeah.

Charles Humble: Obviously that's really hard to do with your own data centers.

Anne Currie: Yeah.

Charles Humble: As a rule of thumb, you probably don't want to be building new data centers if you can help it because, pouring concrete is not great. There's a lot of costs associated. That said, you do have some advantages in your own data centers 'cause you have some things that you can control that people on cloud can't. I would say, I mean, you know, like being honest about it, if you can move things to public cloud, that's probably going to be better. But if you can't, there are still things you can do. So one of those things is you have control over the lifetime of your hardware. This gets a little bit complex, but it's basically down to, so hardware has an embodied carbon cost.

That's the cost that it takes to construct it, transport it, dispose it at the end of its use, like useful lifetime. I mean, it also has the cost it takes to charge it. Now for your laptops, your mobile phones, your end user devices, the embodied carbon absolutely dwarfs the carbon cost used to charge it in its lifetime.

Anne Currie: Yeah.

Charles Humble: What we talk about with end user devices is like basically extend the life. Say, you know, 10 years or something like that, keep it. We want to make less of them, is really the point. Servers and TPUs and GPUs and those sorts of things, it's a bit more complicated. The reason it's a bit more complicated is because we are getting an awful lot better at making more efficient servers for all sorts of reasons. so what that means is the trade-offs with each new generation is more complicated. As an example, a lot of your energy use in your data center is actually gonna be cooling. So a CPU or a TPU that's running less hot requires less cooling. That's a big win. These sorts of things are sufficiently important that actually, until gen AI came along, so really three or four years ago, though we were adding massive amounts of compute, the emissions from our data centers was pretty flat. I mean, it was climbing, but not much. So the point here with your own data centers is you have control over that lifetime. So what you can do is you can do the calculations, assuming you can get the embodied carbon costs from your suppliers, you can do the calculations and think about, "well, how long do I keep this piece of hardware going before I turn it over?" Now, I don't want to give you a heuristic on that because it's kind of dangerous, but it's probably not 10 years, right?

It's probably five years-ish. Maybe something like that, but run the maths. But it's absolutely something you can do. You can also take advantage of things like your servers will have power saving modes that you probably don't turn on because we used to worry about that kind of thing.

'Cause we have this like, again, one of our old assumptions. We used to imagine that if you power a server down, it might not come back quite the same. Actually that's kind of still true, but, you know, it's fixable, right? So enable power saving across your entire fleet, that will make a huge difference, particularly if you've over provisioned, like we were saying earlier, right? 50% of your servers are idle. Well, they can be asleep all the time, and that helps. It's not the same as turning 'em off, but helpful. You can also look at voltage ranges. So your hardware will have a supported voltage range, and you've probably never thought about it, and I'll admit I hadn't until quite recently.

But actually again, if you're running at scale, if you send the lowest voltage that your servers will support, at a big scale that will a considerable difference. And then again, some of the other things we talked about, your CPU choice again, will make a difference. So think about, you know, "do I need to be buying Intel servers all the time, or could I be buying AMD ones or ARM ones?"

And also look at your cooling. But that's a whole, that's a whole nother complicated topic for all sorts of reasons. Often, well, in brief, some of the most energy efficient methods of cooling have their own set of problems, which make the trade offs really hard. So, like water-based cooling tends to be very efficient,

tends not to be great for local water tables.

Anne Currie: Yeah.

Charles Humble: It's, complicated. But, yeah, as I say, there are, so, there are a lot of things are that are definitely harder. And if you have a choice, if you're running in like a hybrid environment, chances are if you have a choice of going public cloud or own data center, public cloud is probably better. It's absolutely in Google and AWS and Microsoft's interests to run their data centers as efficiently as possible. 'Cause that's where their cloud profit margin is, right?

Anne Currie: Absolutely.

Less 

Charles Humble: it's costing them to run the, you are still paying the same amount, the more money they make. 

Anne Currie: Well, I, and I think I always laugh when I see the numbers on Graviton. So when AWS attempt to, persuade you quite correctly to, if you can move from Intel chips to run on, to run your applications onto ARM chips. They say, "oh, this will save, 40% on your hosting bill and 60% on your carbon emissions."

And you think, I think you've just pocketed quite a lot, a big. That suggest to me you've just pocketed quite a nice upgrade in your, in your, profitability. And I have no problem with that whatsoever, as things get better, I have no problem with making profits out of it. So I'm gonna pick you up on something that, I think everything you've said there is very true.

And I'm gonna take a slightly different take on it, which is that remember what that, what Charles is saying, there is quite detailed stuff about not everybody here will be a hardware person and that you will have specialists within your organization who can do all these hardware judgements.

The interesting thing is that they can. And it is always the case that if you can, if you have specialists in your organization, the best way to do better is to persuade them that they want to do better. So, if, you could persuade your specialists that actually to actually take an interest in this and to find ways of improving the efficiency of your systems, cutting the carbon emissions, they will do better at it than you will.

Charles Humble: 100%.

Anne Currie: Best thing you could do is persuade them to focus their in giant specialist brains on the subject because the likelihood is that the real issue is they probably aren't thinking about it, or they probably don't, you know, they, it is not top of their mind. They maybe think they're not even allowed to start thinking about it.

If it at a high level, you can actually get your specialists to turn their attention to these. efficiency issues to these carbon reduction issues, that's so much more effective than you going and reading up on it yourself. Get them involved. Go out and talk to people. Persuade, use your powers of persuasion, because, what you should take away from Charles, what's lots of people listening should take away from what Charles

just said then is that there is a lot of stuff that can be done by your specialist teams that they might not be thinking about doing, or they might not be, they might feel they don't have the time or focus to do. You can potentially help them by focusing them or giving them some budgets or some time to work on it.

Charles Humble: Definitely. Absolutely. Yeah. Yeah. No, I'm a big believer in specialization in our industry, and I think actually this idea that we are almost know everything isn't, is not helpful. Like absolutely, if you've got hardware people, go and tell the hardware people, and it's a thing of incentivizing.

It's like, you know, "we can save money by doing some of these things, or we can reduce our carbon by doing some of these things, and those are good things to do." Yeah, a hundred percent agree with all of that. No disagreements at all.

Anne Currie: Yeah, no, it's interesting isn't it, that most of human progress has come from the realization that specialists kick the butt of generalists. And I'm a generalist, so you know, I wish it wasn't true. My job is to kind of encourage specialists to be specialists and, you know, this is not new news.

It was the, it's the theme of Adam Smith's the Wealth of Nations that he wrote in the 1770s about why the industrial revolution was happening. It wasn't to do with any kind of technology or anything else. It was the discovery that specialists kick the butt of generalists.

Charles Humble: Hundred percent, yes.

Anne Currie: But now we're gonna get to the final tricky question that we have for you, Charles, that you'll be thinking about. You've been thinking about, so I'm, your work often emphasizes the importance of transparency, knowing the carbon footprint of what we build. What tools and practices do you recommend for people to do that? 

Charles Humble: Oh, that is a hard question. Yes. Frustratingly hard actually, we, so the first thing is we often end up using proxies

and the reason we end up using proxies is 'cause measurement is genuinely quite difficult. So cost is a quite a good proxy. In Bill Gates' book, blanking on the name of the book, oh, How to Avoid a Climate Disaster, 

Anne Currie: Oh yeah. Which is excellent. And again, everybody listening to this should be reading it. Yeah.

Charles Humble: Absolutely. So he, in that book, he does a bunch of calculations, which he calls green premiums and they're

basically the cost of going green.

Now, He doesn't do one for our industry, but I would wager, because we are also profligate, I would wager that our green premium, and I haven't worked this out, I will admit it, but I would think our green premium is probably a negative number.

So, that's to say,

going green is probably cheaper for us. Right.

Anne Currie: I agree.

Charles Humble: So cost is a very good proxy. It is an imperfect proxy. One of the reasons it's an imperfect proxy is because, for example, if you're running a green energy mix, that's not going to be reflected in your electricity bill at the moment. That may change, but at the

moment it doesn't happen.

Right. So it is imperfect, but

Anne Currie: Well, it doesn't happen in some places and in other places it does. So if you are on prem and you're in a country with dynamic pricing like Spain or zonal pricing, like talking about the UK having in future, that's still very up in the air, then it does. But if you're in the cloud, even in those areas, it doesn't at the moment.

Charles Humble: Absolutely. But nevertheless, 'cause as I was saying, you know, like probably half of your servers are doing nothing useful. So cost is a pretty good starting point. Another thing is CPU utilization. So there's something we haven't really talked about, which is this idea, Google calls it energy proportionality,

Anne Currie: Yeah.

Charles Humble: the observation that when you turn a machine on, you turn a server on, it has a static power draw, and that static power draw is quite a lot. How much depends on how efficient the server is, but it might be 50% or something like that. So when it's sitting idle, it's actually drawing a lot of power. The upshot of this is you'd usually have like an optimum envelope for a given server, and that might be somewhere between 50 and about 80%.

It may be a bit lower than that depending on how good the chips are. Above about 80% you tend to get key contention and those sorts of things going on. Not great. But around and about that operating window. So it's again, keeping your CPU utilization hard but not, high rather, but not maxed out is another good one.

Hardware utilization is another good one. Beyond that, so all of the cloud providers have tools of varying usefulness. Google's carbon footprint tool is probably best in class, at least in my experience. I think they take this stuff very seriously and they've done a lot of very good work.

Microsoft Azure tools are also pretty good. AWS's ones, so they have just released an update literally as we're recording this, and I hadn't had a chance to go and look at what's in the updated version. I'm going to say I think AWS is still a long way behind their competitors in terms of reporting.

Anne Currie: Yeah.

With 

Charles Humble: a slight proviso that I hadn't looked at what's in the new tool properly. But again, there, there are all things there that you can use. There's a tool called Cloud Carbon Footprint, which is an open source thing, by ThoughtWorks and that's quite good. It will work across different cloud providers, so that's kind of nice. You could probably adapt it for your own data centers, I would imagine. Of course the GSF has a formula for calculated carbon intensity as well. So that's more of a sort of product carbon footprint or lifecycle assessment type approach. It's not really suitable for corporate level accounting or reporting or that sort of thing, but that's quite a good tool as well. And there are a variety of other things you can use, but as I say, if we're talking the very beginnings, you probably start with the proxies. If you've got a choice of cloud provider, think about the cloud provider that gives you the tooling you need.

And you know, that might, again, going back to our assumptions, time was you would choose AWS. Maybe you shouldn't be choosing AWS now, or at least maybe you should be thinking about is AWS the right choice.

At least until they, you know, until they sort put their house in order a bit more. These are things, questions that we can reasonably ask. And in general, if you are working with vendors, whether they're AI vendors or whatever, it is entirely reasonable to go and say, "well, I want to know what your carbon story looks like." And if they won't tell you, go somewhere else. In the case of AI, none of the AI companies will tell you. They absolutely won't. And so my advice, if you're looking at running generative AI, other than. Everything we just said applies to AI, like it applies to everything else. There are a bunch of very specific AI related techniques, distillation, quantization, pruning, those sorts of things. Fine. But really my advice is well, using an open source model, and look at something like the ML leaderboard from ml.energy leaderboard, which will give you an idea of, what the carbon cost looks like. And don't use AI from a company that won't tell you, would be my advice. You know, and maybe we can embarrass some of these companies into doing the right things. You never know.

Anne Currie: Be nice, wouldn't it? It's so, it's interesting, the, this, so in April, Eric Schmidt got up in front of the US government in one of their, in one of their, committees and said, well, you know, if we, at the current rates, AI is going to take up 99% of the grid electricity in the US.

And you think "it's interesting, isn't it," because that's not a law of nature. There are plenty of countries that are looking at more efficient AI, so China, are certainly looking at more efficient AI. They don't want, they want to compete. They wanna be able to run AI because in the end, the business that's going to collapse if AI requires 99% of the US grid is AI because it cannot, you know, it's kind of, if something cannot go on, it will stop. 

Charles Humble: It's a desperate source of frustration for me because it is completely unnecessary.

Anne Currie: Well, it's, you just have to be a bit efficient.

Charles Humble: Just in brief, 'cause again, this is like a whole separate podcast probably,

but just in brief, there are a bunch of things that you can do

Anne Currie: Absolutely.

Charles Humble: that make a huge difference, both when you are collecting your data, when you are training your models, when you're running them in production afterwards. I have just done a piece of work for the News Stack on federated learning, and in the process of doing that, I talked to somebody called Professor Nick Lane, who is at Cambridge University, and he talked about, so one of the solutions to the data center cooling problem, which we touched on earlier, is basically what you do with the waste heat. And there are lots of companies in Europe that are looking at using it for things like heating homes or using, you know, heating municipal swimming pools, that sort of thing, right? You can't do that with an Amazon or a Google or a Microsoft facility, because you have to construct the data center close to where the waste is gonna be used.

But there are lots of these small data centers, particularly in Europe. There are companies like T Loop that are doing a lot of this work. And he made the point that with federated learning, you can actually combine these smaller facilities together and then, you know, be training potentially very large models on much, much smaller data centers, which I thought was fascinating. There's a guy called, Chung is his surname, and apologies to him, i'm blanking on Jae-Won Chung. He's done some extraordinary work looking at, so when we split stuff across GPUs,

that has to be synchronized, right? So we divide the workload up because it's too big to fit in a GPU and we split it across a bunch of different GPUs and we run all of those GPUs at full tilt, but we don't have to. Because we can't divide the workloads up evenly.

So you have some workloads that are tiny but this GPU is still running at full power, and what he worked out was, well, if we slow those GPUs down, the job will still end at the same point, but it'll use a lot less energy. So he's built something called Perseus, on his tasks with things like Bloom and GPT-3, they're about, it's about 30% less energy use just from using that

for exactly the same throughput. So there's no throughput loss, there's no hardware modification. The end results are exactly the same, and you just save 30% of your energy bill, which is a big deal.

Then you go, as I say, things like distillation and quantizing and pruning and shrinking your model size, all of that stuff.

So it frustrates me because it's so unnecessary. I think we need a carbon tax and I think the carbon tax needs to be prohibitive. And I think, you know, bluntly, I think companies like OpenAI should be pushed outta business if they don't get their house in it's time. I thrilled.

Hannah Richie's book, not The End of the World, which is my, possibly my favorite book on climate. And again, it's a book, everyone haven't read it, go and read it. She has a wonderful quote in there where she says, "I've talked to lots of economists and all of the economists I've spoken to agree that we need some sort of carbon tax."

And then she goes on to say, "it's maybe the only thing that economists agree on," which I thought was a fine and excellent line.

Anne Currie: It is really interesting 'cause I, we disagree slightly on, you're not a huge AI fan. I'm a massive AI fan. I want AI and I also want a livable climate. And they are not mutually exclusive. They can be done. I mean, you have, you don't love AI, you don't love AI as much as I love AI, but we are both in agreement that it is not physically impossible to have AI and effective control of climate change because as you were saying about the federated learning and, you know, optimizing your GPU towards the bottleneck tasks and then things like that, as long as you, workloads that are time insensitive that can be shifted in time and maybe delayed and maybe separated and then glob together again,

they're very good workloads to run on renewable power, which is variably available. So in fact, AI is potentially incredibly alignable with the energy transition. The fact that we don't always do it is a travesty and it's so bad for AI as well as being bad for the planet.

Charles Humble: I want to push back slightly on you saying I'm not a fan of AI. So I have. Quite strong concerns specifically about generative AI that are ethical and moral as well as environmental.

Anne Currie: Which I can see.

Charles Humble: And in essence it comes down to the fact you are taking a bunch of other people's work and you are building a machine that plagiarizes that work and you are not compensating those people for it. And you are also, basically you have to do tuning of the model. So reinforcement learning with human feedback and the way that, that's done is pretty horrifying when you dig into it. It usually involves, you know, people in places like Kenya being paid $3 an hour to look at the worst contents of the internet for day after day.

I mean, one can imagine what that does to you. So I have quite specific reservations with generative AI, the way that we are doing it. As it goes, I think there are ways that we could build generative AI that wouldn't, I wouldn't have these ethical problems with, that we're not doing. More generally, think generative AI is interesting. I don't know that it's useful, but I do think it's interesting. And more broadly, I'm not against AI at all. I'm like, you know, 

I've done work with a company that, for example, is using AI to look at, , increase the window that you can treat stroke patients with, by like hours.

And it's amazing. Amazing work. So they're basically doing image processing to identify different types of stroke. And some stroke patients, the window is much wider. So, you know, we

think of it as being 4.5 hours but it's much bigger. Stuff like

that. There's, and, as you say, like grid balancing is gonna get more complicated with renewables, and AI probably has a role to play there.

And I'm not anti. I'm not anti, I just think that there are things that we are doing as an industry which are reckless and ill-judged and you know, in my tiny little way I want. I mean, I'm aware that it's like, you know, blowing a kazoo in a thunderstorm, it's quite amusing, but it doesn't actually do much for anybody. But I, in my own little way, I want to be sort of beating the drum. As an industry, I think we need to get better. Right. And part of the reason I think we need to get better is because the work that we do has a huge impact on the whole planet now and on society and all sorts of things. And we are still like acting like we're a little cottage industry and what we do is inconsequential but it's not true.

 So my reservations with gen AI is, I think it's being done in a desperately irresponsible way, but that doesn't mean it has to be. It just means that's what we're doing. And hey, I might be wrong. You know, I'm not an ethicist. I just like, I just have reservations. Also, I am a writer. And a musician, right?

So, you know, like I do have skin in the game. I kind of want generative AI not to work. 'Cause otherwise I don't really have a living anymore, which is a bit of a worry. So, you know, I'm not a neutral observer on this at all, but I just think the way we're doing this is morally, ethically dubious, as well as being very bad for the climate. And I don't think it has to be any of those things.

Anne Currie: Yeah, I, so it's an interesting, we have a slightly different, 'cause I'm also a writer and a painter. but I've always been so rubbish at making money out of writing and painting that I don't really, don't have anything to say. So we have, that's, but that is my own fault.

A little bit. 

Charles Humble: The last question, I'm looking at your script now. Sorry. 'cause it's a shared Gigle doc, and your last question is about, so I write in my free time in a band called Twofish. And the question is, if you could score the soundtrack for a more sustainable future, what would it sound like? 

Anne Currie: I forgot about the question. Yeah.

Charles Humble: Interesting have get it in. So we did the opposite thing actually. We did, so there's a piece on the last two Fish album, called Floe, and that was my kind of, I started, everything is written as, by two of us. But I started that one and when I started it, what I was trying to do is describe what climate breakdown might sound like in music.

That was kind of my starting point. Not sure anyone hearing it would get that, but what I did was I went and recorded a bunch of like, field recordings. So, you know, California wild fires and that sort of thing. Tune them all to A flat minor as you do, and then wrote this very dark, scary,

that gets a bit drum and bassy as it goes on. It's very black and industrial and dark and quite grim and I rather like it. So I think we just have to go the opposite, right? We'd have to go the other end of this. 

Anne Currie: So Twofish, what's the name of your last album? In fact, which album would you recommend? 

Charles Humble: It's called At Least a Hundred Fingers. That's the last album. And, yeah, Twofish is the band, TWA as in the encryption algorithm, fellow nerds. So yeah, so with this one, the climate break, with the sustainable future one, I think some of my favorite composers, classical composers, would be like early, late 19th, early 20th century.

People like that. They were very inspired by the natural world, and they tended also to draw a lot on their, the folk tunes of the countries where worked. So I think melodically your, my starting point might be to go to a folk tune, and then use very traditional instruments. So have like a, maybe a string section, you know, sort of violins, violas, cello. So try and get some of that lift and air and that sort of thing into it. And then have the, more electronic stuff for stuff that I typically do, be very kind of intricate, interconnected, kind of supporting lines so that you have something melodic that is folk, quite traditional instruments, and then this kind of sense of interconnectedness and sort of mechanisms working, something like that. I might have a go at that actually. Perhaps there'll be a third Twofish album that has that on it. You never know. Yeah, that. If you want to look my stuff up, so my website, my company is Conissaunce com, www.conissaunce.com. I'm Charles Humble on LinkedIn. I'm also

Anne Currie: There will be, we'll have links below in the show notes.

Charles Humble: So yeah, you can find me on all of those. And you can find the music there as well.

Anne Currie: Excellent. And I really recommend the albums. I like them a lot. They're great. 

Charles Humble: Thank you.

Anne Currie: So thank you very much, and thank you to all the listeners today. As reminder again that all the links that we've talked about today, we have slightly overrun, will be in the show notes below. So, until the next time, thank you very much for listening and happy building Green Software.

Charles Humble: Thank you very much indeed for having me. It's been a pleasure. Thanks for listening and goodbye.

Anne Currie: Goodbye. 

Chris Skipper: Hey everyone, thanks for listening. As a special treat, we're going to play you out with the piece that Charles was talking about, Floe by Twofish. If you want to listen to more podcasts by the Green Software Foundation, head to podcast.greensoftware.foundation to listen to more.

Bye for now.