Here is the tension I sit with every single day: we built a company called Greensphere.ai to help the world make better environmental decisions – and the technology we use to do that consumes a meaningful amount of energy to run.
I am not going to paper over that with a press release.
When we started Greensphere.ai, the question that kept me up at night was not “how do we market ourselves as a green company?”
It was a harder one: **how do we actually become one?**
Because slapping a sustainability label on an AI product while quietly running inference jobs on carbon-heavy infrastructure is not building a green AI company. It is building a sustainability-themed one. There is a significant difference.
—
The Problem Is Real, and the Numbers Are Uncomfortable
Training large models has a measurable environmental cost – and it’s not trivial.
Inference, the part where the model actually answers your questions at scale, adds up quietly but consistently over time. For a company whose core promise is environmental impact, ignoring this would be a foundational contradiction.
So we made a set of architectural and operational choices early that shaped everything else.
First, we right-sized the models. The AI industry has a reflexive pull toward bigger: bigger parameters, bigger context windows, bigger compute budgets.
We pushed back on that.
For the specific tasks Greensphere.ai handles – a carefully fine-tuned smaller model consistently outperforms a generalist large one, and it does so at a fraction of the energy cost.
We benchmarked this internally: our domain-specific models run significantly more efficiently than the general-purpose models we started with, without sacrificing performance on our core use cases.
Second, we chose our infrastructure deliberately.
Where our AI runs matters. We prioritise regions with better renewable energy profiles and treat infrastructure as part of the product – not an afterthought.
It’s not perfect – the grid is never 100% clean – but it’s a meaningful lever.
We track our compute footprint and use internal estimates to understand and challenge its environmental impact over time.That number is on my dashboard. It influences decisions.
Third, we built efficiency into the product loop itself. Every feature we ship, we ask: does this require a model call, or can it be handled with a lighter-weight process?
Caching, retrieval-augmented generation over large model calls, and smart batching are not just engineering optimizations for us – they are part of what it means to practice what we preach.
—
Agentic AI Raised the Stakes Further
When we started building agentic AI capabilities – systems that can plan, take actions, and run multi-step workflows autonomously – the energy question got sharper.
An agent that loops through ten reasoning steps to answer a question that could have been answered in two is wasteful. We now treat agent step-count efficiency as a first-class metric alongside accuracy. Fewer steps, same outcome: that is the goal.
This is what energy-efficient AI actually looks like in practice. It is not a marketing position. It is an engineering discipline.
—
What I Got Wrong Early
I assumed that choosing the right tools would be enough. It is not. The choices have to be active and ongoing. Cloud providers shift their energy mixes. Model usage patterns change as the product grows. The carbon math you did in year one does not automatically hold in year two.
We now do ongoing reviews of our infrastructure footprint – not just cost, but estimated emissions. It is imperfect because the data from providers is imperfect. But imperfect accountability is better than none, and it keeps the question alive inside the company rather than filed away in a one-time audit.
—
The Honest Position
Greensphere.ai is not a zero-impact company. No software company is. But we are trying to be a company where the gap between what we say and what we do is as small as we can make it – and where we are honest when that gap still exists.
Green sustainable AI is not a badge you earn once. It is a standard you keep re-earning through decisions that are sometimes slower, sometimes more expensive, and sometimes less impressive-sounding than the alternative.
I think that is worth saying plainly, because the AI industry right now has a lot of green language and not always a lot of green practice.
—
For other founders building at the intersection of AI and sustainability: what operational choices have you made – or avoided – to close the gap between your company’s mission and its actual footprint?
I am genuinely curious what you have found works.
#GreenAI #FounderStory #SustainableAI


