Green AI by Default: Energy Reduction Techniques for LLMs in SE

Authors: Enrique Barba Roque

Published in International Conference on Software Engineering - Doctoral Symposium, Rio de Janeiro, Brazil, 2026

Large Language Models (LLMs) are increasingly being applied to Software Engineering (SE) tasks, showing high accuracy in various problems. However, their high computational demands and energy consumption raise sustainability concerns and hinder their use on consumer hardware and resource-constrained platforms. Multiple optimization techniques exist, but they are often neglected due to the technical difficulty of applying them during model development. This research aims to improve the accessibility of optimization techniques by (1) making energy reporting of LLM more accessible, (2) streamlining and automating optimization techniques, (3) providing guidelines to select appropriate techniques for different use cases, and (4) exploiting these techniques to design efficient SLM ensemble architectures for LLM-enabled applications. Expected contributions include methods for measuring and reporting energy usage, tools to automate compression of models, guidelines to guide developers through the optimization process, and using these results to explore alternative deployment setups for compressed LLMs.

Pre-print