Large language models (LLMs) are becoming an integral part of our daily work. In the field of ecology, LLMs are already being applied to a wide range of tasks, such as extracting georeferenced data or taxonomic entities from unstructured texts, information synthesis, coding, and teaching. Further development and increased use of LLMs in ecology, as in science in general, is likely to intensify and accelerate the process of research and increase publication output—thus pressuring scientists to keep up with the elevated pace, which in turn creates a feedback loop by promoting even greater LLM use. However, this all comes at a cost. While not directly borne by end users, aside from occasional response delays, LLMs require considerable computational power and are energy-demanding during both their initial training phase and their subsequent operational use. Furthermore, partly externalized energy costs are linked to intensive searching and processing of discovered sources as part of Deep Research. Currently, it remains challenging to estimate the total energy costs of LLMs, largely due to limited transparency from their companies of origin.