Optimization of Adaptive Prompt Engineering for Large Language Models via Bayesian Inference in Low-Resource Settings

Authors

  • Arthur J. Williams School of Computing and Information Systems, The University of Melbourne, Melbourne VIC 3010, Australia Author
  • Katherine L. McDough School of Computing and Information Systems, The University of Melbourne, Melbourne VIC 3010, Australia Author
  • Thomas S. Halloway School of Computing and Information Systems, The University of Melbourne, Melbourne VIC 3010, Australia Author

DOI:

https://doi.org/10.71465/fair601

Keywords:

Bayesian Optimization, Prompt Engineering, Large Language Models, Low-Resource NLP.

Abstract

The rapid proliferation of Large Language Models (LLMs) has necessitated the development of effective prompt engineering strategies to harness their full potential. However, the stochastic nature of LLM outputs and the vast, discrete combinatorial space of natural language make manual prompt design a laborious and often suboptimal process. While automated prompt optimization techniques exist, they typically require significant computational resources, massive datasets, or access to model gradients, rendering them inaccessible for low-resource environments and edge computing applications. This paper proposes a novel framework for optimizing adaptive prompt engineering using Bayesian Inference. We introduce a sample-efficient methodology that treats prompt selection as a black-box optimization problem, utilizing Gaussian Processes to model the latent manifold of prompt effectiveness. By strategically balancing exploration and exploitation through acquisition functions, our approach converges on high-performing prompts with significantly fewer model queries than traditional grid search or reinforcement learning paradigms. We demonstrate that this method achieves state-of-the-art performance on reasoning benchmarks while reducing token consumption by an order of magnitude, making advanced prompt engineering feasible for academic laboratories and limited-budget applications.

Downloads

Download data is not yet available.

Downloads

Published

2026-04-30