Nonlinear constrained optimization requires a balance in the approach, exploration and minimization of objectives. In this paper, a Bayesian self-supervised multi-layer perceptron (MLP) model is presented, which focuses on the satisfaction of constraints as well as minimizing user reliance on initialisation. As opposed to classical solvers, there must be no initial guess, candidate solutions are drawn using standard normal distribution and refined. Bayesian Optimization is a two-fold approach that automates the process of supervising hyperparameters as well as provides intelligent search-space exploration to help the model to escape local optima and converge to viable near-global solutions. The framework integrates differentiable inequalities, equality and bound constraints within the loss to provide a strong feasibility to the loss. Training is also stabilized further with gradient clipping, regularization and input-parameter co-optimization. The benchmark assessments of the model in standard test functions and engineering design problems reveal that the model is always able to find feasible solutions with competitive or better objective values than IPOPT indicating that it is indeed a robust model as a feasibility-oriented surrogate solver.
Published in: 8th IEOM Bangladesh International Conference on Industrial Engineering and Operations Management, Dhaka, Bangladesh
Publisher: IEOM Society International
Date of Conference: December 20
-21
, 2025
ISBN: 979-8-3507-4441-5
ISSN/E-ISSN: 2169-8767