This is the home page of the Open Journal of Mathematical Optimization, an electronic journal of computer science and mathematics owned by its Editorial Board.

The Open Journal of Mathematical Optimization (OJMO) publishes original and high-quality articles dealing with every aspect of mathematical optimization, ranging from numerical and computational aspects to the theoretical questions related to mathematical optimization problems. The topics covered by the journal are classified into four areas:

  1. Continuous Optimization
  2. Discrete Optimization
  3. Optimization under Uncertainty
  4. Computational aspects and applications

The journal publishes high-quality articles in open access free of charge, meaning that neither the authors nor the readers have to pay to access the content of the published papers, thus adhering to the principles of Fair Open Access. The journal supports open data and open code whenever possible and authors are strongly encouraged to submit code and data sets along with their manuscript.


Indexing

  

 

 

Awards

The 2021 Beale — Orchard-Hays Prize given by MOS has been awarded to a paper published in OJMO:

Giacomo Nannicini. On the implementation of a global optimization method for mixed-variable problems. Open Journal of Mathematical Optimization, Volume 2 (2021), article  no. 1, 25 p. doi : 10.5802/ojmo.3

 

 

e-ISSN : 2777-5860

New articles

Tight computationally efficient approximation of matrix norms with applications

We address the problems of computing operator norms of matrices induced by given norms on the argument and the image space. It is known that aside of a fistful of “solvable cases”, most notably, the case when both given norms are Euclidean, computing operator norm of a matrix is NP-hard. We specify rather general families of norms on the argument and the images space (“ellitopic” and “co-ellitopic”, respectively) allowing for reasonably tight computationally efficient upper-bounding of the associated operator norms. We extend these results to bounding “robust operator norm of uncertain matrix with box uncertainty”, that is, the maximum of operator norms of matrices representable as a linear combination, with coefficients of magnitude 1, of a collection of given matrices. Finally, we consider some applications of norm bounding, in particular, (1) computationally efficient synthesis of affine non-anticipative finite-horizon control of discrete time linear dynamical systems under bounds on the peak-to-peak gains, (2) signal recovery with uncertainties in sensing matrix, and (3) identification of parameters of time invariant discrete time linear dynamical systems via noisy observations of states and inputs on a given time horizon, in the case of “uncertain-but-bounded” noise varying in a box.

Available online:
PDF

AC Optimal Power Flow: a Conic Programming relaxation and an iterative MILP scheme for Global Optimization

We address the issue of computing a global minimizer of the AC Optimal Power Flow problem. We introduce valid inequalities to strengthen the Semidefinite Programming relaxation, yielding a novel Conic Programming relaxation. Leveraging these Conic Programming constraints, we dynamically generate Mixed-Integer Linear Programming (MILP) relaxations, whose solutions asymptotically converge to global minimizers of the AC Optimal Power Flow problem. We apply this iterative MILP scheme on the IEEE PES PGLib [2] benchmark and compare the results with two recent Global Optimization approaches.

Available online:
PDF