From a mathematical perspective, optimization problems involve minimizing or maximising a given function subject to constraints.

Convex Optimization Problems study the minimization of **convex** functions over **convex sets**.

- variable $x\in \mathbf{R}^n$
- $f_i$ are all convex
- $h_i$ are all affine

Linear Programming is a type of an optimization problem where all functions involved are affine.

Any LP Problem can be formulated as

$$ \begin{array}{ll} \mbox{minimize} & c^Tx \\ \mbox{subject to} & Ax = b \\ & l \leq x \leq u\\ \end{array} $$where $x,c,l,u \in \mathbb{R}^n$ , $b \in \mathbb{R}^m$ and $A \in \mathbb{R}^{m \times n}$

For example,

$$ \begin{array}{ll} \mbox{minimize} & x_1 + x_2 + x_3 + x_4 \\ \mbox{subject to} & x_1 = 1 \\ & 2x_2 + x_4 = 2 \\ & x_4 = 5 \\ & \forall i \hspace{0.1in} 0 \leq x_i \leq \infty \end{array} $$In matrix formulation, $$A = \begin{bmatrix} 1 & 0 & 0 & 0 \\[0.3em] 0 & 2 & 0 & 1 \\[0.3em] 0 & 0 & 0 & 1 \end{bmatrix} \hspace{0.1in} \mbox{and} \hspace{0.1in} b = \begin{bmatrix} 1 \\[0.3em] 2 \\[0.3em] 5 \end{bmatrix}$$

- $x_1$ is fixed to value $1$
- $x_4$ is fixed to value $5$
- $x_2$ is also fixed from second constraint.
- No constraints on $x_3$

So three of the four variables are already fixed to a definite value..

Hence, an equivalent problem would be - $$ \begin{array}{ll} \mbox{minimize} & x_3\\ \mbox{subject to} & 0 \leq x_3 \leq \infty \end{array} $$

And trivially the solution is $x_3 = 0$

The information contained in the initial optimization problem might have several redundancies. We can exploit the structure and the nature of the problem to find these and remove them.

We should also efficiently generate the solution to the original problem from the new problem

Two fold benefit of making the optimization process **faster** and more **reliable**

**LP**

- CPLEX , MOSEK are proprietary.
- Other solvers are have only
*partially*implemented presolve or are inefficient - Many existing implementations are tied to particular solvers

**SDP**

- SCS is the only solver that can solve
*all*types of problems within Convex.jl - No other solver that can handle both semi-definite cones and exponential cones.
- SCS has
**no**pre-solve of any kind, - The only implementation of a presolve for SDP is in the
`MATLAB`

library`frlib`

by Frank Permenter and Pablo Parrilo

**Our Goals**

- Solver independant
- Competitive open source implementation

**Inspiration** : Erling Andersen and Knud Andersen, 1995 - " Presolving in linear programming "

Some of the Redundancies in LP that can be Presolved -

Empty Rows / Columns

When all entries of a row/column are zeros in the matrix A

Singleton Rows

When there is only one entry in a row. This fixes the corresponding variable.

Free Singleton Columns

When there is only one entry in a column. Depending on the bounds on the variable, we can remove both the variable and the constraint.

Erling Andersen is the CEO of MOSEK ApS. MOSEK is one of the most powerful and reliable *proprietary* solvers currently.

Coming up with the theory to identify redundancies and establishing equivalences between the optimization problems including procedures to generate solution to the original problem.

Coming up with the the appropriate data structures with reasonable constraints on time and space and interfacing with solvers / optimization-modelling languages

The ideas seems simple enough, but not a lot of reliable implementations out there.

Some deeper issues need to be considered. We need Data Structures that exploit the sparseness of the 2-dimensional data for space efficiency while also ensuring operations are efficient.

Tradeoffs between

- Indexing and Fast Access
- Ordered enumeration and BLAS operations

Pre-solve Graph was a custom suggestion we came up with that captures the optimization aspects of the problem - constraints and variables.

Task | Hash-based | CRS/CSC | Pre-solve Graph |
---|---|---|---|

Data for Matrix Creation | Matrix dimensions | Matrix dimensions + $nnz$ | Matrix Dimensions |

Insertion | $O(1)$ | $O(1)$ with strict ordering for initialisation only | $O(1)$ |

Access | Expected $O(1)$ | $O(\log(nnz))$ | $O(\log(nnz)$ |

Arbitray Modification | Expected $O(1)$ | $O(\log(nnz))$ | $O(\log(nnz))$ |

Sequential Modification | Expected $O(1)$ | $O(1)$ | $O(1)$ |

Memory | $O(y(2\textrm{sizeof(int)}+\textrm{sizeof(float)}))$ , $1< y <1.5$ | $ O(\textrm{sizeof(int)}+\textrm{sizeof(float)}) $ | $O(2(\textrm{sizeof(int)}+\textrm{sizeof(float)})$ |

Enumerate nnz | $O(\textrm{TableSize})$ with no ordering | $O(nnz)$ with ordering | $O(nnz)$ |

Linear algebra Support (BLAS level 2,3) | No | Yes | Yes (less efficient than CRS) |

Has it been implemented in Julia? | No | Yes | No |

Future usability ? | Yes | Yes | Probably Not |

Reliable implementation in other language ? | Yes | Yes | No |

Efficient Conversion to CRS/CSC? | Yes | - | Yes |

*Very very* brief outline - Find *convex faces* of the cone that contains the feasible set and optimize over this rather than the entire feasible set. This procedure is called the *Partial Facial Reduction*

SDP is the second part of the GSOC project and is based on the recent paper by Pablo Parrilo and Frank Permenter (MIT). Their implementation in MATLAB 'frlib' will be our starting point

We hope to bring efficient presolving capability for most (if not all) cases outlined in two papers - Andersen & Andersen(1995) , Frank Permenter & Pablo Parrilo(2014)

This would benefit all the optimization packages within Julia-opt handling LP and SDP.

Finally to my mentor **Madeleine Udell** and the Julia community -