Oleg KhamisovRussian Academy of Sciences
Global optimization with nonlinear support functions
We consider nonlinear nonconvex optimization problem with equality and inequality constraints. It is assumed that the objective function as well as constraint functions have so called nonlinear support functions. The considered class of problems includes Lipschitz and d.c. optimization problems. We describe methodology of global search with nonlinear support functions and show that this metodology is a flexible tool for global optimization comparable with other well known methodologies. Special attention is devoted to advantages of using support functions in comparison to Lipschitz and d.c. optimization methods. The suggested methodology consists of support functions constructing technique in combination with cuts and branch and bounds. Different algorithms realizing particular schemes of the methodology are provided and corresponding convergence results are given. Numerical testing and applications in some operations research problems are presented.
Panos PardalosUniversity of Florida, Distinguished Professor of Industrial and Systems Engineering
Objective Function Representation in Global Optimization and Applications
The problem of representation (or decomposition) of a continuous function and its use in global optimization has been well studied. The most well known and used methods include the representation of functions as the difference of two convex functions (CD optimization) or difference of two monotonically increasing functions (Monotonic Optimization). Other techniques include reduction to separability (total or partial), and methods based on Kolomogorov's superposition theorem. After a summary of existing work, the talk will focus on DC discrete optimization. In particular we are going to discuss details for the solution of degree-constrained fault-tolerant spanning subgraph problem by DC optimization.
Leonidas PitsoulisAristotle University of Thessaloniki, Associate Professor in the Department of Electrical and Computer Engineering
Optimization in Robust Statistics
Given a dataset an outlier can be defined as an observation that does not follow the statistical properties of the majority of the data. Computation of outliers is of fundamental importance in data analysis, and it is well known in statistics that classical methods, such as taking the sample average or standard deviation, can be greatly affected by the presence of outliers in the data. Robust statistics is concerned with the design and analysis of estimators which are not affected by the presence of outliers, while nearly all of them are based on an underlying optimization problem. In this talk we will present a number of robust estimators that we have developed for multilinear regression and location estimation.