Machine Learning and Optimization

Thursday 9 May 2019, 11:30 - 14:30
Spoken Language
Lund (M1-18)
Van der Goot Building
Campus Woudestein
Add to calendar

In the Big Data eras, data analytics methods have started to play a more and more important role in decision making. The aim of the workshop is to bring together researchers in operations research and computer science, discussing how machine learning and data mining methods can be used to boost performance in solving optimization problems, and applications of integration of data analytics and optimization techniques.

Held in conjunction with Qing Chuan (Charlie) Ye's PhD defense


11:30-12:45Registration and lunch

Patrick de Causmaecker (KU Leuven)
Optimization, where is the information?


Kevin Tierney (Bielefeld University)
Deep Learning assisted Heuristic Tree Search for the Container Pre-Marshalling Problem


Ilker Birbil (Erasmus University Rotterdam)
Data Privacy in Optimal Capacity Sharing 


Panel discussion (challenges and opportunities of combining machine learning and optimization)


Deadline for the registrion is 6 May 2019. Registration is closed. 


  • Qing Chuan (Charlie) Ye (ESE/EUR)
  • Yingqian Zhang (TU Eindhoven)


Several sources of information are involved in the definition of an optimization problem. Among others, there are (mathematical) models, (hidden) knowledge in (historical) data and (not formalised) expert knowledge. Possibilities for utilizing each of these sources have been proposed. Each approach has its strengths and weaknesses, its opportunities and limits. Using some examples from recent work, we discuss how these sources can be accessed and set to work, and how combinations of techniques from the associated fields can bring extra power as well as new challenges.

The container pre-marshalling problem (CPMP) is concerned with the re-ordering of containers in container terminals during off-peak times so that containers can be quickly retrieved when the port is busy. The problem has received significant attention in the literature and is addressed by a large number of exact and heuristic methods. Existing methods for the CPMP heavily rely on problem-specific components (e.g., proven lower bounds) that need to be developed by domain experts with knowledge of optimization techniques and a deep understanding of the problem at hand. With the goal to automate the cost and time-intensive design of heuristics for the CPMP we propose a new method called Deep Learning Heuristic Tree Search (DLTS). It uses deep neural networks to learn solution strategies and lower bounds customized to the CPMP solely through analyzing existing (near-) optimal solutions to CPMP instances. The networks are then integrated into a tree search procedure to decide which branch to choose next and to prune the search tree. DLTS produces the highest quality heuristic solutions to the CPMP to date with gaps to optimality below 2% on real-world sized instances.

Capacity sharing is arguably one of the best approaches to obtain sustainable and cost-effective use of resources. There exist various mathematical programming tools for optimal resource allocation. However, we still need to convince multiple parties to agree upon sharing their capacities. Even if they give their consent for collaboration, they also rightfully raise their concerns regarding the privacy of their sensitive data used in optimization models. Particularly for resource allocation, linear programming is one of the most frequently used optimization methods in practice. Therefore, in this talk I shall discuss two general ideas to obtain data privacy in linear programming: data masking and problem decomposition. The former idea has also ties with a recent research topic in machine learning known as differential privacy. Along with a presentation of these methodologies, I shall also illustrate their use on a revenue management application. The talk will end with a discussion on some open research questions.

More information

    For more information please contact Anneke Kop at

    Compare @count study programme

    • @title

      • Duration: @duration
    Compare study programmes