The site of ISMP 2012 was announced: it will be the Technical University of Berlin. It will be held under the auspices of Berlin’s DFG Research center MATHEON. The local organizing committee will be composed of Martin Groetschel [add link], Rolf Moehring [add link] and Martin Skutella [add link]. The symposium will take place in mid-August. I hope it’s really “mid” rather than the last week of August when many people have to start teaching.

Wednesday’s morning plenary was given by Matteo Fischetti [add link]. Matteo is not only an amazing and prolific researcher, but also an excellent speaker. His talks are full of jokes and witty comments. He talked about “Pure Cutting Plane Methods for Integer Linear Programming: A Computational Perspective”. He and his co-authors (including Egon Balas [add link]) are taking a fresh look at Gomory’s method for solving ILP’s published in 1958. In theory, it is possible to solve an ILP by just using Gomory Fractional Cuts (GFC) and no branching. In practice, at least with the implementations used so far, the method tends to get stuck after some initial progress. He mentioned that branching is used as a symptomatic cure to these well known problems, but we should try to understand the cause of these problems and try to do something about them. He proposes a framework in which GFC’s are read directly from the tableau and cut separation is deeply entangled with LP re-optimization. Nevertheless, this closed loop system tableau-cut-tableau turns out to be very unstable. They took an instance from the MIPLIB [add link] called stein15, which is easy for traditional branch-and-cut methods and implemented a naïve GFC approach. The result was that it didn’t converge to the optimal solution even after 130 rounds of single cuts or 80 rounds of multiple cuts. After looking at some plots, they realized that the determinant of the optimal LP basis was increasing exponentially. Using Matteo’s words, this was happening because the solutions had some weird fractions. The idea then was to use a clever pivoting strategy to keep the tableau clean. Degeneracy is normally considered a negative thing, but in this case it could be used to their advantage. Among all the equivalent optimal bases, choose the “cleanest” one. By design, LP solvers try to minimize the number of pivots during re-optimization after the addition of a cut. As a result, the next solution is very close to the old one (i.e. the change is small), which leads to a large determinant (weird fractions), which in turn leads to an unstable tableau, shallow cuts, and finally hopelessness. Their solution was to use GFC’s with lex optimization: let x0 = objective function, find optimal objective variable z*, fix x0=z*, minimize x1 subject to that, fix x1 to x1*, minimize x2 subject to that, fix x2=x2*, etc. This gives you a cleaner tableau and the basis determinant growth is contained. That allowed them to solve stein15 with lex optimization in 65 single-cut rounds (the optimal solution was found after 15 rounds). He ended the talk showing that GFC with lex optimization actually does an implicit tree search (with branching and backtracking) if one looks at the sequence of solutions that are being found. The problem now is that the way this implicit tree is being explored isn’t that smart, so there’s room for further research on that. Perhaps a modified lex procedure can result in a better tree search.

I attended a talk by H. P. Williams [add link] in the first morning session that was pretty interesting. He spoke about Chvátal Functions [add link] and the possibility of using them to play the role of shadow prices in integer programs. The topic is kind of fascinating, but it’s still in progress. He’s looking at ways of representing such functions in a way that will make it easy for economists to work with the integer shadow prices as an aid to decision making. One interesting comment he made was that, for an integer program in n dimensions, it could be that more than n constraints matter in determining where the optimal solution is. In linear programming however, we only need to worry about at most n constraints.

There were a few other talks that I enjoyed: Miguel Constantino’s [add link] talk on “A Strengthened Integer Programming Model for Conflict Minimization in Cartographic Label Placement” and Eduardo Uchoa’s [add link] talk on “Time-Dependent Traveling Salesman Problem: Polyhedra and Branch-and-Cut-and-Price”. Miguel managed to get some nice improvements using simple cuts added a priori to the IP model. Eduardo gave an introduction to the time-dependent TSP (TDTSP) and . Interestingly, many known TSP facets induce faces low dimensional facets of the TDTSP. I’m curious to see whether the new families of facets that they found for the TDTSP will translate into new TSP facets. I also attended a talk by Andrea Qualizza [add link] on “Linear Programming Relaxations of Non-convex Mixed Integer Quadratically Constrained Problems”. The idea is to solve these kinds of problems using linear programming relaxations that approximate the cone of positive semi-definite matrices, sparsification of cutting planes, and linearization of convex quadratic inequalities. (In summary, make everything linear and see how far that takes you.) The afternoon semi-plenaries were both about approximation algorithms, but I was not able to attend any of them.

The conference banquet took place in the Field Museum’s [add link] Stanley Field Hall, current home to “Sue”, the largest and most complete Tyrannosaurus Rex specimen in the world. I did not attend the banquet, so I cannot attest to the quality of the food. However, I heard amazing things about the Field Museum and I’m sure it must have been an entertaining night.

Thursday 8/27: Today I learned why Paul Tseng’s talk was cancelled on Tuesday. He has literally disappeared! He has been missing since August 13 in a scenic mountainous region in the Yunnan Province in China. The University of Washington has issued a press release that can be found here [add link].

Lars Peter Hansen [add link] gave the morning plenary on “Valuation in Dynamic Stochastic Economies”. I left the room a little early to run to session ThA14: the Tucker Prize session. I was curious to learn about Mohit Sing’s [add link] “Iterative Methods in Combinatorial Optimization”. He did a great job explaining the topic and I was impressed at how the proof of Goeman’s conjecture (i.e. there exists a (1,B+1) approximation to the degree-bounded minimum cost spanning tree problem) was so simple and elegant. Well done! Another merit of his technique is that it’s applicable to a large class of problems that, in their pure versions, are polynomially solvable, but become NP-hard with the addition of side constraints (e.g. bipartite matching). (It’s also applicable to other NP-hard problems such as the steiner tree problem.) I was even inspired with an idea for a research paper that I want to explore in the near future. Thanks Mohit!

My presentation was in session ThC09. Armin Fuegenschuh [add link], who spoke right before me, talked about “Solving Nonlinear Engineering Problem with Piecewise-linear Approximation Techniques”. He’s studying an optimal control problem related to the separation of substances in process engineering. More specifically, how to separate the bad stuff from the good stuff in paper recycling plants. Their model not only optimizes the setup of the separator, but also proposes a layout of separators (they are used in sequence with many possible ways of feeding back into each other). I was impressed when Armin mentioned that 75% of the paper used in Germany is recycled and, out of that, 67% of the fibers are re-used. That’s pretty amazing.

My talk was entitled “Valid Inequalities for a Piecewise Linear Objective with Knapsack and Cardinality Constraints”. We are looking at the interplay of piecewise-linear models and cardinality constraints. When these two conditions are combined, we are able to obtain some interesting new results that translate into previously unknown facets of the polyhedron. We don’t have computational results yet, but we’re very excited with the potential applications of this work; especially in the areas of portfolio optimization and bio-informatics. Both my master’s advisor and my PhD advisor were in present. Laurence Wolsey [add link] was also there, although I think he probably stopped by to listed to Simge Küçükyavuz [add link] (the first speaker) and ended up staying. I was happy to see Andrea Lodi [add link] as well since I admire and respect his work. All in all, I think the talk went well despite the heavy notation I had to use.

I went to dinner with John (Hooker) [add link] at India House (second time there for me). We had a very pleasant conversation about a myriad of things including research, the future of SIMPL [add link]. I really enjoy talking to John and I’m happy we were able to spend that time together. This was also our SIMPL-accepted-into-Operations-Research celebration dinner, so I couldn’t help ordering a glass of Gewürztraminer. Hmmm…

**Wednesday 8/26:**

The site of ISMP 2012 was announced: it will be the Technical University of Berlin. It will be held under the auspices of Berlin’s DFG Research center MATHEON. The local organizing committee will be composed of Martin Grötschel, Rolf Möhring and Martin Skutella. The symposium will take place in mid-August. I hope it’s really “mid” rather than the last week of August when many people have to start teaching.

Wednesday’s morning plenary was given by Matteo Fischetti. Matteo is not only an amazing and prolific researcher, but also an excellent speaker. His talks are full of jokes and witty comments. He talked about “Pure Cutting Plane Methods for Integer Linear Programming: A Computational Perspective”. He and his co-authors (including Egon Balas) are taking a fresh look at Gomory’s method for solving ILP’s published in 1958. In theory, it is possible to solve an ILP by just using Gomory Fractional Cuts (GFC) and no branching. In practice, at least with the implementations used so far, the method tends to get stuck after some initial progress. He mentioned that branching is used as a symptomatic cure to these well known problems, but we should try to understand the cause of these problems and try to do something about them. He proposes a framework in which GFC’s are read directly from the tableau and cut separation is deeply entangled with LP re-optimization. Nevertheless, this closed loop system tableau-cut-tableau turns out to be very unstable. They took an instance from the MIPLIB called stein15, which is easy for traditional branch-and-cut methods, and implemented a naïve GFC approach. The result was that it didn’t converge to the optimal solution even after 130 rounds of single cuts or 80 rounds of multiple cuts. After looking at some plots, they realized that the determinant of the optimal LP basis was increasing exponentially. In Matteo’s own words, this was happening because the solutions had some “weird” fractions. The idea then was to use a clever pivoting strategy to keep the tableau clean. Degeneracy is normally considered a negative thing, but in this case it could be used to their advantage. Among all the equivalent optimal bases, choose the “cleanest” one. By design, LP solvers try to minimize the number of pivots during re-optimization after the addition of a cut. As a result, the next solution is very close to the old one (i.e. the change is small), which leads to a large determinant (weird fractions), which in turn leads to an unstable tableau, shallow cuts, and finally hopelessness. Their solution was to use GFCs with lex optimization: let x0 = objective function, find optimal objective value z*, fix x0=z*, minimize x1 subject to that, fix x1 to x1*, minimize x2 subject to that, fix x2=x2*, etc. This gives you a cleaner tableau and the basis determinant growth is contained. That allowed them to solve stein15 with lex optimization in 65 single-cut rounds (the optimal solution was found after 15 rounds). He ended the talk showing that GFC with lex optimization actually does an implicit tree search (with branching and backtracking) if one looks at the sequence of solutions that are being found. The problem now is that the way this implicit tree is being explored isn’t that smart, so there’s room for further research on that. Perhaps a modified lex procedure can result in a better tree search.

I attended a talk by H. P. Williams in the first morning session that was pretty interesting. He spoke about Chvátal Functions and the possibility of using them to play the role of shadow prices in integer programs. The topic is kind of fascinating, but it’s still in progress. He’s looking at how to represent such functions in a way that will make it easy for economists to work with the integer shadow prices as an aid to decision making. One interesting comment he made was that, for an integer program in n dimensions, it could be that more than n constraints matter in determining where the optimal solution lies. In linear programming however, we only need to worry about at most n constraints.

There were a few other talks that I enjoyed: Miguel Constantino‘s talk on “A Strengthened Integer Programming Model for Conflict Minimization in Cartographic Label Placement” and Eduardo Uchoa‘s talk on “Time-Dependent Traveling Salesman Problem: Polyhedra and Branch-and-Cut-and-Price”. Miguel managed to get some nice improvements using simple cuts added a priori to the IP model. Eduardo gave an introduction to the time-dependent TSP (TDTSP) and discussed polyhedral and computational results. Interestingly, many known TSP facets induce low dimensional faces of the TDTSP. I’m curious to see whether the new families of facets that they found for the TDTSP will translate into new TSP facets. I also attended a talk by Andrea Qualizza on “Linear Programming Relaxations of Non-convex Mixed Integer Quadratically Constrained Problems”. The idea is to solve these kinds of problems using linear programming relaxations that approximate the cone of positive semi-definite matrices, sparsification of cutting planes, and linearization of convex quadratic inequalities. (In summary, make everything linear and see how far that takes you.) The afternoon semi-plenaries were both about approximation algorithms, but I was not able to attend any of them.

The conference banquet took place in the Field Museum‘s Stanley Field Hall, current home to “Sue”, the largest and most complete Tyrannosaurus Rex specimen in the world. I did not attend the banquet, so I cannot attest to the quality of the food. However, I heard amazing things about the Field Museum and I’m sure it must have been an entertaining night.

**Thursday 8/27:**

I learned why Paul Tseng‘s talk was cancelled on Tuesday. He has literally disappeared! He has been missing since August 13 in a scenic mountainous region in the Yunnan Province in China. The University of Washington has issued a press release about that. The story has appeared in many places, including here and here.

Lars Peter Hansen gave the morning plenary on “Valuation in Dynamic Stochastic Economies”. I left the room a little early to run to session ThA14: the Tucker Prize session. I was curious to learn about Mohit Singh‘s “Iterative Methods in Combinatorial Optimization”. He did a great job explaining the topic and I was impressed at how the proof of Goeman’s conjecture (i.e. there exists a (1,B+1) approximation to the degree-bounded minimum cost spanning tree problem) was so simple and elegant. Well done! Another merit of his technique is that it’s applicable to a large class of problems that, in their pure versions, are polynomially solvable, but become NP-hard with the addition of side constraints (e.g. bipartite matching). (It’s also applicable to other NP-hard problems such as the steiner tree problem.) I was even inspired with an idea for a research paper that I want to explore in the near future. Thanks Mohit! After lunch I stopped by to see Gabor Pataki (“Basis Reduction and the Complexity of Branch-and-Bound”) and Timo Berthold (“Hybrid Branching”). Their presentations were very interesting and informative.

My presentation was in session ThC09. Armin Fuegenschuh, who spoke right before me, talked about “Solving Nonlinear Engineering Problem with Piecewise-linear Approximation Techniques”. He’s studying an optimal control problem related to the separation of substances in process engineering. More specifically, how to separate the bad stuff from the good stuff in paper recycling plants. Their model not only optimizes the setup of the separator, but also proposes a layout of separators (they are used in sequence with many possible ways of feeding back into each other). I was impressed when Armin mentioned that 75% of the paper used in Germany is recycled and, out of that, 67% of the fibers are re-used. That’s pretty amazing.

My talk was entitled “Valid Inequalities for a Piecewise Linear Objective with Knapsack and Cardinality Constraints”. We are looking at the interplay of piecewise-linear models and cardinality constraints. When these two conditions are combined, we are able to obtain some interesting new results that translate into previously unknown facets of the polyhedron. We don’t have computational results yet, but we’re very excited with the potential applications of this work; especially in the areas of portfolio optimization and bio-informatics. Both my master’s advisor and my PhD advisor attended the talk. Laurence Wolsey was also there, although I think he probably stopped by to listen to Simge Küçükyavuz (the first speaker) and ended up staying. I was happy to see Andrea Lodi there as well since I admire and respect his work. All in all, I think the talk went well despite the heavy notation I had to use.

I went to dinner with John (Hooker) at India House (second time there for me). We had a very pleasant conversation about a myriad of things, including research and the future of SIMPL. I really enjoy talking to John and I’m happy we were able to spend that time together. This was also our SIMPL-accepted-into-Operations-Research celebration dinner, so I couldn’t help ordering a glass of Gewürztraminer. Hmmm…