Category Archives: Travel

Portuguese Pronunciation and Language Tips

I’ll be taking a group of 34 MBA students on an international business immersion trip to my native Brazil this Spring. We’ll be visiting about a dozen companies in the cities of São Paulo and Rio de Janeiro. This is an initiative created by the awesome Center for International Business Education and Research (CIBER) at the University of Miami.

I’d like my students to be able to pronounce some of the main sounds in Portuguese correctly because I know Brazilians pay attention and really enjoy when foreigners make an effort to say things properly. Therefore, I created a video in which I go over what I consider to be some of the most important things to know when speaking Portuguese (there are others, but I didn’t want the video to be too long).

You can access it on my YouTube channel here: https://www.youtube.com/watch?v=LzgoYFokBPk

Moreover, the 2016 Olympic Games are coming, so I figured these tips could be useful for a larger audience as well. I wish American sports casters would watch this video because they murdered the pronunciation of everything during the World Cup in 2014.

Bonus material: My daughter, Lavinia Lilith, a.k.a. #LLCoolBaby, makes a short appearance at around the halfway mark.

Enjoy!

Advertisements

Leave a comment

Filed under Brazil, People, Teaching, Tips and Tricks, Travel, Videos, YouTube

Can We Use Social Networks to Identify Poor Decision Making?

While suffering through the usual air travel woes recently, I felt compelled to tweet my feelings:

I was pleasantly surprised to see that Paul Rubin and Matthew Saltzman followed up with an interesting exchange:

Although I tend to agree with Paul that an outsider probably does not have enough information to decide whether or not the actions he/she sees are reasonably good given the situation (especially when it comes to the incredibly complex world or airline operations), I like Matthew’s general idea of an experiment to identify whether or not the outcome of a black-box decision making process is “good”.

Can that be done? Can we observe a black-box situation (or process) long enough to be able to tell whether the analytic machine inside the box could do better?

A large number of people carry smart phones these days, with constant connectivity to the internet. Twitter, Facebook, and FourSquare (to name a few) can determine our location, and there are other Apps that tell us which of our friends (and even non-friends) are close by. With all of this connectivity and location awareness, we can think of human beings as sophisticated sensors that collect and share information. We see a fire, a car accident, a traffic jam, an arrest, a fight, and immediately share that information with our network. In addition, human sensors are much better than electronic sensors because they can detect and interpret many other things, such as: the mood in a room (after the airline changes your gate for the third time), the meaning of an image, and so on.

Consider a hypothetical situation in which a crowded venue has to be evacuated for whatever reason. Perhaps some exits will be blocked and people will be directed to go certain places, or act a certain way. Human observers may notice a problem with the way security is handling the situation from multiple locations inside the venue, and from multiple points of view. The collection of such impressions (be they tweets, Facebook status updates, or something else) may contain clues to what’s wrong with the black-box evacuation procedure devised for that venue. For example, “avoid using the south exit because people exiting through there bump into those coming down the stairs from the second floor and everyone has to slow down quite a bit.”

In a world where Analytics and OR specialists struggle to convince companies to try new ideas, could this kind of evidence/data be used to foster collaboration? “I noticed that you did X when Y happened. It turns out that if you had done Z, you’d have achieved a better outcome, and here’s why…”

Is the airline example really too complicated to be amenable to this kind of analysis? I’m not sure. But even if it is, there may be other situations in which a social network of human sensors can collect enough information to motivate someone to open that black box and tinker with its inner workings a little bit. Those of you working in the area of social networks might be aware of something along the lines of what I (with inspiration from Matthew) have proposed above. If that’s the case, I’d love to read more about it. Please let me know in the comments.

1 Comment

Filed under Applications, INFORMS Monthly Blog Challenge, People, Social Networks, Travel

There and Back Again: A Thank You Note

There were sub-freezing temperatures, there were snow flurries, there was a hail storm, and there was a tornado watch. No, I’m not claiming that my visit to Pittsburgh last week was as full of adventures as Bilbo Baggins’s journey, but it was very nice indeed.

I had the great pleasure of being invited by John Hooker and Willem-Jan van Hoeve to give a talk at the Operations Research seminar at the Tepper School of Business. Since John, André Ciré, and I are working together on some interesting things, I took the opportunity to spend the entire week (Mon-Fri) at CMU; and what a joy it was.

The Tepper School was kind enough to have a limo service pick me up from, and take me back to, the airport. I guess this is how the top business schools roll. It’s a great way to make a speaker feel welcome. Besides, my driver turned out to be an extremely friendly and easy-to-talk-to fellow. Thanks to him (and his knowledge of off-the-beaten-path roads), I managed to catch my return flight. Otherwise, a cab driver would have sat through miles of Friday rush hour, and I’d certainly have missed the flight.

I walked to campus every day and actually enjoyed the few minutes of cold weather (wow! I can’t believe I just said that!). Stopping at the Kiva Han to grab an almond biscotto and a small coffee, right across the street from Starbucks, was a daily treat. Walking around campus brought back great memories from my PhD-student days. It’s nice to see all the improvements, and all the good things that remain good. Upon leaving Miami, I had the goal of having Indian food for 10 out of my 10 meals (excluding breakfast). Although I managed to do it only 4 times, I’m pretty happy with my gastronomic adventures in Pittsburgh. The delicious semolina gnocchi served at Eleven is definitely praiseworthy.

Work-wise, it was a very productive week. We had interesting ideas and conversations. I’m very grateful to all of those who took time off their busy schedules to meet with me, be it to catch up on life, talk about research (including some excellent feedback on my talk), or both. Thank you (in no particular order) to Alan Scheller-Wolf, Javier Peña, Michael Trick, Egon Balas, Sridhar Tayur, Masha Shunko, Valerie Tardif, Lawrence Rapp, and of course John and Willem. Many thanks also go to André, David, and all the other PhD students who joined me for lunch on Friday. I really enjoyed meeting all of you and learning a bit about your current projects.

I noticed that John got rid of his chalk board and painted two of his office walls with some kind of glossy white-board paint. It’s pretty cool because it allows you to literally write on your wall and erase everything with a regular white-board eraser. Now I want to do the same in my office! (My white board is pretty small.) But I’m not sure if they’ll let me. Gotta check on that!

Overall, it was an awesome week and I hope I can do this again some time.

1 Comment

Filed under People, Research, Travel

INFORMS 2010 Wrap-Up

I had a very productive and fun time at the INFORMS Annual Meeting in Austin, Texas. So I thought I’d share with you some of my observations about the meeting:

  • Analytics initiative by INFORMS: great idea! I believe that we (INFORMS members) have to jump onto the Analytics bandwagon and let everyone know that we can do Analytics too! It’s all about the power of words these days (see the country’s political arena for a perfect example). Starting today, I’ll tell everyone that I can do “Advanced Prescriptive Analytics” (optimization).
  • Panel on Social Networks: it was nice to hear from some of the most popular OR bloggers and learn about their motivations, fun stories, and blogging strategies. Great job Mike and Laura!
  • John Birge‘s plenary (Omega Rho Distinguished Lecture): very informative and entertaining. The simple and powerful take-home message was: align incentives. What’s good for the employees has to be good for the company as well.
  • RAS Problem Solving Competition: Michael Trick and I (a.k.a. Team MATHY) received an honorable mention at the Railway Applications Section (RAS) 2010 Challenge (certificate + photo :-). After watching the three finalists’ presentations, we were happy to see that our solution value of $11,399,670.88 was equal to the best solution found by the winning team, and better than the solution found by the other two finalists. Nobody managed to prove optimality, though. Interestingly, none of the finalists thought about scaling the problem (thinking in terms of tanks of gasoline instead of gallons of gasoline), which made a huge difference in the performance of our model. Overall, it was a lot of fun to participate in this competition and I want to thank the organizers for putting it together. Here’s a picture of team MATHY with Juan Morales from BNSF Railway.

    And here’s a picture of the railroad network we had to deal with:

  • Technical sessions: I watched many interesting and inspiring talks (as a matter of fact, I had some great ideas for my own research while watching a number of presentations). It’s nice to see that *a lot* of people are using Latex Beamer these days. Let’s aim for a Powerpoint-free INFORMS by 2020!
  • Idea for an iPhone App? I applaud the going-green initiative of reducing the number of conference program booklets that have to be printed out. However, for this to work it requires a lot of organization from each of us: we have to go over the program in advance, select the talks we want, and print the appropriate pages. I don’t know about you, but I never manage to get this done. So I propose we create an iPhone (mobile) app to allow participants to browse the program on-the-go. It’s not convenient to browse the program PDF on a phone. We need an app. We need to be able to filter by author, by chair, by topic/keyword, etc. We want a time-sensitive app that tells you what’s next. We want an app that sends notifications to your phone reminding you that a talk/event is coming up so that you ask for the check in time. If we had something like that, I think that a lot fewer people would ask for a printed program (myself included).
  • Thumbs up for all the vegetarian food: being a vegetarian myself, I was impressed with the generous availability of vegetarian food (i.e. not only salad) at both the Sunday and Tuesday receptions. Well done!
  • Austin’s Convention Center: in addition to being huge, the convention center’s numerous under-construction areas made it very hard to navigate from session to session. I always felt like I was taking the longest path from point A to point B.
  • Meeting old and new friends: it was great to make many new friends and to meet old friends from my PhD days at Carnegie Mellon at the Tepper Alumni reception. I also had some very productive research meetings with several colleagues.

Last but not least, I’d like to thank John Hooker, Christopher Beck, and Willem-Jan van Hoeve for agreeing to give a talk in my session, and Willem for inviting me to present in his session.

Time to say bye-bye to weird Austin and fly back to Miami! Hence, I had to put on my “U” shirt:

10 Comments

Filed under Analytics, Challenge, INFORMS, iPhone, Travel

ISMP ’09: Friday (the end)

John and I attended the morning technical sessions on the COIN-OR project [add link]. I enjoyed the overview talk by Robert Fourer [add link] (“COIN-OR Triennial Update”). It helped me understand what is necessary to get a project into COIN (both technical and legal requirements). This made me wonder if that’s the right way to go with SIMPL [add link]. My first reaction is to say “yes”, but I’ll have to think about it and talk to my co-authors. The second talk was given by Laszlo Ladanyi [add link], and it was entitled “Bigger, Better, Faster: Update on the World’s Fastest Open-Source MIP Solver”. Although the latest results I’ve seen [add link] place CBC [add link] behind SCIP [add link], I guess it’s OK for them to say that because, although SCIP ‘s code is freely available, it’s only free for academic use. This means SCIP cannot call itself “open-source” (Ah…the world of legal terminology). The improvements to CBC have indeed been impressive: new heuristics, 2x increase in preprocessing speed, and 55% improvement on the geometric mean of solution times for problems in the Mittelmann benchmarks [add link], to name a few. The final talk in the COIN-OR session was given by Bjarni Kristjansson [add link] of Maximal Software [add link]. He talked about new developments to CoinMP, which is a library that gives a high-level user easy access to many COIN solvers from both Windows and Linux applications.
An added (positive) side effect of attending the COIN-OR session was that I had a chance to visit Gleacher Center [add link] at the University of Chicago [add link]. Until Thursday, all the sessions I attended were held at the Marriott hotel. The talks were in what appeared to be an MBA classroom. Pretty neat room, nice tables and nice chairs, but no leather. If you want to get an excellent MBA degree, while attending classrooms with leather chairs, you will have to come to the University of Miami [add link] :-)
Unfortunately, there was also a negative side-effect: I had to miss Michel Goeman’s [add link] talk on the “Smallest Compact Formulation of the Permutahedron”. The permutahedron in n dimensions is the convex hull of the points that represent all possible permutations of the numbers from 1 to n. Michel found a formulation with O(nlog(n)) variables and O(nlog(n)) constraints and proved that no smaller compact formulation exists (up to constant factors). This may have some implications in hybrid methods because the permutahedron is related to the alldifferent global constraint of Constraint Programming. I’ll have to see if I can find Michel’s paper online somewhere, or ask him for a copy. In his abstract, he mentions that this result generalizes easily to variants of the permutahedron, so I’m curious to see what those variants are.
During the lunch break I took some time off to go to Millenium Park [add link]. It’s gorgeous. My trip to Chicago wouldn’t have been complete without a visit to The Bean [add link], the X Fountain [add link], and the Y stadium [add link]. I’ve included some photos below. On my way back, I stopped for a quick lunch (paninni + soup) and bought a 3-pack of Starbucks Via [add link] (their brand of instant coffee). I know what you’re going to say: instant coffee tastes terrible. However, there’s been so much talk about Via that I wanted to give it a try.
After lunch I went to François Soumis [add link] talk on “Dantzig-Wolfe Decomposition Putting in the Subproblem the Degenerated Constraints of a Linear Problem”. He explained the Improved Primal Simplex (IPS) [add link] algorithm and how it can be used in a column-generation style. The results are really impressive. For highly degenerate problems, the LPs solve 10x faster (compared to CPLEX’s primal simplex) when only lower bounds are taken into account, and 20x faster when both lower and upper bounds are used. For a class of degenerate set partitioning problems, LPs solve 20 to 40 times faster, while the IPs solve 50 to 100 times faster.
They’re working on a number of extensions, including generalizations of the same technique to non-linear problems (through active set methods).
The last two talks I attended had one thing in common: both dealt with scheduling problems with long time horizons. Although the typical time-indexed MIP formulation for machine scheduling (x_jt = 1 if job j starts at time t) provides strong lower bounds for short time horizons, it is ineffective at solving large horizon problems because the number of variables blows up. In the first talk of that session, Christos Maravelias [add link] talked about “The Polyhedral Properties of a Discrete-Time MIP Formulation for Production Planning”. Christos’s models were suffering from the “curse of long time horizons” and they tried many variants of the model to no avail. Their successful idea consists of looking at the problem as the intersection of two polyhedra P1 and P2 and trying to strengthen the individual P1 and P2 formulations. It turns out that the coefficient matrix of P1 was totally unimodular [add link] (hence no strengthening needed), but the matrix of P2 wasn’t. Nevertheless, by playing around with the basic time unit of their time horizon (and properly adjusting the right-hand side coefficients), they managed to rescale the coefficient matrix of P2 into a k-regular matrix [add link] (k-regularity is a generalization of total unimodularity). Interestingly, the rank-1 closure (w.r.t. mod-k cuts) can be obtained easily for k-regular matrices. In addition, that k-regularity had other positive effects on the way that CPLEX treats the problem (I didn’t fully get that part of the talk). The result: problems that used to take hundreds of thousands of nodes to solve were being solved with just a few nodes, and some times at the root node. The final talk, entitled “Single-Machine Scheduling over Long Time Horizons by Logic-based Benders Decomposition”, was given by Elvin Coban. She implemented three different models for the problem (MIP, CP and Benders) and looked at three objectives: feasibility only, min makespan, and min tardiness. The MIP and CP models were pretty simple and straightforward. The Benders decomposition model consisted of a MIP master problem that assigned groups of jobs to time intervals in the scheduling horizon, and a subproblem for each interval that took care of scheduling the jobs assigned to it. Both the master and the subproblems had their relaxations enriched with extra constraints. The Benders cuts were strengthened by looking at smaller-than-obvious causes of infeasibility (or bounds, for the makespan and tardiness objectives) through multiple solves of each subproblem. The results were interesting: for problems in which the jobs have tight time windows, CP is the best approach, followed by Benders and then (far, far away, 100 times slower) MIP. As the time horizon increases and/or job time windows become looser, the Benders approach remains robust and becomes better than CP, which goes all over the place (MIP is still far, far away). Here’s an idea of the sizes of their problems: # jobs: 60 to 120, time horizon: 170 to 4000 units. I was surprised to learn that they simply divided the time horizon into 10 equal intervals. I suspect that a more careful choice of intervals can make a significant difference (consider, for instance, the release dates and due dates as possible breakpoints).
That was the end of the conference. I headed back to my hotel, picked up my bag and took the Red Line -> Orange Line -> Midway. Total cost of ground transportation for the whole trip = $4.50. (The folks paying my reimbursement will be proud of me.) Unfortunately, upon arrival at the airport, I found out that my Southwest flight was delayed by 2 hours! Thankfully, I managed to arrive home safe and sound (and tired). I had a great time in Chicago, but I’m happy to be back home in sunny south Florida. I hope these ISMP blog posts have been useful to my blog readers. It was a pleasure to contribute my two cents to the O.R. community.

John and I attended the morning technical sessions on the COIN-OR project. I enjoyed the overview talk by Robert Fourer (“COIN-OR Triennial Update”). It helped me understand what is necessary to get a project into COIN (both technical and legal requirements). This made me wonder if that’s the right way to go with SIMPL. My first reaction is to say “yes”, but I’ll have to think about it and talk to my co-authors. The second talk was given by Laszlo Ladanyi, and it was entitled “Bigger, Better, Faster: Update on the World’s Fastest Open-Source MIP Solver”. Although the latest results I’ve seen place CBC behind SCIP, I guess it’s OK for them to say that because, although SCIP ‘s code is freely available, it’s only free for academic use. This means SCIP cannot call itself “open-source” (Ah…the world of legal terminology). The improvements to CBC have indeed been impressive: new heuristics, 2x increase in pre-processing speed, and 55% improvement on the geometric mean of solution times for problems in the Mittelmann benchmarks, to name a few. The final talk in the COIN-OR session was given by Bjarni Kristjansson of Maximal Software. He talked about new developments to CoinMP, which is a library that gives a high-level user easy access to many COIN solvers from both Windows and Linux applications.

An added (positive) side effect of attending the COIN-OR session was that I had a chance to visit Gleacher Center at the University of Chicago. Until Thursday, all the sessions I attended were held at the Marriott hotel. The talks were in what appeared to be an MBA classroom. Pretty neat room, nice tables and nice chairs, but no leather. If you want to get an excellent MBA degree, while attending classrooms with leather chairs, you will have to come to the University of Miami :-)

Unfortunately, there was also a negative side-effect: I had to miss Michel Goemans‘s talk on the “Smallest Compact Formulation of the Permutahedron”. The permutahedron in n dimensions is the convex hull of the points that represent all possible permutations of the numbers from 1 to n. Michel found a formulation with O(nlog(n)) variables and O(nlog(n)) constraints and proved that no smaller compact formulation exists (up to constant factors). This may have some implications in hybrid methods because the permutahedron is related to the alldifferent global constraint of Constraint Programming. I’ll have to see if I can find Michel’s paper online somewhere, or ask him for a copy. In his abstract, he mentions that this result generalizes easily to variants of the permutahedron, so I’m curious to see what those variants are.

During the lunch break I took some time off to go to Millenium Park. It’s gorgeous. My trip to Chicago wouldn’t have been complete without a visit to the Cloud Gate (a.k.a the bean), the Crown Fountain, and the Jay Pritzker Pavilion. I’ve included some photos below. On my way back, I stopped for a quick lunch (paninni + soup) and bought a 3-pack of Starbucks Via (their brand of instant coffee; only sold in Chicago, Seattle and London). I know what you’re going to say: instant coffee tastes terrible. However, there’s been so much talk about Via that I wanted to give it a try.

After lunch I went to François Soumis‘s talk on “Dantzig-Wolfe Decomposition Putting in the Subproblem the Degenerated Constraints of a Linear Problem”. He explained the Improved Primal Simplex (IPS) algorithm and how it can be used in a column-generation style. The results are really impressive. For highly degenerate problems, the LPs solve 10x faster (compared to CPLEX’s primal simplex) when only lower bounds are taken into account, and 20x faster when both lower and upper bounds are used. For a class of degenerate set partitioning problems, LPs solve 20 to 40 times faster, while the IPs solve 50 to 100 times faster. They’re working on a number of extensions, including generalizations of the same technique to non-linear problems (through active set methods).

The last two talks I attended had one thing in common: both dealt with scheduling problems with long time horizons. Although the typical time-indexed MIP formulation for machine scheduling (x_jt = 1 if job j starts at time t) provides strong lower bounds for short time horizons, it is ineffective at solving large horizon problems because the number of variables grows too much. In the first talk of that session, Christos Maravelias talked about “The Polyhedral Properties of a Discrete-Time MIP Formulation for Production Planning”. Christos’s models were suffering from the “curse of long time horizons” and they tried many variants of the model to no avail. Their successful idea consists of looking at the problem as the intersection of two polyhedra P1 and P2 and trying to strengthen the individual P1 and P2 formulations. It turns out that the coefficient matrix of P1 was totally unimodular (hence no strengthening needed), but the matrix of P2 wasn’t. Nevertheless, by playing around with the basic time unit of their time horizon (and properly adjusting the right-hand side coefficients), they managed to rescale the coefficient matrix of P2 into a k-regular matrix (k-regularity is a generalization of total unimodularity). Interestingly, the rank-1 closure (w.r.t. mod-k cuts) can be obtained easily for k-regular matrices. In addition, that k-regularity had other positive effects on the way that CPLEX treats the problem (I didn’t fully get that part of the talk). The result: problems that used to take hundreds of thousands of nodes to solve were being solved with just a few nodes, and some times at the root node. The final talk, entitled “Single-Machine Scheduling over Long Time Horizons by Logic-based Benders Decomposition”, was given by Elvin Coban. She implemented three different models for the problem (MIP, CP and Benders) and looked at three objectives: feasibility only, min makespan, and min tardiness. The MIP and CP models were pretty simple and straightforward. The Benders decomposition model consisted of a MIP master problem that assigned groups of jobs to time intervals in the scheduling horizon, and a subproblem for each interval that took care of scheduling the jobs assigned to it. Both the master and the subproblems had their relaxations enriched with extra constraints. The Benders cuts were strengthened by looking at smaller-than-obvious causes of infeasibility (or bounds, for the makespan and tardiness objectives) through multiple solves of each subproblem. The results were interesting: for problems in which the jobs have tight time windows, CP is the best approach, followed by Benders and then (far, far away, 100 times slower) MIP. As the time horizon increases and/or job time windows become looser, the Benders approach remains robust and becomes better than CP, which goes all over the place (MIP is still far, far away). Here’s an idea of the sizes of their problems: # jobs: 60 to 120, time horizon: 170 to 4000 units. I was surprised to learn that they simply divided the time horizon into 10 equal intervals. I suspect that a more careful choice of intervals can make a significant difference (consider, for instance, the release dates and due dates as possible breakpoints).

That was the end of the conference. I headed back to my hotel, picked up my bag and took the Red Line -> Orange Line -> Midway. Total cost of ground transportation for the whole trip = $4.50. (The folks paying my reimbursement will be proud of me.) Unfortunately, upon arrival at the airport, I found out that my Southwest flight was delayed by 2 hours! Thankfully, I managed to arrive home safe and sound (and tired). I had a great time in Chicago, but I’m happy to be back home in sunny south Florida. I hope these ISMP blog posts have been useful to my blog readers. It was a pleasure to contribute my two cents to the O.R. community.

By the way, if you want a view of the ISMP more focused on Interior Point Methods, take a look at Nathan Brixius blog.

DSC06528

america

DSC06541

DSC06550

fountain

DSC06558

Leave a comment

Filed under ISMP, Research, Travel

ISMP ’09: Wednesday and Thursday

The site of ISMP 2012 was announced: it will be the Technical University of Berlin. It will be held under the auspices of Berlin’s DFG Research center MATHEON. The local organizing committee will be composed of Martin Groetschel [add link], Rolf Moehring [add link] and Martin Skutella [add link]. The symposium will take place in mid-August. I hope it’s really “mid” rather than the last week of August when many people have to start teaching.
Wednesday’s morning plenary was given by Matteo Fischetti [add link]. Matteo is not only an amazing and prolific researcher, but also an excellent speaker. His talks are full of jokes and witty comments. He talked about “Pure Cutting Plane Methods for Integer Linear Programming: A Computational Perspective”. He and his co-authors (including Egon Balas [add link]) are taking a fresh look at Gomory’s method for solving ILP’s published in 1958. In theory, it is possible to solve an ILP by just using Gomory Fractional Cuts (GFC) and no branching. In practice, at least with the implementations used so far, the method tends to get stuck after some initial progress. He mentioned that branching is used as a symptomatic cure to these well known problems, but we should try to understand the cause of these problems and try to do something about them. He proposes a framework in which GFC’s are read directly from the tableau and cut separation is deeply entangled with LP re-optimization. Nevertheless, this closed loop system tableau-cut-tableau turns out to be very unstable. They took an instance from the MIPLIB [add link] called stein15, which is easy for traditional branch-and-cut methods and implemented a naïve GFC approach. The result was that it didn’t converge to the optimal solution even after 130 rounds of single cuts or 80 rounds of multiple cuts. After looking at some plots, they realized that the determinant of the optimal LP basis was increasing exponentially. Using Matteo’s words, this was happening because the solutions had some weird fractions. The idea then was to use a clever pivoting strategy to keep the tableau clean. Degeneracy is normally considered a negative thing, but in this case it could be used to their advantage. Among all the equivalent optimal bases, choose the “cleanest” one. By design, LP solvers try to minimize the number of pivots during re-optimization after the addition of a cut. As a result, the next solution is very close to the old one (i.e. the change is small), which leads to a large determinant (weird fractions), which in turn leads to an unstable tableau, shallow cuts, and finally hopelessness.  Their solution was to use GFC’s with lex optimization: let x0 = objective function, find optimal objective variable z*, fix x0=z*, minimize x1 subject to that, fix x1 to x1*, minimize x2 subject to that, fix x2=x2*, etc. This gives you a cleaner tableau and the basis determinant growth is contained. That allowed them to solve stein15 with lex optimization in 65 single-cut rounds (the optimal solution was found after 15 rounds). He ended the talk showing that GFC with lex optimization actually does an implicit tree search (with branching and backtracking) if one looks at the sequence of solutions that are being found. The problem now is that the way this implicit tree is being explored isn’t that smart, so there’s room for further research on that. Perhaps a modified lex procedure can result in a better tree search.
I attended a talk by H. P. Williams [add link] in the first morning session that was pretty interesting. He spoke about Chvátal Functions [add link] and the possibility of using them to play the role of shadow prices in integer programs. The topic is kind of fascinating, but it’s still in progress. He’s looking at ways of representing such functions in a way that will make it easy for economists to work with the integer shadow prices as an aid to decision making. One interesting comment he made was that, for an integer program in n dimensions, it could be that more than n constraints matter in determining where the optimal solution is. In linear programming however, we only need to worry about at most n constraints.
There were a few other talks that I enjoyed: Miguel Constantino’s [add link] talk on “A Strengthened Integer Programming Model for Conflict Minimization in Cartographic Label Placement” and Eduardo Uchoa’s [add link] talk on “Time-Dependent Traveling Salesman Problem: Polyhedra and Branch-and-Cut-and-Price”. Miguel managed to get some nice improvements using simple cuts added a priori to the IP model. Eduardo gave an introduction to the time-dependent TSP (TDTSP) and . Interestingly, many known TSP facets induce faces low dimensional facets of the TDTSP. I’m curious to see whether the new families of facets that they found for the TDTSP will translate into new TSP facets. I also attended a talk by Andrea Qualizza [add link] on “Linear Programming Relaxations of Non-convex Mixed Integer Quadratically Constrained Problems”. The idea is to solve these kinds of problems using linear programming relaxations that approximate the cone of positive semi-definite matrices, sparsification of cutting planes, and linearization of convex quadratic inequalities. (In summary, make everything linear and see how far that takes you.) The afternoon semi-plenaries were both about approximation algorithms, but I was not able to attend any of them.
The conference banquet took place in the Field Museum’s [add link] Stanley Field Hall, current home to “Sue”, the largest and most complete Tyrannosaurus Rex specimen in the world. I did not attend the banquet, so I cannot attest to the quality of the food. However, I heard amazing things about the Field Museum and I’m sure it must have been an entertaining night.
Thursday 8/27: Today I learned why Paul Tseng’s talk was cancelled on Tuesday. He has literally disappeared! He has been missing since August 13 in a scenic mountainous region in the Yunnan Province in China. The University of Washington has issued a press release that can be found here [add link].
Lars Peter Hansen [add link] gave the morning plenary on “Valuation in Dynamic Stochastic Economies”. I left the room a little early to run to session ThA14: the Tucker Prize session. I was curious to learn about Mohit Sing’s [add link] “Iterative Methods in Combinatorial Optimization”. He did a great job explaining the topic and I was impressed at how the proof of Goeman’s conjecture (i.e. there exists a (1,B+1) approximation to the degree-bounded minimum cost spanning tree problem) was so simple and elegant. Well done! Another merit of his technique is that it’s applicable to a large class of problems that, in their pure versions, are polynomially solvable, but become NP-hard with the addition of side constraints (e.g. bipartite matching). (It’s also applicable to other NP-hard problems such as the steiner tree problem.) I was even inspired with an idea for a research paper that I want to explore in the near future. Thanks Mohit!
My presentation was in session ThC09. Armin Fuegenschuh [add link], who spoke right before me, talked about “Solving Nonlinear Engineering Problem with Piecewise-linear Approximation Techniques”. He’s studying an optimal control problem related to the separation of substances in process engineering. More specifically, how to separate the bad stuff from the good stuff in paper recycling plants. Their model not only optimizes the setup of the separator, but also proposes a layout of separators (they are used in sequence with many possible ways of feeding back into each other). I was impressed when Armin mentioned that 75% of the paper used in Germany is recycled and, out of that, 67% of the fibers are re-used. That’s pretty amazing.
My talk was entitled “Valid Inequalities for a Piecewise Linear Objective with Knapsack and Cardinality Constraints”. We are looking at the interplay of piecewise-linear models and cardinality constraints. When these two conditions are combined, we are able to obtain some interesting new results that translate into previously unknown facets of the polyhedron. We don’t have computational results yet, but we’re very excited with the potential applications of this work; especially in the areas of portfolio optimization and bio-informatics. Both my master’s advisor and my PhD advisor were in present. Laurence Wolsey [add link] was also there, although I think he probably stopped by to listed to Simge Küçükyavuz [add link] (the first speaker) and ended up staying. I was happy to see Andrea Lodi [add link] as well since I admire and respect his work. All in all, I think the talk went well despite the heavy notation I had to use.
I went to dinner with John (Hooker) [add link] at India House (second time there for me). We had a very pleasant conversation about a myriad of things including research, the future of SIMPL [add link]. I really enjoy talking to John and I’m happy we were able to spend that time together. This was also our SIMPL-accepted-into-Operations-Research celebration dinner, so I couldn’t help ordering a glass of Gewürztraminer. Hmmm…

Wednesday 8/26:

The site of ISMP 2012 was announced: it will be the Technical University of Berlin. It will be held under the auspices of Berlin’s DFG Research center MATHEON. The local organizing committee will be composed of Martin Grötschel, Rolf Möhring and Martin Skutella. The symposium will take place in mid-August. I hope it’s really “mid” rather than the last week of August when many people have to start teaching.

Wednesday’s morning plenary was given by Matteo Fischetti. Matteo is not only an amazing and prolific researcher, but also an excellent speaker. His talks are full of jokes and witty comments. He talked about “Pure Cutting Plane Methods for Integer Linear Programming: A Computational Perspective”. He and his co-authors (including Egon Balas) are taking a fresh look at Gomory’s method for solving ILP’s published in 1958. In theory, it is possible to solve an ILP by just using Gomory Fractional Cuts (GFC) and no branching. In practice, at least with the implementations used so far, the method tends to get stuck after some initial progress. He mentioned that branching is used as a symptomatic cure to these well known problems, but we should try to understand the cause of these problems and try to do something about them. He proposes a framework in which GFC’s are read directly from the tableau and cut separation is deeply entangled with LP re-optimization. Nevertheless, this closed loop system tableau-cut-tableau turns out to be very unstable. They took an instance from the MIPLIB called stein15, which is easy for traditional branch-and-cut methods, and implemented a naïve GFC approach. The result was that it didn’t converge to the optimal solution even after 130 rounds of single cuts or 80 rounds of multiple cuts. After looking at some plots, they realized that the determinant of the optimal LP basis was increasing exponentially. In Matteo’s own words, this was happening because the solutions had some “weird” fractions. The idea then was to use a clever pivoting strategy to keep the tableau clean. Degeneracy is normally considered a negative thing, but in this case it could be used to their advantage. Among all the equivalent optimal bases, choose the “cleanest” one. By design, LP solvers try to minimize the number of pivots during re-optimization after the addition of a cut. As a result, the next solution is very close to the old one (i.e. the change is small), which leads to a large determinant (weird fractions), which in turn leads to an unstable tableau, shallow cuts, and finally hopelessness.  Their solution was to use GFCs with lex optimization: let x0 = objective function, find optimal objective value z*, fix x0=z*, minimize x1 subject to that, fix x1 to x1*, minimize x2 subject to that, fix x2=x2*, etc. This gives you a cleaner tableau and the basis determinant growth is contained. That allowed them to solve stein15 with lex optimization in 65 single-cut rounds (the optimal solution was found after 15 rounds). He ended the talk showing that GFC with lex optimization actually does an implicit tree search (with branching and backtracking) if one looks at the sequence of solutions that are being found. The problem now is that the way this implicit tree is being explored isn’t that smart, so there’s room for further research on that. Perhaps a modified lex procedure can result in a better tree search.

I attended a talk by H. P. Williams in the first morning session that was pretty interesting. He spoke about Chvátal Functions and the possibility of using them to play the role of shadow prices in integer programs. The topic is kind of fascinating, but it’s still in progress. He’s looking at how to represent such functions in a way that will make it easy for economists to work with the integer shadow prices as an aid to decision making. One interesting comment he made was that, for an integer program in n dimensions, it could be that more than n constraints matter in determining where the optimal solution lies. In linear programming however, we only need to worry about at most n constraints.

There were a few other talks that I enjoyed: Miguel Constantino‘s talk on “A Strengthened Integer Programming Model for Conflict Minimization in Cartographic Label Placement” and Eduardo Uchoa‘s talk on “Time-Dependent Traveling Salesman Problem: Polyhedra and Branch-and-Cut-and-Price”. Miguel managed to get some nice improvements using simple cuts added a priori to the IP model. Eduardo gave an introduction to the time-dependent TSP (TDTSP) and discussed polyhedral and computational results. Interestingly, many known TSP facets induce low dimensional faces of the TDTSP. I’m curious to see whether the new families of facets that they found for the TDTSP will translate into new TSP facets. I also attended a talk by Andrea Qualizza on “Linear Programming Relaxations of Non-convex Mixed Integer Quadratically Constrained Problems”. The idea is to solve these kinds of problems using linear programming relaxations that approximate the cone of positive semi-definite matrices, sparsification of cutting planes, and linearization of convex quadratic inequalities. (In summary, make everything linear and see how far that takes you.) The afternoon semi-plenaries were both about approximation algorithms, but I was not able to attend any of them.

The conference banquet took place in the Field Museum‘s Stanley Field Hall, current home to “Sue”, the largest and most complete Tyrannosaurus Rex specimen in the world. I did not attend the banquet, so I cannot attest to the quality of the food. However, I heard amazing things about the Field Museum and I’m sure it must have been an entertaining night.

Thursday 8/27:

I learned why Paul Tseng‘s talk was cancelled on Tuesday. He has literally disappeared! He has been missing since August 13 in a scenic mountainous region in the Yunnan Province in China. The University of Washington has issued a press release about that. The story has appeared in many places, including here and here.

Lars Peter Hansen gave the morning plenary on “Valuation in Dynamic Stochastic Economies”. I left the room a little early to run to session ThA14: the Tucker Prize session. I was curious to learn about Mohit Singh‘s “Iterative Methods in Combinatorial Optimization”. He did a great job explaining the topic and I was impressed at how the proof of Goeman’s conjecture (i.e. there exists a (1,B+1) approximation to the degree-bounded minimum cost spanning tree problem) was so simple and elegant. Well done! Another merit of his technique is that it’s applicable to a large class of problems that, in their pure versions, are polynomially solvable, but become NP-hard with the addition of side constraints (e.g. bipartite matching). (It’s also applicable to other NP-hard problems such as the steiner tree problem.) I was even inspired with an idea for a research paper that I want to explore in the near future. Thanks Mohit! After lunch I stopped by to see Gabor Pataki (“Basis Reduction and the Complexity of Branch-and-Bound”) and Timo Berthold (“Hybrid Branching”). Their presentations were very interesting and informative.

My presentation was in session ThC09. Armin Fuegenschuh, who spoke right before me, talked about “Solving Nonlinear Engineering Problem with Piecewise-linear Approximation Techniques”. He’s studying an optimal control problem related to the separation of substances in process engineering. More specifically, how to separate the bad stuff from the good stuff in paper recycling plants. Their model not only optimizes the setup of the separator, but also proposes a layout of separators (they are used in sequence with many possible ways of feeding back into each other). I was impressed when Armin mentioned that 75% of the paper used in Germany is recycled and, out of that, 67% of the fibers are re-used. That’s pretty amazing.

My talk was entitled “Valid Inequalities for a Piecewise Linear Objective with Knapsack and Cardinality Constraints”. We are looking at the interplay of piecewise-linear models and cardinality constraints. When these two conditions are combined, we are able to obtain some interesting new results that translate into previously unknown facets of the polyhedron. We don’t have computational results yet, but we’re very excited with the potential applications of this work; especially in the areas of portfolio optimization and bio-informatics. Both my master’s advisor and my PhD advisor attended the talk. Laurence Wolsey was also there, although I think he probably stopped by to listen to Simge Küçükyavuz (the first speaker) and ended up staying. I was happy to see Andrea Lodi there as well since I admire and respect his work. All in all, I think the talk went well despite the heavy notation I had to use.

I went to dinner with John (Hooker) at India House (second time there for me). We had a very pleasant conversation about a myriad of things, including research and the future of SIMPL. I really enjoy talking to John and I’m happy we were able to spend that time together. This was also our SIMPL-accepted-into-Operations-Research celebration dinner, so I couldn’t help ordering a glass of Gewürztraminer. Hmmm…

2 Comments

Filed under ISMP, Research, Travel

Combined Wed+Thu ISMP ’09 Post Coming Later Tonight

Yesterday I didn’t write a blog post because I was very much in need of a restful sleep (good sign; a lot of work accomplished). But stay tuned for tonight’s post! In the mean time, here are a couple of cool photos for your enjoyment.

x

photo

Leave a comment

Filed under ISMP, Travel