I’ve been teaching the core prescriptive analytics course in our MBA program since 2005, and it was once again time for a major refresh/improvement. Therefore, this past December I went for it, and then proceeded to ask myself the following question over and over during the first half of the spring semester: “Why did you do this to yourself?” It was grueling. It was hard. It was overwhelming. However, when I look at the end result, I am really proud of it. It’s far from perfect, but I collected tons of feedback from my students (THANK YOU!) and I’m already polishing it for the next iteration (that one will be close to spotless). The goal of this post is to share a portion of the feedback I received, which turned out to contain some unexpected surprises.
To give you some context, I improved the existing homework problems, Excel files, and lecture notes (everything now is in PDFs created with LaTeX; no Powerpoint anymore!). I also created a lot of new content (including a hands-on Lego exercise and a case study I developed from scratch), especially when it comes to decision-making under uncertainty. All in all, I’d say 50% of the course is new material, and the remaining 50% is a much better version of the previous material.
Here are the lecture themes (unless otherwise noted, each lecture is a 2-hour session; there are 2 sessions per week). An asterisk * indicates a brand new lecture:
- Lecture 1: Introduction to Optimization: hands-on Lego exercise, investment, scheduling.
- Lecture 2: Mixing and Carryovers: blending, cash flow, production and inventory planning.
- Lecture 3: Networks: assignment, transshipment, shortest path.
- Lecture 4: Integer and Binary Variables: budget allocation, logical conditions, covering, fixed costs.
- Lecture 5: Sensitivity Analysis: shadow prices and reduced costs.
- MIDTERM EXAM
- Lecture 6*: Optimization Under Uncertainty: two-stage stochastic linear programming.
- Lecture 7*: Decision Analysis and Decision Trees (incl. EVPI, EVSI, risk profiles, Bayes’ Rule).
- Lecture 8*: Sequential Decision Processes: stochastic dynamic programming.
- Lecture 9* (two sessions): Monte Carlo Simulation.
- Lecture 10*: Case Discussion.
- FINAL EXAM
On my final exam, I had a question asking students to name their favorite and least favorite lecture. I wanted to see where the home run was and where the big fail was. I had a total of 76 students, and here is the final tally:
You’ll notice that the columns don’t add up to 76 as they should. Some people provided two favorites, and some people did not indicate a least favorite lecture. Here’s what I infer from the table above. (Let me know if you see something I didn’t!)
- Networks as the clear “home run” lecture: I would never have guessed that in a million years! This was so surprising to me. Some of the comments mentioned the visual aspect of it as an attractiveness. I can see that, but still…
- Stochastic DP as the clear “big fail” lecture: I can actually see that because I wasn’t very fond of the lecture myself after I taught it. It looks OK on paper, but the classroom feeling I got from it was lukewarm. I’m thinking of replacing it with a second case study to take place after the first four lectures.
- Two-stage stochastic LP as pretty much another big fail: This one I attribute to a fail on my part. The material in there is pretty good. It’s just a matter of my doing a better job motivating and explaining it (the written feedback points to its being hard to follow). I’ll keep it for sure and refine the delivery. (Two people actually loved it! :-)
- Simulation as a bit of a polarizing topic: I honestly thought the simulation class was going to be a big home run. The 11 votes in the right column are likely my fault again (based on feedback). I fully believe in the quality of the content, but I need to do a better job connecting it with the statistics course that precedes mine (I assumed too much of the students).
- Beloved decision trees: I had a lot of fun putting together and delivering the decision tree lecture, but I didn’t expect it would get so many “fave” votes. Being a new lecture, there were some hiccups that I can easily fix (the same being true for lectures 6, 8, and 9), and seeing how successful it already was gives me even more motivation.
- Case study: I had never written a full-length case study before, but got excited about doing so after an instructional session organized for the faculty at our school. Despite liking the case on paper (it combines marketing and optimization aspects), I was a bit anxious before the actual discussion took place. My students did an amazing job and performed way beyond my expectations. After seeing where they went and what issues they brought up, I feel like I can polish the case to make it even better.
- Lego exercise: Another pleasant surprise. I was worried the Lego exercise might be seen as too simple or silly. I was wrong! The feedback indicated it worked well as a motivational tool, a visual aid, and a good ice breaker (it’s the very first thing I do in Lecture 1).
I’m pretty happy overall and really enjoyed what I learned from this fave/least fave question. It can be pretty informative. Do you have any other creative ways to elicit student feedback that you find particularly helpful? If so, please share them in the comments below.