Previous month:
November 2008
Next month:
January 2009

December 2008

MBAs and the financial crisis

In a December 3rd blog post, The Economist.com comments on the role of MBAs on the financial crisis. It is well-known that applications to graduate school soar when the economy falters, but Andrew Lo from MIT and Jay Lorsch and Rakesh Khurana from Harvard have debated on a BusinessWeek forum whether "business schools are actually to blame for our current turmoil." Lorsch and Khurana blame b-schools for their "failure to promote a higher cause", while Lo asserts that "the current crisis highlights the growing complexity of the financial system and underscores the sea change in business education from the generic to the specific".

He has a point; however, the move toward specific, detailed knowledge, required by the increasing complexity of financial instruments - which many on Wall Street did not understand, as the crisis has demonstrated - also suggests an upcoming separation of the trainings for finance careers and for general, management ones. The case studies of b-school help students understand various issues and dilemmas before they are faced with similar situations in real life; to be an effective CEO, there is, after all, no better training than doing. A thriving career in finance nowadays is based on an in-depth grasp of quantitative concepts, while many MBAs stop as standard deviation when they look for a risk measure. Of course, few MBAs become quants (part of the blame for the crisis has also been placed on computer models the quants used, with assumptions that turned very wrong) - but they supervise them.

According to the Economist.com, Lo believes "MBA students can learn the tools commonly used in modern finance while learning how to model and balance risk properly." That is hardly enough. MBA training could become stronger in quantitative methods (in the current state of things, some MBA training offers better quantitative training than others), but the real question is: have MBA programs outlasted their usefulness when it comes to training students for finance careers? The ascent of Master of Science programs in Financial Engineering suggests they have. These programs are, for the most part, relatively new, and as more and more graduates get promoted to supervising positions, more management jobs on Wall Street will be occupied by math-savvy financial engineers. (This obviously won't solve all possible problems. The current crisis also suggests an upcoming boom in continuing education courses for financiers, since many concepts have emerged over the past decade.) MBA programs do offer valuable courses in the management of organizations. But when it comes to finance, the MBA degree has lost its cachet. It is time for more specialized programs to take over.

Happy New Year, everybody! Best wishes for 2009.


Fortune-Tellers, or Issues in Forecasting

CFO magazine has a great article about issues related to forecasting, full of anecdotes about industry practice. Former Fed chairman Alan Greenspan himself has acknowledged many of today's problems are due to poor forecasting, and the article's author, one Vincent Ryan, has done extensive research at a wide scope of companies, noting for instance that at one business, the "method of forecasting raw-materials costs is to arbitrarily prognosticate that they will remain flat, pretty much ensuring the numbers will be wrong." 16% of respondents to a survey accompanying the article, asked how far out they could reliably forecast revenues given current market conditions, even said they were completely in the dark regarding forecasts (meaning they have zero visibility), and 15% said less than two weeks.

Thankfully, companies realize they can't simply give up on forecasting, and the writer throws a couple of familiar names at his readers, such as "scenario modeling, sensitivity analysis, and contingency planning". The problem, the author explains, lies with "static spreadsheets"; a few paragraphs later comes my favorite quote of the whole article: "Whether companies think long- or short-term, the ability to react quickly to events is really all that CFOs can ask of forecasting, say experts. An all-out drive for pinpoint accuracy, especially in light of current events, can be less helpful." (It is my favorite quote because I so completely agree with it.)

The article then describes in some detail scenario planning and stochastic modeling, again using real-life examples. A company's approach "includes short- and long-term components and incorporates what [a company's CFO] calls "stochastic" modeling — generating a host of scenarios that follow a random distribution." (I always love how prudent writers are when they introduce mathematical terms in non-technical publications.) The idea is to develop forecasts that "have a set of conservative baseline assumptions" and then "introduc[e] variances to that baseline to understand the magnitude of the effect on [...] the key drivers." Scenario modeling, which I wrote about here, is a fuzzier, high-level approach ("a set of stories") that deals with big questions, such as "whether a firm should expand into China"; the author gives a few examples and best practices. Obviously I can't repeat them all here, but please do read the article - there is thoughtful information in every paragraph.

The article even contains a mention of Value-at-Risk, called tail risk here. (The company "uses [a wide distribution of scenarios] to determine how much capital the company needs to hold in 99 percent of all possible outcomes." That's Value-at-Risk for me.) It also argues in favor of "driver-based forecasting" - which means "tracking the operational measures [...] that have a decided financial effect" - contingency planning and the ability of business leaders to use software to generate their own forecasts, instead of needing to rely on the finance department. The writer expands this last point into a broader comment on the flow of information, especially from operations to finance and not only from finance (creating the forecasts) to operations (receiving the forecasts).

While the article is rather long, it is a mine of information for practitioners interested in learning about current practices in forecasting - and academics curious to know what industry people do.


Expertise and Music

People love feel-good stories, and Gilbert Kaplan's has received more attention than most. Kaplan is a businessman who listened to Mahler's Second Symphony one day in 1965 and decided he would conduct it; indeed, he has ended up making a second career out of it, gaining wide (and generally positive) press coverage in the process. He only conducts that symphony - he did take a seven-month crash course in conducting at Juilliard at some point, but has expressed no interest in conducting other works, although he has also recorded a movement of Mahler's Fifth - and earlier this month, he conducted the New York Philharmonic for the 100th anniversary of the work's American premiere.

This aroused a flurry of newspaper and magazine features, from The Economist's "Desperately Seeking Mahler" (November 27th 2008) to The New York Times's "Reimagining Mahler as He Imagined It" (December 5, 2008). Journalists at both publications wrote glowingly positive reviews of Kaplan's career: his first performance of Mahler's Second, in 1982, was a "feat" that "reverberated throughout the music world", "his recording, made in Cardiff in 1985, has outsold Bernstein, Pierre Boulez, Claudio Abbado and all other contenders" in The Economist, "The few music critics who attended [his first performance as conductor in 1982, at a private event] had pledged not to review, but Leighton Kerner, of The Village Voice, heralded the performance in print as “one of the five or six most profoundly realized Mahler Seconds” he had heard in a quarter century" in The New York Times.

Hence, I was very surprised when I read vehement attacks on Kaplan in the Times about a week after the performance - attacks made by orchestra members, no less, who called him "a talent-free conductor who brings little to the work" and who "complained about Mr Kaplan's conducting for an hour" in a meeting with the orchestra's president on the day of the performance ("Mahler Fan With Baton Cues Unrest In the Ranks," December 17, 2008). It just seems that, if Kaplan's conducting was so awful, it wouldn't have taken 25 years for audiences and critics (and members of other orchestras) to notice it. The issue is that Kaplan is not a professional conductor, and members of the New York Philharmonic felt it was beneath them to be led by him.

But Kaplan, who has long retired from business, has spent countless hours analyzing the score, and arguably came more prepared than conductors who juggle many engagements. When it comes to music and orchestras, conductor is the job that has the lowest barrier to entry. Kaplan has applied business methods to realizing his dream - realized he needed teaching, identified someone who could serve as his mentor and give him the best possible training, performed many times. The fact that he is not a professional does not mean he is not a dilettante, and he did end up taking intensive lessons at Juilliard. (If he had been born a few decades later, he could have honed his conducting skills in video games; see "Keeping the Beat Just for Fun", Newsweek, October 29, 2001)

But it was perhaps unavoidable that professionals would feel threatened, especially those at the top of the totem pole. Members of lesser-known orchestras, I suspect, connect with Kaplan's enthusiasm, in a field that is notoriously ultra-competitive and where many are talented but very few are chosen. Kaplan might have unwittingly summarized it best when he told The Economist: "I had a feeling that people in the audience were urging me to fulfil my dream. They were up with me on the podium that night, playing baseball for the Yankees, writing the book they never wrote or getting the girl they never got." In the music world, playing at the New York Philharmonic is like starting with the Yankees, publishing a book to rave reviews, marrying the prom queen. No wonder they don't care for Kaplan's story, and weren't a good match for him. It'd be like asking the high school jock to become best friend with the nerd with glasses. What did Kaplan think the jock was going to say?


Predicting Success

Yesterday, I came across a fascinating article by Malcolm Gladwell in the New Yorker's Annals of Education. The article, entitled "Most Likely to Succeed", discusses the difficulty in predicting who will make a great hire, when the information at one's disposal says very little about how people will behave in their future position.

The opening paragraphs focus on stand-out college quarterbacks who perform terribly when they move to the N.F.L. - a topic that I find intriguing, and even heart-breaking for the players involved. Gladwell gives excellent, convincing explanations as to why someone who "set every record imaginable in his years at the University of Kentucky" then became "a flop in the pros", and the answer, in a nutshell, is that being a quarterback on a college team and in the NFL involves very different skills. To this day, NFL recruiters struggle with what Gladwell calls "the quarterback problem."

Gladwell then extends his analysis to teachers. He writes: "Your child is actually better off in a “bad” school with an excellent teacher than in an excellent school with a bad teacher. Teacher effects are also much stronger than class-size effects." According to Eric Hanushek, an economist at Stanford, "the U.S. could close [the] gap [about its schoolchildren's academic performance] simply by replacing the bottom six per cent to ten per cent of public-school teachers with teachers of average quality." The issue, of course, is that if schools could tell in advance who was going to make a great teacher and who wasn't, they would have moved to hire only the former a long time ago."The school system," Gladwell writes, "has a quarterback problem."

He also argues that the push for higher standards (teaching certification or master's degree) is meaningless, in the sense that holding such diplomas has little correlation with teaching effectiveness, at least according to researchers Thomas Kane from Harvard, Douglas Staiger from Dartmouth, and Robert Gordon from the Center for American Progress".

Many other fields face the "quarterback problem" - one could even argue, any field that hires college graduates. Doctoral admissions committees stare at grades and struggle to guess whether stellar performance on well-defined exam questions, (hopefully) about material already seen in class, can predict the creativity and originality required to do good research, where most problems aren't defined precisely and no one knows in which direction to look for the answer. Human resources employees stare at resumes or transcripts, and hope to divine whether the candidate graduating with highest honors will make valuable contributions to the organization - relying on interviews, and more and more frequently, on-site second-round interviews, which allow recruiters to remove students from their environment and gauge their people skills at a dinner or social event the night before the interview. Flying a group of candidates to the company headquarters is a small price to pay to avoid hiring the wrong person.  

This leaves us with the question: it is hard to guess who will be good at the job, sure, but what are bosses supposed to do? Here, Gladwell comes up with a satisfying example, although he doesn't provide any guidelines, except for the teaching profession. (His approach: get rid of tenure and pay great teachers more - the high risk, high reward method.) His example is about the financial-advice industry, and the intense training job applicants in a company called North Star Resource Group are put through. The company has a ratio of twenty interviewees to one candidate; candidates are not offered jobs but instead participation in a four-month training camp, where they behave like real financial advisers. People who make the cut after the training camp are hired as apprentice advisers, and “even with the top performers, it really takes three to four years to see whether someone can make it,” the co-president of the firm is quoted as saying.

This is the model Gladwell proposes to apply to education, minus the training camp. I felt it could apply just as well to many other professions, and I particularly liked the idea of the short trial period. In a way, businesses already do have something similar at their disposal, through co-op programs at partner universities. Interestingly, I've heard many companies would like to have co-op students, but students are reluctant to enroll because they would have to take courses during the summer or overload during other regular semesters to complete all requirements on time, to graduate with the rest of their class. Maybe it's time companies decided they will recruit in priority students who have successfully gone through a co-op with them.

In short, this was a great read, and will be of interest to many people working in fields outside education.


Tired of Uncertainty, and of Spending

The New York Times ran an article yesterday entitled: "Private Colleges Worry About a Dip in Enrollment." The numbers of students applying for regular admission appear to be down at some schools, after double-digit gains in early applications, and the article reports that two-thirds of the administrators at private institutions who were surveyed worry about a decline in enrollment. Because names like Harvard and Yale jump to mind when people talk about private universities, it is easy to forget that a small liberal-arts college such as Beloit College in Wisconsin "gets three-quarters of its $55 million budget from tuition."

As the journalist points out, the decrease in applications might not translate into a decrease in enrollment - it could be because more students commit earlier through the early-decision program. Some administrators try to put on a brave face by musing that fewer applications might increase the universities' yield (the ratio of offers accepted to offers made). One gets the feeling that the students who do apply to such private schools might get more attractive aid offers this year, to encourage them to accept.

An admissions official at Gettysburg College also referred to the misinformation propagated by the media, a point that the journalist, who authored an article entitled "College May Become Unaffordable for Most in U.S." in the same newspaper earlier this month, does not expand upon. Given that both her articles are very well documented, with convincing numbers regarding the increase in students applying for financial aid, the obvious conclusion is that the part of tuition costs colleges have traditionally asked families - especially middle-class ones - to shoulder was unrealistic ("The middle class has been financing [college enrollment] through debt", someone in the first article is quoted as saying) and that families have reached that conclusion faster than admissions offices; hence, the higher application rate to public universities and community colleges. College administrators might be in massive denial regarding what their financial aid package can buy.

The real story should be why some private colleges, despite their heavy reliance on tuition, have failed to implement any kind of measure to protect themselves against a downturn. Some administrators' utterances don't exactly inspire faith in their abilities. For instance, Lebanon Valley College is taking steps to "lure more students, including adding lacrosse for men and women and hiring a prominent coach, which [an administrator] thinks will attract 20 to 25 students." The same person is then quoted as saying: “We’ve also increased our scholarship award to children of alums, from $500, which is a nice gesture, to $2,500 a year, which is more than a gesture."

Is a place where children of alums get such blatant preferential treatment a place where children of non-alums really want to study? Doesn't anyone feel outraged? It turns out that Lebanon Valley College has an extensive commitment to merit-based scholarships; students graduating in the top 10% of their high school class gets a scholarship covering half the tuition at the college for all four years of study. (Students graduating in the top 20% and 30% also get generous, although smaller, scholarships.) Someone should get a crash course in talking to journalists.

In any case, tuition rates for 2009-2010 haven't been published yet, and we've been here before. The percentages announced in the Spring of 2002, after the dot-com bust, would send shivers down the spine of any parent (New York Times, "As Endowments Slip at Colleges, Big Tuition Increases Fill the Void," February 22, 2002). If families are indeed turning away from less-known private schools for their offspring, they might create a self-fulfilling prophecy, by making these colleges unaffordable. And proving the New York Times journalist right.


Book Trailers

The world changed while I wasn't looking. I found out a few days ago that there is now a thing called "book trailer", and that it's been copyrighted as far back as 2002, by a company named Circle of Seven Productions. The product owes MySpace (founded 2003) and especially YouTube (founded 2005) much of its popularity; a search for "book trailer" on YouTube returned about 5,000 results. The San Francisco Chronicle wrote about the new trend as early as September 2006; in July 2008, a small publishing company announced a book trailer contest to be judged by Stephen King, and the December 2008 issue of the august Poets & Writers magazine published a lengthy feature on video advertisements.

The idea behind it all is to create an instrument of viral marketing, which will "entertain the viewer, who will then e-mail [the book trailer] to a friend." Such a strategy appeals to a younger crowd and might even create new readers in the process. In these troubled times for newspapers ("Feeling the pinch", The Economist, December 4, 2008, "Tribune Co. files for bankruptcy protection", Chicago Tribune, December 9, 2008) and for the book industry ("How to publish without perishing", New York Times, November 29, 2008), it is refreshing to hear about a novel publishing idea in line with the times. With Amazon's Kindle, Google Book, and now video trailers, books are definitely adapting to a technology-driven world.

Whether the format can be adapted to newspapers or magazines is another story - after all, an article trailer is nothing more than a video tease the news channels broadcast before the break about upcoming segments to prevent viewers from turning the TV off. James Surowiecki comments in the New Yorker ("News you can lose," December 22, 2008): "Many argue that if newspapers had understood they were in the information business, rather than the print business, they would have adapted more quickly and more successfully to the Net." He also makes the arguments that the real problem of newspapers is not the Internet, it is the attitude of consumers who have come to expect high-quality news for free. He argues that this is not sustainable and that "there are many possible futures one can imagine for [newspapers], from becoming foundation-run nonprofits to relying on reader donations to that old standby, the deep-pocketed patron." By contrast, the book industry almost looks like it's in good shape.


Ivory Towers Meet Real World

Harvard announced two weeks ago that its endowment had lost 22% (International Herald Tribune, December 4, 2008). As if that wasn't bad enough, the two-digit percentage at the Ivy League university translates into the staggering loss of $8 billion, which unsurprisingly represents the fund's worst performance in four decades, according to Bloomberg. I can't even visualize what those 8 billion can buy - maybe demolishing the whole Harvard campus and rebuilding it from scratch with creations designed by architect Frank Gehry? (I do like the Harvard campus, though. Very much so. And I hope the Harvard students like their campus too, because construction freezes mean they won't get a new one for a while.)

You might say that $8 billion is barely more than 1% of the money thrown at the finance bailout, but it's also half the bailout amount carmakers had requested from Congress. It is also half of the whole endowment at Yale, which only lost $1 billion, according to this article dated December 16, 2008 in the New York Times. (You know there is something fundamentally wrong with the times when words like "only" and "billion" appear in the same sentence.) All the cheers in September when Harvard boasted of a 8.6% return on its endowment for the previous fiscal year (New York Times, September 12, 2008) are long gone.

Harvard is of course not alone in this situation, and many universities are revising their plans for the coming years; their actions include "institut[ing] a hiring freeze and a moratorium on all construction projects that are not already underway" (Boston Globe, November 11, 2008) for instance at Boston University, Dartmouth, Brown and Cornell. Harvard's endowment in particular "fund[s] more than a third of the university's annual $3.5 billion operating budget" - it is a common misconception from parents and students alike that undergraduate and graduate tuition fees fund universities' operations - which means that any big endowment loss can severely affect operating plans.

On the plus side, this all means that, even now, Harvard's endowment still totals about $30 billion - the university is not exactly destitute. At the rate of a contribution of $1 billion per year to its operating budget, Harvard will run out of money in 30 years. Let's be pessimistic and make it 20 years, and contrast this with the burn rates in the car industry. According to The Economist (November 13, 2008), "General Motors and Ford announced on November 7th that they had burnt their way through a total of nearly $15 billion of their precious spare cash in the third quarter. GM is on course to run out of money early next year; Ford a little later." Additional statistics full of numbers in billions are available in this New York Times article dated December 17, 2008.

The Bloomberg article mentioned above suggests that the performance of Harvard's endowment, in terms of percentage, is in range with that of similar institutions. Numbers found in the Wall Street Journal echo the same claim: "The University of Virginia Investment Management Co. said it lost nearly $1 billion, or 18%, of its endowment over the four-month period, reducing it to $4.2 billion. In Vermont, Middlebury College says its endowment fell 14.4%, to $724 million. In Iowa, Grinnell College's endowment dropped 25%, to $1.2 billion. In Massachusetts, Amherst College says its endowment, $1.7 billion as of June 30, also fell by 25%."

An excellent analysis of what went wrong is available in this article of The Economist, "Ivory-towering infernos," dated December 11th, 2008. It discusses the illiquidity premium endowments' strategy, pioneered by David Swensen at Yale, was supposed to yield - and did yield up until this year: "By investing heavily in illiquid assets, rather than the publicly traded shares and bonds preferred by shorter-term investors, an institution with an unlimited time horizon would earn a substantial illiquidity premium." Among possible culprits for the current crisis: "The model may have been adopted by endowments that were too small for it", and "Spending is usually based on averaging the value of the endowment over three years. This method may have led them to spend too much when times were good." (The article also says that "universities typically spend around 4.5-5% of the value of their endowments each year," which is either wrong or shockingly different from the 35% number quoted by Harvard's president for her university.)

Harvard's president sent two  letters to the Harvard community, written in November and December, respectively, which are available here and here. She is correct in pointing out that other revenue streams will be affected - not only endowment, but also donations and possibly tuition, which makes the economic crisis a "perfect storm" for higher education. Sadly, this doesn't only mean hiring, raise and construction freezes: according to a November 18, 2008 article in the New York Times, "the California State University system is planning to cut its enrollment by 10,000 students for the 2009-10 academic year, unless state lawmakers provide more money." A decrease in the number of college slots would dramatically increase capacity at a time where students graduate from high school in record numbers.

And the students who do get in will probably ask for more financial aid, especially since credit conditions have tightened, making it more difficult for the kids or their parents to take out loans, while the universities will need to make up for the endowment fall by increasing their income somehow. Some public universities, such as Binghampton University, have already seen an increase in application of 50% (New York Times, November 7, 2008); even in the California State system, "applications for fall 2009 are up almost 20 percent from last year, with a 36 percent increase in applications from community college transfer students." (New York Times, November 18, 2008) But public institutions are even more financially strained than their private counterparts, due to the budget shortages at the state level, which might translate into steep tuition increases.

This might not be the best of time to be graduating from college and be on the job market, but it's an even scarier time to be graduating from high school, facing unprecedented competition in the admission process and the sudden prospect of not being able to afford college. Maybe it's time for a higher university bailout too - not to bail out the universities, but to bail out the parents.


Schantz Lecture, Part 2

This is a follow-up to my previous post. On Friday, December 5, Cynthia Barnhart of MIT gave the second of two lectures at Lehigh, that one for the general public. In her talk, which attracted such a large crowd that some students in the back had a hard time to find a seat (the university faculty meetings are held in the same large auditorium - curiously, for those meetings, there is never a seating issue), she explained in broad terms some of the challenges faced by the airline industry, and some of the more questionable policies implemented by the airlines.

Something that stuck in my mind is the choice of performance metrics by the US Department of Transportation. Specifically, Barnhart explained that a plane is considered late if it is more than fifteen minutes late. If one plane is already late and the other is not, there is a strong incentive to game the system by giving priority for landing to the plane that is on time, further delaying the unlucky passengers in the other plane.

I also enjoyed hearing her analysis of the congestion at US airports. It turns out that plane capacity has been decreasing, although this has not been matched by a decrease in the number of passengers. On the face of it, flying smaller planes while airports are plagued by congestion does not seem like a smart decision. What is happening there is that airlines compete with each other and offer their customers more frequency, to make sure passengers don't get a more convenient flight with another airline. No one airline has any incentive to reduce its number of flights (flying bigger planes), since the many flights of other airlines would still keep congestion problematic, and the sole airline trying to "do the right thing" would have more dissatisfied customers upset with the few flight options.

Barnhart showed interesting graphs about capacity at US airports. For instance, Atlanta airport is scheduled as if the weather is always perfect, and even then, there are times during the day where the number of flights scheduled exceeds capacity. So a few of these flights have for sure to be delayed into the next block of time, which might bring that block over capacity and propagate the delay. (This is when flights are grounded prior to departure in an airport far away, as it is safer to prevent planes from taking off than to let planes circle for hours in the air while they wait to be cleared for landing.) At least, the weather in Atlanta is usually good, but think about Boston, which can suffer drastic decreases in capacity due to poor weather, but is still scheduled as if all of the capacity is always available. This situation is in sharp contrast with that of airports in Europe such as Amsterdam, which is scheduled using less than full capacity, allowing for some slack in operations.

Reining in the number of flights scheduled in US airports requires a more thoughtful approach to landing slots, which Barnhart suggests should be auctioned. Now, she also points out that airlines have invested a lot of money in having their own terminals at some airports, and obviously are not interested in seeing another company benefit from their expenses. The scheme for auctioning landing slots seems limited in scope so far, and despite its goal to reduce the record air delays that so frustrate passengers, it has received much opposition, for instance from Port Authority. (See this article in the Washington Post, "N.Y. Airports to Fight Slot Auction Plan", and that one in the New York Times, "Court Order Delays Auction of Landing Slots at Airports".)

I am not sure how else the situation could be improved. Since the airlines as a whole need to reduce capacity, especially during times of high airport use, and have no individual incentive to do so, maybe the Department of Transportation should fine the airlines that arrive late. This is not a new idea - it was already voiced in this November 2007 USA Today article, the title of which, sadly, suggests DOT isn't terribly successful at its task ("DOT slammed for handling of airline complaints"). But at least that would leave the airlines with the responsibility for selecting which flights they should discontinue.

Another intriguing result of Barnhart's analysis is that the airlines that have their flights spend the most time at the gate are not necessarily the ones with the least passenger disruption. It appears that small low-budget airlines have a great track record precisely because the aircraft turns around fast. Bigger airlines operate on a different model, apparently, where a lot of planes arrive and then a lot of other planes leave, so you are a lot more likely to miss your connection if your plane is delayed.

The statistics regarding passenger delay were very revealing, since average passenger delay is a small amount (I don't recall the exact number, but it was of the order of 10min.) This doesn't seem to match the growing number of headlines about catastrophic delays at US airports. What really happens is that a very small percentage of customers are delayed for hours (an average of seven hours, if I remember correctly), but don't affect the average delay much because the percentage of people to whom this happens is tiny. However, those are the customers who become very disgruntled and create a public relations nightmare for the company.

What is interesting about all this is that Barnhart has devised operating policies that would significantly decrease the delay suffered by these passengers, provided that the non-delayed passengers were delayed a little, by something like five minutes on average. It appears that such a policy was strongly opposed by the top management to whom she presented her results, because they did not want to increase the delay of the non-delayed passengers - who, frankly, would probably have remained unaware of it to begin with. Who pays attention to a five-minute delay? The extremely delayed passengers, though, will notice whether they wait four hours or seven at the airport. Some airline managers really are out of touch with reality.

All in all, I found the talk extremely enjoyable, as it shed new light on a topic everyone can relate to, and has experience with.


Spencer Schantz Distinguished Lecture Series

Dr. Cynthia Barnhart gave the Spencer Schantz distinguished lecture at Lehigh last week. Or rather lectures, plural, because there were two of them: the first one on Thursday, December 4, was a technical talk geared toward the faculty and students in the Industrial and Systems Engineering department, while the second one, the following day, was intended for a broader audience. Both talks focused on the application of operations research to the airline industry, with different amounts of math in each.

Composite Variables.
(Severe geek alert ahead. If you don't care for math, wait for next post tomorrow. I'm serious.)

While Barnhart discussed a wide range of topics in the technical talk, from robust scheduling to operations recovery and dynamic scheduling (wouldn't it be helpful if airlines reassigned a flight to a different plane when an aircraft is delayed, rather than sticking stubbornly to its original plan?), the concept that most got my attention is the one of composite variables, which Barnhart has discussed many times before and yet never fails to amaze me, because it shows how much of an impact the initial modeling of a business problem can have on its tractability. This is especially true in logistics, where the large-scale nature of the problems encountered makes it imperative to design smarter algorithms.

A review of the branch-and-bound algorithm. The idea is that many logistics problems use integer variables (number of trucks, yes/no assignment decisions modeled using 0/1 variables, etc). Such problems are typically approached by first solving their linear relaxation (the problem where we forget that variables have to be integer, which is much easier to solve than the original problem), and then slowly reintroducing the fact that variables have to be integer.

For instance, if we find that the number of trucks used is equal to 3.2 in the linear relaxation, we can say that at optimality, either the number of trucks will be less than or equal to 3, or it will be greater than or equal to 4. We can say that because there is no integer number strictly between 3 and 4, so the optimal integer solution can't be in the part we have just removed. Then we solve these new linear problems again, look up which variables are not yet integer, and reiterate. This is called the branch-and-bound algorithm.

A good example is available on these OpenCourseWare notes of a course I TA'ed for [was a Teaching Assistant for] many years ago, starting at the bottom of p.3. (Since the variables are binary, they can only take values 0 or 1, so the branching is on the variable equal to 0 or the variable equal to 1.) If you keep adding constraints to a subproblem to prevent the variables from being non-integer, at some point you will arrive at an integer solution and that subproblem will be solved. You will also have a feasible integer solution to the original problem.

What is appealing about branch-and-bound is that, once you have a feasible integer solution, you can sometimes figure out that analyzing other subproblems is going to be a waste of your time, so you don't have to enumerate all the possibilities. This is because solving the linear relaxation gives us a bound on the optimal objective of the integer subproblem. For instance, if you have a minimum-cost problem, the linear relaxation will always have lower cost than (or equal to) the original integer problem, because it has fewer constraints - we have removed the constraints that the variables had to be integer - so we have more latitude to do whatever we want to decrease cost. So if we happen to have a feasible integer solution that gives us a cost of 100 and the linear relaxation of another subproblem gives us 103.4, we know that adding constraints to that subproblem to make the decision variables integer will only increase cost, so we'll never get lower than 103.4 and don't need to study that subproblem in more detail.

The better the bound, the more information we get and the faster we are going to eliminate subproblems and find the optimal solution. The issue with traditional modeling for logistics problems is that the linear relaxations typically give poor bounds. So we can't remove subproblems easily, and the computer takes forever to solve the original problem because we have to go through all the subproblems to find the optimal solution. This would be annoying for small-size instances, but it becomes a major issue in logistics, where some formulations have hundreds of thousands of variables.

Composite variables. The idea behind composite variables is to reformulate the problem in an equivalent way that gives better bounds at the linear relaxation level. What composite variables do is to capture several decisions simultaneously. For instance (the following example is taken from these slides, p.15), if you have to deliver 6,000 packages to the sorting center and you can either use an aircraft with capacity 3,000 or an aircraft with capacity 8,000, you know right away you will either use two small planes or a big one to match the demand. So instead of writing 3000*x1+8000*x2>=6000, you can write 6000*x3+8000*x2>=6000. Initially, x1 (number of small planes) could have been 0, 1, 2 and x2 (number of big planes) 0, 1, but now both x3 (whether we use small planes) and x2 (whether we use big planes) are 0 or 1. In other words, composite variables here capture the choice between pre-defined strategies, rather than letting the computer struggle by itself with defining the details of each strategy.

It gets better! (Really, it does.) Once you have written all that, you can replace the 6000*x3+8000*x2>=6000 constraint by the connectivity constraint x3+x2>=1, that says the sorting center must be supplied somehow (either using the big-aircraft strategy or the small-aircraft strategy). Before, the connectivity constraint x1+x2>=1 would not have guaranteed that the center had all the packages, since it allowed for only one small aircraft to supply the sorting center (x1=1, x2=0), bringing only 3000 out of 6000.

The fact that, with this new formulation, we can drop the demand-capacity constraint 6000*x3+8000*x2>=6000 is good news because (1) we like having fewer constraints, (2) connectivity constraints like x3+x2>=1 go through integer points (x3+x2=1 is the line going through x3=1, x2=0 and x3=0 and x2=1). In contrast, 6000*x3+8000*x2>=6000 goes through (x3=1,x2=0), which we like because it's integer, but also through (x3=0,x2=0.75), which we don't like, because if the linear relaxation ends up picking that point as the optimal solution, we won't have integrality of the variables. That would mean more work ahead of us to get to an integer solution. Obviously, there are other constraints in the problem that mean that, even after all this, we won't necessarily have an integer solution right away, but at least we'll be a lot closer.

This method resulted (according to p.20 of this presentation) in 5.3% savings in annual operating costs and a 7.69% reduction in number of planes used, which also meant the leases on these planes could be terminated, leading to additional savings. Logistics modeling isn't for the faint of heart (after all, that's why operations research PhDs are in high demand in industry), but all that hard work was definitely worth it for the company involved.