Previous month:
February 2008
Next month:
April 2008

March 2008

On "Redefining Global Strategy", Part 2

Here comes the second part of my review of "Redefining Global Strategy", by Pankaj Ghemawat. (Part one is here.) As much as part one was an enjoyable read, at least for a business book, part two was harder to plow through, and I stand by my previous comment that books written by academics are only successful when they are co-authored by journalists - or maybe when I am doing the writing. Ghemawat is actually far better than the average academic author, but I still felt frustrated with his book at times and wished for more examples. Let's just say the temptation to skip to the summary at the end of Chapters 5 and 6 was great.

The second part of the book focuses on three strategies for global value creation, dubbed AAA for Adaptation, Aggregation and Arbitrage. (And yes, the author indulges in AA and Aa abbreviations that would make Standard & Poor's and Moody's proud.) Adaptation strategies "adjust to differences in each country". Without some degree of adaptation, multinational companies attempting to enter new markets face an uphill battle. Even Wal-Mart made blunders such as "stocking US-style footballs in soccer-mad Brazil" (p.107) Ghemawat presents an excellent analysis of the home appliance industry starting p.108. For instance, he describes the unexpected diversity in clothes washers (which created problems for Whirlpool trying to produce fewer models and create economies of scale) as follows: "In France, top-loading machines accounted for about 70 percent of the market [...] WestGerman consumers preferred front-loaders with high spin-speeds of 800 rpm or more. Italian consumers preferred 600-800rpm, front-loading machines. The British prefer 800rpm front-loaders, but with a hot and cold water fill rather than cold-water-only supply." (p.111) And you thought selling clothes washers was boring. You can read more along those lines for fridges and ovens on p.111. Ghemawat provides a list of levers and sublevers for adaptation on p.116, for instance segment focus, as demonstrated by Zara (p.121), or partitioning, practiced by McDonald's (and its McSpaghetti offered in the Phillipines but not in Italy, p.125). The author gives also strategies to foster openness, knowledge and integration across countries on pp.134-135, and states in summary that the optimal degree of adaptation depends on industry characteristics.

Aggregation strategies are "all about using various grouping devices to create greater economies of scale than country-by-country adaptation can provide." (p.139) Ghemawat makes the surprising discovery that (semi-)globalization has increased the importance of geographic regions; specifically, "intraregional trade has had more influence than interregional trade on the large increases in international trade." (p.140 - and for those of you out there who always get confused between inter and intra, 'intraregional' means inside one region and 'interregional' means across regions.) "Of the 366 companies in the Fortune Global 500 for which such data were available, 88 percent derived at least 50 percent of their sales in 2001 from their home regions - with the share of sales in the home region averaging 80 percent for this subgroup." (p.141) This chapter gives regional strategy archetypes (regional focus, regional portfolios, regional hubs, regional platforms, regional mandates, and regional networks) based on the evolution of the manufacturing company Toyota. Zara is back on p.147 with the comment that "the decline of the dollar against the euro has inflated Zara's costs of production in Europe, relative to competitors that rely more on dollar-denominated imports from Asia."

Arbitrage is a way of exploiting differences. For instance, labor arbitrage has "played a key role" in Embraer's success (p.179); "Embraer's employment costs came to $26,000 per employee in 2002, versus an estimated $63,000 for the regional jet business of its archrival, Montreal-based Bombardier." This explains why Embraer "has focused its operations on final assembly, the most labor-intensive part of the production process, and has outsourced other operating activities." Ghemawat uses the example of Indian pharmaceuticals, starting on p.180, to illustrate the varieties of arbitrage. Arbitrage is also the reason why the Northern hemisphere can have fresh cut flowers in winter. Chapter 7, on "Playing the Differences", "examines the trade-offs among the AAA strategies and the extent to which it is possible and advisable to pursue more than one of the As at the same time." The author distinguishes between AAA awareness, one A strategy, compound (AA) strategies and trifecta (AAA) strategies. He also introduces the AAA triangle and illustrates the concept on companies such as IBM, Procter & Gamble and Tata Consultancy Services (p.205-8). Chapter 8, "Toward a Better Future", "concludes the book with a look at the future of globalization."

On the Engineering Pathway

Today's post is on the Engineering Pathway, a portal to teaching and learning resources in engineering, for use by students and teachers in K-12 and higher education. Initial funding for the portal was provided by the National Science Foundation, in order to (from the award summary) "develop a prototype national digital library for science, engineering and technology education", with Dr. Alice Agogino from U.C. Berkeley as the PI. (The National Science Digital Library can be found here.) The site has grown tremendously since its inception, with associate editors in each engineering discipline - I am the AE for the industrial engineering part of the website; please feel free to volunteer resources on industrial engineering.

My favorite feature is "Today in History"; it is so easy to forget that many accomplishments in engineering are quite recent. Among others:

  • The first vacuum cleaner was patented on February 18, 1901 by an English structural engineer. Unfortunately, it was unable to collect the dust. The first motorized vacuum cleaner was invented in 1907 by a janitor in Ohio who suffered from asthma, and got a patent in 1908 for his discovery. A pillowcase served as dust collector.
  • The first Ford Mustang rolled off the assembly lines on March 9, 1964. From the blog: "The Mustang was one of the most successful product launches in automotive history with over one million units sold in its first 18 months."
  • On November 18, 1963, "Bell Telephone introduce[d] the push button telephone, eventually to replace the rotory dial telephone that had dominated the market since its invention in 1891."

If you would like to write a post on an invention, please let me or another associate editor know.

Besides the "Today in History" feature, I also enjoy the more general Engineering Pathway blog posts, which have many guest authors. One of my favorite posts is due to Patricia Galloway, who wrote in March about women civil engineers in history and her own career path. (March is Women's History Month!) You can find a list of posts on women engineers and their contributions here. We are always striving to improve the site and add resources; all comments, questions and suggestions are very welcome!


Laura McLay has a post on INFORMS's new online journal, Analytics, in her blog Punk-Rock Operations Research. (INFORMS is the Institute for Operations Research and Management Sciences.) The journal's editor states: "In the good old days, many "savvy" corporate CEOs and other assorted head honchos in the public and private sector routinely made critical decisions by the seat of their pants. [...] Today, more and more of these C-level decision-makers are turning to analytics for help in the decision-making process. The stakes are just too high and the competition is just too fierce. [...] [According to the 2007 book "Competing on Analytics", analytics are defined as] 'the extensive use of data, statistical and quantitative analysis, explanatory and predictive models, and fact-based management to drive decisions and actions.' "

This reminds me of the success of Super-Crunchers, which I still haven't read, but was selected by The Economist as one of the best business books of 2007. Since part of my research is on data-driven decision-making, I am obviously pleased by the attention analytics are receiving. The key issue is to apply the right model rather than making wrong assumptions that will have dire consequences for the company - I have a few posts on real-life examples in my Research page. An appeal of data-driven techniques is that they are built around available data instead of requiring managers to make assumptions they can't check, for instance on demand or customer behavior. (The other part of my research focuses on robust optimization, and guess what the full-page advertisement opposite the editorial was about? Go Frontline Solvers! "Robust Optimization, stochastic programming and simulation optimization made easy to use.")

Analytics has many interesting articles, but since I am French, I'll indulge in a bit of pride for the motherland and mention the article on Renault's supply chain. Managers moved away from a built-to-inventory model (code for: make cars to put into car dealerships, and if you produced the wrong car, i.e., a car that doesn't match customers' tastes, the dealer will have to give customers deep discounts to convince customers to buy it) toward a built-to-order approach (code for: build the exact car the customer wants). A built-to-order perspective is only possible when time-to-delivery, also called lead times, are small enough to convince the customer to wait for his perfect car rather than shopping elsewhere. Renault felt a customer would be willing to wait three weeks, but not the six that were usual before the supply chain overhaul. The article provides a great example of information flow and information systems in a complex manufacturing company. Such clarity in company operations is lacking in many business articles, and it was refreshing to get a precise description of the problem at hand.

The article also gives insights into the difficulty to get buy-in from other parts of the company, and common issues in business, such as the dilemma between "in-house developments versus ERP software". (The operations research experts were forced to work with an ERP software supplier, and mid-way everyone agreed this was a mistake, which the O.R. people had been saying all along, because the "great specifics of Renault product range description" proved difficult to incorporate to commercial software. Ultimately, Renault had no choice but to "trash all the work done with the ERP software (a few men-years).") The article's author gives a good overview of the principles behind the algorithms, which any industrial engineering undergraduate should be able to follow, and describes convincingly the challenges in sales forecasting, production planning and the like. Towards the end, the author states: "We define with [end users] what the problem is. [...] Then we focus on what should be the characteristics of a "good" solution, so as to be sure to take into account all the business rules. Such questions may seem very basic from an O.R. viewpoint, but they brought a great clarification to business people. An important lesson was that the modeling of the objective function must be validated by business people, even though O.R. technicalities are not easy to grasp for them." I couldn't agree more. Involving business people (the end-users) makes the difference between an expensive computer-based toy that will gather dust on a virtual shelf and an efficient tool that becomes widely used in the company. While consulting companies have long been eagerly hiring O.R. graduates for their modeling and data analysis skills, the article makes a great case for in-house O.R. in more traditional industries.

Is there a future for quantitative business research?

A few months ago, The Economist began "a look at noteworthy articles from the business journals" in its online edition. It is good to see business research gets some attention from the mainstream press; however, the articles are very qualitative in nature (read: fluff). The existential necessity of midlife change? (What's new in the journals, February 2008) Door-to-door sales: the forgotten channel? (February 2008) Cost analysis: the acquisition of the items listed in a popular Christmas song? (December 2007) And people get paid for this? (The only two really interesting articles mentioned were Michael Porter's update of his seminal "five forces" articles for the Internet age, and an analysis of online retailers' lack of Internet security, i.e., lack of safeguards to protect customers' data.) Practitioners often deem the quantitative research done in business schools irrelevant, on the grounds that real life is hard to quantify, although mathematical models can help provide insights into efficient strategies that are not obvious to decision-makers.

The debate is years-old - see this article in The Economist, and two posts of mine (post one and post two): should information flow from companies to schools (business school professors being relegated to the role of witnesses, writing case studies about strategies they played no role in implementing), or from schools to companies (professors helping companies improve by suggesting innovative techniques)? Part of the issue, it seems, is that academics prefer to view themselves as advisers to CEOs, and most CEOs could not care less about the nitty-gritty of implementation, although that's what distinguishes winning strategies from losing ones. It shouldn't come as a surprise that quantitative decision-making has found a new home in finance and is fighting an uphill battle in operations management; journals such as Management Science haven't kept much science in the papers they publish and have re-focused instead on qualitative insights.

Operations researchers are the people in the trenches, who make systems run better and help companies save money. Of course, "making systems run better" doesn't quite have the same ring to it as "advising a CEO", but from a consumer's perspective, you can only view, for instance, the FAA's use of operations research techniques to reduce delays in inclement weather as a happy development. The FAA is one of this year's finalists for the Edelman prize, which is awarded by the Institute for Operations Research and Management Sciences for the best real-life application of operations research. Mike Trick explains in this post how competitive the award is (I learned the finalists had been announced from his blog). The press release issued by INFORMS can be found here. It states in part:

"The 2008 Franz Edelman finalists are:

  1. Federal Aviation Administration, for a project entitled “Airspace Flow Programs,” which gives the FAA greater ability to control the nation’s skies at times of peak consumer usage and flight congestion.
  2. Netherlands Railways, for “The New Dutch Timetable: The O.R. Revolution,” a solution that improved on-time performance and capacity for more than a million daily train passengers.
  3. StatoilHydro, one of the world’s largest gas producers, and Gassco, the independent Norwegian network operator, for “Optimizing the Offshore Pipeline System for Natural Gas in the North Sea.”
  4. The City of Stockholm, Sweden for “Operations Research (O.R.) Improves Quality and Efficiency in Social Care and Home Help,” a program that has brought improvements to the complex scheduling of more than 4,000 providers who help the sick and the elderly.
  5. U.S. Environmental Protection Agency, for “Reducing Security Risks in American Drinking Water Systems.”
  6. Xerox, for “LDP Lean Document Production® - Dramatic Productivity Improvements for the Printing Industry,” which has bettered production and reduced costs for print shops and document manufacturers. The total impact to date on Xerox profits from the utilization of the LDP is about $200M. Xerox has filed 48 patents on this methodology and so far 11 have issued."

It is too bad that kind of accomplishments doesn't find its way into The Economist's "Journals" page. That would make for a more interesting read than the latest news on door-to-door salesmen.

Should College Last Four Years?

Last month, Princeton announced plans to offer admitted high school students the opportunity to spend one year abroad, to do some kind of public service work before they begin college (New York Times, February 19, 2008). The Times adds: "Princeton's president, Shirley M. Tilghman, said in an interview that such a program would give students a more international perspective, add to their maturity and give them a break from academic pressures." Given how competitive college admissions have become, turning student coaching into a lucrative business, it makes sense to allow students for a "change of scenery" before they start college - otherwise they'll take a break while registered for class, believing that they've done the hardest and that the name of the university on their diploma will guarantee them a job (they're wrong - many companies require students to have a GPA exceeding some threshold, typically in the 3.0-3.2 range, to be considered for interviews.) On the other hand, taking a year off also means graduating later.

This got me thinking on the length of college studies. College has always been four years long, even back in the days where it was reserved for the (male) elite and life expectancy was about 10 years shorter as it is now (see the CDC's website [U.S. Center for Disease Control and Prevention]). But recent years have seen a shift in college teaching, emphasizing a more hands-on project-oriented approach that might require more time for students to realize what they are doing wrong, if they are expected to take a more active role in their learning (Olin College is the poster child for project-based learning). Even in more traditional colleges, students are encouraged to spend a year abroad or a semester in co-op; in order to graduate with their friends, they often spend the summer before they leave or after they come back taking whichever courses in their curriculum are offered during the summer session, although research-active professors usually do not teach during the summer and some courses are assigned to the ever-mysterious "staff", which is often code for adjunct or senior graduate student. (In some respects, these people might be more motivated to do a good job than tenured or tenure-track faculty, but many of the courses are offered in the Summer I session rather than over the whole summer, which means that the instructor has to cram as much material in half the time [in weeks, not time spent in class] by teaching four times a week and students have little time to assimilate the material.)

On top of that, many companies make offers to students based on their performance during their summer internships between junior and senior years, i.e., one full year before they graduate, although students take many electives related to their major their senior year (that's the reason why many students see their GPA improve steadily between their freshman and senior year, with the exception of the Spring semester before graduation where some develop a severe case of "senioritis"). Students understandably prefer having a job offer in hand by the time they graduate. This is also better for the universities themselves, which are ranked by U.S.News and BusinessWeek, among others, in part on the colleges' abilities to find their students jobs. (Am I the only one to find it ironic that people cannot expect to spend their whole career in the same place anymore, cannot hope that one company will "take care of them" in their old days, but that universities have to "take care" of their graduates by making sure they get jobs; otherwise the universities will be penalized in the always-powerful college rankings? It seems that students are primarily paying (tuition) for a job at the end of the tunnel, rather than for a good education.) Companies want to snatch the best students away from their competitors - they cannot afford to single-handedly decide they will only recruit in the Spring, and since Spring is traditionally devoted to recruiting summer interns, it all works out very well for the human resources people who are sent to college campuses.

In effect, this means college lasts three years for post-graduation purposes. This system favors students who have figured out early on what they wanted to do and don't waste time studying abroad and so on, while the others will be better off staying one more year in a graduate program to improve their GPA and show their motivation. (If you study abroad the Fall semester of your senior year, you'd better be willing to stay in school for a Master degree, no matter how high your GPA, as you won't be able to go to job interviews.) With all the talk about 3+2 or 4+1 degrees in higher education, what the push toward Master's degrees at the same institution where students gain their Bachelor degrees really amounts to is a lengthening of college education. Because it has also become critical to stay abreast of recent developments in one's field, it is not clear that college education in the States should last, say, five straight years, as opposed to allowing workers to go back to school after a few years in the workforce, but pretending that college lasts four years is a joke.

Graduates of the French engineering schools stay five years in school - they spend the first two (or three, for those who choose to retake the exams) preparing for the entrance examinations, then three years in the engineering school itself, with the last semester spent at a company doing the equivalent of a really intense, 12-credit-worth senior project (it's closer to a co-op, in fact), which they present in a public session a few days before they graduate, in addition to writing a report on what they did. All job interviews are conducted in the Spring. Of course in France people typically don't go back to school after they've left, so the goals of the training are different, but French engineers remain among the best in the world. Food for thought on this side of the Atlantic Ocean? 

A Good PR Move

There has been much talk recently about top universities giving more financial aid to low-income families, or rather, announcing they will give more financial aid to such families. In December 2007, Harvard  announced some loans would be replaced by grants (in other words, you don't have to pay the money back), following in January by Yale (same link as above, second half of article). Stanford jumped into the fray in February. MIT followed suit in March. Lehigh has also announced plans to make college more affordable for families in the lower range of the income scale. (Count Princeton in too.) While Harvard's move has been controversial because it also helps high-income families, the focus has been on making college more affordable for people who are not as well-off, and that is a extremely laudable goal, in these times where not having a college degree drastically reduces your earning power. This is a fantastic result.

But it's unlikely universities have adopted that stance just to do good, all of a sudden. I'm not saying altruism did not play a role (MIT in particular has a long-standing commitment to need-based fellowships; a few of the students who would not have been able to attend the 'Tute without such financial aid are profiled in the Winter 2008 issue of Spectrum, MIT's magazine for donors, under the headline "Nurturing a dream" - this actually plays a big role in my pride at having graduated from the place), but just around the same time, newspapers such as The New York Times and the Washington Post ran articles about the fact that college frenzy will soon ease at the Ivy League level. The wording in the Post is most telling: "Colleges and universities are anxiously taking steps to address a projected drop in the number of high school graduates in much of the nation starting next year and a dramatic change in the racial and ethnic makeup of the student population. [...] After years of being overwhelmed with applicants, higher education institutions will over the next decade recruit from a pool of public high school graduates that will experience: (i) A projected national decline of roughly 10 percent or more in non-Hispanic white students, the population that traditionally is most likely to attend four-year colleges. (ii) A double-digit rise in the proportion of minority students -- especially Hispanics -- who traditionally are less likely to attend college and to obtain loans to fund education."

In other words, numbers of high school graduates will keep rising, but numbers of college-bound high school graduates based on today's observed behavior will decrease, unless universities change the behavior of groups that traditionally have been less likely to send their offspring to college. The motivation is laudable, and I'm all for making college more affordable for deserving students. Now, don't tell me the timing is just a coincidence. What the universities are really doing is making themselves attractive for talented minority high school students who would not consider them otherwise, letting them know the Ivy League is an option ahead of time, so that top-tiered schools can remain as competitive as before.

Lehigh Valley Science and Engineering Fair

Yesterday I served as a judge for the Lehigh Valley Science and Engineering Fair, which was held at Lehigh University. Last year's fair was canceled under mysterious reasons that appeared unrelated to student participation (there used to be over 300 projects submitted to the fair), so this year students were obviously reluctant to do work if there was a chance it would be for nothing - only about 170 enrolled and about 120 ended up submitting a project, which was great for everyone involved, since judges could spend more time talking with students on the day of the competition. The Lehigh Valley has more than its fair share of thugs and nobodies desperate to feel powerful, so it was a fantastic opportunity to connect with the good people of the Valley or, like one of the organizers put it, in reference to a recent case in the news where X-rated cell phone pictures of local high school students circulated widely among teenagers, "Those are not the students we'll read about in the newspaper, but those are the ones we care about."

I was part of the team judging eighth- and ninth-grade physics (I am proud to report I still understand physics at the ninth-grade level), and I was very impressed by the quality of the projects. The first-prize winner in eighth grade was Kyle Michalski from Sacred Heart School in Nazareth. He studied how electromagnetic fields affect plant growth, because he was interested in recent statements by the World Health Organization about links between EMFs and childhood leukemia. In his experiments, EMFs cause irregular growth in all of the plants - some plants subject to EMFs did not grow at all, and some grew a lot more than the plants in the control group, which were not subjected to EMFs. These fluctuations were not captured by the average growth in each group (obviously), and I was particularly impressed that an eighth-grader would realize that average numbers don't give the whole picture by themselves. He's ahead of many people in the real world. Second prize was given to Eric Rau from Broughal Middle School in Bethlehem (right across the street from where I work - it was great to see what the local kids are up to!), who studied the speed of light in Jell-Os of different colors, and established that the speed of light using a red laser was fastest in the red Jell-O. His experiment was very well-done.

In ninth grade, both first and second prizes went to students at Parkland High School in Allentown. Second prize went to Nicholas Dyszel, who studied properties of infrared light, and in particular, which materials do not block the infrared light of a TV remote control. That was great work, with very interesting observations. First prize went to Gilbert Jones for his study of the insulation properties of various materials. He tested the properties of five materials used in wall insulation and compared their performance with what was advertised by the manufacturer. I have to say, every single person on the judging team raved about that Gil Jones kid. (One even joked he wanted to hire him on the spot.) Not only did he execute brilliantly a project of breathtaking difficulty, but in the interview process it became obvious he really knew what he was talking about (why there was drywall at the bottom of the box he built in his basement, what convection was about, and so on). He is a brilliant kid and I hope he continues on the science and engineering career path. Way to go Parkland!

Integrated Forecasting and Inventory Control

(This post is the introduction of a technical report I co-authored with my doctoral student Gokhan Metan, entitled: "Integrated Forecasting and Inventory Control for Seasonal Demand: a Comparison with the Holt-Winters Approach". Our work on data-driven revenue management is funded in part by the National Science Foundation under grant DMI-0540143.)

Forecasting and optimization have traditionally been approached as two distinct, sequential components of inventory management: first the random demand is estimated using historical data, and then this forecast (either a point forecast of the future demand or a forecast of the distribution) is used as input to the optimization module. In particular, the primary objective of time series analysis is to develop mathematical models that explain past data; these models are used in making forecasting decisions where the goal is to predict the next period’ observation as precisely as possible. To achieve this goal, demand model parameters are estimated or a distribution is fitted to the data using a performance metric such as Mean Square Error, which penalizes overestimating and underestimating the demand equally. In practice, however, the optimization model penalizes under-and over-predictions unequally, e.g., in inventory problems backorder is viewed as particularly undesirable while holding inventory is more tolerated. In such a setting, the decision-maker places an order in each time period based on the demand prediction coming from the forecasting model, but the prediction of the forecasting model does not take into account the nature of the penalties in the optimization process and instead minimizes the (symmetric) error between the forecasts and the actual data points.

In this paper, we investigate the integration of the forecasting and inventory control decisions; in particular, our focus is on comparing the performance of this approach with the traditional Holt-Winters algorithm for random demand with a seasonal trend. The goal is not longer to predict future observations as accurately as possible using a problem-independent metric, but to blend the inventory control principles into the analysis to achieve superior inventory management. This work adds to the growing body of literature on data-driven inventory management by focusing on cyclical demand, and comparing the performance of a novel algorithm developed by the authors, based on the clustering of data points, with that of the traditional Holt-Winters approach. Cluster creation and recombination allows the decision-maker to place his order at each time period based only on the most relevant data. To the best of our knowledge, we are the first authors to propose a clustering approach to the data-driven inventory management problem. Customer behavior exhibits cyclical trends in many logistics applications where the influence of exogenous drivers is difficult to quantify accurately, which makes the approach particularly appealing. The proposed methodology captures the tradeoff between the various cost drivers and provides the decision-maker with the optimal order-up-to levels, rather than the projected demand. A cost decrease of 2-5% in experiments suggests that inventory managers could greatly benefit from implementing this approach.