Previous month:
January 2008
Next month:
March 2008

February 2008

Information Overload

Buried in The Economist's report on securitisation last week (February 16 issue) was the mention of investors clamoring for more information, in order to better understand financial instruments. The report's authors noted: "Yet reams of information already accompany mortgage-backed securities sold in public markets. [...] So some interpret calls for greater disclosure as whimpering by investors who did not do their homework." That reminded me of a Buttonwood column that I had set aside over the summer (Too much information, July 14, 2007) - about information vs noise in a news-intensive world and how to distinguish one from the other. A study by an investment bank even "found that the average forecasting error [on analysts' forecasts of company profits] was 43% over 12 months and 95% over two years." Striking is also the fact that "most of the statistics are revised in subsequent weeks but the revisions rarely have as much market impact as the original figures." Doesn't that make you think of headless chicken?

The column also suggests that increasing the amount of data available can decrease the quality of the decisions (a new type of Braess' paradox, maybe), and provides a proxy that will not surprise anyone but for which I did not expect to find any hard evidence: "An academic study has found that American mutual fund managers are more likely to weigh their portfolios in favor of shares in companies in which one of the senior officers went to the same university as they did. The effect is strongest when they were there at the same time and on the same course." (And how am I going to convince my students to pay attention in class rather than making new friends now?)

Given the risk of information overload, I am curious to see how promising innovations such as RFID will evolve - in particular whether they will avoid overwhelming managers with data. Successful data-driven management will help businesspeople make superior decisions by giving them a fuller picture of customer behavior (for instance identifying quickly which product is performing above expectations), but making it successful might not be a trivial matter. In the end, the answer might be to outsource data analysis to firms that specialize in it. This has indeed become a booming business, in particular in fields such as supermarket retail that don't sound cutting-edge but have those dear membership cards that keep track of everything you buy.

According to a December 2007 article (Watching as you shop), "Big shops are using elaborate technology to monitor and influence the behavior of their customers." In a project called PRISM, "sensors recorded data on customer-traffic patterns" for Coca-Cola, Procter & Gamble and Wal-Mart, and in yet another trial, a system called BehaviorIQ was "used by retailers to gather data on where their customers go, where and how long they stop, and how they react to different products." Some stores even record the conversations of their shop assistants to track "employee performance and customer behavior." The companies analyzing all this data insist that their algorithms work wonders, and maybe they do. (After all, they just have to work a bit better than the store's current strategy to yield substantial profits.) If so, the true future of super-computing might not lie in the finance industry after all, and data-driven management has beautiful days ahead.


On Math and Elephants

I once read a story about how people train elephants. The trainer ties a rope (itself attached to a pole or something of that order) to the elephant when it's a baby and cannot break the rope no matter how hard it tries. So the elephant learns the lesson that the rope cannot be broken, and remains peacefully attached by the puny rope even when it's an adult and could easily break it if it tried again. It just never realize its circumstances have changed.

I am reminded of that tale in research. Take the Black-Scholes model of option pricing in finance: it assumes Log-Normality of the asset prices, and leads to an elegant, closed-form formula. The thing is, asset prices aren't Log-Normal - they have fat tails (rare events occur more often than what the Log-Normal model would suggest). One of my former students, Mo al Najjab, studied the empirical distribution of stock prices in the S&P500 for his Master's thesis and used the fact that a Log-Logistic model seems to match real-life asset behavior more closely to price European call options using simulation. His analysis suggested that call options are overpriced in the Log-Normal model. (You can read more about his work here.) In the early 1970s, when the Black-Scholes model was first developed, computers in the financial industry weren't powerful enough to handle large simulations, and traders came to rely on a (relatively) simple formula that gave them a quick answer. Quants championed the model then because of the difficulty in considering other distributions (although academics started questioning the Log-Normal assumption almost right away; the "Options, Futures and Other Derivatives" book by John Hull has good references on this). Few people, it seems, have noticed that the Black-Scholes model has outlived its usefulness. The fact that a problem couldn't be solved 35 years ago doesn't mean it still can't be.       

Research is a field with a built-in bias towards following the leader (first person to publish in a topic) without challenging his or her assumptions. People want to publish papers to get tenured or be promoted, and it's easier to gain acceptance if you propose an improvement on an already published paper, as one of its authors is likely to review your work, and more often than not will not enjoy reading that you think his or her model has flaws and here's what you want to change. So let's take a moment to recognize the authors of two innovative works I recently came across. The first one is due to Amanda Schmitt at Lehigh, who is currently completing her PhD dissertation here at Lehigh under the supervision of my colleague Larry Snyder (I'm on her thesis committee). Amanda is studying inventory management for supply chains subject to supply uncertainty. One thing that caught my attention when she presented her thesis proposal is that she used multi-period models. This seemed an obvious choice, but she explained that quite a bit of literature has focused on single-period models. And you'll say: but isn't the whole point of studying disruptions to quantify their long-term impact on the system? How can you measure the speed of recovery if you only have one time period? Don't ask me. It's good that people take a step back once in a while.

The other work is due to Russell Meller and Kevin Gue, respectively of the University of Arkansas and Auburn University. I found out about it at the recent NSF Grantees Conference in Knoxville, Tennessee. Meller and Gue have studied how to improve warehouse design to minimize the distance traveled by the workers, and come up with what they call a fishbone design. You can read more about Meller's and Gue's approach here. Curiously, it hadn't occurred to anyone before them to check whether the traditional design (straight aisles with one large alley cutting through them at a perpendicular angle) was anywhere close to optimal; it turns out the "fishbone aisles" design reduces picking costs by 20 percent. That makes you wonder what else is being done very inefficiently out there because nobody challenged the assumptions.


The Media's Anti-Math Bias

The first few paragraphs of an article in yesterday's Washington Post ("Parents Rise Up Against A New Approach to Math") left me shaking my head. "The Prince William County third-grader did not stack the numbers and carry digits from one column to the next, the way generations have learned. [...] His computation amount to an upside-down pyramid with numbers at the bottom." Doesn't that sound crazy to you? Doesn't it seem that another educational novelty is hurting students more than it helps them? (Integrated math, anyone?) "In Prince William and elsewhere in the country, a math textbook series has fomented upheaval among some parents and teachers who say its methods are convoluted and fail to help children master basic math skills and facts." It's hard to disagree. Then the rest of the article left me shaking my head too, but for the opposite reason.

Somewhere towards the bottom of the first page online, a math supervisor gave the example of 23 times 5, and that's when I understood what the students were probably being taught. The inversed pyramid was 20 on top of 3, and each of the numbers was multiplied by 5. This is called distribution, people. In (20+3)*5=20*5+3*5, we say that the 5 distributes to the terms inside the parentheses. (20+3)*5 is not the same as 20+3*5, because multiplication has priority over addition (does its thing first). You might find this obvious, but even some of the college seniors I teach, who are among the best in the nation (we're a Top 15 industrial engineering department) struggle with the concept - clearly someone failed to teach them along the way. And if they didn't get taught parentheses and distribution, then there's a lot of people out there who didn't either.

This is not a trivial matter: if you enter 20+3*5 into the computer but mean (20+3)*5 it will give you an answer. You just will never realize you didn't get the right one. This is precisely why third-graders should care. I did learn multiplication the old-fashioned way, and then distribution some years later, but the concept matters to young kids too, because today's education is geared towards "getting it right" and kids are not being taught to check whether their answers make sense (what I call in my courses "sanity checks"). I cannot emphasize enough how important this is, in particular when you talk to someone about your money or discuss financing with your car dealership. I once talked to a banker who was off in his calculations by a factor of ten; this obviously did not inspire too much confidence in his abilities. You need to figure out quickly the "order of magnitude" of what people are telling you about.

The idea behind the new textbook, I believe, is that (20+3)*5 is 20*5 plus something, so the answer should be higher than 100. Similarly, because 23 is less than 30, the answer is going to be less than 150, and because 23 is closer to 20 than 30, the answer will be closer to 100 than to 150. Basically, you sandwich 23 between numbers that are easier to manipulate (read: have zeros at the end) and multiply those by 5. Maybe teaching kids how to get the exact answer using distribution rules is a bit overboard, but the principle is sound and I am sure it saves a lot of time on multiple-answer tests because you can rule some answers quickly. The Post journalist who obviously had no clue what he was writing about might want to brush up on his math skills.


The Joys of Modeling

Here are quotes from articles on models in business and finance published in The Economist over the past couple of months.

  • "Morgan Stanley uses no fewer than 13 models to value currencies. His latest update offers a wide range for the euro's fair value against the dollar from $1.02 to $1.29, with a median value of $1.15. [...] The yen might be anything between 18% overvalued and 29% undervalued, depending on which model you trust. [...] Morgan Staley uses only four models to estimate the yuan's fair value, of which the median valuation suggests it is only 1% undervalued against the dollar - not the answer Congress wants." (Misleading misalignments, June 23, 2007) I loved the "only four" part.
  • "Even if house prices were to fall by 20% in real terms over the next two years, and even if the links between house prices and spendings proved to be stronger than the central bank ordinarily assumes, [...] the economy's growth would never be dragged by more than half a percentage point, while the unemployment rate would rise by only 0.2 of a percentage point. Reassuring stuff - except that [these] rosy figures assume that central bankers can quickly and pre-emptively loosen monetary policy." (Tangled reins, September 8, 2007)
  • "[According to Morgan Stanley,] the operating earnings of companies in the S&P 500 index will have grown at an annual rate of 9.4% in the second quarter, and by 8.5% over the year as a whole. [According to the Bureau of Economic Analysis,] domestic profits were lower in the second quarter than they were in the same period of 2006." So are profit strong or weak? It turns out "these numbers do not compute like with like." (The Profit Puzzle, September 15, 2007)
  • "The standard statistical approach to risk management is based on a bell curve or normal distribution, in which most results are in the middle and extremes are rare. [...] But financial history is littered with bubbles and crashes, demonstrating that extreme events or so-called "fat tails" occur far more often than the bell curve predicts. [...] Models should have been built on the assumption that liquidity can disappear overnight. But [...] in the short term, those using conventional models would take greater risks [assuming assets are liquid...] and earn higher returns for their clients. The cautious firm would lose business." (Financial Markets, October 27, 2007)
  • "According to Hedge Fund Research, the average return [of hedge funds] at the end of October was 3.3%.  But do these figures represent the true scale of the problem? [...] The suspicion is that some credit-related funds have been valuing their assets on the basis of prices derived from models, rather than real markets. 'The mathematical models people have been using to value credit derivatives have been very naive', says Paul Wilmott, a financial consultant." (Serial Crunching, November 22, 2007)

This article summarizes the situation as of August 2007 quite well. The best piece of writing on the topic was published in November 2007 with the title: "A new fashion in modeling" and includes suggestions for the future under the name of imperfect-knowledge economics. The author points out that contemporary economists "continue to pursue the perfect economic forecast despite abundant evidence that it does not, and cannot exist." (Economists focus on point, or sharp, forecasts rather than range forecasts.) While the column's author gives scant details on the new, better approach, the fact that "the forecaster recognises that his model will inevitably be less than perfect" sounds like a good start; using "qualitative regularities in the way that market participants respond to new information" would certainly make a fine policy if only I could figure out what 'regularities' is supposed to mean here. (Economists!) We are also told about Knightian uncertainty - "most business decisions involve a step in the unknown that is to some extent unmeasurable" - it seems hard to believe that a theory so central to today's business world was first disseminated in the 1920s. The journalist concludes with a comment by the authors of Imperfect-Knowledge Economics, regarding the rating agencies' performance and the subprime-mortgage mess: "[There is] empirical proof that relying on models alone is not wise." The question is: are we giving them the tools to do better next time?


Women in Engineering Academia

I'm sometimes asked what it's like to be a woman in an engineering field and in academia. I suppose the fact that I'm still here gives it away: it's been good - as a matter of fact it's been really very good. I'm proud to be there. I've always been a bit suspicious of the media's emphasis on role models (isn't it more exciting to do something that has never be done before? do students even care?), but I like to think I've played a tiny role in helping some students figure out what was best for them. That's also why I enjoy my role as an academic adviser for the engineering freshmen so much. While most of them keep themselves out of trouble and have a good idea of what they want to major in, I do help every year the one or two lost souls who don't know whether they should stay in engineering and need a sounding board to weigh their options. I'm not going to pretend I've transformed the life of thousands. When I think about it, I can count about five students in my first three years at Lehigh for whom I really do believe I made a difference, seven max (while I've enjoyed discussions with many more, and many more have enjoyed IE 316, I doubt my not being there would have had a life-changing impact on them). But hopefully the others will put the skills they've learned in IE 316 to good use. It certainly matters to the five students in question that I was there, though. All this to say, as someone who got into that line of work to help people and be a positive influence, I'm not going to complain about the increased visibility that being a woman in a traditionally male field has given me.

For the past two years or so I've been the only woman in my department (before that there was one other female faculty member), and until last September all my male colleagues were white males, a significant fraction of whom are getting close to retirement, at a university that only began to admit women in its undergraduate programs in 1971 (now we also have a visiting Assistant Professor of Indian origin). It turns out that my colleagues are fantastic; they were fantastic when I interviewed and have remained so ever since. (Now, if only they stopped ordering thick pizzas for the lunch meetings... And for the record, they wrote my re-appointment letter a long time ago.) In contrast with industry, the challenge in achieving gender equality in academia might not come from one's peers or one's superiors - especially at places on an upward trajectory like Lehigh, you're hired because you can do good work, and you're given the resources to perform well not because having more women is "nice" but because having professors succeed helps the university ascend in the rankings. I like that philosophy.

Instead, the main obstacle in having higher numbers of women gain tenure in academia might well be related in part to the large number of international graduate students, especially when they come from countries where women are not always as respected as men. While many of these students might consider having a female adviser, if only because they can't afford to stay in school if they don't get a research assistantship, others may struggle to take women seriously. Of course you cannot blame it all on international students, though; a majority of my female acquaintances working in industry prefer to have male bosses. While in industry people don't have a choice regarding who they work for, in academia graduate students have much more flexibility. In the end female faculty members select advisees from a smaller pool of candidates, which can affect the quality of their papers, their publication record and their chances to get tenure. That's why I am pleased with the well-publicized choice of women as university presidents but don't believe it is going to have much of an impact on gender equality in academia. The example of a female leader might cheer women in industry; it doesn't help female academics trying to recruit students.

As final words of wisdom for today, when I talk to female undergraduates I often advise them against over-interpreting any one situation. In my days on the Graduate Student Council at MIT, I served as the co-chair of the Academics, Research and Career committee and had two co-chairs, both from India. I got along very well with one of the two, while the other kept trying to take all the decisions by himself. One day he even told me he would prepare the agenda for the meeting and I could order the food - I am not kidding. It'd be easy to think I was discriminated against. The truth is, after observing him and working with that specific student for a while, I came to the conclusion he simply was one of those people whose parents have told them a bit too often that they were little geniuses - the kind who wants all the attention and all the credit all the time, no matter what. He was still a jerk, but that had nothing to do with me.

If someone steps in your toes, you're not going to ask: "are you stepping on my toes because (a) you're angry at me, (b) you want the extra height, (c) you didn't see me, or (d) you step on any toes you can see in a five-mile radius just for the fun of it?" You're going to say: get off my toes! While, sadly, discrimination against women still exists in this day and age (the class-action suit against Wal-Mart is a good example; the harassment suit against Madison Square Garden is another one), in many cases women spend a lot of time trying to figure out whether they're actually suffering from discrimination. And, although their boss might be terrible and make them very unhappy, they might not. My take is that all this energy would be put to better use elsewhere. If you're not happy, take action. More often than not, people don't realize the extent of the choices they have because these choices don't match what they had hoped for themselves. (That is true of women who think they're discriminated against, of women who have female bosses they can't stand, of anyone who doesn't like his job.) I know of one friend at MIT who felt harassed by the male students in her lab and did instead her whole thesis in the research department of a company, so that she could keep her adviser but wouldn't have to work in the lab. People less determined would have dropped out; she created a solution that worked for her. I'll admit it's difficult to resist the temptation of self-pity at times, and we're all dealt bad hands of cards every so often. But when all is said and done, I'd like my students to remember that their response is their choice. If I can use the fact that I am the only woman in my department to get my message across, that's perfectly fine with me.


Distance-Learning Education

The Economist has published a ranking of distance-learning MBA programs on its website. Many of these institutions do not have the same kind of name recognition as their brick-and-mortar counterparts, although the MBA degree lends itself quite well to distance education - students enrolled in those courses often keep their full-time job and can therefore apply their new skills as soon as the lecture is over, without having to put their career on hold for two years or to commute at the end of a long day for evening classes. (The inconvenient of course is that you don't benefit from a tightly knit network of classmates.)

Because not everybody has the chance to live near a first-rate university to go back to school part-time, and because technology has made it possible to take courses online, it makes sense that universities would put some of their course content on the Web. The issue is, of course, how to assess students' understanding when they're so removed from the professor and don't have much time to study. For that reason, distance-learning degrees tend to inspire more confidence when they further the recipient's knowledge rather than introduce him to a whole new discipline - that is, when they're geared toward continuing education and are used to obtain certificates (shorter courses of study well-suited for professionals who don't want to toil away on homework for years) rather than full-fledged Master degrees.

Distance-learning MBAs will work best for people already in mid-level management positions who want a stamp of approval confirming they know what they're doing. Such individuals don't need to build a strong network of new relations or to use the university's career services, as they're more likely to stay at their current company. But that also raises the question: is the degree worth the price? It seems that people are so keen on "self-improving" all the time that they spend their money on online courses when they could as well buy a book and learn the same material, maybe in more depth. Continuing education to get a degree is certainly better than continuing education for the sake of it (this article in SmartMoney suggests that the instructors then become the ones with motivation problems, despite the hefty price tag for the students), and professionals who invest the time and money to learn should be commended. But I can't shake the feeling that people set themselves up for major disappointments if they hope an online degree will transform their lives.

I'd like to see higher education evolve toward a middle ground between distance learning and classroom learning. In an era of videoconferencing and telecommuting, students (including undergraduates) should be prepared to a wide array of work situations. Allowing some courses to be taken online could also help with large classes, where students tend to disengage faster than in recitations. The whole point of sitting in a classroom is to have more interactions with an instructor. This implies you can actually hear what the instructor has to say, see the blackboard or the screen where the slides are projected, don't get distracted by your neighbors' discussions. Classes currently held in big auditoriums might benefit from having the lectures posted online, so that students can view them when they want rather than being force-fed introductory chemistry at the ungodly hour of eight in the morning. I'd much rather see online learning become a component in every student's education, rather than a category of degrees that might or might not hold its ground when competing with its brick-and-mortar cousin.


Doomed from the Start?

Last week I came across a fascinating column in The Economist (February 2, 2008 issue), with the wonderful title of Toy Story, on what toys in Europe and America have to say about their residents' career aspirations. While Anglo-Saxons (Brits and Americans are lumped together) display a "fondness for heavily marketed novelties tied to films or television" and, as a result, in the States firms like Denmark's Lego or Germany's Playmodel are dwarfed by Mattel, "[in Germany] parents like to see boys assembling elaborate structures in their bedrooms; Lego is the top toy brand in Germany. In contrast, the French shun construction toys, preferring the world of the imagination. Playmobil is their leading toy brand." In the words of Playmobil's chief executive, "'the dream of every German mother' used to be to have an engineer for a son." It's hard to imagine American parents having the same dream. Maybe the fondness of American kids for super-heroes and movie figures plays a role in the country's infatuation with entrepreneurship - it is, after all, the closest you can get to calling the shots and ordering your troops around, with the obvious exception of the police and the military (which, however, have the "slight" disadvantage of requiring you to start at the bottom of the totem pole and working your way up in order to get a job vaguely resembling what is shown in the movie theaters).

This begs the question: is engineering in the States doomed from the start? The main issue, I think, is that people tend to perceive it as an end choice (you start as an engineer, you retire as an engineer), while most engineering graduates quickly move to management jobs thanks to their ability to analyze data and understand numbers. A significant chunk of industrial engineering graduates in particular don't ever enter the engineering ranks and are snapped by consulting companies. Because business training, both at the undergraduate (BA in business and related majors) and graduate (MBA) levels, tends to be light on math, there is a real need for people who will not shy away from numbers and blithely accept whatever predictions they're being offered. Engineering graduates have the training to understand what is realistic and what is not; they also understand the line issues better when they cover companies as analysts for financial firms. You can also find many engineering graduates in law schools, sometimes intending to specialize in intellectual property and patents issues and sometimes not, but always benefiting of the attention to details and analytical skills that come from their training. The challenge, then, becomes to educate students and parents on what an engineering career really means - most of the time engineering will only provide a stepping stone to a management career path. (I am obviously biased when it comes to assessing which Bachelor degree, the one in business or the one in engineering, ends up providing the better opportunities down the road.) This farsightedness of the rest of the population is only good news for the current engineering students. Hopefully they will later be the ones buying Legos to their kids rather than the Spiderman XXIV action figure.


On Webpages and Web Search

I recently stumbled upon an old copy of Technology Review (May/June 2007 issue), where the graphic designer Roger Black discusses recent advances in Web design, including Flash-based software and more interactivity allowing the end user, i.e., the reader, to play with the page design. ("Help me Redesign the Web", pp.60-61) Black contributed to the New York Times Reader, where "if you resize the windows, columns reflow, pictures change size, and ads drop in and out." As the proud owner of the digital archives of the New Yorker, I can say with confidence that the New Yorker system - where each page is a giant photograph and scrolling up or down without losing your place in the paragraph can be politely described as complicated - could use some improvement. I do my own little version of designing when I occasionally block advertisers' content from the webpages I visit, in particular the ads with flashing colors that distract me when I try to read (right-click on the image and select "Block images from..."; of course you might also end up blocking images you really want, such as the one- to five-star ratings on Amazon.com, but you only have to right-click and select "Allow images from..." to reverse it. Great to block blinking ads from showing up in your emails too, courtesy of The Washington Post.)

It does seem to me, however, that a more pressing problem than two-way Web design is the relevance of the pages that come up in a web search. The homepage of the Journal on Optimization Theory and Applications has apparently not been updated since 1997. (That one made me think web pages should have expiration dates, say, two years after last modification, and be removed from search results once the page expires.) Google now allows users to filter according to pages first seen in the last 24 hours, past week, and so on, up to past year, but older pages updated recently, and as a result still relevant, are not picked up by the search. Search results, with their thousands of hits, make me think about one gigantic mail inbox with neither folders nor archives. Maybe one day we'll get tabs (thank you Mozilla Firefox) or some kind of arborescent structure for search results - pdf files here, html files there, and here are results from universities and there from .com sites, and in bold are the results where the key words appear more than five times - rather than asking users to put up with all the results in any order, out of faith in Google's PageRank algorithm.

Another interesting idea would be to allow the algorithm to learn the user's preferences by allowing her to rate how useful she found this or that webpage and have PageRank adjust accordingly. As Black points out in the Technology Review article, "the Web was conceived as a way for researchers and scientists to share documents." You knew the value of the page beforehand, since the person who had created it was one of your collaborators. Nobody thought about the Web's growth over the following decade, and nobody forecast the growing problem of cyber-junk. Now that search engines have become a necessary tool to navigate the Internet, we might want to turn them into cutting-edge GPS systems rather than cumbersome and outdated print maps.


No Science For Congress

For a while it seemed like it was going to happen: politicians were talking about strengthening the nation's competitiveness, the President himself proposed the American Competitiveness Initiative, stating: "Federal investment in research and development has proved critical to keeping America’s economy strong by generating knowledge and tools upon which new technologies are developed." Presidential candidates offered innovation agendas. In its FY 2008 budget, the National Science Foundation requested $6.4 billion, which is alternatively described as a 8.7% increase over FY 2007 estimated expenses or a 6.8% increase over the FY 2007 budget request (the White House website also cites the 6.8% figure). This table illustrates NSF priorities quite nicely; for instance, the Office of Cyberinfrastructure hoped for a 9.6% increase, and some of the engineering directorates, including the one that funds my research, expected increases in the 14-16% range.

In what might be a move by Republicans and Democrats to clear the way for some easy victories for the new president next year (I am such an optimistic person...), the final budget bill gives NSF a puny 2.5% increase. A news bulletin issued by the American Institute of Physics on December 19, 2007 summarizes the situation as follows: "Later today, the House of Representatives will approve the FY 2008 budget bill. President Bush is expected to sign this bill, which contains a 2.5 percent increase for the National Science Foundation for FY 2008. This increase is but a fraction of the 8.7 percent increase requested by the Administration, the initial 10.0 percent increase approved by the House, and the initial 10.8 percent increase approved by the Senate." Craig Bennett, chairman of Intel, voiced his disapproval in an op-ed piece in the San Francisco Chronicle, which I discovered thanks to John Hunter's blog. (I found the fact that "the funding decisions on the America Competes Act took place a few days after Congress passed a $250 billion farm bill" particularly interesting.) Bennett emphasizes that science and engineering have become a priority for other countries, and that recent hurdles to legal immigration, due to the backlash against its illegal counterpart, have made it difficult to "import the best and brightest minds."

So what now? As much as I would have preferred to see a large budget increase in the federal budget for science and engineering (what a surprise), there are plenty of ways for the US to remain competitive in addition to the NSF providing research grants. The burden, however, is shifted to decentralized actors - companies, universities and research labs - which makes it more difficult to bring research and innovation back to the forefront: companies need to write sizable checks, thousands of administrators need to dip into their university endowment together for them to have a noticeable impact on the amount of research funded. (Undaunted by the challenge, quite a few companies have begun to fund research with no strings attached, in a trend that was missing from, but would have fit well into, The Economist's recent report on corporate social responsibility.) But all this talk focuses on input - the inflow of money - rather than output - a highly educated workforce that will implement innovative approaches to solve complex real-life problems. While there is some truth to the fact that scientists involved in basic research need to support many doctoral students on various related projects to get one breakthrough, now seems as good a time as ever to think about to reach another critical segment of the student population: the students who graduate  with a Master degree.

As I mentioned in a November post, "a growing number of students pursue Master degrees because of the heightened competition at the Bachelor level - getting an advanced degree is an easy, and efficient, way to distinguish oneself. [...] Of course it's harder to get politicians and administrators excited about Master-level work - students who enter the program typically don't know how to do research, and they leave at the very moment where their training would pay off for their adviser. Research conducted by Master students rarely ends up in the New York Times - but that's not the point of the degree. Masters of Science and of Engineering train students in the latest developments in a field and prepare them for that buzzword of today's education system, "lifelong learning". You cannot implement a novel technique if you don't know it exists, and even when you're self-taught it's hard to recognize that a model applies to your specific situation if a professor didn't give you pointers beforehand."

Because many PhD holders who go to industry join R&D divisions, it is critical to train Master students into innovative ways of thinking as well; they're more likely go back to school for a MBA, migrate toward the top-management ranks of a company. Unfortunately, they're also harder to teach - students admitted from other universities, in particular from abroad, have widely different backgrounds; some are superb and some struggle. The ones who have received the best training are the ones we have trained ourselves. For that reason, the latest fortune reversal for science in Washington could turn into a blessing in disguise for dual Bachelor/Master programs and Master-level education, which thesis advisers (and hence the NSF) typically do not fund. Such programs would provide a more structured way for students to learn cutting-edge techniques and gain exposure to research, but their effectiveness depends on advisers who view the Master thesis as an education opportunity for the student, and not just free labor to write thousands of lines of code. And who knows, by the time Congress decides to care about science, maybe those students will come back to get their PhD after all.