Previous month:
April 2015
Next month:
June 2015

May 2015

MA State of Technology Report 2015

2015SOT-coverBack in March I attended an event held for the release of the Massachusetts State of Technology Report 2015. The best part of the morning, I thought, was the panel discussion with the mayors of Somerville, Holyoke, Worcester and Newton, all of whom came across as approachable, dedicated and driven to make their cities as good as they could be.

I particularly enjoyed hearing about the Holyoke Innovation District. The economic development strategy for Western MA - dubbed Pioneer Valley - was recently made available here. The strategy focuses on the "reconstruction, reactivation and service expansions" of the north-south corridor, aka the New England Knowledge Corridor (NEKC), and the east-west corridor, or Knowledge Corridor, the completion of a broadband network, the growth of the clean energy industry, "leveraging the impressive and wide range of economic attributes and assets concentrated in the interstate NEKC" (in plain English: "a population of nearly 3 million, a workforce of more than 1.25 million, 215,000 college students enrolled in 41 colleges and universities, and in excess of 64,000 businesses", which all contributes to make the NEKC the 20th largest market in the country). 

The State of Tech report itself makes the following points, among others. "Innovation Districts in MA include the Boston Innovation District, the Cambridge Kendall Square District, the Route 3 Tech Corridor, the N^2 corridor (Newton/Needham), the Worcester Incubator, TechSpring (Springfield) and the Holyoke Innovation District. Noteworthy media initiatives include the Innovation Hub radio show, launched by and syndicated by WGBH, Radio Boston launched by WBUR, and the return of the Boston Globe to local ownership. Transportation-wise, "the MBTA extended late night service" (2014 and 2015, now ending service at 2am on early Saturday and early Sunday), "JetBlue established a new Worcester hub", "Logan direct air service [was] extended to 75 domestic and 44 international destinations". Companies such as Amazon.com, Google, Microsoft, J&J and IBM opened or extended their research centers. There are now over 20 university incubators and accelerators in MA as well as over 20 industry incubators and accelerators in MA (MassChallenge, Bolt and more) and the technology investments in MA companies over 2011-2014 have been valued to $4.86 billion. 

The report also considers "three emerging technology areas that are poised to profoundly impact people worldwide and that Massachusetts has the potential to dominate: the Internet of Things, Security and Healthcare & Life Science Information Technologies". ("It's been said that Massachusetts predominantly works on technology that is crucial to creating and advancing the state of the art as opposed to entertaining people." Take that, California.)

  1. The Internet of Things (IoT) refers to the upcoming connection of more than 50 billion devices to the Internet, unleashing up to $6.2 billion in new economic value. A well-known example is thermostats as components of smart homes. "For most businesses, IoT is not just about connecting new products but also about how these connections will completely transform the way companies engage with their customers, and the types of opportunities that open up through this deeper relationship."
  2. IT security and cyber-security.
  3. Healthcare/Life science: "The ability to collect, store, and analyze massive amounts of data - coupled with the emergence of the cloud, sophisticated robotics and advanced manufacturing - is catalyzing major breakthroughs in life sciences research and development, and revolutionizing patient care every step of the way. The result is a new patient-centric paradigm in which technology is powering the new future of human health." 

Innovation at MIT

Spectrum_Spring2015_cover-360x556The Spring issue of MIT Spectrum is devoted to innovation. It follows the release in December of the report "The MIT Innovation Initiative: Sustaining and Extending a Legacy of Innovation" and the creation of the MIT Innovation Initiative. The December report highlights MIT's legacy of innovation, the engagement of the MIT Community and MIT's primary areas of focus to accelerate innovation: (i) strengthening and expanding MIT's innovation capabilities, (ii) cultivating innovation communities, (iii) developing transformative infrastructure and (iv) promoting the science of innovation through the new Laboratory for Innovation Science and Policy. Innovation and entrepreneurship resources available at MIT are listed here.

Innovation is a curious topic... so much more popular than research. Who can be against innovation, really? Addressing a problem by creating something new and successfully bringing it to market sounds impossible to oppose, just like (almost) everyone hopes to be a leader rather than a follower and (almost) everyone wants to use their unique skills to the fullest. While there is a bit more pushback against "blind" entrepreneurship, i.e., self-employed people whose companies barely survive from year to year and whose talents could have been used with greater impact in a larger company (although obviously at places like MIT or the Ivy League it is tempting to conflate any startups with the success stories of Facebook and the like), innovation is a concept that doesn't seem to have a downside at the moment. And maybe there is no downside, but even concepts like leadership have been found to have less savory variants than others, and most top artists have had formative periods based on rigor, discipline and apprenticeship (qualities that the media is less enamored of) before they ventured out on their own.

If I had one criticism of the MIT Spectrum articles, which showcase current innovation endeavors at MIT, that would be the focus on single individuals - nowadays few meaningful projects can be completed alone, and there would be value in reporting on those teams worked together to achieve the common goal. (The MIT report released in December 2014 itself debunks the myth of the lone scientist.) But the articles in themselves are so short that it would be hard for any reporter to do the topic any justice.

The MIT Innovation Initiative website is full of helpful links and a highly recommended starting point for anyone interested in those issues.


More on the Economist's Special Report on Universities

Here are more highlights on The Economist's special report on universities, especially the "Flagging model" (about the US system) article.

  • "Students... are buying degrees, whose main purpose is to signal to employers that an individual went to a - preferably highly selective - university... People are prepared to pay through the nose to buy advantage for their children, so top institutions charge ever higher prices and acquire ever more resources, while those at the bottom get less."  
  • "Employers are not much interested in the education universities provide either... Their principal filter was the applicant's university. Unless he had attended one of the top institutions, he was not even considered." A researcher from Northwestern University is quoted as saying: "Evaluators relied so intensely on school as a criterion of evaluation not because they believed that the content of elite curricula better prepared students for life in their firms... but because of the perceived rigor of the admissions process." The article also states: "After the status of the institution, recruiters looked not at students' grades but at their extracurricular activities."
  • "If employers are not interested in grades, students might as well take it easy... And since academics [professors] are promoted largely on the basis of their research, they might as well give up teaching. That is, indeed, what they seem to be doing." Other researchers conclude that "no actors in the system are primarily interested in undergraduate s student academic growth."

The special report also has an insert about MOOCs, subtitled "online learning could disrupt higher education, but many universities are resisting it." But the key question is whether the goal of a university should be to do research or to teach. It is nice to say that ideally it should do both, but in practice colleges and universities view their mission as one or the other. Research universities define themselves through their connection to research. Liberal arts colleges teach through inquiry and the creation of new knowledge in research projects. The thing is, just like we have witnessed grade inflation in high school and college (the funny thing is, since the Economist says students don't care about grades anymore, which is debatable to begin with, maybe grade inflation is a consequence of some research-focused professors not wanting to deal with aggravated students and giving them all good grades to get peace?), we witness journal inflation and paper inflation where someone creates a new outlet for publication so that more papers can get published, and lower-quality journals figure out how to play the impact-factor game, but altogether there is a real case to be made that bottom-tier research universities - the ones with forgettable research and mediocre teaching - are not nearly as good than many liberal-art colleges dedicated to good teaching (and whose professors publish very little research).

This touches upon an issue highlighted in a recent article in the New York Times: What's the point of a professor? 43 percent of students get an A nowadays, and 61 percent say that their professors treated them like a colleague or peer. But, the article states, "while they’re content with teachers, students aren’t much interested in them as thinkers and mentors. They enroll in courses and complete assignments, but further engagement is minimal. One measure of interest in what professors believe, what wisdom they possess apart from the content of the course, is interaction outside of class... Students email teachers all the time — why walk across campus when you can fire a note from your room? — but those queries are too curt for genuine mentoring. We need face time." (This is something MOOC or SPOC [small private online course] proponents would do well to remember. 

The NYT article explains: "For a majority of undergraduates, beyond the two and a half hours per week in class, contact ranges from negligible to nonexistent." The article was written by an English professor and this becomes obvious (at least that it was written by a humanities professor) toward the end, when the author writes about students who have no urge to become disciples because college is now about a career rather than about ideas.

In engineering, undergraduate students have long joined industry - a very different career path than what their professors chose for themselves. We can't really talk about undergraduate engineering students becoming disciples of professors. Hopefully in their career they will integrate a wide range of courses taught by different professors to become highly successful practitioners. But it also gives engineering professors a unique opportunity to serve as sounding boards and, yes, perhaps mentors, because they are not personally invested in the choice most undergraduate students will make in their careers.

By that I mean that doctoral students often struggle with industry vs academia and part of their dilemma lies in the fact that many at top institutions (in my anecdotal experience) don't want to tell their advisor they plan to go to industry for fear of retaliation. In fact I have friends who graduated with a PhD in science and later (after a postdoc) switched to industry to the never-ending disapproval of their PhD adviser, although I hope they are not the norm. Part of the reason for their adviser's behavior, I think, is that some people naturally tend to view the path they selected for themselves as the best path for everybody else, but when you advise undergraduate students, you naturally interact with fewer who might follow in your footsteps and get a PhD, so it is easier to view the students as their own person with their own goals and interests instead of mini-yous, so you give (I hope) better advice. All this to say, I think engineering professors can serve as good mentors (or reasonably good mentors) precisely because they are not quite in the industry-based career path students want to pursue, but know enough about it to give good advice.

Finally, the following in the NYT article most resonated with me: "You can’t become a moral authority if you rarely challenge students in class and engage them beyond it." The objective of becoming a moral authority in itself worries me a little (maybe becoming an effective teacher would have been enough...), but challenging and engaging students is an important condition to impactful teaching.


HBR on Making Better Decisions

BR1505_500The May issue of Harvard Business Review has a spotlight on making better decisions, which primarily focuses on behavioral economics.  Leaders as Decision Architects discusses how to mitigate the effects of cognitive biases and low motivation on decision making. People have two main modes of processing information: (1) "automatic, instinctive and emotional" and (2) "slow, logical and deliberate." The authors argue that "engaging System 2 requires exerting cognitive effort, which is a scarce resource... as the cognitive energy needed to exercise System 2 is depleted, problems of bias and inadequate motivation may arise." The authors seem to favor System 2, but in a business world heavy on interpersonal skills, there is also value in trusting your instincts if you think a potential partner is not trustworthy. That is not a cognitive bias. For instance, many people have a tell-tale sign right before/when they lie. You can spend your "cognitive energy" trying to articulate what the sign was - and perhaps if you are new to this, it is not a bad thing to articulate what gave you pause - or you can go with your instincts and try to formalize it all some other time.

(But the next article, Outsmart your own biases, actually advocates precisely against that sort of behavior because "unless you occasionally go against your gut, you haven't put your intuition to the test." My take is, if you don't follow your instincts and your instincts were right, you'll associate a lot more intense pain to the experience - in addition to having to deal with a problem employee you didn't want to hire in the first place - than you would if you just go with another qualified candidate from the beginning. Again, I don't think intuition would be a cognitive bias.

A cognitive bias would be, School A distorted something the star varsity athlete of School B had posted on social media to get him suspended from a key game in the rivalry between Schools A and B and unfairly ruin his reputation, so now someone who graduated from School B cringes whenever he sees the name of School A on a resume and has sworn never to hire anyone who has ever been affiliated with School A. Yes, it would be good for that person to pause and analyze his aversion to School A logically because he may be depriving himself of star performers on his team.

Another cognitive bias would be, you have a rags-to-riches personal story or at least an underdog-who-beat-the-odds story, so you tend to be more inclined to give a chance to other people who claim to be underdogs too, but you are blind to the fact that people are simply presenting themselves in the way they think will make you more likely to hire them. More on how to outsmart your own biases below.)

The authors of Leaders as Decision Architects identify two roots for poor decision making: insufficient motivation (you know the right thing that you're supposed to be doing but don't do it) and cognitive biases (you decide on the wrong course of action). Again, they seem to think that System 2 is the solution to everything ("Because problems of motivation and cognition often occur when System 2 thinking fails to kick in...") Well, if the issue is motivation, maybe it'd be helpful to first have the employee connect with a positive emotional response - even if it is just discussing their favorite sports team - and then once they are in a productive state, push them along on the path to completing the task. The funny thing is, a key example they give is that of a company a co-author got involved with, which was plagued with severe attrition within months of onboarding. "The training failed to build an emotional bond between new hires and the organization and caused them to view the relationship as transactional rather than personal."

A quick mention is made of the 2008 book Nudge: Improving Decisions about Health, Wealth and Happiness by Thaler and Sunstein, and "Step 4: Design the solution" in the HBR article was particularly informative, but what I found most helpful was the sidebar on common biases that affect business decisions (click here for the chart in one picture with helpful explanations for each, courtesy of HBR's Twitter feed):

  • Action-oriented biases: excessive optimism and overconfidence
  • Biases related to perceiving and judging alternatives: confirmation bias, anchoring and insufficient adjustment, groupthink, egocentrism
  • Biases related to the framing of alternatives: loss aversion, sunk-cost fallacy ("we pay attention to historical costs that are not recoverable when considering future courses of action"), escalation of commitment ("we invest additional resources in an apparently losing proposition because of the effort, money and time already invested"), controllability bias ("we believe that we can control outcomes more than is actually the case").
  • Stability biases: status quo bias and present bias.

The authors of Outsmart Your Own Biases advocate the use of checklists and algorithms to bypass strong emotional attachments and stay focused on the right things; they also recommend "trip wires" at key points in the decision-making process. In thinking about the future, they advise to make three estimates for every forecast, think twice (make two forecasts and take the average), take an outside view (view the project you are involved in as an outsider) and - my favorite activity - use premortems, i.e., imagine a future failure and then explain the cause. It is also called prospective hindsight and was the topic of an old blog post of mine. The authors also recommend thinking about objectives and thinking about options. In particular, they refer to Chip Heath, Dan Heath and their book (which I love, as stated here) Switch: How to change things when change is hard and describe the "vanishing options" test: what would you do if you couldn't choose any of the options you're currently weighing?

Finally, Fooled by Experience argues that "we view the past through filters that distort reality." Such filters include the business environment, our circle of advisors (who may censor information) and our own "focus on evidence that confirms our beliefs." Although the advice is a bit trite ("we can base our decisions on a clearer view of the world if we... surround ourselves with people who will speak frankly [and] search for evidence that our hunches are wrong"), it is a good reminder that our view of the past is imperfect at best.

Overall, the issue was a good read. I'm still not sold on behavioral economics but I'm having fun thinking about cognitive biases.  


The Economist on Universities

Back in late March, The Economist published a special report on universities (a broad theme, we'll all agree). Here are some highlights.

First, excerpts from the Leader article about the report, "The world is going to university". "Just as America's system is spreading, there are growing concerns about whether it is really worth the vast sums spent on it... There are, broadly, two ways of satisfying this huge demand [for a university degree]. One is the continental European approach of state funding and provision, in which most institutions have equal resources and status. The second is the more market-based American model, of mixed private-public funding and provision, with brilliant, well-funded institutions at the top and poorer ones at the bottom. The world is moving in the American direction... If America were getting its money's worth from higher education, that would be fine. On the research side, it probably is... but on the educational side, the picture is less clear."

Later in the Leader, we read: "A recent study of recruitment by professional-services firms found that they took graduates from the most prestigious universities not because of what the candidates might have learned but because of those institutions' tough selection procedures." In other words, professional-services firms farm out their selection process to universities' admission committees and the decisions they made about 17-year-old students because it is easier to pick someone who already has a "stamp of approval" from someone else, no matter how little connection the initial (college) selection has with the ultimate goal (job recruiting).

I would love to see a study where 21-year-old college seniors would be allowed to pretend they are about to graduate from the Ivy League (in the major they really did have) and given fake transcripts and fake references, and would be sent on the job market. (Somehow they'd have to go through campus recruiting, be given email addresses at the Ivy League school they're supposed to be from, etc.) Not only do I think they'd do very well if they're chosen appropriately - personable students with strong grades from a "New Ivy", for instance - but I think they'd do as well in the workforce as real Ivy League grads or perhaps even better (if they're more used to handling setbacks or failures). My opinion is based on well-known psychology experiments where teachers were told their students were highly talented ("academic bloomers") or where they were told their students were behind and challenging. There was actually no basis for those labels, but the teachers didn't know that and began to treat their students differently if they had been told they were highly talented, although they were not. (The story is more complicated than saying every student could be talented, though.) The issue, of course, is that admitting the labels we put on others affects their success for better or worse would bring a lot more candidates in the job selection process, and HR departments are not known to particularly want to take a risk. 

Here's another excerpt of the Leader about America's universities. "The government rewards universities for research, so that is what professors concentrate on. Students are looking for a degree from an institution that will impress employers; employers are interested primarily in the selectivity of the institution a candidate has attended." I disagree with that last part. I think that employers want students who come with letter grades stamped on their forehead and assigned by someone else so that they don't have to figure it out themselves. We rank colleges, hospitals, places to live - we rank everything. Of course it is tempting for HR departments to hope applicants would come neatly ranked. Then they could go down the list and evaluate for fit with the actual position. The ranking would be a mix of environment, effort and emotional intelligence (I should call this the Three Es and start going on the talk show circuit... jk): the school, the GPA and the extracurricular activities.

Later, the article also touches upon common tests to make the higher-education market work better. In a way, it would help ensure that the dreamed-of "student's grade" (which would make HR departments' jobs so much easier) really reflects what the student knows and is not distorted by the university's name. But didn't the SAT fill a similar purpose for college admissions? I have yet to find a single person nowadays who thinks highly of the SAT. In fact I'd be highly suspicious of anyone who thinks highly of multiple-answer tests. But they make the job of admission committees easier. In France where there is an exit exam at the end of high school and an entrance exam for engineering and (college-level) business schools, committees of professors must decide on that year's exams,  paces to administer the exams in a secure fashion must be found all over France every year in May and a small army of professors and lecturers must grade the exams before the results can be posted. Those would in fact only be the results of the first phase, the admissibility one. If you are admissible, you go and take a series of oral exams. More instructors needed there too. Is the system better or worse than in the U.S.? For STEM disciplines, which is the system I know best, I think the system is better. It also tells you a lot about students'  ability to handle pressure and fatigue. (Each series of written exams lasts five days, with one exam in the morning and one exam in the afternoon. The exams are typically three or four hours long. There are four main series of exams, i.e., groups of schools that uses those exams to rank applicants: the Ecole Centrale group, the Mines group, the Ecole Polytechnique group and the ENSII group. You get a rank for each group of exams you take.)

It would be a complicated system to administer in the U.S., especially since the exam setup in France is supported by a structure of preparatory classes students take for two years after high school - they are not expected to graduate from high school and then blissfully sit through the grandes ecoles entrance exams. This setting doesn't exist in the U.S., but could be created. (It'd be better if it didn't give rise to an entire new cottage industry of prep courses and prep books and pseudo-counselors feeding off parents' anxiety). Another challenge in the US is that students may pursue widely different majors once they have entered a given university, and they won't choose until the end of their freshman or sophomore year - in France the grandes ecoles have a much narrower focus. Finally, this narrower focus reflects a difference in mentality. The U.S. emphasizes well-rounded applicants with a jaw-dropping number of extra-curricular or leadership activities. In France, if you ace the exam (and therefore get a good ranking), you enter the best school that still has openings by the time the officials reach your rank number.

Yet, in the same way that it should be possible to design entrance exams for each college or super-domain (engineering, science, business, humanities) and level the playing field, one would imagine that top companies would be able to design exams for their applicants if they wanted. In fact, consulting companies often do, although they call them "case studies". They try to put would-be employees in the position they'd be in if they got the job, and see how they'd react. Granted, the students who go to the "case study" phase of the recruiting process have already gone past the gatekeepers - meaning, they probably have impressed recruiters already with the name of the university they graduated from. Consulting companies do deserve credit, however, for trying to improve their recruiting process by also encouraging consulting clubs at the universities they recruit from, which allows them to interact early with potential recruits by sponsoring case study competitions and other activities. It is obviously not something all companies can spearhead that sort of initiatives, and students should also feel free to be students in college and not future employees constantly judged by perhaps-bosses. But it all goes to show that there are many more potential options than what people imagine to the problems of matching students with universities and graduates with jobs.  

Well, this post is getting enormous and I haven't even reached the special report itself yet, so that'll be for some other time!