Skip to main contentA logo with &quat;the muse&quat; in dark blue text.
Advice / Job Search / Finding a Job

Do Hiring Managers Care Where You Went to College?

This article is from our friends at LearnVest, a leading site for personal finance.


Does it really matter whether the diploma you receive carries the name recognition of the Ivy League?

That depends on whom you ask. If it’s a 17 year-old coming out of an SAT haze, the perceived prestige of her future alma mater is the be-all and end-all. It’s probably safe to say that the same is true for her parents.

But if you ask business leaders, the response you’re likely to hear is: It’s really not that big of a deal.

Two surveys conducted by Gallup on behalf of the Lumina Foundation, an organization committed to increasing the percentage of Americans with post-high school degrees, found that hiring managers weren’t that concerned where their new hires received their college degrees.

Rather, what is important, according to the more than 600 business leaders surveyed, is the candidate’s knowledge in the field and her applicable skills. Even a candidate’s college major outweighed her school’s pedigree: 28% of those surveyed found her major to be “very important” in the hiring decision, while only 9% gave the same weight to the institution on her diploma.

Meanwhile, Gallup’s second survey to American adults overall revealed that the general public places a lot of value on the institution. When asked how important they think a candidate’s college is to a hiring manager, 30% say it is “very important,” and a surprising 47% feel the same about college majors.

So what does this mean for eager job hunters? Gallup puts it nicely: “Getting a job and achieving long-term success in one’s career may increasingly depend on demonstrating real value to employers through experience and targeted learning—and increasingly less on degrees, even if they are from prestigious universities.”


More From LearnVest