Business schools today are sorely lacking when it comes to hands on experience (the requirement of having at least one semester of an internship not withstanding). While I believe that having highly educated teachers (those achieving their doctorate) is important, I think that schools of higher learning should place more emphasis and focus on hiring professorial staff that have the work experience (i.e., working in the profession in which they teach). This will, in my estimation, actually help students put together the theory often taught in school with the practical application of their future trade. I have interviewed 5 candidates for an internship in my department over the last week and only 2 knew the difference between advertising and marketing. What are they teaching these students in their marketing classes?
Thoughts on 2004 Award Winner