After two years in the work force, I’m learning a lot about the way the world works. Particularly in reference to my college degree. It really is funny how much emphasis has been placed on the value of a bachelors. Don’t get me wrong, I think they’re important, but to be perfectly honest…I don’t really need it for the job I’m doing.
I honestly didn’t take one class that is even remotely related to my career. I know, I know college is not necessarily about the degree you receive, but the work ethic necessary to reach graduation day. But why is a degree an ABSOLUTE requirement for most professional positions?
Take for example family physicians. They have to take all kinds of science and math classes as prerequisites for medical school. I’m sorry, but I don’t really remember the last time my doctor needed to know the formula for the inelastic collision of two objects. I’m pretty sure he just needs to know what my body temperature should be, how fast my heart should be beating, and other random stuff about my body. Why are physics and calculus a requirement for ALL doctors? I don’t care if my doc knows how to do integrals, I want to know if he can cure my
gonnohrea cold. Best way to cure someone of their sickness is to be exposed to a million other people with that sickness, not take earth science.
Think about how different life would be without mechanics. We would all be screwed and have no way to get our cars fixed. I’m taking a shot in the dark, but I’m willing to bet most mechanics didn’t go to college. Instead, they shadow and work with mechanics day in and day out for years, and eventually (if they know their ‘ish) will get to work on cars without supervision. Why doesn’t that philosophy apply to more positions. A degree is required for my position, but I can tell you right now, I don’t think anything I do is beyond the intellectual capacity of an 18 year old, although it is above the maturity of most kids that age.
Some of you may be getting a little offended thinking I’m saying your degree is useless. Sorry, but it’s probably true. The primary reason a degree is valuable is because society says so. Be honest, is your degree absolutely essential to your position. Not meaning is it a requirement to get the job, but would you be a total waste of space at work without it? All I’m saying is I think “on the job” training is really what helps us be great employees, not a bland degree in business administration.
p.s. if you are a physician, thanks for suffering through o-chem and microbiology, my body appreciates it 🙂