Some friends and I were talking about whether or not a college degree is necessary these days. I feel as if it is, but I know plenty of people who have gotten a job and been there for 20 years, and are making that good money. Times have changed though, and education is a business, so if you don't have one the people who do always seem to have a leg up when it comes to obtaining a job. How do you feel about this, do you have a college degree, if not, do you feel like your life has been hindered by not having one? Do you feel like you got a better deal by not having to pay back all those student loans?