It appears a good part of the profession transitioned from a health profession to a commodity judging by the many who treat ugliness as a disease. I'm hearing of the frustrations of young dentists who feel somehow they were duped in to believing they were training to become a health professional and instead, find themselves needing to be marketers and sales people. Veterans are feeling demoralized; feeling the optic of the profession doesn't fit. Long standing institutions of higher learning are struggling for enrollment and some gurus think it's not just because of the economy. So what's going on? Is the change a good thing or bad thing? Some say health centered practice is going the way of the dodo bird. What do you think?