Americans have finally woken up to the scam of college.
The first line in the article:
Americans have grown sour on one of the longtime key ingredients of the American dream.
It has never been a key ingredient of the American dream. Hard work, freedom, and opportunity are the key ingredients, not going into lifelong debt to gain no real marketable skills thanks to a lie.
College was always necessary for certain jobs, like doctors, lawyers, engineers, etc. and I'm definitely not saying continuing education is a bad thing. However, pushing it as necessary for all jobs is modern propaganda that certain people got rich pushing. This was made worse by HR people becoming a thing in corporations and furthering the "all jobs need degrees" lie. This was despite years of studies and surveys showing college students were arriving in the workplace completely unprepared and without skills, still needing tons of training, etc. Oh, and let's not forget all the high paying jobs they weren't finding.
There are 439K open trade jobs in the U.S., and I believe that's just around data centers. These people go to school for months, get certified, and are on the job making 6 figures within a year or two. BTW, debt free. When they go to lunch, they're being served by college graduates, buried in debt, who cannot find a job in whatever specialty their degree claims.