The Corporatization of America
The United States prides itself on being a capitalist economy, but over the past several decades it has rather aggressively morphed into a corporatist society, in which individuals have far less freedom and power than corporations. Almost every aspect of US society has been shaped and controlled by corporate interests, including health care, higher education, agriculture, and even politics. Is it too late to break free of this hold before democracy is lost altogether? The diverse viewpoints in this resource explore how we got here and what can be done to get America back on track.