socialized medicine

     

Socialize medicine is a term used primarily in the United States to refer to certain kinds of publicly-funded health care. The term is used most frequently, and often pejoratively, in the U.S. political debate concerning health care. Definitions vary, and usage is inconsistent. The term can refer to any system of medical care that is publicly financed, government administered, or both.

Found pages about socialized medicine