California Mandates COVID Vaccinations for Health Care Workers
California Mandates COVID Vaccinations for Health Care Workers

California has become the first state in the nation to require all workers in health care settings to be fully vaccinated against COVID.

Andria Borba reports.

(8-5-21)