Republicans increasingly think colleges are harming the U.S.

Republicans increasingly think colleges are harming the U.S.