Why electing Joe Biden will make the culture war even worse-
[thefederalist.com]
From the article - 'If Biden wins the White House this November, his administration will be run by a shadow team of radicals bent on warping America's culture to the left.'
Isn't that what is happening now?
Valid point.