Hollywood’s Real Impact on Culture & America’s Worldview

video blog
April 01, 2020Apr 01, 2020

An economic war on entertainment is going on in America. Has China decided Hollywood is one of the key American industries it wants to influence? Money controls the agenda and what gets produced. Too often the films that are funded are the ones that bring a leftist worldview. Is there a plan to influence America’s culture through entertainment? Kevin and Sam Sorbo discuss these and more timely topics.