The 1930s

The 1930s was a decade of depression and despair for the American people. The decade ushered in the New Deal changing the landscape forever with social reform. There was, however, violence tamed by J. Edgar Hoover’s FBI that tackled the out of control gangsters. Unfortunately, the end of the decade resulted in the beginning of the second World War.

View More

Scroll to Top