Have you ever found that you were taught things in school that turned out not to be true when you received sources to the contrary later in education or life? I'm not talking life lessons about sappy ideas like 'freest' or 'equaliest'.
For example: I remember seeing videos and being lectured that immigrants came to the US, because the police were fair and non-exploitative.
I'm reading now that in the August 1900 riot in NY after a police officer was killed by an African American even the police officers joined in in terrorising common black folk, tearing them off of trolleys and beating them in the streets. Same issues of police joining in on mass brutality against Jews.
The modern justice system we occasionally hate, I find I like more at the moment.
We're you lied to in school?