(NineElevenNews) -- What has happened to the United States in the years since 9/11? The United States has become a hated country known for telling lies and killing people..locking people up without a right to a fair trial..framing people. The list goes on...
{seyretpic id=140 align=center}