Japan has a reputation for being a "Westernized" country, but don't be fooled. There are plenty of things to avoid in Japan when you visit.