Being an angst filled acne prone high school student. I have been brainwashed through many years of anime, games and other thingamajigs that might otherwise suggest that perhaps japan is GREAT! Well of course my fruitless hours spent browsing the "information highway" have shown me the good and the bad of the land of the rising sun. Yet I'm still curious as to life in japan.
Of course it probably is one something that truly has to be experienced. Seeing as how I have lived in the US for 5 years (I'm from Mongolia) I have went through cultural shock and other things involved in going to a new place.
Well too much blabbering, i'll just ask the question:
Is being in Japan so great that even the worst of it somehow makes it a unique and great experience? Like me coming to the US and becoming a total social outcast and learning to live with it and eventually fight throught it, therefore becoming a better person.
I swear America and Americans can mess you up so bad you can't fart anymore, yet I still love this country and the people (and the women). You know what I mean.