The Truth about Germany: Football

Deutsche Welle
March 16, 2008 AT 4:58 PM
Germany is one of the greatest football nations on earth. They won the World Cup in 1954, in 1974 and in 1990. Even their women have won the world cup! But is the passion for soccer shared by all Germans?