Yesterday I was listening to NPR while a few senior analysts explained the Fannie/ Freddie takeover. I was taken back for a minute while I listened to what the government rescue of the mortgage giants meant for taxpayers. I began to wonder what it meant for what we so fondly call, "The American Dream."
The American Dream, loosely defined, is the right to an opportunity for a better life, the right to a good job, the right to own a sufficient home, and the right to live and participate in our consumer based society. However, when I hear about the "Mortgage Crisis," or how car dealerships are having trouble securing finances for their 2009 inventory, I wonder, What has the American Dream done to us?
In America, the land of the free, you will find a plethora of people, some more fortunate and some less fortunate. I think the real problem for America starts when we, as Americans, have realized we have not fulfilled the American Dream. So, we less fortunate become unrealistically optimistic. With the idea of an American Dream in our heads, we become invincible, but is this ultimately the the demise of America?
What I mean to say is, If I feel that it's my "American" right to own a home, and Fannie Mae feels that it has the right to grant my "Dream," who can stop us? If I feel that I have the "American" right to own a car and Ford feels it has the right to fulfill my "Dream," who can stop that?
Who can stop the American Dream when it's all some Americans have? If Americans tend to feel they are owed something and American businesses take advantage of it, we end up in a cycle of late payments on both sides.
Let me make it clear, I'm not against the Dream. I support it and I admire it's capabilities, and our country's capabilities, but, if everyone with less feels they deserve more, and everyone with more feels they deserve more, what will become of America? Will the idea that this land comes with a Dream be this lands demise?