I don't know about y'all but I have about had it with reality TV. Starting with the fact that there really isn't that much reality happening on reality TV. Ending with the fact that nothing truly productive or worthwhile seems to happen on reality TV. In fact the opposite seems to happen including the destruction of families.
Maybe you think I am over reacting to Jon and Kate's big announcement and yes it was billed and advertised and marketed as their "big announcement," of divorce. WELL DUHH!! Did we not see that happen did we not watch as two people who obviously cared for each other started to like each other less and less. Honestly, anyone who watches that show just watched a family fall apart. I don't know about you but I don't think that is entertainment I think that is sad.
Meanwhile, my sons and I were only members of the household who refused to watch Jon and Kate's big announcement last night. With all my heart I wanted them to say the show was ending and that they were going to work on their family. I did not want to hear that they were separating and a family was being pulled apart.
It just makes me wonder what kind of world we live in that it is entertainment to watch a family fall apart. What kind of reality is it anyway for a family of 10, whose only real income is that of their show, to take vacations to Hawaii, Park City, Disney World, buy a million dollar home and so forth. That is not reality and ultimately they payed a far greater price for all those hand outs.
I am guilty of watching the show and enjoying the show in the first couple of seasons. As time went on I found the show was just irritating and not really how a mom and dad taking care of 8 kids lives. There are many more shows on TV right now that are one form of reality or other and honestly I am done with them all. I think writers should go back to writing good shows and reality should stay where it belongs in reality not on the TV.
Comments