I have been wondering about this for a long time. I know men and women are different and wear different types of clothing most of the time, but what is up with them showing off their feet so much? Not only do they wear sandals and flip-flops at the beach, they're now wearing it in schools, at work, church services, weddings, proms, the Red Carpet, just about anywhere. A lot of them say they hate feet, but they sure love showing them off. I really don't care what people wear, but isn't showing off your feet all the time quite primitive?