You know what's bullshit?
Nature. You can't sit through an advert break these days without seeing about five different products advertised as containing 'natural ingredients', and we're supposed to unquestioningly accept that this makes them superior and worth paying extra money for in the middle of a global financial crisis.
Ths thing is, someone at some point has arbitrarily decided that nature equals good, and the majority of the world has just accepted it without asking any questions. Hasn't anyone seen a nature documentary? Nature is a horrible thing to live your life by. It's full of disease, and pain, and animals getting ripped to shreds by other animals at every opportunity. Come to think of it, almost everything that people have built against nature is good. We've made ways to treat disease, care for the injured, and our general health and happiness is better today than at any previous point in time.
Come to think of it, holding up nature as some sort of ideal is essentially giving a big "FUCK YOU!" to all the hard, mostly uncredited work that's been put in over the course of human history to improve the lives of everyone. From now on, anyone involved in advertising stuff as 'natural' should be forced to go into a forest somewhere next time they get sixk or injured, and wait for Mr Squirrel and Mr Deer to tend to their wounds. Bastards.