I found this quote by Carl Jung and immediately loved it. I've been really thinking about all of the hype (positive and negative) about climate change lately. Everyone has an opinion, but it seems like the opinions are politically focused rather than focused on what is really important - our planet. Regardless, isn't it our responsibility to do our part and take care of the earth? In the end, if "climate change" is just a hoax, you've helped make the planet a better place to live.