I really wish we could go back in time to when a company provided a service and did not have an opinion on social matters! Why do companies feel like they need to take a stand on anything not related to their business? Do they really think or know that it creates new customers to bring in more money by doing so? A business exists to make money. So someone somewhere thinks this type behavior from a business brings in more money.