I can completely understand why some people choose to believe that somewhere, amid the chaotic and ever changing complexities that is life and the universe, there is something or someone who not only understands what it's all about but cares enough about everybody to make sure that it all turns out nice in the end. I can see why that would offer a great wealth of comfort and support and how it would help people try to live in a way that echoes that kind of care and compassion. This is faith. It may not be everyone's cup of tea, but in the grand scheme of things it's harmless and does a multitude of good for those who need it.
Whenever anyone takes that faith and twists it into some form of control over others, whenever anyone claims to know for certain that, not only is there definitely a higher power but, that they know exactly how that higher power does and doesn't want people to behave. Whenever someone acts as though their beliefs give them greater rights or some kind of perference above others. This is where faith ends and religion takes over.
Religion is the destructive and divisive dark side of faith. It drives people to war and violence over petty differences in interpretation. It's the misplaced certainty that allows parents to sit by and pray as their child suffers and dies of medically treatable condition. It's the prudish and puritanical judgement that makes people think they have a right to poke their noses into the sex lives and health issues of others.
When the overwhelming message of almost every system of belief is "just be nice to one another." How did religion turn it into "a billion and one ways to be hateful to each other?"