I have other reasons for being cynical of the faith of my youth, but I’ve long been told that “the Church was sick” and that it was “in need of a reformation.” I understand the premise and seeing what is being acted out versus what the Bible clearly teaches, it’s hard to argue that many Christians don’t even seem to following the teachings of Christ or really any aspect of Christianity.
I am not convinced a healthy church will mean much, but I’d certainly prefer to live in a society that preaches love, compassion, and humility over personal power and wealth.
I do find it strange, and a bit telling, that Christians have been historically conservative in modern times, while the Gospel seems to be rather progressive. How did the Church become a bastion of conservative ideology and fear?
I do wish you luck. Humanity certainly needs compassion and love. I’m not sure how much it needs the Church.