Highest Rated Comments


counting_noodles135 karma

As a lifelong Christian, I walked away from the church this past summer. I couldn't reconcile my personal values with the values my church was encouraging. Granted, I live in the south, but for instance our church would have whole sermons on why women are subservient to men, which (seemed) to be backed up by the scripture they pulled. These controversial issues seem to only be exacerbated by the trump era.

Walking away from the church isn't isolated - the majority of my left-leaning friends and siblings have done the same. And the majority of young people identify as left-leaning. Do you think the church should adapt to shifting ideologies (and thus, perhaps, interpret the bible less literally) in order to retain/keep their congregation? Or just double down on what their core base wants to hear?

ETA: my experience is limited to what I've heard and seen growing up in a non-denominational church in the south, which may represent the most egregious viewpoints the church still holds. I'll check out the denominations others have recommended to me in the replies.