When was the US destroyed? When slavery was made illegal? When women were given the right to vote? The civil rights movement? Without religion none of these things would have existed.
I don't really care either way what religions the founding fathers were. They allowed the religious populace to use their bible to enforce slavery and consider women and minorities as less than human.