Does God still heal people, like He did when Jesus was on earth? If He does, then why do we need doctors and medicine? Shouldn't a strong faith be enough, and isn't that what God wants us to have?
“Does God still heal people, like He did when Jesus was on earth? If He does, then why do we need doctors and medicine? Shouldn’t a strong faith be enough, and isn’t that what God wants us to have?”