Interest in the connection between faith and health has led to a relatively large number of studies investigating the link. Some religious groups, including some actively involved in alternative medicine, claim to offer complete health to their adherents. Evaluating the health of believers in those religions would provide important evidence about the truth or falsity of those religions (as has been done for the First Church of Christ, Scientist, or Christian Scientists). For this reason, it is important to answer the question as to whether the Bible teaches that God promises to heal Christians. Does it?
More Information: Continue reading