Does God Put Sickness on People to “Teach” Them Something?
Just believing in healing is not enough. You must believe that it is God’s will to heal you. You have to believe that healing is yours and it belongs to you. Because of traditions of men, we have tried to tell the world that the God we serve has made us sick. What a lie to tell about the Father God who is the God of love and mercy!
You cannot stand in faith against sickness and disease when you have been taught that sickness is God’s desire for you. How can you have faith for your healing when you think God has put cancer on you to teach you something? God wants you healed. Despite what tradition says, that’s the truth!
As Christians, we are supposed to be lighthouses of deliverance and help in a dark world. God’s desire is that we show His love and power to the hurting world around us. The world is supposed to see good works in our midst, not sickness and disease.
Our commission is to bring forth the Word of life—the Word of God concerning salvation, healing and deliverance to those around us. Jesus said we are to lay hands on the sick and they would recover. That God gets glory from His children being sick makes no sense. But more importantly, it doesn’t agree with the Word of God.