I just saw a recent Pew poll stating that 64% of Americans claim to be a Christian, and out of the 64%, only 20% read the Bible daily. That’s a surprisingly low number!
If you say you’re a Christian and you truly believe in the power of God’s Word and it’s foundational role in our faith, shouldn’t it take precedence in your daily life? Without the Word of God… How do you deepen your understanding and relationship with God? How do you strengthen your faith? How do you get guidance and comfort in life’s challenges?
Folks, all I have to say is…. maybe only 20% of Americans are Christians!
