On Friday, May 5, 2017 at 5:35:15 AM UTC-6, burfordTjustice wrote:
America Was Founded As A Christian Nation By Christians That Believed
That The Bible Is The Word Of God
http://www.dcclothesline.com/2017/05...e-word-of-god/
The Bible is for the most part...a WORK OF FICTION. When are people going to
understand this and accept that fact?
====