From The Art and Popular Culture Encyclopedia
Christian naturists are Christians found in most branches and denominations of Christianity who practice naturism or nudism. They find no conflict between the teachings of the Bible and living their lives and worshiping God without any clothing, believing that covering the body leads to its sexualization. Thus, the common notion that nudity and sexuality go hand-in-hand is seen as a worldly point of view. The Christian definition of the human body should be separate, distinct, and non-materialistic.