4 views

1 Answers

The theology of the body is a term used in Christian theology to refer to the teaching of various Christian denominations on the human body as it relates to God and the church.

4 views

Related Questions