I need to stop reading books with a slave theme, because I never like them. Slavery is slavery is slavery. I don't care how nice the master/owner is, he chose to buy slaves and has no problems treating them as such.
I also really did not like the way the character was gang-banged, and then magically he trusts his master and starts to love him, and once they have sex all is well the end.
There's also no explanation of the culture. Why are there slaves at all? Slaves and free do the same work, which seems strange to me. When did it become common? Why?