Whether Christianity was particular in its mistreatment of women, it never overcame the culture and showed where the culture was wrong...in this or any other regard. The same goes for all religions. Their gods never seemed to be able to break out of the culture. If slavery was the "thing," the gods and holy men told 'em how to do slaves. If dominating women was the thing, god and the holy men told 'em how to do women. If the culture hated homosexuality, then their god hated it too. If they were ignorant about science, their God didn't know science either. If the culture superstitiously thought blood had some special magical powers, then god used blood to mark doorways and wash away sins. If the culture thought diseases and mental illness was due to sin's curse or demons, then their god acted like that was true too. Why do their gods never tell them anything they don't already know?