I don't think (at least in my own case) that it was anything created by my parents. It's just that when we were younger the boys were taught to have stereotypical male interests and the girls were taught to have stereotypical girl interests. It was nothing my parents told me, but I didn't want to hang out with most boys because they liked to fart and wrestle while my friends and I wanted to pretend we were fairies. I'm not saying this is a good thing; there's no reason other than society's norms for young boys to be different than young girls. However, at least when I was younger, the separation had nothing to do with fear of rape or violence and my parents didn't teach me to hate boys; I had male friends once I got a little older and became a bit of a tomboy, which my parents completely accepted.
When I was growing up there was a clear divide between boys and girls. Being the only girl in a large family, I enjoyed growing up with boys and playing their games, with their toys, etc. Then we entered grade school and learned that boys and girls were too different to play together and I, being the only girl, was ostracized from the only playmates I had.
I remember girls in my classes calling boys gross, boys calling girls dumb, and the divide was encouraged, if not enforced, by our parents and teachers. We were told that interacting with each other was inappropriate. As a girl, I was told that boys were "out to get me" and I better watch my back because befriending them would bring nothing but trouble. I dunno what the boys were told but, from the ridiculous crap I've heard, it wasn't anything good.
I would look beyond this particular group of girls that you work with. Is this an attitude brought into the group by one or two girls with particularly strong opinions and personalities? Are the parents friends? Do they belong to the same community? Schools? Churches? Book clubs? Is this an attitude the parents are collectively instilling in their children?