The article does seem to have a negative slant, but I guess you can't be PC all the time. But anyway, women do need to be more careful with their bodies. In addition, they need to encourage their sons to do the right thing by their children. I think if more 'baby daddies' were forced by their mothers to be involved in their kids' lives, they might actually become attached to them. Too often the mother is more vocal than her son when it comes to avoiding taking responsibility for the life he has created. It makes no sense either when she herself was abandoned by her boyfriend when her child or children were born. Granted, if the kid is not living in your house there isn't much you can do, but if he is and it's proven by a blood test he is the father, then taking care of the kid should be an automatic rule of the house. And of course, even outside the house mom can still have an influence. Most adults do still care on some level what their parents think, even if they don't agree with them.