I'm curious to know how important your job is when telling a women and I mean more the type of job you have. For example I plan on being a nurse however that is seen as more of a feminine line of work so I'm slightly concerned they won't regard me as a male as much. I somewhat plan on studying, getting money then going to America to study another year for a better paying nursing job and I'm well aware this shows me as being more ambitious but I'm still curious to your insight.
This site is pretty incredible and I do plan on posting here more often once I gain more knowledge.
This site is pretty incredible and I do plan on posting here more often once I gain more knowledge.