A Definition Of Urology
Source: Adapted from the National Institutes of Health

What does the term “urology” mean? The term “urology” refers to the branch of medicine that deals with the urinary system in women and the urogenital system in men. To find out more about this term, please search the news section of this website for related articles and information.


Share: