A Definition Of Radiology
Source: Adapted from the National Institutes of Health

What does the term “radiology” mean? The term “radiology” refers to the branch of medicine that deals with the use of x-rays. To find out more about this term, please search the news section of this website for related articles and information.


Share: