A Definition Of Rabies
Source: Adapted from the National Institutes of Health

What does the term “rabies” mean? The term “rabies” refers to a fatal, if untreated, viral disease of mammals that’s spread to humans by the bite of infected animal. To find out more about this term, please search the news section of this website for related articles and information.


Share: