List of Dental Schools in the United States
What is a dental school? A dental school (school of dental medicine, school of dentistry, dental college) is a tertiary educational institution—or part of such an institution—that teaches dental medicine to prospective dentists and potentially other dental auxiliaries. Dental school graduates receive a degree in Dentistry, Dental Surgery, or Dental Medicine,…








