Definition of tropical medicine in English English dictionary
Science of diseases seen primarily in tropical or subtropical climates. It arose in the 19th century when European colonial doctors encountered infectious diseases unknown in Europe. The discovery that many tropical diseases (e.g., malaria, yellow fever) were spread by mosquitoes led to discovery of other vectors' roles (see sleeping sickness, plague, typhus) and to efforts to destroy vector breeding grounds (e.g., by draining swamps). Later, antibiotics came to play an increasingly important role. Research institutes and national and international commissions were organized to control common tropical illnesses, at least in areas with Europeans. As colonies became independent, their governments took over most of these efforts, with help from the World Health Organization and the former colonizing countries
the branch of medicine that deals with the diagnosis and treatment of diseases that are found most often in tropical regions