The history of infrared thermography can be traced back to scientific research in the late 19th and early 20th centuries. Here are some key moments and important contributions in the development of infrared thermography:
Discovery of infrared radiation: Infrared radiation is an electromagnetic wave with a wavelength longer than visible light and cannot be seen by the human eye. It was first discovered by British astronomer William Herschel in 1800. He used a prism to separate sunlight into different colors, and then placed a thermometer in different areas of the spectrum to measure temperature changes. He found a region beyond the visible spectrum with higher temperature, which was the discovery of infrared radiation.
Need for infrared imaging: Over time, scientists and engineers began to realize the enormous potential of infrared radiation in various fields such as night vision, thermal imaging, and medical diagnostics. This led to continuous exploration and development of technologies that can achieve infrared imaging.
Development of early infrared thermography: In the early 20th century, scientists started studying methods for sensing and detecting infrared radiation. Some early attempts included using thermocouples, platinum black, and other materials to manufacture infrared detectors. Although these devices were not as accurate and sensitive as modern infrared thermography today, they laid the foundation for the development of infrared imaging technology.
Advancements during World War II: Infrared technology became particularly important in military applications during World War II, such as night vision devices and missile guidance systems. This drove rapid development and improvement in infrared technology.
Development of modern infrared thermography: In the latter half of the 20th century, with advancements in semiconductor technology and improvements in infrared detector technology, modern infrared thermography became widely used. These devices can capture the infrared radiation of objects and convert it into visible images, used in various applications such as thermal imaging, night vision, medicine, architecture, and industry.
The history of infrared thermography originated from initial exploration and scientific research on infrared radiation, and then, driven by the demand in military and civilian sectors, after years of technological progress and improvement, modern infrared thermography has become an important imaging and measurement tool.