An infrared or IR sensor is an electrical device that measures and picks up infrared radiation in its surroundings. Infrared radiation is from the portion of the electromagnetic spectrum that extends from the red end to microwaves. Infrared is invisible to the eye.
In the popular imagination, the most understood use of infrared sensing technology is in low light or night vision devices – or infrared remote controls for home entertainment systems. But the history of infrared and its uses (and usefulness) are much wider.
Infrared radiation (also sometimes abbreviated to IR) was first discovered in the year 1800 by British Astronomer William Herschel while he was studying the sun. Originally, he termed the phenomenon ‘calorific rays’ after the Latin word for heat ‘calor.’ The term infrared, meaning ‘below red’ (infrared falls below red on the spectrum of visible light), was not coined until later in the 19th century.
Infrared detection was first used in the Second World War as image intensifiers, based on work done by a Hungarian physicist in the 1930s named Kálmán Tihanyi. Development of this technology continued throughout the 20th century, with the technology moving into the commercial and industrial settings in the 1950s, but saw an explosion in popularity once solid-state infrared sensors were developed in the 1970s. Owing to their affordability and ruggedness compared to preceding technologies.