From the earliest days of radiation testing, scientists have been looking for ways to measure and observe the radiation emitted by materials. To do this, they have developed a variety of radiation detection instruments, each with its own advantages and disadvantages. The most common type of detector used today is the Geiger Mueller (GM) detector with pancake type probes. This device is portable and can measure the amount of radiation present in units of counts per minute, counts per second, or microroentgen (µR) or microrem (µrem) per hour.
Another type of detector used in radiation detection instruments is the scintillation detector. This device uses a material such as sodium iodide to detect gamma and x-ray radiation. It is often used in gamma-ray spectroscopy to identify the specific radioactive material (radionuclide) that emits the radiation. Finally, electroscopes were one of the earliest detectors used to measure radiation.
This device uses a pair of gold leaves that are charged by ionization caused by radiation and repel each other. This provides a means to measure radiation with a better level of sensitivity than was reliably possible using photographic plates. In addition to these detectors, there are also specialized instruments known as teletectors that are specifically designed to detect gamma and x-ray radiation. These devices are often used in health physics and radiation protection to measure the absorbed dose, i.e., the amount of energy deposited in a material by ionizing radiation.No matter which type of detector is used, it is important to understand how it works and how it can be used to accurately measure and detect radiation.
By understanding the different types of detectors available, scientists can choose the best instrument for their needs.