Did you know: your iPhone's camera can detect radiation and there's an app for that
Apparently, smartphones can do everything. Today, we learned that their high-resolution cameras have the means to detect and assess radiation levels! It's a good thing we don't have to resort to that, but it's still nice to know. How is this possible, anyway? Well, a digital camera is designed on the same principle as the human eye. The retina absorbs light, converts it to a neural impulse, and hands it over to the brain along the optic nerve. Likewise, a camera has its CMOS sensor perceive light, turn it into electricity and send it to a processor that interprets it much like our brains do.
source: Ray Detect
Cameras are more sensitive to radiation than you probably think.
The trick is that light is actually a form of radiation, thankfully not of the same kind that turns turtles into ninjas. Since cameras do their thing by soaking in light, it makes sense that they would respond somehow to radiation, which is essentially polluted light. For example, old-timey cameras have their films fogged due to the excessively powerful light, while digital ones get current spikes. The latter can be analyzed with software based on what scientists call "continuous high-delta algorithm," letting radiation levels be estimated.Said software isn't something you'll never see outside of specialized laboratories. Quite the contrary, you can actually have it on your iPhone! The Ray Detect app (link) was the first of its kind to implement the algorithm on mobile, taking advantage of Apple's burst shooting and graphics-processing features brought with iOS 7. It analyzes each pixel of the camera sensor individually, as well as information from all the pixels combined to come up with specific values. They are compared against values from a database which reproduces the radiation spectrum. This way, Ray Detect can detect the pollution and produce readings of radiation doses. Pretty cool, huh?
source: Ray Detect
Things that are NOT allowed: