Taking a cue from the way eyes and brains register images, Western Sydney University has unveiled a new type of camera to underpin a mobile space situation awareness module ready to contribute to space and defence systems.
Known as an event-based camera, the pixels in the Astrosite only capture data when they sense a change in light, reducing the amount of data contained in the end image.
And since the Astrosite’s pixels only register changes in light, they don’t get saturated and over-exposed in the same way traditional pixels (or even chemical-based photographs) do.
The Astrosite was developed at WSU’s International Centre for Neuromorphic Systems (ICNS) with support from the Royal Australian Air Force’s Plan Jericho and Defence Science and Technology (DST) group, with a goal of producing quality images that can be computed more easily.
Project research lead associate professor Greg Cohen told iTnews that if you were to point the Astrosite’s event-based camera (also known as a neuromorphic retina) at a static scene, you wouldn’t get any data out of the camera.
“Only when something moves in the field of view do the affected pixels send the information,” Cohen said.
“So you can imagine if you're pointing it at a telescope at the sky, most of the sky is empty. It's just dark space, so you don't get any information out of that.
“If you use a normal camera, you're taking a picture and you're taking a picture of all this empty sky, or dark sky at least, continuously. So that's where the enormous amount of data
comes from.”
And since sky monitoring systems tracking satellites, space debris or aircraft are only really interested in looking at how things are changing or moving, the Astrosite is hugely beneficial in that redundant data is eliminated.
Cohen added that since the pixels are asynchronous and operate independently from one another, it creates a higher dynamic range in the images and track objects in space even during daylight hours.
“If it's very bright, it's now just looking for changes and contrast essentially around that
point. So the reason the cameras can work during the day is that when there's lots of light, it's still just looking for changes.
“It's not looking for the absolute amount of light. Some of the pixels can be looking at a very bright part of the scene, and some pixels can look at some very dark parts of the scene so you don't get this overexposed, underexposed effect.”
While space and astronomy are evident applications for event-based cameras, Cohen said they’re a whole new approach to imaging with an array of industrial uses.
“I think there's some really interesting applications in underwater for example. Anywhere where you have low light. Autonomous vehicles, drones, anywhere where you use a camera to try to solve a problem where you don't really want to see a photo at the end of it, you want to do something.”