Planet Labs Tests AI-Powered Object Detection On Satellite
- Reference: 0181406998
- News link: https://tech.slashdot.org/story/26/04/08/0145220/planet-labs-tests-ai-powered-object-detection-on-satellite
- Source link:
> Artificial intelligence has now run directly on a satellite in orbit. A spacecraft about 500km above Earth captured an image of an airport and then immediately ran an onboard AI model to detect airplanes in the photo. Instead of acting like a simple camera in space that sends raw data back to Earth for later analysis, the satellite [2]performed the computation itself while still in orbit .
>
> The system used an NVIDIA Jetson Orin module to run the object detection model moments after the image was taken. Traditionally, Earth observation satellites capture images and transmit large datasets to ground stations where computers process them hours later. Running AI directly on the satellite could reduce that delay dramatically, allowing spacecraft to analyze events like disasters, infrastructure changes, or aircraft activity almost immediately.
"This success is a glimpse into the future of what we call Planetary Intelligence at scale," [3]said Kiruthika Devaraj, VP of Avionics & Spacecraft Technology. "By running AI at the edge on the NVIDIA Jetson platform, we can help reduce the time between 'seeing' a change on Earth and a customer 'acting' on it, while simultaneously minimizing downlink latency and cost. This shift toward integrated AI at the edge is a technological leap that can help differentiate solutions like Planet's Global Monitoring Service (GMS), providing valuable insights for our customers and enabling rapid response times when it matters most."
[1] https://slashdot.org/~BrianFagioli
[2] https://nerds.xyz/2026/04/ai-space/
[3] https://www.businesswire.com/news/home/20260407165913/en/Planet-Successfully-Runs-AI-in-Space
Not impressive, a Pre-ML 1990s PC doable problem (Score:2)
> Instead of acting like a simple camera in space that sends raw data back to Earth for later analysis, the satellite performed the computation itself while still in orbit.
Detecting aircraft in a satellite image is something that human coded algorithmic computer vision, not machine learning, could do in the 1990s on a desktop PC. That a Jetson could handle a model recognizing aircraft is not surprising at all. It is a rather simple problem. Again, Pre-ML 1990s PC doable.
Smart phones and smart watches are already pioneering local ML processing. The machine learning models that can be run on the CPU inside an Apple Watch are impressive. Amazing onboard voice analysis.
Its
Missile, not satellite, probably more desired goal (Score:2)
Might be a proof of concept project, the real goal is getting the local ML processing onboard a missile. Or a rod from god.
and it almost worked! (Score:1)
Looking at the image, the AI mis-identified a tail section as wings of a complete, smaller plane.
bent pipe (Score:2)
for fecks sake
there is a reason why you dont do compute in space its dumb and however much you think there is power etc you still have to launch that weight up there
best option is to do all of this on earth and raw data transmitted is the best option
the ultimate is a passive system like a bent pipe
get over it
Re: (Score:2)
Not a difficult task, local processing not a new idea, but ...
Local processing - whether your smartphone, smartwatch, or a satellite in orbit - all have utility in that:
(1) Data is private.
(2) Time/bandwidh not need to transmit data.
With local processing, one might get an answer faster because the transmitted result is such a smaller piece of data.