New Sats to 'Persistent Stare" at Targets for Years

bobbymike

ACCESS: USAP
Senior Member
Joined
21 April 2009
Messages
13,094
Reaction score
5,873
Over the next decade, the Pentagon plans to launch satellites that offer a revolutionary leap in surveillance technology by persistently staring at targets from space for long periods of time, an official said.

Under Secretary of Defense for Intelligence Michael Vickers gave the estimate at a defense conference this week in Washington, D.C.

The Defense Department is at a “pivotal moment for intelligence” due to the rapid technological and geopolitical change underway throughout the world, he said. Adapting to the environment requires requires both short– and long-term investments, he said.

Read more: http://defensetech.org/2014/11/21/pentagon-satellites-to-persistently-stare-at-targets-in-10-years/#ixzz3JkqVURhi
Defense.org
 
Are they planning on doing this geostationary or LEO? Both present some really hairy challenges. Near Space makes better sense.
 
If DARPA's project MOIRE pans out, then yes, we will be doing the gorgon stare from GEO. High lattitude targets still require a polar orbit though, so either three MORIE rigs on molnyia orbits or a more traditional and conventional spyspat in polar orbit to cover the GEO overwatch gaps.
 
ouroboros said:
If DARPA's project MOIRE pans out, then yes, we will be doing the gorgon stare from GEO. High lattitude targets still require a polar orbit though, so either three MORIE rigs on molnyia orbits or a more traditional and conventional spyspat in polar orbit to cover the GEO overwatch gaps.

You're still going to be 22,000 miles away from the target and looking through ~50 miles of atmosphere. You can have an ungodly high resolution imager to get more details, software compensation for atmospheric distortions, and the pictures still wont look near as good as just having your robot bird with a realistic appearance fly by the target for a look.
 
sublight is back said:
Are they planning on doing this geostationary or LEO?

Why not neither? Use a statite and hover hundreds of miles over a spot for years.

Payload potential is probably pretty minimal... but it'd make an interesting system nonetheless. Imagine a small telescope that hangs of Tehran, 24/7. Or one given a small ability to maneuver, and it follows Putin around.
 
Yeah but... from Figure 2 of the patent, any feasible mass to area ratio statite will be AT LEAST twice geostationary altitude. Maybe Lunar orbit or beyond.
 
sublight is back said:
Proximity and cost.


Care to share lifetime operating cost estimates vs. a representative GEO or LEO platform?


sublight is back said:
You're still going to be 22,000 miles away from the target and looking through ~50 miles of atmosphere. You can have an ungodly high resolution imager to get more details, software compensation for atmospheric distortions, and the pictures still wont look near as good as just having your robot bird with a realistic appearance fly by the target for a look.


[font=verdana, arial, helvetica, sans-serif]For a number of reasons atmospheric distortion is not an issue for imaging satellites looking down.

[font=verdana, arial, helvetica, sans-serif]By detail I assume you mean optical resolution. Again, this is not that much of a concern - there are points of diminishing returns. The highest resolution satellites available today have known mirror diameters, and these have not changed significantly in what, 30 years? NRO does not see a significant need for greater resolution, and in fact for many missions prefers lower optical resolution (at least in visible bands).[/font]

[font=verdana, arial, helvetica, sans-serif]To get the "global" level of coverage they want would require enough "robot birds" or "near space" platforms to impact the photosynthesis of plant life on the surface.[/font]

[/font]
 
Bill Walker said:
Yeah but... from Figure 2 of the patent, any feasible mass to area ratio statite will be AT LEAST twice geostationary altitude. Maybe Lunar orbit or beyond.

Yeah, but all ya gotta do is drop down a real fine wire of scrith and hang a camera at the end. Easy!
 
you might imagine that the deployment of membrane optics would be an operation that you would like to have some experience of before launching a Billion dollar satelite to geostationary orbit?
Perhaps you might want to practise using a subscale system in low earth orbit, an X-37b type platform may be useful for proof of principle experiments.
 
Mat Parry said:
you might imagine that the deployment of membrane optics would be an operation that you would like to have some experience of before launching a Billion dollar satelite to geostationary orbit?
Perhaps you might want to practise using a subscale system in low earth orbit, an X-37b type platform may be useful for proof of principle experiments.


That would make far too much sense.
More detail on the development of MOIRE's technology:
https://str.llnl.gov/january-2013/britten
 
Article on the maturation of Gorgon Stare technology:

http://www.forbes.com/sites/lorenthompson/2015/04/10/air-forces-secret-gorgon-stare-program-leaves-terrorists-nowhere-to-hide/
 
Gorgon Stare isn't really related to "persistant stare" satellite tech, is it? It's a panoramic camera technology for UAVs. I've seen the two conflated but they aren't closely connected, AFAIK.
 
TomS said:
Gorgon Stare isn't really related to "persistant stare" satellite tech, is it? It's a panoramic camera technology for UAVs. I've seen the two conflated but they aren't closely connected, AFAIK.

Tenuous I know, but the initial ambition for both was to persistently 'stare' at 10x 10 km targets. The optics for both drone and satellite systems will be very different but perhaps the sensors would be related

DroneGorgonStare.png


http://www.darpa.mil/NewsEvents/Releases/2013/12/05.aspx
"From GEO it is believed, a satellite using MOIRE optics could see approximately 40 percent of the earth’s surface at once. The satellite would be able to focus on a 10 km-by-10 km area at 1-meter resolution, and provide real-time video at 1 frame per second."

https://medium.com/war-is-boring/the-new-sensor-on-this-drone-can-scan-a-whole-city-at-once-33c314d4c763

The PBS Rise of the Drones documentary is well worth a watch if you haven't seen it, just for a glimpse at the capabilities of the Argus sensor (from 31 min onwards... Just for fun, note the content of the discussion and the eyes of the expert being interviewed at 35:02)

https://youtu.be/HopKAYthJV4
 
Somewhat related AFRL "Gotcha" Radar?

https://www.fbo.gov/index?s=opportunity&mode=form&id=cb163520f9cb87631daa9f7f652dd693&tab=core&_cview=0

'Gotcha radar' aims to help troops see in any conditions

By Jim DeBrosse

Staff Writer
WRIGHT-PATTERSON AIR FORCE BASE — The capability to scan an entire city from the air in any weather and detect ground movement of an object as small as a cockroach :eek:is a goal of the Gotcha radar program at the Air Force Research Laboratory. In development at AFRL for several years, the Gotcha radar was first tested in 2006, looking at a 1-kilometer-sized city section with the help of a shared supercomputer at the AFRL to process the real-time, 3-D video images.

With the unveiling this morning, Aug. 31, of a new supercomputer dedicated entirely to Gotcha, base researchers will be able to scan a circle of about 5 kilometers, converting Gotcha’s real-time radar data into a 400-megapixel image every second, said Dr. Dave Jerome, director of the AFRL’s Sensors Directorate. Each image has about 100 times the resolution of an ordinary camera. The Desch — named after Dayton codebreaker and computer pioneer Joe Desch — was custom-built for Gotcha researchers by Silicon Graphics Inc. of California. To move massive amounts of data quickly, the computer has an extra processor, an SGI Altix 450, dubbed “The Bombe” by researchers for the high-speed codebreaking machine Desch designed to crack the Nazi Enigma codes. Under a top-secret Navy program starting in 1943, NCR produced and built 120 of Desch’s codebreaking machines, which were so reliable they operated day and night until the end of the war.

Although the $2.2 million Desch supercomputer can assimilate, store and analyze radar data with blazing speed, its hardware alone can’t show researchers what’s moving on the ground. Radar operates by sending out electromagnetic energy and detecting the energy reflected back by objects. Radar’s advantage is that it can find objects day or night and peer through clouds, rain or snow. The disadvantage is that it requires complex computer coding to translate its readings into meaningful, real-time images for surveillance. “It’s not like putting together a Power Point presentation,” said Gotcha program manager Mike Minardi. “You have to write the code.” The Gotcha program is an Air Force team effort involving 50 researchers, most of them at Wright-Patterson. Minardi’s group handles the processing of the radar signals, a second maximizes the efficiency of the supercomputer, a third is tweaking the radar design and a fourth — based in Rome, N.Y. — is working on the datalink between the airborne-radar and ground crews.

If successful, the Gotcha radar will increase the real-time awareness of troops on the ground, particularly in urban areas where they wouldn’t be able to see the approach of vehicles or other threats, Minardi said. Because of its massive storage capability, as much as 2.2 million home computers, the Desch also will allow Gotcha users to peer back in time. When trouble erupts in a city, they can view past images and trace back the offenders to their origins. Minardi said the computer may be able to store a month’s worth of images. Tom Majumder, a Gotcha researcher, said the program’s aim is to build a system that provides persistent real-time surveillance for troops any time of day or night and under all conditions. “We compare what we’re doing with what Desch had done with his Bombe,” he said. “We need to do this surveillance 24-7.”
 
By my rough maths, that's roughly 20 pixels per m2 (based purely on the "400MP" statement; it could be a blurry 400MP image) so unfortunately you'd need some prehistoric super cockroaches to spot them from space.
 
Re: New Sats to "Persistent Stare" at Targets for Years

A 400 megapixel image 'has about 100 times the resolution of an ordinary camera'. The author is slightly behind the times when it comes to the state of the art in camera sensors.
My nine-year old, cheapish at the time Sony DSC-H5 has a 7.2 MP sensor.
The fairly new, commercially available Nikon D7200 has a sensor of 25 MP nominally, 24 MP effectively.
 
Either the reporter got some details wrong or the radar isn't actually trying to generate a high resolution image but rather simply looking for sub meter level displacement from frame to frame. Seems to be a radar analog to "Gorgon Stare". This system is pretty heavy on the back end computations so I wonder if an alternate approach of simply optimizing a radar to cue on returns generated by humans could be scaled to cover a large area. There have been a number of radars developed to look through forest canopies to track humans walking underneath. They could see if a blimp borne version might be scaled up to cover larger areas.
 

Attachments

  • Forrester Radar On Humingbird.jpg
    Forrester Radar On Humingbird.jpg
    213.9 KB · Views: 460
Re: New Sats to "Persistent Stare" at Targets for Years

Could somebody please change "Persistant" (French spelling) to "Persistent" (English spelling) in the thread title?
 
Note it says "movement".

The data is collected in a single radar mode, but is processed into several different data products such as video Synthetic Aperture Radar (SAR), Ground Moving Target Indication (GMTI) with Minimum Detectable Velocity (MDV), Coherent Change Detection (CCD), Super-resolution 2D imagery, and 3D SAR imagery.
The cockroach is likely a reference to its GMTI capabilities. Megapixels probably refers to it's SAR capabilities.

But since SAR requires the antenna to be moving it is very probably not related to persistent *staring*.
 
The gotcha radar is indeed very interesting... I doubt it is intended for use on a satellite (LEO or GEO). I'll leave this patent here for the interested to give some background on the rayleigh diffraction limit
http://www.google.com/patents/US7375802?cl=en
(Note, The science behind the claims of this patent is ... "Unproven" with notables such as Prof Brian Cox saying some very unkind things)

In the context of the US developing new IMINT satellites, perhaps things are further along than is being claimed? It might explain this act of generosity
http://www.secretprojects.co.uk/forum/index.php/topic,15730.60.html
 
Old but relevant (IMHO)

http://www.defenseindustrydaily.com/darpas-moire-video-scud-hunts-from-space-07079/

"DARPA expected a $30-40 million award for Phase II, which came to pass in September 2011 with an award to Ball Aerospace. Phase 2 will produce and test a 5 meter brassboard telescope, showing a technological path to flight aboard a satellite. Membrane fabrication must also be trace-to-flight, and phase 2 will incorporate either sub-scale or fractional demonstration of the system’s unfurling in space. Phase 2 will end with a System Preliminary Design Review."

"MOIRE Phase 2
Sept 2/11: Ball Aerospace Corp. in Boulder, CO receives a $36.9 million cost-plus-fixed-fee contract for Phase 2 of the MOIRE Program.

Work will be performed in Broomfield, CO (73.2%); Livermore, CA (7.2%); Goleta, CA (17.2%); and Huntsville, AL (2.4%). Work is expected to be complete by Feb 10/13 (HR0011-10-C-0157)."

some possibilities:
1) these unfurling tests did not occur
2) they were launched as a piggyback payload on a DOD launch
3) X-37b (if not unfurling tests, then at least a sample of the membrane could have been carried in the payload bay, exposed to space for the duration of a mission and then returned for inspection to evaluate the membranes robustness to space)

I have read somewhere that humidity (in the pre-launch, terrestrial environment) is a challenge for earlier membrane materials. Not surprising that optics made from materials that resemble "cling-film"* are delicate.

*other thin-film food wrap products are available ::)
 
The fact that the white world observation sat company's aren't all over membrane optics struck me as odd, especially the cubesat guys. MOIRE's earlier incarnation as a defraction grating laser holed mesh must have enough negative attributes to make it not worthwhile, and the newer ion milled fresnel lens membranes seem to be a giant pain to make. The folding issue is real, but having a hole in the middle makes it a lot easier. Never understood why Ball was using squarish frames for their design though. wouldn't a design based on triangle subframes stretching the membrane be easier from a mount mechanical perspective to keeping the membrane flat, since any 4 point attachment will always have some level of creasing across two points, but a three point mount should yield an uncreased membrane?
 
Easy enough to just go to Ball and see what they have to say. Note how the membrane material is visibly wrinkled in some of the video. The critical factor is the lateral spacing of the synthetically generated fringes deposited on the membrane so you can get away with this if it isn't too large. These "optics" can work in either refraction (as implemented for this program) or reflection but you need a precision substrate if you do it in reflection. Like any diffractive optic, there is the issue of grating "efficiency" but given the size you can achieve, you more than make up for it. On the other hand, the resolution is directly related to diameter so you can see it might be possible to build a high resolution spy sat and park it in geostationary orbit. A scaling factor of 88 would be needed to match what a low altitude spy sat could do (88 meter diameter at geostationary to equal 1 meter diameter at 250 miles).


https://www.youtube.com/watch?v=QAv06I10Bvw
 
bobbymike said:
Over the next decade, the Pentagon plans to launch satellites that offer a revolutionary leap in surveillance technology by persistently staring at targets from space for long periods of time, an official said.

Under Secretary of Defense for Intelligence Michael Vickers gave the estimate at a defense conference this week in Washington, D.C.

The Defense Department is at a “pivotal moment for intelligence” due to the rapid technological and geopolitical change underway throughout the world, he said. Adapting to the environment requires requires both short– and long-term investments, he said.

Read more: http://defensetech.org/2014/11/21/pentagon-satellites-to-persistently-stare-at-targets-in-10-years/#ixzz3JkqVURhi
Defense.org

Video of Michael Vickers comments

https://youtu.be/HLfvrBmQeMc

Skip to 13:10 for the section dealing with persistent global coverage... Just for fun, note that at 13:56 his hand gestures could be interpreted as being revealing (indicating low earth orbital coverage moving to geostationary?).

In the interim, new LEO IMINT satellites are on the way, unless NROL-71 surprises us all & launches to GEO ;) (unlikely)
http://spaceflightnow.com/2015/05/01/next-round-of-u-s-optical-spy-satellites-to-start-launching-in-2018/
 
Potentially another method to high resolution (7cm!) imaging from a geostationary orbit

agorascope1.png


aragoscope2.png


aragoscope12.png


aragoscope14.png


https://www.nextbigfuture.com/2016/08/giant-non-diffraction-limited-space.html

From the video viewable in the above link (from ~ 60 min onwards) the lead researcher says that the NRO threw the idea in the waste paper basket....

One idea that initially crossed my mind was could the opaque disc that (sits upfront of the optics package) block the satellite from earth based observation. My next thought was "Misty eyed speculation"
http://www.space.com/637-anatomy-spy-satellite.html

Edited to fix broken links (20th August 2017)
 
https://www.nextbigfuture.com/2017/06/israeli-gigapixel-camera-on-drones-with-30-hours-of-endurance-provide-persistant-surveillance-of-80-square-kilometers.html
 
Does GORGON STARE figure into this? I know they were looking at making use of high pixel density cellphone camera sensors in huge arrays. Think of a few thousand 16 megapixel sensors overlapping in conjunction with hyperspectral imaging for a persistent stare. I haven't heard much on the program lately so I figure it either petered out or is going operational for NRO. It was a DARPA program initially but I lost track of it.
 
https://www.nextbigfuture.com/2017/08/more-telescope-for-drones-and-satellites-with-weight-reduction-of-ten-to-100-times.html
 
phrenzy said:
Does GORGON STARE figure into this? I know they were looking at making use of high pixel density cellphone camera sensors in huge arrays. Think of a few thousand 16 megapixel sensors overlapping in conjunction with hyperspectral imaging for a persistent stare. I haven't heard much on the program lately so I figure it either petered out or is going operational for NRO. It was a DARPA program initially but I lost track of it.

It was featured on a science documentary not long ago over here in the UK as it's a BAE product and though the guy talked in general terms about it, they made it clear there was much they couldn't talk about it so I assume it's still going.
 
http://www.defenseone.com/technology/2017/09/future-spy-satellites-just-got-exponentially-smaller/140700/?oref=d-topstory
 
https://www.upi.com/Defense-News/2017/09/29/New-DARPA-radar-sensor-captures-video-through-clouds/6061506697848/?utm_source=sec&utm_campaign=sl&utm_medium=1
 
https://www.armytimes.com/news/your-army/2017/10/12/army-researchers-are-working-on-mapping-tech-that-can-help-you-see-the-battlefield/
 
http://www.nationaldefensemagazine.org/articles/2012/9/1/2012september-super-camera-puts-human-eye-to-shame
 
bobbymike said:
http://www.nationaldefensemagazine.org/articles/2012/9/1/2012september-super-camera-puts-human-eye-to-shame

Same folks who did this earlier version back in 2012.
https://www.youtube.com/watch?v=ASnm8P_OiTA
 
Back
Top Bottom