SpaceX (general discussion)

That still does not take care of the threshold problem. With a dim light source, if your interval is too short, the light collected on the sensor is not enough to trigger the sensor to do anything at all. Which means, for that interval, there is no information to add. Reset the sensor, and you start from scratch. Pick a longer interval, and the streaks from passing satellites play havoc with detecting the light from the dim source. For deep space imaging, exposure can range into tens of minutes, or longer.
 
Last edited:
That still does not take care of the threshold problem. With a dim light source, if your interval is too short, the light collected on the sensor is not enough to trigger the sensor to do anything at all. Which means, for that interval, there is no information to add. Reset the sensor, and you start from scratch. Pick a longer interval, and the streaks from passing satellites play havoc with detecting the light from the dim source. For deep space imaging, exposure can range into tens of minutes, or longer.


What is determining whether a pixel gets "triggered"? Light over time? An instantaneous burst? What?
 
The amount of light striking the sensor during the polling interval or shutter time. If the accumulated input is enough to pass the threshold, there is a result with anything past the threshold generating a brighter result. Below the threshold, no result at all.

Photographic film is still in use in astronomy, alongside CCDs
<edit> and IPCS for very long exposure times - Image Photon Counting System </edit>
<edit2> which in some cases is a special type of CCD </edit2>
 
Last edited:
The amount of light striking the sensor during the polling interval or shutter time. If the accumulated input is enough to pass the threshold, there is a result with anything past the threshold generating a brighter result. Below the threshold, no result at all.

So it is being monitored. Then it should be possible to buffer the data and remove the time intervals where numerous pixels get blown out simultaneously. That said, we've probably run this discussion into the ground. Large constellations are coming. Whether reducing satellite albedo below a certain limit will be the solution, software tweaking of collected data, or a combination of techniques, it will get addressed in the end.
 
Again, some very dim objects are only visible at very long exposure times. Snip that exposure interval into shorter intervals, collecting the information from the separate shorter intervals leaves you without any information about the dim object at all, because the detection threshold was not cleared during any of the separate, shorter intervals. Input during interval below detection threshold -> no input detected in the interval-> no information to collect from the interval.
 
Again, some very dim objects are only visible at very long exposure times. Snip that exposure interval into shorter intervals, collecting the information from the separate shorter intervals leaves you without any information about the dim object at all, because the detection threshold was not cleared during any of the separate, shorter intervals. Input during interval below detection threshold -> no input detected in the interval-> no information to collect from the interval.


Yes, for film. I'm not seeing how it's applicable to a digital sensor. It either detects something or it doesn't. Or are you saying each pixel on a detector is analog? (Honest question.)

"A peculiar aspect of nearly all IR FPAs is that the electrical responses of the pixels on a given device tend to be non-uniform. In a perfect device every pixel would output the same electrical signal when given the same number of photons of appropriate wavelength. In practice nearly all FPAs have both significant pixel-to-pixel offset and pixel-to-pixel photo-response non-uniformity (PRNU). When un-illuminated, each pixel has a different "zero-signal" level, and when illuminated the delta in signal is also different. This non-uniformity makes the resulting images impractical for use until they have been processed to normalize the photo-response. This correction process requires a set of known characterization data, collected from the particular device under controlled conditions. The data correction can be done in software, in a DSP or FPGA in the camera electronics, or even on the ROIC in the most modern of devices."

This suggests they're already post-processing collected data.
 
Last edited:
Digital sensors have a detection threshold that has to be cleared before anything is detected. If the sensor's input is not energetic enough, there is input without detection. Accumulate the input over time, and at some point detection happens - because the CCD's charge has changed enough. Reset the sensor for detection of another image before that point is reached, and nothing is recorded.
 
Last edited:
Again, some very dim objects are only visible at very long exposure times. Snip that exposure interval into shorter intervals, collecting the information from the separate shorter intervals leaves you without any information about the dim object at all, because the detection threshold was not cleared during any of the separate, shorter intervals. Input during interval below detection threshold -> no input detected in the interval-> no information to collect from the interval.


Yes, for film. I'm not seeing how it's applicable to a digital sensor. It either detects something or it doesn't. Or are you saying each pixel on a detector is analog? (Honest question.)

"A peculiar aspect of nearly all IR FPAs is that the electrical responses of the pixels on a given device tend to be non-uniform. In a perfect device every pixel would output the same electrical signal when given the same number of photons of appropriate wavelength. In practice nearly all FPAs have both significant pixel-to-pixel offset and pixel-to-pixel photo-response non-uniformity (PRNU). When un-illuminated, each pixel has a different "zero-signal" level, and when illuminated the delta in signal is also different. This non-uniformity makes the resulting images impractical for use until they have been processed to normalize the photo-response. This correction process requires a set of known characterization data, collected from the particular device under controlled conditions. The data correction can be done in software, in a DSP or FPGA in the camera electronics, or even on the ROIC in the most modern of devices."

This suggests they're already post-processing collected data.
By the fact that you failed to answer my point about regulatory failure in this particular issue, that you therefore agree that there has been a regulatory failure here.
 
If there is no image of the dim object to process, no amount of post-processing is going to produce one.
 
According a anonymous source at Florida Cocoa Beach site

Is SpaceX closing the Site and move to the New Construction site at KSC.
Workers are lay off and Starship Mk.2 will be abandon
While Work begin on Mk.3 (Texas) and Mk.4 (KSC)

Source
 
Digital sensors have a detection threshold that has to be cleared before anything is detected. If the sensor's input is not energetic enough, there is input without detection. Accumulate the input over time, and at some point detection happens - because the CCD's charge has changed enough.

And there has to be a way to monitor that charge else the post-processing mentioned above would be impossible. But we could go back on forth on this to what end? You have your opinion and I have mine. If the presence of satellite constellations brings on the extinction of ground-base telescopic observation then you'll be able to pat yourself on the back. When (not if) they come up with a ground-based solution to the problem (could be software/hardware/firmware) rest assured, we'll be discussing this again.
 
By the fact that you failed to answer my point about regulatory failure in this particular issue, that you therefore agree that there has been a regulatory failure here.

Hardly. If the astronomical(?) community failed to make it's case that doesn't make it a regulatory failure.
 
According a anonymous source at Florida Cocoa Beach site

Is SpaceX closing the Site and move to the New Construction site at KSC.
Workers are lay off and Starship Mk.2 will be abandon
While Work begin on Mk.3 (Texas) and Mk.4 (KSC)

Source
I already posted that exact same video up thread.
 
By the fact that you failed to answer my point about regulatory failure in this particular issue, that you therefore agree that there has been a regulatory failure here.

Hardly. If the astronomical(?) community failed to make it's case that doesn't make it a regulatory failure.
How could you possible know that they didn’t make their case and it wasn’t just ignored?
 
The process of capturing light is similar enough for chemical and electronic sensors. A strong signal will swamp a weak one in its vicinity. Processing afterwards will not recover what you failed to record in the first place.

Unless the light reflected from a satellite can travel back in time it won't.

No, the sensors aren't magical. We don't have single photon detectors, so you can't buffer the arrival of each individual photon and its source and then throw away the ones you don't want.

Electronic sensors can be thought of as an array of tiny buckets which have to be emptied to be read and which will only register as having something in them after a certain threshold is exceeded (say the bottom is covered) So buffering just keeps reseting all the buckets to zero. Really dim objects generate tiny amounts of input and so may not cover the bottom of the bucket for hours. Naturally, someone going past with a firehose during that hour will likely completely screw up the detection of those dim objects (or the tiny variations in brightness which provide details of structure etc).

For what you describe something would have to be keeping track of how much of the "bucket bottom" is full, no? Yes, buffering would reset them to zero but if you knew how much of the bucket bottom was covered before you reset it you'd just add that to what you capture after the reset.

No, and yes, my analogy is woefully inadequate. :D

No, the bucket only registers if the bottom is covered, that's the threshold sensitivity. If it could detect 1/100 of the bottom being covered then that would be a sensor that was 100 times more sensitive to input ( but then you'd still have the same problem only at 1/100 of the original threshold :) ).

Basically, that's what I've been saying. Monitor each "bucket" every 0.xx seconds and record the information. Then add up your information over time, tossing out the times that whole fields get blown out.

Yeah, I know that's what you're saying. What I'm saying is that the current technology doesn't work in a way that makes that possible.
 
If you are using stacked exposure you are, in effect, adding numerous short exposure images. Where you still run into the threshold problem. With a long exposure image the threshold can be cleared. It does not matter whether a physical shutter is used or the virtual shutter created by the processor's polling interval, the result is the same.

There is a simple and obvious and historically correct solution: locate your telescope *above* the satellites. Problem solved.
 
In orbit. Or on the moon. It would be worth the money, but who is going to pay that bill?
 
Last edited:
Regarding Starlink satellites and earth base space observation, I feel that the controversy is a bit old fashioned: post big bang space cartography have already been achieved thanks to multi spectrum observation.
Optical ground earth observation will have to evolve gaining much deepness in the process.
This is all about human story and no telescope inhabitants are immune of it ;)
 
If you are using stacked exposure you are, in effect, adding numerous short exposure images. Where you still run into the threshold problem. With a long exposure image the threshold can be cleared. It does not matter whether a physical shutter is used or the virtual shutter created by the processor's polling interval, the result is the same.

There is a simple and obvious and historically correct solution: locate your telescope *above* the satellites. Problem solved.

Bit rough on the amateurs who contribute so much but don't have access to a space program.
 
In orbit. Or on the moon. It would be worth the money, but who is going to pay that bill?
It will probably have to wait
If you are using stacked exposure you are, in effect, adding numerous short exposure images. Where you still run into the threshold problem. With a long exposure image the threshold can be cleared. It does not matter whether a physical shutter is used or the virtual shutter created by the processor's polling interval, the result is the same.

There is a simple and obvious and historically correct solution: locate your telescope *above* the satellites. Problem solved.

Bit rough on the amateurs who contribute so much but don't have access to a space program.
They should probably crowd-fund a software solution then because satellite constellations are going to happen.
 
There is a simple and obvious and historically correct solution: locate your telescope *above* the satellites. Problem solved.

Bit rough on the amateurs who contribute so much but don't have access to a space program.

You know what's *really* rough on amateur astronomers? Light pollution. A far greater problem than mere occasional satellites at dawn and dusk. Nothing seems likely to help with that little problem any time before the next global thermonuclear war.
 
Even in countries as small and densely populated as the Netherlands, there are still places where light pollution is low enough for meaningful optical observation of the stars. Saying 'there has to be a way to monitor that [CCD]charge' and then suggesting crowd funding for a software solution is, in my opinion, an expression of wishful thinking.
 
Even in countries as small and densely populated as the Netherlands, there are still places where light pollution is low enough for meaningful optical observation of the stars. Saying 'there has to be a way to monitor that [CCD]charge' and then suggesting crowd funding for a software solution is, in my opinion, an expression of wishful thinking.

Oh no, I'm not "wishful thinking". I'm saying if it is the "end of the world", and satellite constellations are the boogie man being claimed, then there is a choice. Either develop some way of removing their effect from imagery or pout. Banning satellite constellations ain't gonna happen.
 
In short, kill amateur astronomy for the sake of making a buck.
 
Even in countries as small and densely populated as the Netherlands, there are still places where light pollution is low enough for meaningful optical observation of the stars. Saying 'there has to be a way to monitor that [CCD]charge' and then suggesting crowd funding for a software solution is, in my opinion, an expression of wishful thinking.

Where I live in a rural area in Alberta, I'm about 150 km from the center of Calgary in one direction, and about 100 km from the center of Red Deer in the other direction. The 'domes of light' emitting from both these cities, light up the night skies enormously. I wouldn't have thought there was anywhere in the Netherlands that has no towns in between a 200 km space.
 
In short, kill amateur astronomy for the sake of making a buck.
Nobody said they couldn't look or develop ways to adapt. Adapt or die. That's the name of the game, no? Lots of hobbies have gone bye-bye over the centuries.
 
Stargazing is a problem in the Netherlands, the Wadden islands are relatively dark but not as dark as astronomers would wish. Anywhere else in the country, light pollution precludes observing anything but the brightest stars.
 
Erm, NO. Sites for earthbound astronomy can be found in sparsely populated areas of the world. Until, of course, we start flooding near space with lots of highly reflective satellites. It would be nice if there was some regulatory authority that would specify the use of non-reflective materials with the parties that exploit satellites listening to that agency. There are big advantages to space-borne observation platforms, but they come at great cost - beyond the means of amateur astronomers and most of the current professional astronomers.
 
Erm, NO. Sites for earthbound astronomy can be found in sparsely populated areas of the world. Until, of course, we start flooding near space with lots of highly reflective satellites. It would be nice if there was some regulatory authority that would specify the use of non-reflective materials with the parties that exploit satellites listening to that agency. There are big advantages to space-borne observation platforms, but they come at great cost - beyond the means of amateur astronomers and most of the current professional astronomers.

I seem to recall SpaceX saying they were already working on reducing the albedo of their sats. Sounds like the perfect case for Vanta Black (assuming it's durable enough.) Don't know how that might effect vehicle heating however.
 
Satellites are lit only in terminator (near dusk/dawn). Satellites will generally be sub arcsecond in angular extent (as a comparison Mars is 25 arcseconds at minimum approach while a star will be a tiny fraction of an arcsecond). It doesn't matter if a satellite is lit or dark if it occludes the target you are looking at. All satellites at all altitudes do this.

Atmospheric turbulence, clouds, aircraft and even meteor contrails constantly affect optical astronomy. Radio astronomy has RF noise from all the various radio emitting satellites in space.

If you are doing science, you keep tabs of all this and schedule your observations accordingly. If you are a purist who yearns for an unspoiled sky, you were born in the wrong time. There is even a plan to set up artificial meteor showers for entertainment in the very near future.
 
Satellites are lit only in terminator (near dusk/dawn). Satellites will generally be sub arcsecond in angular extent (as a comparison Mars is 25 arcseconds at minimum approach while a star will be a tiny fraction of an arcsecond). It doesn't matter if a satellite is lit or dark if it occludes the target you are looking at. All satellites at all altitudes do this.

Atmospheric turbulence, clouds, aircraft and even meteor contrails constantly affect optical astronomy. Radio astronomy has RF noise from all the various radio emitting satellites in space.

If you are doing science, you keep tabs of all this and schedule your observations accordingly. If you are a purist who yearns for an unspoiled sky, you were born in the wrong time. There is even a plan to set up artificial meteor showers for entertainment in the very near future.

Can you imagine the effect if Solar Power Satellites become a thing?
 
It would be nice if there was some regulatory authority that would specify the use of non-reflective materials with the parties that exploit satellites listening to that agency.

So long as:
1: That agency pays the additional costs that such regulations impose (R&D, extra systems on the satellites to deal with the added thermal issues and mass, added launch cost due to heavier satellites, etc.)
2: That agency makes it's money by means other than fees and taxes (i.e. it's not merely an authoritarian parasite, but a productive member of society).
 
Can you imagine the effect if Solar Power Satellites become a thing?


SPS's would be in Geosynchronous orbit, meaning equatorial. Which means that, like other GEO sats, they'd pass through the Orion nebula from time to time. This could of course be predicted and planned for... and I can assure you that for every amateur complaining about the SPS messing up his shot of Orion, there'd be others who would specifically target that opportunity.

 
Last edited:
SpaceX successfully launched an uncrewed Dragon spacecraft for NASA today (Dec. 5) on the company's final cargo mission of the year , sending fresh supplies to the International Space Station — and also sticking a rocket landing on a drone ship off the Florida coast.

A shiny, new two-stage Falcon 9 rocket lifted off at 12:29 p.m. EST (1729 GMT) from Launch Complex 40 at Cape Canaveral Air Force Station carrying the company's robotic Dragon cargo capsule toward the orbiting lab following a 24-hour delay due to high winds.
 

Similar threads

Back
Top Bottom