SpaceX (general discussion)


 


He is back, Greyhavoc !!
 
Causing more trouble for astronomers.

STARLINK SATELLITES PHOTOBOMB A METEOR SHOWER: Yes, there was an outburst of alpha Monocerotid meteors on Nov. 22nd. As predicted by forecasters Esko Lyytinen and Peter Jenniskens (NASA/Ames), Earth grazed a filament of comet dust, prompting a flurry of meteors to emerge from the constellation Monoceros (the Unicorn). In La Palma on the Canary islands, a Global Meteor Network camera captured the display--and something more. Starlink photobombed the meteor shower:

 
I'm sure it's not the first time a satellite has ever appeared in a picture of space. Interesting that they haven't felt compelled to write articles every other time it's happened. :rolleyes:
 
The problem is crossing over from "annoying but bearable" to "will ruin most of our observations". That's what compels them to write.
 
The problem is crossing over from "annoying but bearable" to "will ruin most of our observations". That's what compels them to write.

I'm having a difficult time buying the hysteria. They probably already have filters available that remove things like satellites, airplane lights, etc. If they don't have them they better figure it out because it's happening. SpaceX is the first of many to come.
 
I'm having a difficult time buying the hysteria. They probably already have filters available that remove things like satellites, airplane lights, etc. If they don't have them they better figure it out because it's happening. SpaceX is the first of many to come.

Filters can be used for things like sodium lights which emit at a single frequency. To filter out Starlink reflections, you'd have to eradicate most of the visible light spectrum.

Astronomers have figured out what needs to happen: satellite builders need to make their satellites less bright. There are lots of ways to do this, these are the first two I can come up with in 2 minutes of thinking about the issue:

- first and foremost, don't use reflective materials (like bright shiny MLI) when nonreflective materials are available,
- angle the solar panels and other unavoidably reflective surfaces to reflect light away from Earth,

The publicity is their way of pressuring manufacturers into doing this.
 
I'm having a difficult time buying the hysteria. They probably already have filters available that remove things like satellites, airplane lights, etc. If they don't have them they better figure it out because it's happening. SpaceX is the first of many to come.

Filters can be used for things like sodium lights which emit at a single frequency. To filter out Starlink reflections, you'd have to eradicate most of the visible light spectrum.

Astronomers have figured out what needs to happen: satellite builders need to make their satellites less bright. There are lots of ways to do this, these are the first two I can come up with in 2 minutes of thinking about the issue:

- first and foremost, don't use reflective materials (like bright shiny MLI) when nonreflective materials are available,
- angle the solar panels and other unavoidably reflective surfaces to reflect light away from Earth,

The publicity is their way of pressuring manufacturers into doing this.
IIRC SpaceX is already working on reducing the albedo.
 
"On today's episode of "Tusk Man Bad"..."
He’s not above criticism you know.
Nobody is. When he's singled out for doing what at least half a dozen other companies are planning to do though it's a bit obvious. And why weren't these people complaining a decade ago, when all this was in the planning stages? Lastly, space is the place to do observation from anyway. (Yes, I read the article. Sounds like a hysterical researcher trying to protect his turf, facts be damned.)
 
When he's singled out for doing what at least half a dozen other companies are planning to do though

Nope. Astronomers are very much aware of the other large constellations being prepared, so they protest all of them. They have been aware of the issue since the late 1990s when the Iridium constellation was launched, which used large mirror-like flat panel antennas that would get very bright in the night sky. Thankfully, the Iridium-next satellites being launched now are far dimmer.

This year, Elon was so nice to give us all a visual demonstration of what's to come, so of course it's a popular topic now. Oh, and when astronomers protested after the first launch, SpaceX listened (but not without firing off another 60 satellites to the initial, too-bright design).

It's easy to be glib and say "oh, all astronomers should just use space telescopes instead". The number of space telescopes in use today is a few dozen across the entire EM spectrum. I count about 800 professional-grade astronomical observatories on Earth, and that ignores the thousands of amateurs who contribute observations on this level.

Astronomers have always been protective of the EM spectrum they're trying to observe. To get FCC approval, SpaceX had to comply with regulations for the radio emissions from their Starlink sats that are aimed at protecting radio astronomy. For visual astronomy, such protections aren't in place at the moment, but the time has clearly come for regulation.
 
When he's singled out for doing what at least half a dozen other companies are planning to do though

Nope. Astronomers are very much aware of the other large constellations being prepared, so they protest all of them. They have been aware of the issue since the late 1990s when the Iridium constellation was launched, which used large mirror-like flat panel antennas that would get very bright in the night sky. Thankfully, the Iridium-next satellites being launched now are far dimmer.

This year, Elon was so nice to give us all a visual demonstration of what's to come, so of course it's a popular topic now. Oh, and when astronomers protested after the first launch, SpaceX listened (but not without firing off another 60 satellites to the initial, too-bright design).

It's easy to be glib and say "oh, all astronomers should just use space telescopes instead". The number of space telescopes in use today is a few dozen across the entire EM spectrum. I count about 800 professional-grade astronomical observatories on Earth, and that ignores the thousands of amateurs who contribute observations on this level.

Astronomers have always been protective of the EM spectrum they're trying to observe. To get FCC approval, SpaceX had to comply with regulations for the radio emissions from their Starlink sats that are aimed at protecting radio astronomy. For visual astronomy, such protections aren't in place at the moment, but the time has clearly come for regulation.


One can't help but wonder why, if they've been arguing about it for decades, no regulations are in place. Perhaps their concerns are perceived as being overblown. Still would like to know why they don't just filter the satellites out. It's not like satellites have never passed through a telescopes FOV before.
 
If you properly want to observe stars from the earth's surface, you find a spot with as little light as possible from the environment. If you are near any densely populated area, starlight is drowned out by the reflected light from surface sources. In the same way, the light scattered by highly reflective satellites drowns out starlight. On the earth's surface,you can find other, darker locations for observing the sky. When the light is reflected from orbiting objects, you are out of luck. As to to why the powers that be are deaf to astronomers' pleas - astronomy is not the most financially profitable of activities.
 
Last edited:
If you properly want to observe stars from the earth's surface, you find a spot with as little light as possible from the environment. If you are near any densely populated area, starlight is drowned out by the reflected light from surface sources. In the same way, the light scattered by highly reflective satellites drowns out starlight.

Still not seeing why they couldn't be filtered out. It's not like once a satellite clears the horizon it ruins the image of every satellite within line of sight, whether it's looking in it's direction or not.
 
If you properly want to observe stars from the earth's surface, you find a spot with as little light as possible from the environment. If you are near any densely populated area, starlight is drowned out by the reflected light from surface sources. In the same way, the light scattered by highly reflective satellites drowns out starlight.

Still not seeing why they couldn't be filtered out. It's not like once a satellite clears the horizon it ruins the image of every satellite within line of sight, whether it's looking in it's direction or not.

To get dim objects you need to leave the shutter open for a long time. Big bright streaks across the image will drown out dimmer signals.
 
To get dim objects you need to leave the shutter open for a long time.

Or take a *lot* of shorter duration photos and then stack them. It would seem to be simplicity itself for a good program to go through the pile of digital images and toss out the ones with anomalies, everything from satellites to meteors to airplanes to birds to clouds, and flashes from lightning and headlights and whatnot.
 
The point of leaving open the shutter is that the light of distant stars is accumulated on your sensor/film. Short exposure will not pass the detection threshold of your sensor/film.
 
When he's singled out for doing what at least half a dozen other companies are planning to do though

Nope. Astronomers are very much aware of the other large constellations being prepared, so they protest all of them. They have been aware of the issue since the late 1990s when the Iridium constellation was launched, which used large mirror-like flat panel antennas that would get very bright in the night sky. Thankfully, the Iridium-next satellites being launched now are far dimmer.

This year, Elon was so nice to give us all a visual demonstration of what's to come, so of course it's a popular topic now. Oh, and when astronomers protested after the first launch, SpaceX listened (but not without firing off another 60 satellites to the initial, too-bright design).

It's easy to be glib and say "oh, all astronomers should just use space telescopes instead". The number of space telescopes in use today is a few dozen across the entire EM spectrum. I count about 800 professional-grade astronomical observatories on Earth, and that ignores the thousands of amateurs who contribute observations on this level.

Astronomers have always been protective of the EM spectrum they're trying to observe. To get FCC approval, SpaceX had to comply with regulations for the radio emissions from their Starlink sats that are aimed at protecting radio astronomy. For visual astronomy, such protections aren't in place at the moment, but the time has clearly come for regulation.


One can't help but wonder why, if they've been arguing about it for decades, no regulations are in place. Perhaps their concerns are perceived as being overblown. Still would like to know why they don't just filter the satellites out. It's not like satellites have never passed through a telescopes FOV before.
Or perhaps the regulatory authority is not up to the job in this particular case, has made a mistake, has not considered all the opinions. It could be any of a hundred things. But instead you seem to assume automatically the astronomers are at fault.
 
If you properly want to observe stars from the earth's surface, you find a spot with as little light as possible from the environment. If you are near any densely populated area, starlight is drowned out by the reflected light from surface sources. In the same way, the light scattered by highly reflective satellites drowns out starlight.

Still not seeing why they couldn't be filtered out. It's not like once a satellite clears the horizon it ruins the image of every satellite within line of sight, whether it's looking in it's direction or not.

To get dim objects you need to leave the shutter open for a long time. Big bright streaks across the image will drown out dimmer signals.
Sure. But it's not film anymore it's digital. Buffer X number of frames and then remove those with the streaks in post.
 
To capture the light from distant stars, photon by photon, you cannot avoid extended exposure of the sensor/film. Irrespective of the user of electronic sensors or photographic film, you face a detection threshold that cannot be cleared with short exposure images. Long exposure images come with streaks from reflective orbiting objects which swamp the weak light from the stars. It would be possible to filter the streaks from the short exposure images, but as not enough of the light from the dim object has reached your sensor to clear the threshold, the dim object does not register at all. Not with the first image. Or the second, or the umpteenth. After filtering, what would be left would be a streakless composite image without a hint of the dim object you wanted to study in the first place.
 
Last edited:
To capture the light from distant stars, photon by photon, you cannot avoid extended exposure of the sensor/film. Irrespective of the user of electronic sensors or photographic film, you face a detection threshold that cannot be cleared with short exposure images. Long exposure images come with streaks from reflective orbiting objects which swamp the weak light from the stars. It would be possible to filter the streaks from the short exposure images, but as not enough of the light from the dim object has reached your sensor to clear the threshold, the dim object does not register at all. Not with the first image. Or the second, or the umpteenth. After filtering, what would be left would be a streakless compositie image without a hint of the dim object you wanted to study in the first place.

That's not how digital sensors work. Even if it was, again, you buffer what you capture, and remove the frames where a satellite passed over.
 
If you are using stacked exposure you are, in effect, adding numerous short exposure images. Where you still run into the threshold problem. With a long exposure image the threshold can be cleared. It does not matter whether a physical shutter is used or the virtual shutter created by the processor's polling interval, the result is the same.
 
Last edited:
If you are using stacked exposure you are, in effect, adding numerous short exposure images. Where you still run into the threshold problem. With a long exposure image the threshold can be cleared. It does not matter whether a physical shutter is used or the virtual shutter created by the processor's polling interval, the result is the same.
Again, it's not film. Each pixel in the detector could be monitored over time (which is what you'd have to do anyway). Excise any spikes in your rate of change and splice the rest together.
 
The process of capturing light is similar enough for chemical and electronic sensors. A strong signal will swamp a weak one in its vicinity. Processing afterwards will not recover what you failed to record in the first place.
 
Last edited:
The process of capturing light is similar enough for chemical and electronic sensors. A strong signal will swamp a weak one in its vicinity. Processing afterwards will not recover what you failed to record in the first place.

Unless the light reflected from a satellite can travel back in time it won't.
 
@sferrin : I think you nailed it. Especially when such dynamic filters already exist but at least in reverse (visual target tracking on sensor).
 
The process of capturing light is similar enough for chemical and electronic sensors. A strong signal will swamp a weak one in its vicinity. Processing afterwards will not recover what you failed to record in the first place.

Unless the light reflected from a satellite can travel back in time it won't.

No, the sensors aren't magical. We don't have single photon detectors, so you can't buffer the arrival of each individual photon and its source and then throw away the ones you don't want.

Electronic sensors can be thought of as an array of tiny buckets which have to be emptied to be read and which will only register as having something in them after a certain threshold is exceeded (say the bottom is covered) So buffering just keeps reseting all the buckets to zero. Really dim objects generate tiny amounts of input and so may not cover the bottom of the bucket for hours. Naturally, someone going past with a firehose during that hour will likely completely screw up the detection of those dim objects (or the tiny variations in brightness which provide details of structure etc).
 
The process of capturing light is similar enough for chemical and electronic sensors. A strong signal will swamp a weak one in its vicinity. Processing afterwards will not recover what you failed to record in the first place.

Unless the light reflected from a satellite can travel back in time it won't.

No, the sensors aren't magical. We don't have single photon detectors, so you can't buffer the arrival of each individual photon and its source and then throw away the ones you don't want.

Electronic sensors can be thought of as an array of tiny buckets which have to be emptied to be read and which will only register as having something in them after a certain threshold is exceeded (say the bottom is covered) So buffering just keeps reseting all the buckets to zero. Really dim objects generate tiny amounts of input and so may not cover the bottom of the bucket for hours. Naturally, someone going past with a firehose during that hour will likely completely screw up the detection of those dim objects (or the tiny variations in brightness which provide details of structure etc).

For what you describe something would have to be keeping track of how much of the "bucket bottom" is full, no? Yes, buffereing would reset them to zero but if you knew how much of the bucket bottom was covered before you reset it you'd just add that to what you capture after the reset.
 
The process of capturing light is similar enough for chemical and electronic sensors. A strong signal will swamp a weak one in its vicinity. Processing afterwards will not recover what you failed to record in the first place.

Unless the light reflected from a satellite can travel back in time it won't.

No, the sensors aren't magical. We don't have single photon detectors, so you can't buffer the arrival of each individual photon and its source and then throw away the ones you don't want.

Electronic sensors can be thought of as an array of tiny buckets which have to be emptied to be read and which will only register as having something in them after a certain threshold is exceeded (say the bottom is covered) So buffering just keeps reseting all the buckets to zero. Really dim objects generate tiny amounts of input and so may not cover the bottom of the bucket for hours. Naturally, someone going past with a firehose during that hour will likely completely screw up the detection of those dim objects (or the tiny variations in brightness which provide details of structure etc).

For what you describe something would have to be keeping track of how much of the "bucket bottom" is full, no? Yes, buffering would reset them to zero but if you knew how much of the bucket bottom was covered before you reset it you'd just add that to what you capture after the reset.

No, and yes, my analogy is woefully inadequate. :D

No, the bucket only registers if the bottom is covered, that's the threshold sensitivity. If it could detect 1/100 of the bottom being covered then that would be a sensor that was 100 times more sensitive to input ( but then you'd still have the same problem only at 1/100 of the original threshold :) ).
 
The process of capturing light is similar enough for chemical and electronic sensors. A strong signal will swamp a weak one in its vicinity. Processing afterwards will not recover what you failed to record in the first place.

Unless the light reflected from a satellite can travel back in time it won't.

No, the sensors aren't magical. We don't have single photon detectors, so you can't buffer the arrival of each individual photon and its source and then throw away the ones you don't want.

Electronic sensors can be thought of as an array of tiny buckets which have to be emptied to be read and which will only register as having something in them after a certain threshold is exceeded (say the bottom is covered) So buffering just keeps reseting all the buckets to zero. Really dim objects generate tiny amounts of input and so may not cover the bottom of the bucket for hours. Naturally, someone going past with a firehose during that hour will likely completely screw up the detection of those dim objects (or the tiny variations in brightness which provide details of structure etc).

For what you describe something would have to be keeping track of how much of the "bucket bottom" is full, no? Yes, buffering would reset them to zero but if you knew how much of the bucket bottom was covered before you reset it you'd just add that to what you capture after the reset.

No, and yes, my analogy is woefully inadequate. :D

No, the bucket only registers if the bottom is covered, that's the threshold sensitivity. If it could detect 1/100 of the bottom being covered then that would be a sensor that was 100 times more sensitive to input ( but then you'd still have the same problem only at 1/100 of the original threshold :) ).

Basically, that's what I've been saying. Monitor each "bucket" every 0.xx seconds and record the information. Then add up your information over time, tossing out the times that whole fields get blown out.
 

Similar threads

Back
Top Bottom