would such a hypothetical mass defy the expansion of the universe to such a degree that, despite the space still expanding between the stars and galaxies, the galaxies within Attractor wouldn't lose photonic sight of one another in the far, far future as the universe expands as per Hubble's Law?
Yes, it does. On very large scales the distribution of matter in the universe appears uniform, and the space undergoes metric expansion. But on smaller scales, where the distribution of matter appears less uniform, regions that are denser collapse under their own gravity. Stars within galaxies, and even galaxies within clusters, have motions described by that local gravitational field, rather than the cosmological expansion, so these galaxies never get pulled apart.
If dark energy remains constant (as in the cosmological constant), which we expect it to, then in the very distant future the proper distance to the cosmic event horizon will be about 17 billion light years. That then will be the maximum possible size of any gravitationally bound system in the universe. Right now, the maximum size of bound clusters is a few hundred million light years, however measuring the exact value is tricky.
Attractor - could a similar gravitational anomaly of far greater mass hypothetically pull all the galaxies of the known universe to its core?
No, for a subtle reason. The observable universe is much larger (46 billion light years in proper distance) than the distance to the cosmological event horizon (about 16 billion light years). We receive light (emitted a long time ago) from galaxies that are now beyond that event horizon, but we cannot send signals to them now and have those signals ever reach them. Likewise, we cannot reach them ourselves, nor can their gravitation ever pull us there. The space in between is expanding too quickly.
If we imagine some very massive cluster (or something) was beyond our cosmic event horizon, then we would have some left-over velocity towards it due to it being inside of our cosmic event horizon in earlier times, during which it could influence us. This is a potential explanation for "dark flow". But again we'll never meet that source, nor will any other galaxies that have slipped beyond its cosmic horizon.
If we imagine further that there was some
stupendously massive object, whose mass made up the majority of the observable universe, then this would change the geometry of the space time. Its gravitational field would dominate over the expansion everywhere, and everything would fall into it.
An interesting theory, do any cosmological observations agree with it?
It has been tested in a few ways, but observations do not so much "agree" as rule out certain mass distributions such a populations of black holes could have. This has actually been so tightly constrained now that some authors are saying we can be confident that at least half of the dark matter cannot be explained by primordial black holes, because of upper limits on their numbers now placed by observations of gravitational microlensing events. Basically, if there are many of these black holes, then we expect them to occasionally pass in front of distant supernovae and produce an anomalous brightening. We don't see any signs of that.
Dark matter as primordial black holes is a popular hypothesis going back decades, but it's growing weaker as more and more "possibility space" for such black holes is ruled out by observations. It also has so far not had any predictive successes. But we've only grown more confident that dark matter, as some sort of massive weakly interacting particles, does exist.
So, my question, can the wavelengths of Lightwaves also expand or stretch infinitely?
In principle there is no upper limit to the wavelength of a photon. This is because their sources can oscillate with arbitrarily low frequency, and the Planck distribution has no lower bound. And there is also no limit to how much a photon can be stretched. Black hole horizons, the cosmological horizon, and even the expansion of the universe itself, may generate infinite redshift.
Infinite redshift of a photon of course means their energy trends to zero. But it also means the photons are "delocalized", in the sense that if an instrument could detect them (as photon strikes), then they could be detected
anywhere, because the wave function for that photon will spread over all of space. The catch is the size of the detector must also trend to infinity to have any fair chance of finding it.
All of that is theory, but in
practice there's no way for arbitrarily long photons to exist.
Let's use emission from black holes as an example.
Main idea: The closer to the event horizon a photon is emitted from, the longer its wavelength will be when detected far away. Seems straightforward. However, we cannot say with arbitrary precision how close to the event horizon the photon was emitted from. The uncertainty principle tells us that in order to constrain the distance from the horizon to Δx, the uncertainty in the momentum (in this case the momentum of the photon) must be at least ℏ/(2Δx). The momentum of a photon is p = E/c, and a photon's energy and wavelength are related by E = hc/λ. Go through the algebra, and we find that the wavelength (in the source frame) must be smaller than 4πΔx. What a simple and logical result! The closer to the horizon the photon was emitted from, the shorter its wavelength must be, and they follow a direct proportionality. The wavelength of the light is itself the limit for the measurement.
(Aside: we can reach the exact same result by going through the energy-time uncertainty principle, and arguing that the time uncertainty, Δt, must be Δx/c. Also we may as well throw away the factor of 4π and say that the wavelength is equal to the distance Δx.)
Now we'll compute how much this photon of wavelength λ will be redshifted as it climbs away to infinity, from its source at Δx from the horizon. For a fixed distance from the horizon, this becomes more extreme for larger mass black holes. If we consider the black hole SgrA* with 4 million solar masses, then photons we know to have been emitted from within 1 meter above the horizon must have a wavelength (in source frame) shorter than about 1 meter. The gravitational redshift factor 1m above the horizon is around 100,000, so the photon observed very far away must be shorter than about 100km. Counter-intuitively, the smaller we make Δx here, the
smaller the wavelength will be when seen far away. At 1cm from the horizon, the maximum wavelength when detected far away will be about 10km.
This sounds very backwards from what we're usually taught about how light emitted closer to the horizon is redshifted more. But that assumed the light could have an arbitrary wavelength at the source. The uncertainty principle forbids us from saying a 1m wavelength photon was emitted 1cm above the horizon. That wouldn't even make sense logically -- is 99% of that photon below the horizon? Or 99% at a distance greater than 1cm? Both lead to contradictions with saying the photon was observed at all, or with such a great redshift.
This (admittedly very simple and handwavy) argument using black hole horizons also generalizes fairly well for any sort of infinite redshift we may imagine. For cosmic expansion, the limit is placed by the size of the cosmological horizon and the amount of time the universe has existed.