All our cosmic assumptions
About the universe rely on the speed of light remaining relatively constant. This seems an unreliable assumption to me, given that we have only a tiny vantage point from which to conduct our measurements.
the speed of light (and therefore time) speeds up proportionally to the distance from the center of the universe.
So, the speed of light is a function of “absolute” distance, rather than a measure of distance between objects.
I want to make one distinction- light is not “eminating” from the center- rather, the center is the point where light speed reaches 0.
Ok, say we are 186,000 light years from “A” (The center of the universe). Now draw a line between Earth and “A”. The line should pass through the “distant star” on the drawing. Let’s pretend the “distant star” is exactly halfway between us and “A”.
Within the rules of this theory, observers on the “distant star”, would say that light is traveling half the speed (93,000m/s) that we say it light travels on Earth. Proportionately, they are also 93,000 light years from the center.
So on Earth, we look at the distant star and say “it takes light 93,000 years to reach us from that star, and light travels at 186,000 m/s, so that object is roughly 547×10 to the 15 miles away”.
And on the “distant star”, where light is traveling half the speed they say, “it takes light 186,000 years to reach us from Earth, and light is traveling at 93,000m/s, so that object is roughly 547×10 to the 15 miles away”.
So both objects would calculate the same distance from one another, even though light is traveling at vastly different speed.
(note: This math is simplified, since in this theory the speed of light would be constantly increasing over distance, rather than only at the 2 points in question)
So we are not “wrong” to say that light travels at 186,000m/s. But what would happen if we tried to calculate distance between objects without knowing where the center of the universe was?
For one, objects off axis (not directly between us and the center, shown in “B”) might be a set distance from the center of the universe, when actually it is nearer to (or further from) Earth.
It follows that the “distant star” might not be directly between us and the center. It could actually be in any place along the line of “C”. If the “distant star” is closer to “A”, but off axis (“B”), the “distant star” would in fact be further from Earth than what we would calculate with our existing method. Conversely, if the “distant star” was further from “A” than Earth, it could actually be in any location along “D”.
This reminds me of certain quantum properties like the uncertainty principle, not to mention the elegant helix curve that emerges when you calculate possible locations of celestial objects, which is interesting. But moreso the reason I like this theory is that it reconciles the experience of living inside of time while also demonstrating time as a singular, unified plane with linear measurements like the other dimensions.
When you measure distance to the center of the universe instead of other objects, the calculated distance of all objects in the universe to center will be exactly the same.
Approaching the absolute center, or the 0 light speed point, light and time move so slowly that they have the same calculated distance to center as the most distant objects, because their greater distance is compensated by light traveling at incredible speeds far faster than on Earth.
So distance becomes a singular constant, unified with time and light. Therefore, distance does not exist in it’s own right. It is a tool of measurement for time.
I’d love to get an astrophysicist to say “hmm…”, or more likely, “here’s why you are an idiot”, so if you know anybody that might shed some light on this please pass it along.