
Physicists in Vienna staged how a cube and a sphere would look in a camera if they raced at 99.9 percent of light speed. They used lasers and ultrafast cameras to build a single snapshot from many thin slices, revealing a rotation illusion, not a squashed shape.
The result turns a classic prediction into something you can actually see. It shows what a camera would record at near light speed, in a small lab.
The work was led by Dominik Hornof at Vienna University of Technology (VUT). His research focuses on optical methods that make relativistic motion visible with simple tools.
In 1959, Roger Penrose showed that a fast object should look rotated in a snapshot, not flattened. James Terrell separately reached the same conclusion.
The Terrell-Penrose effect, rotation illusion from differing light arrival times, predicts that a sphere keeps its circular outline in a true snapshot. It also says the image reveals slivers of the far side that normally hide from view.
A camera collects light that arrives together, so earlier light from the back mixes with later light from the front. The geometry of that timing shifts the picture in a way that looks like a rotation.
They synthesized a snapshot from thin slices of reflected light. Each slice recorded only the light that returned during a carefully chosen instant.
Each pulse hit a stationary object, and a gated camera, a sensor that opens for tiny time windows, captured a razor thin slice. After one slice, they nudged the object forward by the distance it would have covered at the chosen speed.
Each exposure lasted 300 picoseconds, one trillionth of a second, short enough to freeze light’s path to a slice. To mimic 0.8 times light speed, they stepped a 3 foot cube forward about 1.9 inches between slices.
For a sphere at 99.9 percent, the shift was about 2.4 inches per step, which let the final image show hints of its far side. The stitched image of the cube looked turned rather than shrunken along its motion.
A fast object is physically shortened along the direction of travel, a Lorentz contraction, shortening predicted by special relativity that particle experiments support.
The camera does not directly record that shortening, because light from different points left at different times.
In special relativity, physics of high speed motion, a proper snapshot collects photons that arrive together at the sensor. That timing shifts the apparent positions so the object looks rotated rather than squeezed.
Researchers explained that the rotation isn’t a physical motion but a way to represent timing more clearly. They noted that the simplicity of the approach makes the concept easy to understand without requiring complex mathematics.
The clean rotation picture assumes parallel ray projection, an object much smaller than camera distance, so incoming light rays are nearly parallel. At close range the sphere and cube pick up mild distortions that the team also simulated.
The camera’s image intensifier used a microchannel plate, tiny channels that multiply electrons, to catch faint photon signals without smearing time. That hardware enabled 300 picosecond gates with a workable signal to noise ratio.
Illumination geometry mattered, and a small tilt needed for even lighting made one measured axis slightly longer in the sphere image. The authors called this an artifact of the setup rather than a feature of relativity.
The method did not accelerate any object, it stitched slices from a still target with careful timing. That is why the team could test light speed optics safely inside a standard room.
The same technique could help visualize well-known relativity scenarios, such as the train thought experiment that illustrates the constant speed of light.
It could also be used to study how motion alters shadows, reflections, and even the apparent order of events.
Researchers can turn diagrams into moving sequences that reveal how snapshots form when every photon must arrive together.
That kind of visual explanation makes complex timing effects tangible without relying on new equations.
Because the setup uses commercial lasers, cameras, and a motion stage, it remains accessible for experimental labs and research demonstrations. It opens the door to testing visual predictions that have mostly existed only in theory.
The team even animated the synthesized frames to slow apparent light to a walking pace, making cause and effect easier to observe frame by frame..
The images show relativistic, close to light speed, motion in a way that rewards patience and careful thinking. It is a physics story about timing, not a trick of software.
A snapshot made this way shows parts of a shape that would otherwise hide, because the back sent its light earlier. That is the thread that ties every image in this study together.
A sphere stayed a circle, a cube looked turned, and both matched the equations within the limits described. Seeing theory and experiment agree at room scale is the quiet headline here.
Results like this will keep reshaping how modern physics is taught from high school onward. It is a clear win for simple tools used with clever timing.
The study is published in Nature.
—–
Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.
—–
