Samsung is (allegedly) working on a new smartphone camera sensor that can act as an approximate global shutter, something which would significantly reduce motion skew and “jello” effects in images/video.
One of those will be a 12MP unit, which offers 1.5µm pixels and, according to a report in Sisa Journal (which quotes an anonymous source from Samsung Electronics), is set to come in future Galaxy flagships.

It’s not a true global-shutter sensor, but the design aims for “at the level of performance of a global shutter” by combining a new kind of pixel structure with an optical flow algorithm that compensates for motion. Should it make its way to future-generation Galaxy devices, it would be the most significant alteration to Samsung’s camera stack in years since the mass adoption of 200MP sensors.
Why Global Shutter Matters on Smartphones
Rolling shutters are used on most phone cameras, which expose lines of pixels one after the next. At speed — or when the camera is panned — disparities in timing between the top and bottom of the frame result in skewed shapes, wobble, and banding under artificial light. On mobile sensors, full-frame readout generally occurs in dozens of milliseconds — more than enough time to bend straight lines and warp the shape of wheels or propellers.
Global shutters mean all pixels are exposed at the same time, freezing motion cleanly. For day-to-day photographers, that means crisper sports shots, kids and pets who aren’t Go-Go Gadget; steadier video whether panning quickly or from a vehicle. It even decreases horizontal flicker lines you see under LED lights, which have haunted gymnasiums and arenas.
What Samsung Reportedly Built for Future Cameras
According to Sisa Journal, Samsung’s new 12MP sensor keeps the rolling-shutter design but changes both the pixel layout and the signal path. There is at least one ADC for every 2×2 pixel block, with four pixels sharing an ADC. The company source depicts hybrid behavior: 2×2 pixel blocks directly follow each other like a rolling-shutter animation, but the rest of the frame is treated in a fashion that resembles simultaneous exposure.
Bringing the ADC into the pixel array itself is one way to shorten that analog path, which could lead to reduced read noise and faster signal conversion — both important for skew. Samsung is allegedly using an optical-flow algorithm that follows motion vectors throughout the frame and corrects on the fly. Coupled with a 1.5µm pixel pitch for enhanced light gathering, the sensor is designed to offer cleaner motion capture without the resolution penalties typically associated with true global-shutter designs.

Where It Fits Within Samsung’s Galaxy Device Lineup
At 12MP, this sensor isn’t likely to replace the more than 50MP and past 200MP main cameras that you’ll find in a premium Galaxy system. And the smarter play remains ultrawide, or a 3× telephoto lens if one is exceptionally troubled by motion-induced skew on video and action scenes. An ultrawide with quasi-global shutter could stabilize fast pans, and a 3× tele module would better freeze athletes, vehicles, or wildlife.
On the processing side, Samsung’s latest ISPs in the Exynos and Snapdragon platforms already fuse optical stabilization, electronic stabilization, and AI-powered scene analysis. A sensor that spits out motion-compensated data would provide those pipelines with a cleaner input, potentially allowing for faster bursts, better subject tracking, and higher-quality 4K video that has less of the ugly stuff.
Rivals and Industry Context for Smartphone Sensors
There are such things as true global-shutter sensors, but they’re typically for industrial or automotive applications in which pixel storage nodes and ancillary circuitry hurt light-capture efficiency and limit resolution by definition. For consumer phones, both Sony and OmniVision are driving faster rolling-shutter readouts to reduce skew in the first place, while companies such as Apple and Google, for example, fight physical motion with multi-frame fusion and advanced stabilization, respectively. Such a hybrid solution could benefit from both global-like performance and resolution, which would be the key point of differentiation for Samsung’s ISOCELL portfolio.
The trade-offs are concrete: it will also make the design more complex, draw more power, and produce even more heat than the alternative of adding all of those ADCs and motion modeling at either the pixel or column level. The payoff, if done right, is footage that looks like high-end cinema cameras with global-shutter motion — fewer leaning buildings, cleaner LED lighting, and more natural motion cadence.
What to Watch Next for Samsung’s Imaging Roadmap
Samsung has not formally confirmed the sensor nor its branding, but reports suggest it will appear in a future flagship series or foldable device. Conventional life cycles for mobile sensor programs are months from sampling to mass production, so a debut — alongside the big Galaxy launch slated for Q2 next year — would align with industry timescales.
Look for clues such as an ISOCELL announcement that targets motion, discussions of embedded ADC or “optical flow” capture, and marketing language around anti-flicker or anti-jello video. If this sensor shows up on an ultrawide or a 3× module, it could quietly become the feature that makes fast action look effortless in the next wave of Galaxy flagships.
