Update July July 4, 2024: 03:00 AM ET: A spokesperson from Samsung has told The Mac Observer that the report is false. The post has since been removed from The Elec.
A new report reveals Apple might be considering using advanced sensors from Samsung in the upcoming iPhones. According to The Elec, Apple is trying out Samsung’s advanced CMOS image sensors (CIS), which, if things go well, could debut with the iPhone 16, slated to debut this fall.
The report states that Apple is currently in the final stage of testing Samsung’s CIS, which could be included in the primary camera of the upcoming iPhone 16. Traditionally, Apple has partnered with Sony, but recent concerns over reliability and other factors have led Apple to explore additional partners.
If Samsung’s sensors meet Apple’s quality standards, it will mark the first time Samsung supplies this crucial component for an iPhone. It would also mean that the upcoming iPhone 16 series would rely on Samsung for not only the advanced cameras but also the advanced M14 display on the iPhone 16 Pro series and the M12 display for the iPhone 16 and 16 Plus.
As to why Samsung’s new image sensor is better than previous ones, it’s because Samsung’s CIS boasts an advanced three-wafer stack design, a departure from the two-stack setup used in current and previous iPhone cameras.
Here, each wafer has a specific role: one has the photodiode that converts light into electric signals, another contains transistors for handling signal amplification and processing, and the third houses analog digital converter logic. By splitting these functions across three wafers, Samsung’s design allows for higher pixel density, less noise in images, and smaller pixels overall.
Furthermore, Samsung’s latest image sensor utilizes wafer-to-wafer hybrid bonding, which connects wafers using copper pads instead of traditional signal-transferring bumps. As a result, it not only reduces the size of the image sensor but also enhances data transfer speed.