A U.K. woman was photographed standing in a mirror where her reflections didn’t match, but not because of a glitch in the Matrix. Instead, it’s a simple iPhone computational photography mistake.
A U.K. woman was photographed standing in a mirror where her reflections didn’t match, but not because of a glitch in the Matrix. Instead, it’s a simple iPhone computational photography mistake.
Fair point, but I still think we’re exaggerating the amount of doctoring that’s being done by the phones. There’s always been some level of discrepancy between real life subjects and the images taken of them.
It’s just a tool creating media from sensor data. Those sensors aren’t the same as our eyes, and their processors don’t hold a candle to our own brains.
In the interest of not rambling, let’s look back at early black and white cameras. When people looked at those photos, did they assume the world was black and white? Or did they acknowledge this as a characteristic of the camera?