A U.K. woman was photographed standing in a mirror where her reflections didn’t match, but not because of a glitch in the Matrix. Instead, it’s a simple iPhone computational photography mistake.

  • e0qdk@kbin.social
    link
    fedilink
    arrow-up
    225
    arrow-down
    14
    ·
    10 months ago

    This story may be amusing, but it’s actually a serious issue if Apple is doing this and people are not aware of it because cellphone imagery is used in things like court cases. Relative positions of people in a scene really fucking matter in those kinds of situations. Someone’s photo of a crime could be dismissed or discredited using this exact news story as an example – or worse, someone could be wrongly convicted because the composite produced a misleading representation of the scene.

    • falkerie71@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      59
      arrow-down
      14
      ·
      10 months ago

      I see your point, though I wouldn’t put it that far. It’s an edge case that has to happen in a very short duration.
      Similar effects can be acheived with traditional cameras with rolling shutter.
      If you’re only concerned of relative positions of different people during a time frame, I don’t think you need to be that worried. Being aware of it is enough.

      • Decoy321@lemmy.world
        link
        fedilink
        English
        arrow-up
        18
        ·
        10 months ago

        We might be exaggerating the issue here. Fallibility has always been an issue with court evidence. Analog photos can be doctored too.

        • curiousaur@reddthat.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          10 months ago

          Sure, but smartphones now automatically doctor every photo you take. Someone who took the photo could not even know it was doctored and think it represents truth.

          • Decoy321@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Fair point, but I still think we’re exaggerating the amount of doctoring that’s being done by the phones. There’s always been some level of discrepancy between real life subjects and the images taken of them.

            It’s just a tool creating media from sensor data. Those sensors aren’t the same as our eyes, and their processors don’t hold a candle to our own brains.

            In the interest of not rambling, let’s look back at early black and white cameras. When people looked at those photos, did they assume the world was black and white? Or did they acknowledge this as a characteristic of the camera?

      • ElderWendigo@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        17
        ·
        edit-2
        10 months ago

        All digital photography is computational. I think the word you’re looking for is composite, not computational.

        • NotSoCoolWhip@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          10 months ago

          Unless the dude is saying only film should be admissible, which doesn’t sound all that bad.

          • ElderWendigo@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            8
            ·
            10 months ago

            Film is also subject to manipulation in the development stage, even if you avoid compositing e.g. dodging and burning. Photographic honesty is an open and active philosophic debate that has been going on since its inception. It’s not like you can really draw a line in the sand and blanketly say admissible or not. Although I’m sure established guidelines would help. Ultimately, it’s an argument about the validity of evidence that needs to be made on a case by case basis. The manipulations involved need to be fully identified and accounted for in those discussions.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        9
        ·
        10 months ago

        With all the image manipulation and generation tools available to even amateurs, I’m not sure how any photography is admissible as evidence these days.

        At some point there’s going to have to be a whole bunch of digital signing (and timestamp signatures) going on inside the camera for things to be even considered.

      • Ook the Librarian@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        ·
        10 months ago

        This was important in the Kyle Rittenhouse case. The zoom resolution was interpolated by software. It wasn’t AI per se, but the fact that a jury couldn’t be relied upon to understand a black box algorithm and its possible artifacts, the zoomed video was disallowed.

        (this in no way implies that I agree with the court.)

        • Rob T Firefly@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          The zoom resolution was interpolated by software. It wasn’t AI per se

          Except it was. All the “AI” junk being hyped and peddled all over the place as a completely new and modern innovation is really just the same old interpolation by software, albeit software which is fueled by bigger databases and with more computing power thrown at it.

          It’s all just flashier autocorrect.

          • Ook the Librarian@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            As far as I know, nothing about AI entered into arguments. No precedents regarding AI could have been set here. Therefore, this case wasn’t about AI per se.

            I did bring it up as relevant because, as you say, AI is just an over-hyped black box. But that’s my opinion, with no case law to cite (ianal). So to say that a court would or should feel that AI and fancy photoediting is the same thing is misleading. I know that wasn’t your point, but it was part of mine.

        • wagoner@infosec.pub
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          I watched that whole court exchange live, and it helped the defendant’s case that the judge was computer illiterate.

          • Ook the Librarian@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 months ago

            As it usually does. But the court’s ineptitude should favor the defense. It shouldn’t be an arrow in a prosecutor’s quiver, at least.

    • Jarix@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      10 months ago

      This isn’t an issue at all it’s a bullshit headline. And it worked.

      This is the result of shooting in panorama mode.

      In other news, the sky is blue