"A company which enables its clients to search a database of billions of images scraped from the internet for matches to a particular face has won an appeal against the UK’s privacy watchdog.

Last year, Clearview AI was fined more than £7.5m by the Information Commissioner’s Office (ICO) for unlawfully storing facial images.

Privacy International (who helped bring the original case I believe) responded to this on Mastodon:

"The first 33 pages of the judgment explain with great detail and clarity why Clearview falls squarely within the bounds of GDPR. Clearview’s activities are entirely “related to the monitoring of behaviour” of UK data subjects.

In essence, what Clearview does is large-scale processing of a highly intrusive nature. That, the Tribunal agreed.

BUT in the last 2 pages the Tribunal tells us that because Clearview only sells to foreign governments, it doesn’t fall under UK GDPR jurisdiction.

So Clearview would have been subject to GDPR if it sold its services to UK police or government authorities or commercial entities, but because it doesn’t, it can do whatever the hell it wants with UK people’s data - this is at best puzzling, at worst nonsensical."

  • tillimarleen@feddit.de
    link
    fedilink
    arrow-up
    30
    ·
    9 months ago

    so if I go to Britain, rob a home, take the loot out of the country and sell it there, it‘s all good?

      • mozzribo@leminal.space
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        But as long as the data acquisition as a process and storage happens on UK territory, isn’t it still illegal? Isn’t it like saying I’m robbing a bank but since I wired the funds into a Swiss safe, I’m good?

    • ThenThreeMore@startrek.website
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      Only if you’re doing so in an official governmental capacity for your country.

      The article is basically that they won the appeal because they only provide services to governments and law enforcement (having previously withdrawn their services to businesses because they lost a lawsuit in the USA)

  • ThenThreeMore@startrek.website
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    9 months ago

    So Clearview would have been subject to GDPR if it sold its services to UK police or government authorities or commercial entities, but because it doesn’t, it can do whatever the hell it wants with UK people’s data - this is at best puzzling, at worst nonsensical.

    While on an individual law level it’s extremely frustrating the article has a quote which makes perfect sense.

    it is not for one government to seek to bind or control the activities of another sovereign state

    If that wasn’t a concept in law any country could pass any law in and expect it to apply internationally.

    • Waltzy@feddit.uk
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      8 months ago

      Wouldn’t a UK court only concern itself with the activities of a company operating in the UK? If this company does not operate in the UK I’m surprised it’s got far enough to need overturning

      • OhNoMoreLemmy@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        8 months ago

        Because it operates on the data of UK residents.

        The internet has made everything really weird in terms of jurisdictions. You can have photos of UK citizens taken in the UK and stored on a UK server, and if a company from somewhere else scrapes the data without permission and moves it out the UK, that doesn’t obviously mean that it’s now fine to use for whatever.

        Now of course the law has to have some jurisdictional limits, but it’s not surprising that there has been some disagreement about where they are.

  • ExtremeDullard@lemmy.sdf.org
    link
    fedilink
    arrow-up
    8
    arrow-down
    19
    ·
    9 months ago

    but because it doesn’t, it can do whatever the hell it wants with UK people’s data - this is at best puzzling, at worst nonsensical

    Let’s not forget one teeny tiny fact here: the people whose data Clearview can do whatever the hell it wants put it up online ALL BY THEMSELVES! Clearview scrapes the internet to find its material.

    I’ve refused to have my picture taken since 2000 under any circumstances - be it at work, in group photos in clubs, etc. The reason being, those photos invariably get uploaded somewhere, usually with a caption that says “From left to right: …”

    I’ve been called paranoid and batshit crazy since 2000. But guess what: Clearview doesn’t have my photo. Who’s having the last laught now eh?

    Clearview is a hateful turd of an outfit. It should be shut down for obscene immorality and its CEO can burn in hell. But let’s not forget that it exploits people’s carelessness. People’s data fuels the corporate surveillance economy and this has been public knowledge for more than a couple of decades. It should come as no surprise that somebody some day would attempt to match people’s faces with people’s names using the data people themselves provided.

    • ShortN0te@lemmy.ml
      link
      fedilink
      arrow-up
      11
      ·
      9 months ago

      put it up online ALL BY THEMSELVES!

      Nope. You can upload images of other ppl too. And even tag them with names etc. Just as a example: old school photos of the entire class.

      • ExtremeDullard@lemmy.sdf.org
        link
        fedilink
        arrow-up
        1
        arrow-down
        12
        ·
        edit-2
        9 months ago

        Did you read what I wrote?

        People should be aware of the dangers of group photos, and my point is that they should have known this was coming a long time ago and should limit their exposure.

        • hiddengoat@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          9 months ago

          Just using the information you have posted publicly in various places someone that has access to the right sources could pick your rather unique mobile device out of a haystack with very little issue. Doing so would give them location data that, combined with a number of hobbies you mention, would give them a reasonable assumption of a few different places you could be found in a given area. From that point it’s down to either obtaining surveillance video or, more readily, just trawling the background of photos that are tagged with that location and using physical descriptors you’ve used to determine which individual is you.

          And from there it’s just a matter of tracing other appearances you made in other people’s photos and surveillance video.

          They already have you, whether you want them to or not.

          • ExtremeDullard@lemmy.sdf.org
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            9 months ago

            Indeed: spend enough time and effort and anybody can be deanonymized and fully documented. The point is that privacy-conscious individuals should make it as difficult to automate as possible.

            Clearview - and to a large extent all the other corporate surveillance players - go primarily for the low hanging fruits: people who post selfies with their names attached or don’t remove the EXIF data, tagged group photos and such. Bots can easily scrape those. If you go out of your way to either not provide that data in the first place, or pollute the well by providing fake photos and/or fake names attached, you make it harder for big data to exploit your data.

            It’s still possible, just less likely unless you’re a high value target - and realistically, most people aren’t.

    • Leraje@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      9 months ago

      Whilst I agree it’s wise to take precautions, it seems weird to me that we think its OK to expect the onus to be on us to curtail a normal activity like sharing a pic of you and your mates messing about rather than the onus on these companies not harvesting those pics to create a sellable database of us to allow governments to circumnavigate the need for warrants.

      Edit: and with the amount of self-styled internet pranksters and influencers randomly shooting images and video of whatever they want, unless you leave the house wearing a balaclava you don’t really have any choice of your face being part of their dataset.

      • ExtremeDullard@lemmy.sdf.org
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        9 months ago

        it seems weird to me that we think its OK to expect the onus to be on us to curtail a normal activity like sharing a pic of you and your mates messing about rather than the onus on these companies not harvesting those pics

        I totally agree with you. In a sane and functional society, corporate surveillance would be illegal. But we don’t live in a sane society do we? The powers that be don’t do much of anything to curb the gross privacy violations, and they don’t because most of them are on big tech’s payroll and do big tech’s bidding.

        With that in mind, how does a concerned individual live in such a society? Carefully. If you value your privacy and you want to limit the amount of data you share with Big Data, everything you do is basically hamstrung by the thought of what harm it will bring to your privacy.

        Do you really think I like living my life denying everybody the right to take a photo with me in it? Of course I would like to be on the company’s outings’ photos. Of course I would like to show my face on that Teams meeting call. But I just don’t want to show my face to Big Data, so I don’t. I wish those awful companies couldn’t legally misuse my data, but nobody is reining them in.

        • Leraje@lemmy.blahaj.zoneOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          Mate, I’m 100% with you. It’s just saddening and maddening that those of us who are fed up of having our faces and lives sold in order to enhance some cunts bottom line have to be the ones who put the effort in just to protect what is our right not to be a sellable data point.

          I dunno. It’s not like I didn’t already think this, but sometimes it feels overwhelming. We have to buy products and services just to preserve what privacy we have in a never ending arms race of all these huge companies trying to own us all. All I want to do is sometimes share photos of my family and friends without worrying about my face or my kids faces ending up being sold to the highest bidder. I know that’s just the way things are these days but it’s still incredibly frustrating and annoying.

          • ExtremeDullard@lemmy.sdf.org
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            8 months ago

            but it’s still incredibly frustrating and annoying.

            I’ll tell you what’s even more frustrating and annoying: when you discuss those topics on a privacy-oriented forum and a bunch of buttmunches downvote you. It happens to me here and it happens on all the other forums I patronize where I discuss these things. And I’m pretty sure the downvoters are members of the newer generations who have grown up with zero privacy and never experienced it, don’t know what it should be and what they’re missing, for whom corporate surveillance is normal and we old farts are raving lunatics.

            I don’t care about the silly forum points, but seeing people dismiss what you say without even attempting to argue because it doesn’t fit their warped normality, that’s fucking frustrating and really depressing for the future.

    • A1kmm@lemmy.amxl.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      9 months ago

      Data being public (and privacy in general) shouldn’t be ‘all or none’. The problem is people joining the dots between individual bits of data to build a profile, not necessarily the individual bits of data.

      If you go out in public, someone might see you and recognise you, and that isn’t considered a privacy violation by most people. They might even take a photo or video which captures in the background, and that, in isolation isn’t considered a problem either (no expectation of privacy in a public place). But if someone sets out to do similar things at a mass scale (e.g. by scraping, or networking cameras, or whatever) and piece together a profile of all the places you go in public, then that is a terrible privacy violation.

      Now you could similarly say that people who want privacy should never leave home, and otherwise people are careless and get what they deserve if someone tracks their every move in public spaces. But that is not a sustainable option for the majority of the world’s population.

      So ultimately, the problem is the gathering and collating of publicly available personally identifiable information (including photos) in ways people would not expect and don’t consent to, not the existence of such photos in the first place.