• palordrolap@kbin.social
    link
    fedilink
    arrow-up
    238
    ·
    10 个月前

    Put something in robots.txt that isn’t supposed to be hit and is hard to hit by non-robots. Log and ban all IPs that hit it.

    Imperfect, but can’t think of a better solution.

    • Lvxferre@mander.xyz
      link
      fedilink
      English
      arrow-up
      126
      ·
      edit-2
      10 个月前

      Good old honeytrap. I’m not sure, but I think that it’s doable.

      Have a honeytrap page somewhere in your website. Make sure that legit users won’t access it. Disallow crawling the honeytrap page through robots.txt.

      Then if some crawler still accesses it, you could record+ban it as you said… or you could be even nastier and let it do so. Fill the honeytrap page with poison - nonsensical text that would look like something that humans would write.

      • CosmicTurtle@lemmy.world
        link
        fedilink
        English
        arrow-up
        58
        ·
        10 个月前

        I think I used to do something similar with email spam traps. Not sure if it’s still around but basically you could help build NaCL lists by posting an email address on your website somewhere that was visible in the source code but not visible to normal users, like in a div that was way on the left side of the screen.

        Anyway, spammers that do regular expression searches for email addresses would email it and get their IPs added to naughty lists.

        I’d love to see something similar with robots.

        • Lvxferre@mander.xyz
          link
          fedilink
          English
          arrow-up
          32
          ·
          edit-2
          10 个月前

          Yup, it’s the same approach as email spam traps. Except the naughty list, but… holy fuck a shareable bot IP list is an amazing addition, it would increase the damage to those web crawling businesses.

          • Nighed@sffa.community
            link
            fedilink
            English
            arrow-up
            12
            ·
            10 个月前

            but with all of the cloud resources now, you can switch through IP addresses without any trouble. hell, you could just browse by IP6 and not even worry with how cheap those are!

            • Lvxferre@mander.xyz
              link
              fedilink
              English
              arrow-up
              12
              ·
              10 个月前

              Yeah, that throws a monkey wrench into the idea. That’s a shame, because “either respect robots.txt or you’re denied access to a lot of websites!” is appealing.

              • Nighed@sffa.community
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                5
                ·
                10 个月前

                That’s when Google’s browser DRM thing starts sounding like a good idea 😭

      • KairuByte@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 个月前

        I’m the idiot human that digs through robots.txt and the site map to see things that aren’t normally accessible by an end user.

        • Lvxferre@mander.xyz
          link
          fedilink
          English
          arrow-up
          6
          ·
          10 个月前

          For banning: I’m not sure but I don’t think so. It seems to me that prefetching behaviour is dictated by a page linking another, to avoid any issue all that the site owner needs to do is to not prefetch links for the honeytrap.

          For poisoning: I’m fairly certain that it doesn’t. At most you’d prefetch a page full of rubbish.

    • PM_Your_Nudes_Please@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      10 个月前

      Yeah, this is a pretty classic honeypot method. Basically make something available but inaccessible to the normal user. Then you know anyone who accesses it is not a normal user.

      I’ve even seen this done with Steam achievements before; There was a hidden game achievement which was only available via hacking. So anyone who used hacks immediately outed themselves with a rare achievement that was visible on their profile.

      • Link@rentadrunk.org
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 个月前

        That’s a bit annoying as it means you can’t 100% the game as there will always be one achievement you can’t get.

      • CileTheSane@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 个月前

        There are tools that just flag you as having gotten an achievement on Steam, you don’t even have to have the game open to do it. I’d hardly call that ‘hacking’.

    • Ultraviolet@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      10 个月前

      Better yet, point the crawler to a massive text file of almost but not quite grammatically correct garbage to poison the model. Something it will recognize as language and internalize, but severely degrade the quality of its output.

    • Aatube@kbin.social
      link
      fedilink
      arrow-up
      10
      arrow-down
      46
      ·
      10 个月前

      robots.txt is purely textual; you can’t run JavaScript or log anything. Plus, one who doesn’t intend to follow robots.txt wouldn’t query it.

      • BrianTheeBiscuiteer@lemmy.world
        link
        fedilink
        English
        arrow-up
        55
        ·
        10 个月前

        If it doesn’t get queried that’s the fault of the webscraper. You don’t need JS built into the robots.txt file either. Just add some line like:

        here-there-be-dragons.html
        

        Any client that hits that page (and maybe doesn’t pass a captcha check) gets banned. Or even better, they get a long stream of nonsense.

      • ShitpostCentral@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        10 个月前

        You’re second point is a good one, but you absolutely can log the IP which requested robots.txt. That’s just a standard part of any http server ever, no JavaScript needed.

        • GenderNeutralBro@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          11
          ·
          10 个月前

          You’d probably have to go out of your way to avoid logging this. I’ve always seen such logs enabled by default when setting up web servers.

      • ricecake
        link
        fedilink
        English
        arrow-up
        13
        ·
        10 个月前

        People not intending to follow it is the real reason not to bother, but it’s trivial to track who downloaded the file and then hit something they were asked not to.

        Like, 10 minutes work to do right. You don’t need js to do it at all.

  • Cosmic Cleric@lemmy.world
    link
    fedilink
    English
    arrow-up
    142
    arrow-down
    4
    ·
    10 个月前

    As unscrupulous AI companies crawl for more and more data, the basic social contract of the web is falling apart.

    Honestly it seems like in all aspects of society the social contract is being ignored these days, that’s why things seem so much worse now.

    • TheObviousSolution@lemm.ee
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      10 个月前

      Governments could do something about it, if they weren’t overwhelmed by bullshit from bullshit generators instead and lead by people driven by their personal wealth.

    • PatMustard@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      10 个月前

      these days

      When, at any point in history, have people acknowledged that there was no social change or disruption and everyone was happy?

  • Optional@lemmy.world
    link
    fedilink
    English
    arrow-up
    128
    arrow-down
    2
    ·
    10 个月前

    Well the trump era has shown that ignoring social contracts and straight up crime are only met with profit and slavish devotion from a huge community of dipshits. So. Y’know.

    • Ithi@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 个月前

      Only if you’re already rich or in the right social circles though. Everyone else gets fined/jail time of course.

  • MonsiuerPatEBrown@reddthat.com
    link
    fedilink
    English
    arrow-up
    99
    arrow-down
    1
    ·
    edit-2
    10 个月前

    The open and free web is long dead.

    just thinking about robots.txt as a working solution to people that literally broker in people’s entire digital lives for hundreds of billions of dollars is so … quaint.

  • rtxn@lemmy.world
    link
    fedilink
    English
    arrow-up
    92
    arrow-down
    1
    ·
    edit-2
    10 个月前

    I would be shocked if any big corpo actually gave a shit about it, AI or no AI.

    if exists("/robots.txt"):
        no it fucking doesn't
    
    • bionicjoey@lemmy.ca
      link
      fedilink
      English
      arrow-up
      50
      ·
      10 个月前

      Robots.txt is in theory meant to be there so that web crawlers don’t waste their time traversing a website in an inefficient way. It’s there to help, not hinder them. There is a social contract being broken here and in the long term it will have a negative impact on the web.

    • DingoBilly@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 个月前

      Yeah I always found it surprising that everyone just agreed to follow a text file on a website on how to act. It’s one of the worst thought out/significant issues with browsing still out there from the beginning pretty much.

  • moitoi@feddit.de
    link
    fedilink
    English
    arrow-up
    100
    arrow-down
    14
    ·
    10 个月前

    Alternative title: Capitalism doesn’t care about morals and contracts. It wants to make more money.

    • AutistoMephisto@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      5
      ·
      10 个月前

      Exactly. Capitalism spits in the face of the concept of a social contract, especially if companies themselves didn’t write it.

      • WoodenBleachers@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        3
        ·
        10 个月前

        Capitalism, at least, in a lassie-faire marketplace, operates on a social contract, fiat money is an example of this. The market decides, the people decide. Are there ways to amass a certain amount of money to make people turn blind eyes? For sure, but all systems have their ways to amass power, no matter what

        • nickwitha_k (he/him)@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          5
          ·
          10 个月前

          I’d say that historical evidence directly contradicts your thesis. Were it factual, times of minimal regulation would be times of universal prosperity. Instead, they are the time of robber-barons, company scrip that must be spent in company stores, workers being massacred by hired thugs, and extremely disparate distribution of wealth.

          No. Laissez-faire capitalism has only ever consistently benefitted the already wealthy and sociopaths happy to ignore social contact for their own benefit.

          • WoodenBleachers@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            3
            ·
            10 个月前

            You said “a social contract”. Capitalism operates on one. “The social contract” as you presumably intend to use it here is different. Yes, capitalism allows those with money to generate money, but a disproportionate distribution of wealth is not violation of a social contract. I’m not arguing for deregulation, FAR from it, but the social contract is there. If a corporation is doing something too unpopular then people don’t work for them and they cease to exist.

            • nickwitha_k (he/him)@lemmy.sdf.org
              link
              fedilink
              English
              arrow-up
              5
              ·
              10 个月前

              If a corporation is doing something too unpopular then people don’t work for them and they cease to exist.

              Unfortunately, this is not generally the case. In the US, for example, the corporation merely engages in legalized bribery to ensure that people are dependent upon it (ex. limiting healthcare access, erosion of social safety nets) and don’t have a choice but to work for them or die. Disproportionate distribution of wealth may not by itself be a violation of social contact but if gives the wealthy extreme leverage to use in coercing those who are not wealthy and further eroding protections against bad actors. This has been shown historically to be a self-reinforcing cycle that requires that the wealthy be forced to stop.

              • WoodenBleachers@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                3
                ·
                10 个月前

                Yes, regulations should be in place, but the “legalized bribery” isn’t forcing people, it’s just easier to stick with the status quo than change it. They aren’t forced to die, it’s just a lot of work to not. The social contract is there, it’s just one we don’t like

    • gapbetweenus@feddit.de
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      6
      ·
      10 个月前

      Capitalism is a concept, it can’t care if it wanted and it even can’t want to begin with. It’s the humans. You will find greedy, immoral ones in every system and they will make it miserable for everyone else.

      • Aceticon@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        arrow-down
        4
        ·
        edit-2
        10 个月前

        Capitalism is the widelly accepted self-serving justification of those people for their acts.

        The real problem is in the “widelly accepted” part: a sociopath killing an old lady and justifying it because “she looked funny at me” wouldn’t be “widelly accepted” and Society would react in a suitable way, but if said sociopath scammed the old lady’s pension fund because (and this is a typical justification in Investment Banking) “the opportunity was there and if I didn’t do it somebody else would’ve, so better be me and get the profit”, it’s deemed “acceptable” and Society does not react in a suitable way.

        Mind you, Society (as in, most people) might actually want to react in a suitable way, but the structures in our society are such that the Official Power Of Force in our countries is controlled by a handful of people who got there with crafty marketing and backroom plays, and those deem it “acceptable”.

        • gapbetweenus@feddit.de
          link
          fedilink
          English
          arrow-up
          8
          ·
          10 个月前

          People will always find justification to be asholes. Capitalism tried to harvest that energy and unleashed it’s full potential, with rather devastating consequences.

          • Chee_Koala@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 个月前

            Sure, but think-structures matter. We could have a system that doesn’t reward psychopathic business choices (as much), while still improving our lives bit by bit. If the system helps a bit with making the right choices, that would matter a lot.

            • gapbetweenus@feddit.de
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 个月前

              That’s basically what I wrote, (free) market economy especially in combination with credit based capitalism gives those people a perfect combination of a system to thrive in. This seems to result in very fast progress and immense wealth, which is not distributed very equally. Than again, I prefer Besos and Zuckerberg as CEOs rather than politicians or warlords. Dudes with big Egos and Ambitions need something productive to work on.

        • Katana314@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          5
          ·
          10 个月前

          It’s deemed “acceptable”? A sociopath scamming an old lady’s pension is basically the “John Wick’s dog” moment that leads to the insane death-filled warpath in recent movie The Beekeeper.

          This is the kind of edgelord take that routinely expects worse than the worst of society with no proof to their claims.

          • Aceticon@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            10 个月前

            This is the kind of shit I saw from the inside in Investment Banking before and after the 2008 Crash.

            None of those assholes ever gets prison time for the various ways in which they abuse markets and even insider info for swindeling amongst other Pension Funds, so de facto the Society we have with the power structures it has, accepts it.

  • YTG123@feddit.ch
    link
    fedilink
    English
    arrow-up
    78
    arrow-down
    7
    ·
    10 个月前

    We need laws mandating respect of robots.txt. This is what happens when you don’t codify stuff

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      38
      arrow-down
      1
      ·
      10 个月前

      It’s a bad solution to a problem anyway. If we are going to legally mandate a solution I want to take the opportunity to come up with an actually better fix than the hacky solution that is robots.txt

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      10 个月前

      AI companies will probably get a free pass to ignore robots.txt even if it were enforced by law. That’s what they’re trying to do with copyright and it looks likely that they’ll get away with it.

      • SPRUNT@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        3
        ·
        10 个月前

        The battle cry of conservatives everywhere: It’s too hard!

        Except if it involves oppressing minorities and women. Then it’s a moral imperative worth all the time and money you can shovel at it regardless of whether the desired outcome is realistic or not.

        • Jojo@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          10 个月前

          Seriously, could the party of “small government” get out of my business, please?

            • Jojo@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 个月前

              I just wish the push and pull of politics didn’t have to be played as a zero sum game. I wish someone could take the initiative and just…

              I think both parties in America sing pretty loud about “law and order.” I haven’t heard that cry particularly loudly from either side over the other. I don’t think I’ve heard anyone who claims to be a Democrat saying the end goal is “small government” but I have heard it from Republican voices.

              Honestly, I would really prefer if we were in a system that enabled more parties, so we didn’t have “parties” that did such contradictory things as the current ones…

              • JasonDJ@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                10 个月前

                The GOP has historically been the party of law and order. Hence why they implied that blue lives matter more than black lives.

                thatsthejoke.png

                Just like how one party impeached a president of the other for obstruction and abuse of power, and the other impeached a president for checks notes lying about a blowjob.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      9
      ·
      10 个月前

      Turning that into a law is ridiculous - you really can’t consider that more than advisory unless you enforce it with technical means. For example, maybe put it behind a login or captcha if you want only humans to see it

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          5
          ·
          edit-2
          10 个月前

          Yes, and there’s also no law against calling an unlisted phone number

          Also we already had this battle with robots.txt. In the beginning, search engines wouldn’t honor it either because they wanted the competitive advantage of more info, and websites trusted it too much and tried to wall off too much info that way.

          There were complaints, bad pr, lawsuits, call for a law

          It’s no longer the Wild West:

          • search engines are mature and generally honor robots.txt
          • websites use rate limiting to conserve resources and user logins to fence off data there’s a reason to fence off
          • truce: neither side is as greedy
          • there is no such law nor is that reasonable
    • ArmokGoB@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      10 个月前

      Sounds like the type of thing that would either be unenforceable or profitable to violate compared to the fines.

    • wabafee@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      3
      ·
      edit-2
      10 个月前

      I hope not, laws tend to get outdated real fast. Who knows robots.txt might not even be used in the future and it just there adding space because of law reasons.

      • Tyfud@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        10 个月前

        You can describe the law in a similar way to a specification, and you can make it as broad as needed. Something like the file name shouldn’t ever come up as an issue.

        • GhostMatter@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 个月前

          The law can be broad with allowances to define specifics by decree, executive order or the equivalent.

      • BreakDecks@lemmy.ml
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 个月前

        robots.txt is a 30 year old standard. If we can write common sense laws around things like email and VoIP, we can do it for web standards too.

      • kingthrillgore@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 个月前

        robots.txt has been an unofficial standard for 30 years and its augmented with sitemap.xml to help index uncrawlable pages, and Schema.org to expose contents for Semantic Web. I’m not stating it shouldn’t not be a law, but to suggest changing norms as a reason is a pretty weak counterargument, man.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        10 个月前

        We don’t need new laws we just need enforcement of existing laws. It is already illegal to copy copyrighted content, it’s just that the AI companies do it anyway and no one does anything about it.

        Enforcing respect for robots.txt doesn’t matter because the AI companies are already breaking the law.

        • BreakDecks@lemmy.ml
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 个月前

          I think the issue is that existing laws don’t clearly draw a line that AI can cross. New laws may very well be necessary if you want any chance at enforcement.

          And without a law that defines documents like robots.txt as binding, enforcing respect for it isn’t “unnecessary”, it is impossible.

          I see no logic in complaining about lack of enforcement while actively opposing the ability to meaningfully enforce.

          • Echo Dot@feddit.uk
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 个月前

            Copyright law in general needs changing though that’s the real problem. I don’t see the advantage of legally mandating that a hacky workaround solution becomes a legally mandated requirement.

            Especially because there are many many legitimate reasons to ignore robots.txt including it being misconfigured or it just been set up for search engines when your bot isn’t a search engine crawler.

    • XTornado@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      4
      ·
      10 个月前

      All my scrapping scripts go to shit…please no, I need automation to live…

  • circuitfarmer@lemmy.world
    link
    fedilink
    English
    arrow-up
    71
    arrow-down
    1
    ·
    10 个月前

    Most every other social contract has been violated already. If they don’t ignore robots.txt, what is left to violate?? Hmm??

    • BlanketsWithSmallpox@lemmy.world
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      2
      ·
      10 个月前

      It’s almost as if leaving things to social contracts vs regulating them is bad for the layperson… 🤔

      Nah fuck it. The market will regulate itself! Tax is theft and I don’t want that raise or I’ll get in a higher tax bracket and make less!

      • Jimmyeatsausage@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        ·
        edit-2
        10 个月前

        This can actually be an issue for poor people, not because of tax brackets but because of income-based assistance cutoffs. If $1/hr raise throws you above those cutoffs, that extra $160 could cost you $500 in food assistance, $5-$10/day for school lunch, or get you kicked out of government subsidied housing.

        Yet another form of persecution that the poor actually suffer and the rich pretend to.

      • SlopppyEngineer@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 个月前

        And then the companies hit the “trust thermocline”, customers leave them in droves and companies wonder how this could’ve happened.

      • Ogmios
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        17
        ·
        10 个月前

        Yea, because authoritarianism is well known to be sooooo good for the layperson.

  • KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    2
    ·
    10 个月前

    hmm, i though websites just blocked crawler traffic directly? I know one site in particular has rules about it, and will even go so far as to ban you permanently if you continually ignore them.

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        10 个月前

        i mean yeah, but at a certain point you just have to accept that it’s going to be crawled. The obviously negligent ones are easy to block.

        • T156@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          10 个月前

          Except that it’d also catch out people who use accessibility devices might see the link anyways, or use the keyboard to navigate a site instead of a mouse.

          • HACKthePRISONS@kolektiva.social
            link
            fedilink
            arrow-up
            4
            arrow-down
            4
            ·
            10 个月前

            i don’t know, maybe there’s a canvas trick. i’m not a webdev so i am a bit out of my depth and mostly guessing and remembering 20-year-old technology

        • oatscoop@midwest.social
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          edit-2
          10 个月前

          If it weren’t so difficult and require so much effort, I’d rather clicking the link cause the server to switch to serving up poisoned data – stuff that will ruin a LLM.

          • HelloHotel@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            10 个月前

            Visiting /enter_spoopmode.html will choose a theme and mangle the text for any page you next go to accordingly (think search&replace with swear words or santa clause)

            It will also show a banner letting the user know they are in spoop mode, with a javascript button to exit the mode, where the AJAX request URL is ofuscated (think base64) The banner is at the bottom of the html document (not nesisarly the screen itself) and/or inside unusual/normally ignored tags. `

          • T156@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            10 个月前

            Would that be effective? A lot of poisoning seems targeted to a specific version of an LLM, rather than being general.

            Like how the image poisoning programs only work for some image generators and not others.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 个月前

        Well you can if you know the IPs that come in from but that’s of course the trick.

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        10 个月前

        last i checked humans dont access every page on a website nearly simultaneously…

        And if you imitate a human then honestly who cares.

    • kingthrillgore@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      edit-2
      10 个月前

      There are more crawlers than I have fucks to give, you’ll be in a pissing match forever. robots.txt was supposed to be the norm to tell crawlers what they can and cannot access. Its not on you to block them. Its on them, and its sadly a legislative issues at this point.

      I wish it wasn’t, but legislative fixes are always the most robust and complied against.

      • KillingTimeItself@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        10 个月前

        yes but also there’s a point where it’s blatantly obvious. And i can’t imagine it’s hard to get rid of the obviously offending ones. Respectful crawlers are going to be imitating humans, so who cares, disrespectful crawlers will ddos your site, that can’t be that hard to implement.

        Though if we’re talking “hey please dont scrape this particular data” Yeah nobody was ever respecting that lol.

  • kingthrillgore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    25
    ·
    10 个月前

    I explicitly have my robots.txt set to block out AI crawlers, but I don’t know if anyone else will observe the protocol. They should have tools I can submit a sitemap.xml against to know if i’ve been parsed. Until they bother to address this, I can only assume their intent is hostile and if anyone is serious about building a honeypot and exposing the tooling for us to deploy at large, my options are limited.

    • phx@lemmy.ca
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      2
      ·
      edit-2
      10 个月前

      The funny (in an “wtf” not “haha” sense) thing is, individuals such as security researchers have been charged under digital trespassing laws for stuff like accessing publicly available ststems and changing a number in the URL in order to get access to data that normally wouldn’t, even after doing responsible disclosure.

      Meanwhile, companies completely ignore the standard mentions to say “you are not allowed to scape this data” and then use OUR content/data to build up THEIR datasets, including AI etc.

      That’s not a “violation of a social contract” in my book, that’s violating the terms of service for the site and essentially infringement on copyright etc.

      No consequences for them though. Shit is fucked.

  • 𝐘Ⓞz҉@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    8
    ·
    10 个月前

    No laws to govern so they can do anything they want. Blame boomer politicians not the companies.

    • gian @lemmy.grys.it
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      10 个月前

      Why not blame the companies ? After all they are the ones that are doing it, not the boomer politicians.

      And in the long term they are the ones that risk to be “punished”, just imagine people getting tired of this shit and starting to block them at a firewall level…

      • WeirdGoesPro@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 个月前

        Because the politicians also created the precedent that anything you can get away with, goes. They made the game, defined the objective, and then didn’t adapt quickly so that they and their friends would have a shot at cheating.

        There is absolutely no narrative of “what can you do for your country” anymore. It’s been replaced by the mottos of “every man for himself” and “get while the getting’s good”.

    • Dr_Satan@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      10 个月前

      I think that good behavior is implicitly mandated even if there’s nobody to punish you if you don’t.

  • lily33@lemm.ee
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    18
    ·
    10 个月前

    What social contract? When sites regularly have a robots.txt that says “only Google may crawl”, and are effectively helping enforce a monolopy, that’s not a social contract I’d ever agree to.