We have paused all crawling as of Feb 6th, 2025 until we implement robots.txt support. Stats will not update during this period.

  • Pika
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    3 hours ago

    might be in relates to issue link here

    It was a good read, personally speaking I think it probably would have just been better off to block gotosocial(if that’s possible since if seems stuff gets blocked when you check it) until proper robot support was provided I found it weird that they paused the entire system.

    Being said, if I understand that issue correctly, I fall under the stand that it is gotosocial that is misbehaving. They are poisoning data sets that are required for any type of federation to occur(node info, v1 and v2 statistics), under the guise that they said program is not respecting the robots file. Instead arguing that it’s preventing crawlers, where it’s clear that more than just crawlers are being hit.

    imo this looks bad, it defo puts a bad taste in my mouth regarding the project. I’m not saying an operator shouldn’t have to listen to a robots.txt, but when you implement a system that negatively hits third party, the response shouldn’t be the equivalent of sucks to suck that’s a you problem, your implementation should either respond zero or null, any other value and you are just being abusive and hostile as a program

    • mesamune@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      3 hours ago

      Thank you for providing the link. Im actually on GoToSocials side on this particular one. But I wish both sides would have communicated a bit more before this got rolled out.