Is there something that can generate random Internet usage to make the real sites I go to a bit obfuscated?

I’m thinking something that runs on my server, and simply visits a random website. It probably shouldn’t actually be random, and some sort of tweaking would be great. Like the ability to have it visit every news site there is. That way the ISP will have a harder time telling my political bias.

The threat model for this is below using a VPN for normal usage, although getting a dedicated VPN IP address is a project for one day.

    • HumanPerson
      link
      fedilink
      English
      arrow-up
      6
      ·
      9 months ago

      Turn on your browser history for a while then use that.

      • LWD@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        9 months ago

        chrome:site-engagement for a slightly more accessible list

    • GluWu@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      9 months ago

      Just start listing the most popular and generic sites. Then Google a topic like technology and copy whatever those sites are. I imagine you could have a pretty decent list populated in 15 minutes. You could also just ask chatgpt to create lists of the top 100 sites for “x”.

      What would write in? I might be willing to help because this interests me as well.

      • Dust0741@lemmy.worldOP
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        That’s a good idea.

        Probably just a shell script. Someone mentioned using curl so that’d be pretty easy

        • GluWu@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          9 months ago

          Let me know if you start working on anything. I want to try to use greasemonkey, I haven’t in years.

          • Dust0741@lemmy.worldOP
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            8 months ago

            Little curl shell script that works:

            #!/bin/bash
            
            # Random_Curl_Request.sh
            
            # CSV file containing websites
            CSV_FILE="/home/user/Documents/randomSiteVisitor/websites.csv"
            
            # Get a random line from the CSV file
            RANDOM_LINE=$(shuf -n 1 "$CSV_FILE")
            
            # Extract the website URL from the random line
            WEBSITE=$(echo $RANDOM_LINE | cut -d ',' -f 1)
            
            # Make a curl request to the random website every minute
            while true; do
                curl $WEBSITE
                sleep 60
            
                # Get a new random line from the CSV file
                RANDOM_LINE=$(shuf -n 1 "$CSV_FILE")
            
                # Extract the website URL from the new random line
                NEW_WEBSITE=$(echo $RANDOM_LINE | cut -d ',' -f 1)
            
                # Update the website URL for the next iteration
                WEBSITE=$NEW_WEBSITE
            done