• Lem Jukes@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    11
    ·
    edit-2
    10 hours ago

    This feels discouraging as someone who struggled with learning programming for a very long time and only with the aid of copilot have I finally crossed the hurdles I was facing and felt like I was actually learning and progressing again.

    Yes I’m still interacting with and manually adjusting and even writing sections of code. But a lot of what copilot does for me is interpret my natural language understanding of how I want to manipulate the data and translating it into actual code which I then work with and combine with the rest of the project.

    But I’ve stopped looking to join any game jams because it seems even when they don’t have an explicit ban against all AI, the sentiment I get is that people feel like it’s cheating and look down on someone in my situation. I get that submitting ai slop whole sale is just garbage. But it feels like putting these blanket ‘no ai content’ stamps and badges on things excludes a lot of people.

    Edit:

    Is this slop? https://lemjukes.itch.io/ascii-farmer-alpha https://github.com/LemJukes/ASCII-Farmer

    Like I know it isn’t good code but I’m entirely self taught and it seems to work(and more importantly I mostly understand how it works) so what’s the fucking difference? How am I supposed to learn without iterating? If anyone human wants to look at my code and tell me why it’s shit, that’d actually be really helpful and I’d genuinely be thankful.

    *except whoever actually said that in the comment reply’s. I blocked you so I won’t see any more from you anyways and also piss off.

    • Probius@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      I like to use AI autocomplete when programming not because it solves problems for me (it fucking sucks at that if you’re not a beginner), but because it’s good at literally just guessing what I want to do next so I don’t have to type it out. If I do something to the X coordinate, I probably want to do the same/similar thing to the Y and Z coordinates and AI’s really good at picking up that sort of thing.

      • Lem Jukes@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        10 hours ago

        If you learned math with a calculator you didn’t learn math.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          edit-2
          10 hours ago

          Firstly, a calculator doesn’t have a double digit percent chance of bullshitting you with made up information.

          If you’ve ever taken a calculus course you likely were not allowed to use a calculator that has the ability to solve your problems for you and you likely had to show all of your math on paper, so yes. That statement is correct.

      • Lumiluz@slrpnk.net
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        12 hours ago

        Same vibes as “if you learned to draw with an iPad then you didn’t actually learn to draw”.

        Or in my case, I’m old enough to remember “computer art isn’t real animation/art” and also the criticism assist Photoshop.

        And there’s plenty of people who criticized Andy Warhol too before then.

        Go back in history and you can read about criticisms of using typewriters over hand writing as well.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          2
          ·
          10 hours ago

          None of your examples are even close to a comparison with AI which steals from people to generate approximate nonsense while costing massive amounts of electricity.

          • Lumiluz@slrpnk.net
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            2 hours ago

            Have you ever looked at the file size of something like Stable Diffusion?

            Considering the data it’s trained on, do you think it’s;

            A) 3 Petabytes B) 500 Terabytes C) 900 Gigabytes D) 100 Gigabytes

            Second, what’s the electrical cost of generating a single image using Flux vs 3 minutes of Balder’s Gate, or similar on max settings?

            Surely you must have some idea on these numbers and aren’t just parroting things you don’t understand.

            • finitebanjo@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              5 minutes ago

              What a fucking curveball joke of a question, you take a nearly impossible to quantify comparison and ask if its equivalent?

              Gaming:

              A high scenario electricity consumption figure of around 27 TWh, and a low scenario figure of 14.7 TWh

              North American gaming market is about 7% of the global total

              then that gives us a very very rough figure of about 210-285 TWh per annum of global electricity used by gamers.

              AI:

              The rapid growth of AI and the investments into the underlying AI infrastructure have significantly intensified the power demands of data centers. Globally, data centers consumed an estimated 240–340 TWh of electricity in 2022—approximately 1% to 1.3% of global electricity use, according to the International Energy Agency (IEA). In the early 2010s, data center energy footprints grew at a relatively moderate pace, thanks to efficiency gains and the shift toward hyperscale facilities, which are more efficient than smaller server rooms.

              That stable growth pattern has given way to explosive demand. The IEA projects that global data center electricity consumption could double between 2022 and 2026. Similarly, IDC forecasts that surging AI workloads will drive a massive increase in data center capacity and power usage, with global electricity consumption from data centers projected to double to 857 TWh between 2023 and 2028. Purpose-built AI nfrastructure is at the core of this growth, with IDC estimating that AI data center capacity will expand at a 40.5% CAGR through 2027.

              Lets just say we’re at the halfway point and its 600 TWhper anum compared to 285 for gamers.

              So more than fucking double, yeah.

    • Demigodrick@lemmy.zipM
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      3
      ·
      edit-2
      18 hours ago

      FWIW I agree with you. The people who say they don’t support these tools come across as purists or virtue signallers.

      I would agree with not having AI art* or music and sounds. In games I’ve played with it in, it sounds so out of place.

      However support to make coding more accessible with the use of a tool shouldn’t be frowned upon. I wonder if people felt the same way when C was released, and they thought everyone should be an assembly programmer.

      The irony is that most programmers were just googling and getting answers from stackoverflow, now they don’t even need to Google.

      *unless the aim is procedurally generated games i guess, but if they’re using assets I get not using AI generated ones.

      • mke@programming.dev
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 hours ago

        The people who say they don’t support these tools come across as purists or virtue signallers.

        It is now “purist” to protest against the usage of tools that by and large steal from the work of countless unpaid, uncredited, unconsenting artists, writers, and programmers. It is virtue signaling to say I don’t support OpenAI or their shitty capital chasing pig-brethren. It’s fucking “organic labelling” to want to support like-minded people instead of big tech.

        Y’all are ridiculous. The more of this I see, the more radicalized I get. Cool tech, yes, I admit! But wow, you just want to sweep all those pesky little ethical issues aside because… it makes you more productive? Shit, it’s like you’re competing with Altman on the unlikeability ranking.

    • otp
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      4
      ·
      15 hours ago

      Back in the day, people hated Intellisense/auto-complete.

      And back in the older day, people hated IDEs for coding.

      And back in the even older day, people hated computers for games.

      There’ll always be people who hate new technology, especially if it makes something easier that they used to have to do “the hard way”.