• @9488fcea02a9
    link
    English
    823 months ago

    I’m not a developer, but I use AI tools at work (mostly LLMs).

    You need to treat AI like a junior intern… You give it a task, but you still need to check the output and use critical thinking. You cant just take some work from an intern, blindly incorporate it into your presentation, and then blame the intern if the work is shoddy…

    AI should be a time saver for certain tasks. It cannot (currently) replace a good worker.

    • @[email protected]
      link
      fedilink
      English
      34
      edit-2
      3 months ago

      As a developer I use it mainly for learning.

      What used to be a Google followed by skimming a few articles or docs pages is now a question.

      It pulls the specific info I need, sources it and allows follow up questions.

      I’ve noticed the new juniors can get up to speed on new tech very quickly nowadays.

      As for code I don’t trust it beyond snippets I can use as a base.

      • @[email protected]
        link
        fedilink
        English
        0
        edit-2
        3 months ago

        JFC they’ve certainly got the unethical shills out in full force today. Language Models do not and will never amount to proper human work. It’s almost always a net negative everywhere it is used, final products considered.

          • @[email protected]
            link
            fedilink
            English
            13 months ago

            Its intended use is to replace human work in exchange for lower accuracy. There is no ethical use case scenario.

            • @[email protected]
              link
              fedilink
              English
              13 months ago

              It’s intended to show case its ability to generate text. How people use it is up to them.

              As I said it’s great for learning as it’s very accurate when summarising articles / docs. It even sources it so you can read up more if needed.

              • @[email protected]
                link
                fedilink
                English
                03 months ago

                It’s been known to claim commands and documentation exist when they don’t. It very commonly gets simple addition wrong.

                • @[email protected]
                  link
                  fedilink
                  English
                  13 months ago

                  That’s because it’s a language processor not a calculator. As I said you’re using it wrong.

                  • @[email protected]
                    link
                    fedilink
                    English
                    13 months ago

                    So the correct usage is to have documents incorrectly explained to you? I fail to see how that does any good.

    • Rickety Thudds
      link
      fedilink
      English
      153 months ago

      It’s clutch for boring emails with several tedious document summaries. Sometimes I get a day’s work done in 4 hours.

      Automation can be great, when it comes from the bottom-up.

      • @[email protected]
        link
        fedilink
        English
        23 months ago

        Honestly, that’s been my favorite - bringing in automation tech to help me in low-tech industries (almost all corporate-type office jobs). When I started my current role, I was working consistently 50 hours a week. I slowly automated almost all the processes and now usually work about 2-3 hours a day with the same outputs. The trick is to not increase outputs or that becomes the new baseline expectation.

    • @[email protected]
      link
      fedilink
      English
      83 months ago

      I am a developer and that’s exactly how I see it too. I think AI will be able to write PRs for simple stories but it will need a human to review those stories to give approval or feedback for it to fix it, or manually intervene to tweak the output.