Maybe I’m the only one who didn’t know this, but it only just occurred to me to try - and it worked!

I gave it needle size, ply and the garment size I wanted - as well as asking for Australian sizing and instructions (so you’ll need to change that for whatever you’re used to) and from what I could see it was pretty good. Haven’t actually tried it but I may for a small project and see how it goes.

Edit to say that I’m very aware of chatgpts limits (I work in a field where it’s being abused) but thought it was an interesting idea. Simplicity would be key. I’d consider myself a beginner this might be a good way of creating small simple projects. Or nonsense! I have bags of cheap wool that I got through my local buy nothing group so I’m always up for a bit of experimentation.

  • lurch (he/him)
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    2
    ·
    8 months ago

    AI doesn’t understand what it’s doing. It’s like a second grader savant that has read every book. If you ask it to write a story about or draw a man playing using a baguette as a baseball bat, you have a significant chance to get something with the animal “bat” in it and the baguette not being used to hit a ball.

    • voracitude@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      4
      ·
      edit-2
      8 months ago

      You think so? https://chat.openai.com/share/3e9d7f52-1a7d-4f20-b707-6aa51ed1c7d6

      Please provide your definition of “understanding” and explain how it is different in humans than in ML models. Please explain the functional differences in your mind that separate (for example) a human looking at and explaining the joke in a meme, and a Machine Learning model doing the same thing.

      Of course they’re not human-level yet in terms of adaptability, but the more I think about the above the more convinced I am that humans just have much, much larger “context windows” than current machine learning models, and that’s an advantage that is already eroding quickly.

      Edit: All I want is for someone to clearly show humans do it differently than these new machine learning models. That shouldn’t be too hard, if it’s so fundamental and obvious. Or, could it be that “understanding” is a nebulous term that’s actually quite hard to define?

      • andros_rex@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 months ago

        Specifically for knitting - can a machine be trained on the muscle movements involved in knitting? The feeling of tension in your yarn, how many stitches you feel comfortable crowding on that needles, how you need to move your yarn out of the way?

        Knitting has some complicated stitches and movements that I don’t think have been replicated by machine. Crochet is not able to be produced by machine. I think that there is a kinesthetic understanding necessary for a sort of “AI” to really understand knitting which hasn’t been demonstrated by any models I’ve seen. Maybe someone will put sensors on someone’s muscles and try to “tokenize” the mechanics, but I don’t think that’s been done yet.

        Writing patterns is a skill. Professional pattern writers test their patterns and modify them, calling heavily on their knowledge of that kinesthetic understanding. You could not just read a bunch of pattern books and write your own without having done the activity. You would need to be choosy in your pattern books too - anyone who does historical knitting/crochet/fiber work can tell you that there are lots of confusing, ambiguous, or wrong instructions! Can you consistently discriminate between different notation styles? Do you have opinions on magic loop versus ch6? These are things that I don’t think have been tokenized - and since most of that is ambiguity in human communication, where is “AI” supposed to pick this up from?

        • voracitude@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          8 months ago

          Specifically for knitting - can a machine be trained on the muscle movements involved in knitting? The feeling of tension in your yarn, how many stitches you feel comfortable crowding on that needles, how you need to move your yarn out of the way?

          Yes, it can, but that’s actually two sets of questions. So yes, first of all, a machine learning model can be trained on the motions required to create any given pattern. How it would create the pattern would differ from a human unless we gave it human form and trained it to use that machinery to knit or crochet the pattern in exactly the way you would.

          I don’t think we’re there yet, but these models are the only kind of software so far that even has a hope of processing the insanely complex inputs and outputs required for each and every movement on the needle/hook in exactly the right way. That’s the reason crochet hasn’t been machine-replicated yet: is just too complex for existing, simple machines.

          Think about how much information your brain actually processes just to throw a baseball to hit a target, or balance yourself on an unsteady surface. That’s where modern machine learning models are excelling: in finding patterns amongst huge amounts of information much faster than a human could. Boston Dynamics have been doing coordination successfully for over a decade already.

          Researchers have also already successfully reconstructed images people were thinking about from nothing but MRI scans: https://www.science.org/content/article/ai-re-creates-what-people-see-reading-their-brain-scans

          And that’s where we find the answer to the question of personal preferences, too. Take EEG readings while you knit and, with the right preparation, that data can train the model to your preferred styles, flourishes, whathaveyou.

          Writing patterns is a skill. Professional pattern writers test their patterns and modify them, calling heavily on their knowledge of that kinesthetic understanding.

          Folding proteins is far, far more complex than crochet or knitting, but modern AI models are better at it than anything we’ve had before including humans: https://www.nature.com/articles/d41586-022-02083-2

          It’s a scary time, with how fast this stuff is going. A knittingbot is no longer impossible, it’s just a matter of someone taking the time to gather and prep the data, and build it.

          • andros_rex@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            8 months ago

            Folding proteins is applying complex algorithms to data, which is what computers are good at.

            Writing knitting or crochet patterns is a skill that requires being good at knitting or crochet as a human. You have to have an understanding of how a human does it to write a pattern - where to place stitch markers logically, when to switch tensions, knowing yourself enough to know how crowded you can let the needle before you’re going to fuck it up… Maybe we could apply “AI” to the mechanical movements and have it create an object from a pattern, but I think going backwards would be significantly harder.

            Often I feel like “AI” stuff ignores a lot of technical aspects of the crafts in general. It has only access to visual/audio information that has been uploaded to the internet in some form. I doubt most learning is not done through that way. I watched hours of knitting videos and understood absolutely nothing, I attended an in person class and it stuck. Even if “AI” can model the outcomes of the creations, it can’t really be trained on the creative process.