Beyond recognizing the laureates’ inspirations from condensed-matter physics and statistical mechanics, the prize celebrates interdisciplinarity. At its core, this prize is about how elements of physics have driven the development of computational algorithms to mimic biological learning, impacting how we make discoveries today across STEM.
To be fair, regardless of one’s stance on the utility of current AI or the wisdom of developing it, it is an extremely difficult and potentially world changing technical achievement, and given there isn’t a computer science prize, physics is probably the most relevant category to it
Insane compute wasn’t everything. Hinton helped develop the technique which allowed more data to be processed in more layers of a network without totally losing coherence. It was more of a toy before then because it capped out at how much data could be used, how many layers of a network could be trained, and I believe even that GPUs could be used efficiently for ANNs, but I could be wrong on that one.
Either way, after Hinton’s research in ~2010-2012, problems that seemed extremely difficult to solve (e.g., classifying images and identifying objects in images) became borderline trivial and in under a decade ANNs went from being almost fringe technology that many researches saw as being a toy and useful for a few problems to basically dominating all AI research and CS funding. In almost no time, every university suddenly needed machine learning specialists on payroll, and now at about 10 years later, every year we are pumping out papers and tech that seemed many decades away… Every year… In a very broad range of problems.
The 580 and CUDA made a big impact, but Hinton’s work was absolutely pivotal in being able to utilize that and to even make ANNs seem feasible at all, and it was an overnight thing. Research very rarely explodes this fast.
Edit: I guess also worth clarifying, Hinton was also one of the few researching these techniques in the 80s and has continued being a force in the field, so these big leaps are the culmination of a lot of old, but also very recent work.
i’m here to remind you that for last 20ish years half of the time chemistry nobel goes to biologists, and now they doubled down on ai wankery with giving it to alphafold
To be fair, AlphaFold is pretty incredible. I remember when it was first revealed (but before they open sourced parts of it) that the scientific community were shocked by how effective it was and assumed that it was going to be technologically way more complex than it ended up being. Systems Biologist Mohammed AlQuraishi captures this quite well in this blog post
I’m a biochemist who has more interest in the computery side of structural biology than many of my peers, so I often have people asking me stuff like “is AlphaFold actually as impressive as they say, or is it just more overhyped AI nonsense?”. My answer is “Yes.”
Physics’ Nobel prize awarded for a Computer Science achievement, actual physics is having a dry spell I guess
They explain the flex at least
Seems like a pretty extreme flex, I’m worried it’ll snap.
If they award a Nobel for materials science, this should win.
To be fair, regardless of one’s stance on the utility of current AI or the wisdom of developing it, it is an extremely difficult and potentially world changing technical achievement, and given there isn’t a computer science prize, physics is probably the most relevant category to it
not really. A lot of techniques have been known for decades. What we didn’t have back then was insane compute power.
and there’s the turing award for computer science.
Insane compute wasn’t everything. Hinton helped develop the technique which allowed more data to be processed in more layers of a network without totally losing coherence. It was more of a toy before then because it capped out at how much data could be used, how many layers of a network could be trained, and I believe even that GPUs could be used efficiently for ANNs, but I could be wrong on that one.
Either way, after Hinton’s research in ~2010-2012, problems that seemed extremely difficult to solve (e.g., classifying images and identifying objects in images) became borderline trivial and in under a decade ANNs went from being almost fringe technology that many researches saw as being a toy and useful for a few problems to basically dominating all AI research and CS funding. In almost no time, every university suddenly needed machine learning specialists on payroll, and now at about 10 years later, every year we are pumping out papers and tech that seemed many decades away… Every year… In a very broad range of problems.
The 580 and CUDA made a big impact, but Hinton’s work was absolutely pivotal in being able to utilize that and to even make ANNs seem feasible at all, and it was an overnight thing. Research very rarely explodes this fast.
Edit: I guess also worth clarifying, Hinton was also one of the few researching these techniques in the 80s and has continued being a force in the field, so these big leaps are the culmination of a lot of old, but also very recent work.
i’m here to remind you that for last 20ish years half of the time chemistry nobel goes to biologists, and now they doubled down on ai wankery with giving it to alphafold
To be fair, AlphaFold is pretty incredible. I remember when it was first revealed (but before they open sourced parts of it) that the scientific community were shocked by how effective it was and assumed that it was going to be technologically way more complex than it ended up being. Systems Biologist Mohammed AlQuraishi captures this quite well in this blog post
I’m a biochemist who has more interest in the computery side of structural biology than many of my peers, so I often have people asking me stuff like “is AlphaFold actually as impressive as they say, or is it just more overhyped AI nonsense?”. My answer is “Yes.”