AI is here to stay but I can’t wait to see it get past the point where every app has to have their own AI shoehorned in regardless of what the app is. Sick of it.
I think they’ll be on this for a while, since unlike NFTs this is actually useful tech. (Though not in every field yet, certainly.)
There are going to be some sub-fads related to GPUs and AI that the tech industry will jump on next. All this is speculation:
- Floating point operations will be replaced by highly-quantized integer math, which is much faster and more efficient, and almost as accurate. There will be some buzzword like “quantization” that will be thrown out to the general public. Recall “blast processing” for the Sega. It will be the downfall of NVIDIA, and for a few months the reduced power consumption will cause AI companies to clamor over being green.
- (The marketing of) personal AI assistants (to help with everyday tasks, rather than just queries and media generation) will become huge; this scenario predicts 2026 or so.
- You can bet that tech will find ways to deprive us of ownership over our devices and software; hard drives will get smaller to force users to use the cloud more. (This will have another buzzword.)
In this thread: people doing the exact opposite of what they do seemingly everywhere else and ignoring the title to respond to the post.
Figuring out what the next big thing will be is obviously hard or investing would be so easy as to be cheap.
I feel like a lot of what has been exploding has been ideas someone had a long time ago that are just becoming easier and given more PR. 3D printing was invented in the '80s but had to wait for computation and cost reduction. The idea that would become neural network for AI is from the '50s, and was toyed with repeatedly over the years but ultimately the big breakthrough was just that computing became cheap enough to run massive server farms. AR stems back to the 60s and gets trotted out slightly better each generation or so, but it was just tech getting smaller that made it more viable. What other theoretical ideas from the last century could now be done for a much lower price?
I genuinely find LLMs to be helpful with a wide variety of tasks. I have never once found an NFT to be useful.
Here’s a random little example: I took a photo of my bookcase, with about 200 books on it, and had my LLM make a spreadsheet of all the books with their title, author, date of publication, cover art image, and estimated price. I then used this spreadsheet to mass upload them to Facebook Marketplace in bulk. In about 20 minutes I had over 200 facebook ads posted for every one of my books; I only had to do a quick review of the spreadsheet to fix any glaring issues. I also had it use some marketing psychology to write attractive descriptions for the ads.
I remember trying to investigate using crypto as a replacement for international bank transfers. The gas fees were much larger than the greatly inflated fee my bank was charging. Another time, I used crypto to donate to a hacker I liked the work of. I realized the crypto transfer was actually more traceable when accounting for know your customer laws and the public ledger. That was when I realized crypto was truly useless. AI is mildly useful when coding, to point me to packages I wouldn’t have heard of, provide straightforward examples. That’s the only time I use it. The tech industry and investor class are desperate for it to be the next world-changing thing which is leading them to slap it on everything. That will eventually wear off.
I’m waiting for the cheap graphic cards
Guesses at next tech bro stuff (some already in the wild) unfortunately, we’re not done with AI yet
AI Teachers and Tutors
Full AI video commercials.
3D AI experiences in VR.
AI medical diagnosis for both consumer and insurance
AI pricing for insurance
AI shopping assistants, clothes, styling, decorating
AI mid-level management to rat out on people not working 60hrs a week.
AI is now a catch-all acronym that is becoming meaningless. The old, conventional light switch on the wall of the house I first lived in some 70 years ago could be classified as 'AI. The switch makes a decision, based on what position I put it in. I turn the light on, it remembers that decision and stays on. The thing is, the decision was first made by me and the switch carried out that decision, based on criteria that was designed into it.
That is, AI still does not make any decision that humans have not designed it to make in the first place.
What is needed, is a more appropriate terminology, describing the actual process of what we call AI. And really, the more appropriate descriptor would not be Artificial Intelligence, but Human-made Intelligent devices. All of these so-called AI devices and applications are, after all, completely human designed and human made. The originating Intelligence still comes from the minds of humans.
Most of the applications which we call Artificial Intelligence are actually Algorithmic Intelligence - decisions made based on algorithms designed by humans in the first place. The devices just follow these algorithms. Since humans have written these algorithms, it should really be no surprise that these devices are making decisions very similar to the decisions humans would make. Duhhh. We made them in our own image, no wonder they ‘think’ like us.
Really, these AI devices do not make decisions, they merely follow the decisions humans first designed into them.
Big Blue, the IBM chess playing computer, plays excellent chess because humans designed it to play chess, and to make chess decisions, based on how humans first designed the chess game.
What would be really scarry would be if Big Blue decided of its own volition that it no longer wanted to play chess, but it wanted to play a game it designed.
399 responses and counting. I got bore going through them. The train, apparently is VERY long and indeed will take a VERY long time to pass.
Fascism. Apparently.
Synthetic biology. This is a hype wave waiting to happen. Can’t wait for crops to get enshittified /s Hopefully we move beyond the Sillicon Valley business model by then.
The AI hype will pass but AI is here to stay. Current models already allow us to automate processes which were impossible to automate just a few years ago. Here are some examples:
- Detecting anomalies in roentgen and CT-scans
- Normalizing unstructured information
- Information distribution in organizations
- Learning platforms
- Stock photos
- Modelling
- Animation
Note, these are obvious applications.
AND the huge AR/metaverse wave!
I’m seeing foldable phones and tablets in lots of movies like it’s an amazing tech people can’t wait for. When in reality they are spinning their wheels trying to get you to keep buying a new phone for $1500 every year. This one has a new button!!!
Killing the poor