I am so tired of people, especially people who pretend to be computer experts online, completely failing to understand what Moore’s Law is.
Moore’s Law != “Technology improves over time”
It’s an observation that semiconductor transistor density roughly doubles every ~2 years. That’s it. It doesn’t apply to anything else.
And also for the record, Moore’s Law has been dead for a long time now. Getting large transistor density improvements is hard.
I’m gonna go on “no stupid question” and ask why my old hard drives aren’t doubling in size.
You need to properly feed, water and fertilize them. If you don’t do this, your old hard drives will just waste away until they’re just a few megabytes, not flourish into giant petabyte trees.
Did you try evolving them?
You have to walk around in the right environment otherwise they’re all going to turn into generic eevees, and you don’t want that
It might be from a brand that doesn’t evolve, or only has one evolution instead of two.
deleted by creator
Also the improvements in computer speed from Moore’s law were from Denard Scaling, which says with transistors 2x smaller, you can run things 2x faster but produce 2x as much heat.
Heat dissipation has been the bottleneck for a long time now.
Sure, but also no.
More’s law is at the most fundamental level a observation about the exponential curve of technological progress.
It was originally about semiconductor transistors and that is what Moore was specifically looking at but the observed pattern does 100% apply to other things.
In modern language the way language is used and perceived determines its meaning and not its origins.
In modern language the way language is used and perceived determines its meaning and not its origins.
So we should start calling monitors computers, desktop towers modems (or CPUs (or hard drives)), wifi as internet, browsers as search engines and search engines as browsers. None of this is incorrect, according to the average person.
More’s law is at the most fundamental level a observation about the exponential curve of technological progress.
No. Let me reiterate:
Moore’s Law was an observation that semiconductor transistor density roughly doubles every ~2 years.
It is not about technological progress in general. That’s just how the term gets incorrectly applied by a small subsect of people online who want to sound like they’re being technical.
Moore’s Law is what I described above. It is not “technology gets better”.
I meant that sentence quite literally, semiconductor is technology. My perspective is that original “moors law” is only a single example of what many people will understand when they hear the term in a modern context.
At some point where debating semantics and those are subjective, local and sometimes cultural. Preferable i avoid spending energy on fighting about such.
Instead il provide my own line of thinking towards a fo me valid reason of the term outside semiconductors. I am open to suggestions if there is better language.
From my own understanding i observe a pattern where technology (mostly digital technology but this could be exposure bias) gets improving at an increasingly fast rate. The mathematical term is exponential.
To me seeing such pattern is vital to understand whats going on. Humans are not designed to extrapolate exponential curves. A good example is AI, which large still sucks today but the history numbers don’t lie on the potential.
I have a rather convoluted way of speaking, its very unpractical.
Language,at best, should just get the message across. In an effective manner.
I envoke (reference) moores law to refer to the observation of exponential progress. Usually this gets my point across very effectively (not like such comes up often in my everyday life)
To me, moors law in semiconductors is the first and original example of the pattern. The fact that this interpretation is subjective has never been relevant to getting my point across.
In modern language the way language is used and perceived determines its meaning and not its origins.
This is technically correct but misleading in this context, given that it falsely implies that the original meaning (doubling transistor density every 2y) became obsolete. It did not. Please take context into account. Please.
Furthermore you’re missing the point. The other comment is not just picking on words, but highlighting that people bring “it’s Moore’s Law” to babble inane predictions about the future. That’s doubly true when people assume (i.e. make shit up) that “doubling every 2y” applies to other things, and/or that it’s predictive in nature instead of just o9bservational. Cue to the OP.
Please take context into account. Please.
(this is a lil’ lemmy thread and I think everyone understands what OP had in mind)
but the observed pattern does 100% apply to other things.
Sure, if you retroactively go back and look for patterns where it matches something but that isn’t a very useful exercise.
Moore’s law is about circuit density, not about storage, so the premise is invalidated in the first place.
There is research being done into 5D storage crystals, where a disc can theoretically hold up to 360TB of data, but don’t hold your breath about them being available soon.
That would certainly benefit my Plex server setup
I always thought the holographic 3D discs were going to be a really cool medium in the infacy days of bluray and hd-dvd. I can’t believe that’s is been over a decade since the company behind it went bankrupt.
Probably a stupid question, but how can the crystals be 5d if oir universe is (at a meaningful scale) 4d?
Not a stupid question at all. Here’s the Wikipedia article for it. The significant part is this:
The 5-dimensional discs [have] tiny patterns printed on 3 layers within the discs. Depending on the angle they are viewed from, these patterns can look completely different. This may sound like science fiction, but it’s basically a really fancy optical illusion. In this case, the 5 dimensions inside of the discs are the size and orientation in relation to the 3-dimensional position of the nanostructures. The concept of being 5-dimensional means that one disc has several different images depending on the angle that one views it from, and the magnification of the microscope used to view it. Basically, each disc has multiple layers of micro and macro level images.
That’s fucking dope.
Wavelength could add a dimension. For example, if you have an optical disc (2D) that can be read and written separately by red and blue lasers, that makes it 3D.
That’s neat, so it’s using a trick of mathematics and physics to store info in greater dimensionality than just what the physical universe is limited to? Kinda like how we can use coordinates to represent 4d points on a graph even if we can’t really visualize it?
Yes. Generally, “three dimensions” refers to three spatial dimensions: left/right, up/down, forward/backward. And then the fourth dimension is usually time. But if you’re not talking about movement in space/time, you can have as many dimensions as you want. For example, in a video game, you can have movement in three dimensions, but you could also allow the player to move through time (fourth dimension), change characters that interact with the world differently (fifth), and so on.
This is true, but…
Moore’s Law can be thought of as an observation about the exponential growth of technology power per $ over time. So yeah, not Moore’s Law, but something like it that ordinary people can see evolving right in front of their eyes.
So a $40 Raspberry Pi today runs benchmarks 4.76 times faster than a multimillion dollar Cray supercomputer from 1978. Is that Moore’s Law? No, but the bang/$ curve probably looks similar to it over those 30 years.
You can see a similar curve when you look at data transmission speed and volume per $ over the same time span.
And then for storage. Going from 5 1/4" floppy disks, or effing cassette drives, back on the earliest home computers. Or the round tapes we used to cart around when I started working in the 80’s which had a capacity of around 64KB. To micro SD cards with multi-terabyte capacity today.
Same curve.
Does anybody care whether the storage is a tape, or a platter, or 8 platters, or circuitry? Not for this purpose.
The implication of, “That’s not Moore’s Law”, is that the observation isn’t valid. Which is BS. Everyone understands that that the true wonderment is how your Bang/$ goes up exponentially over time.
Even if you’re technical you have to understand that this factor drives the applications.
Why aren’t we all still walking around with Sony Walkmans? Because small, cheap hard drives enabled the iPod. Why aren’t we all still walking around with iPods? Because cheap data volume and speed enabled streaming services.
While none of this involves counting transistors per inch on a chip, it’s actually more important/interesting than Moore’s Law. Because it speaks to how to the power of the technology available for everyday uses is exploding over time.
Moore’s law factored in cost, not just what was physically possible.
The complexity for minimum component costs has increased at a rate of roughly a factor of two per year. Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years.
About 5 years ago I pirated all the games ever normally published for my childhood gaming system and my friends different gaming system.
If I went to the past and told that to my younger self and that it all fits in a pinky finger nail sized medium, I wouldn’t have belived me. It’s just so far out there.
Yeah taken as a guideline and observation that computer speeds/storage/etc continue to improve, I think it’s fair. It may not always be double, but it is still significantly different than other physical processes which have “stagnated” by a similar metric (like top speed on an average vehicle or miles per gallon).
Hard drive density has stagnated. There haven’t been any major technology breakthroughs since 750GB PMR drives came out in 2006. Most of the capacity improvements since then have come from minor materials improvements and stacking increasing amounts of platters per drive, which has reached its limit. The best drives we have, 24tb, have 10 platters, when drives in the 2000’s only had 1-4 platters.
Meanwhile, semiconductors have been releasing new manufacturing processes every few years and haven’t stopped.
Moore’s Law somewhat held for hard drives up until 2010, but since then it has only been growing at a quarter of the rate.
Right now there are only 24TB HDDs, with 28TB enterprise options available with SMR. The big breakthrough maybe coming next year is HAMR, which would allow for 30tb drives. Meanwhile, 60TB 2.5"/e3.s SSDs are now pretty common in the enterprise space, with some niche 100TB ssds also available in that form factor.
I think if HAMR doesn’t catch on fast enough, SSDs will start to outcompete HDDs on price per terabyte. We will likely see 16TB M.2 Ssds very soon. Street prices for m.2 drives are currently $45/TB compared to $14/TB for HDDs. Only a 3:1 advantage, or less than 4 years in Moore’s Law terms.
Many enterprise customers have already switched over to SSDs after considering speed, density, and power, so if HDDs don’t keep up on price, there won’t be any reason to choose them over SSDs.
sources: https://youtu.be/3l2lCsWr39A https://www.tomshardware.com/pc-components/hdds/seagates-mozaic-3-hamr-platform-targets-30tb-hdds-and-beyond
I’ve only looked at the consumer space and all I’ve noticed is that SSD prices were finally going down after stagnating for years, but then the manufacturers said that prices are too low and they intentionally slowed down production to increase prices, so prices are actually higher than they were a year ago.
Sometes the prices go up, but they steadily go down over time.
This chart is really good for seeing storage prices
Right, over the long term prices go down, but it still greatly annoys me that they jacked up prices in the short term. Thankfully I have no need to purchase any storage and won’t for years.
That chart doesn’t really show the recent price hike. Late last year, I bought an 8TB Samsung SATA SSD for $350. If I wanted to buy that same drive today, it would be $630.
Do you have to archive all the porn in the Internet?
“We do these things not because they are easy. But because we are hard!” -JFK
Did I just pave way to the greatest joke today.
Removed by mod
I gave the subject a check. From Tom’s Hardware, industry predictions are like:
Year Capacity (in TB) 2022 1~22 2025 2~40 2028 6~60 2031 7~75 2034 8~90 2037 10~100 Or, doubling roughly each 4y. Based on that the state of art disks would 500TB roughly in 2040. Make it ~2050 for affordable external storage.
However note that this is extrapolation over a future estimation, and estimation itself is also an extrapolation over past trends. Might as well guess what I’m going to have for lunch exactly one year for now, it’ll be as accurate as that.
To complicate things further currently you have competition between two main techs, spinning disks vs. solid state. SSD might be evolving on a different pace, and as your typical SSD has less capacity it might even push the average for customers back a bit (as they swap HDDs with SSDs with slightly lower capacity).
And I was impressed by Seagate launching their Mozaic 3+ 32TB HDDs…
That’s honestly intense. I would be terrified of having that much data in one place
While not hard drives, at $dayjob we bought a new server out with 16 x 64TB nvme drives. We don’t even need the speed of nvme for this machines roll. It was the density that was most appealing.
It feels crazy having a petabytes of storage (albeit with some lost to raid redundancy). Is this what it was like working in tech up till the mid 00s with significant jumps just turning up?
This is exactly what it was like, except you didn’t need it as much.
Storage used to cover how much a person needed and maybe 2-8x more, then datasets shot upwards with audio/mp3, then video, then again with Ai.
Well hell, it’s not like it’s your money.
a petabye of ssds is probably cheaper than a petabye of hdds when you account for rack costs, electricity costs, and maintenance.
Not a problem I’ve ever faced before, admittedly
The size increase in hard drives around that time was insane. Compared to the mid-90’s which was just a decade ago, hard drives capacities increased around 100 times. On average, drive capacities were doubling every year.
Then things slowed down. In the past 20 years, we’ve maybe increased the capacities 30-40 times for hard drives.
Flash memory, on the other hand, is a different story. Sometime around 2002-3 or so I paid something like $45 for my first USB flash drive - a whole 128MB of storage. Today I can buy one that’s literally 1000 times larger, for around a third of that price. (I still have that drive, and it still works too!)
I guess you’re expected to set those up in a RAID 5 or 6 (or similar) setup to have redundancy in case of failure.
Rebuilding after a failure would be a few days of squeaky bum time though.
Absolutely not. At those densities, the write speed isn’t high enough to trust to RAID 5 or 6, particularly on a new system with drives from the same manufacturing batch (which may fail around the same time). You’d be looking at a RAID 10 or even a variant with more than two drives per mirror. Regardless of RAID level, at least a couple should be reserved as hot spares as well.
EDIT: RAID 10 doesn’t necessarily rebuild any faster than RAID 5/6, but the write speed is relevant because it determines the total time to rebuild. That determines the likelihood that another drive in the array fails (more likely during a rebuild due to added drive stress). with RAID 10, it’s less likely the drive will be in the same span. Regardless, it’s always worth restating that RAID is no substitute for your 3-2-1 backups.
Yeah I have 6 14tb drives in raid 10, I’ll get 2 more if i need it.
At raid6, rebuilds are 4.2 roentgens, not great but they’re not horrible. Keep old backups.but the data isn’t irreplaceable.
Raid5 is suicide if you care about your data.
I’m more shocked how little I need extra space!
I’m rocking an ancient 1TB for backups. And my main is a measly 512GB SSD.
But I don’t store movies anymore, because we always find what we want to see online, and I don’t store games I don’t actively use, because they are in my GOG or Steam libraries.
With 1 gigabit per second internet, it only takes a few minutes to download anyways.Come to think of it, my phone has almost as much space for use, with the 512GB internal storage. 😋
Maybe I’m a fringe case IDK. But it’s a long time since storage ceased to be a problem.I download both windows and linux offline installers when I buy games at gog.com, it’s one of the reasons I buy there.
I can understand that having your own copy is nice, especially if the service is closed for some reason.
I just don’t bother doing that anymore, I prefer browsing my library on GOG instead of a file-manager.
2028: ~363TB 2029: ~439TB 2030: ~531TB
This is what I came up with.
Source?
Data from various searches,
https://www.oceanclub.org/h5_en/post/info/id/1545 https://www.thestack.technology/the-evolution-of-storage-alex-mcmullan-pure-storage/ https://blocksandfiles.com/2024/08/16/the-128tb-ssd/ https://www.trendforce.com/presscenter/news/20240913-12303.html
I guesstimated a ~20% growth rate
This was just mental guess work. I’m not claiming I know.
Just attempting to format those in a more readable way.
We can argue as much as we want about whether moore’s law covers technological development in general or be pedantic like good old fundamental Christians and only read what the words say.
The bigger problem is that we have reached the era of what we could tentatively call “wal s’eroom”. Thanks to enshittification (another one of those slippery words!) I predict that technological progress reverses from now on by 50% every 2 years.