Early personal computers didn’t have disk drives. Instead, they relied on floppy disks for storing the few kilobits of data needed. In 1981, Apple introduced their first hard drive. It was called the ProFile, held 5MB of data and retailed for $3,499. That equates to more than $700,000 for a gigabyte of storage, which is hardly enough to store a few photos today.
Moore’s Law, now infamous in the technology community, unscientifically stated that computer performance would double every two years. A similar principle, Kryder’s Law, has governed storage for the last few decades and has actually outpaced Moore’s law, according to Scientific American:
Over the years there has been a lot of talk about Moore's Law and the way that doubling the power and memory of computer semiconductors every 18 months has driven technological advance. But from where Mark Kryder sits, another force is at least as powerful, perhaps more: the cramming of as many bits as possible onto shrinking magnetic hard drives.
The 61-year-old engineer might be on to something. Since the introduction of the disk drive in 1956, the density of information it can record has swelled from a paltry 2,000 bits to 100 billion bits (gigabits), all crowded in the small space of a square inch. That represents a 50-million-fold increase. Not even Moore's silicon chips can boast that kind of progress.
At some point, both speed and storage costs will have to bottom out. Cloud storage services, such as Dropbox, routinely offer 2GB of storage for free, and 1 terabyte external hard drives, which might have cost as much as $700 million 30 years ago, are available for less than $100.