Here's an unsigned 32-bit integer's maximum value (2^32):
4,294,967,295
Pretty pretty easy to visualize, as it's roughly 4.3 billion possible values. And in a retarded bid to help bring some frame-of-reference to it, we could say it's 536.870912 MiB. Or we could say it could address 4,294,967,296 possible bytes, and thus is why 32-bit systems can only address up to about 4 GiB of RAM.
But unsigned 64-bit numbers' possible combinations (pardoning any off-by-one errors I may have in this post), whoa-boy...:
18,446,744,073,709,551,616
That number is fucking huge, and with it you could have a system with up to 16,777,216 Terabytes of RAM that could be addressed.
I don't even know how to begin visualizing that or what I can even compare it to (since I can compare 4 GiB of RAM to the many GiBs of hentai that I have).
Considering how insanely huge 64-bit integers get, especially if they're unsigned, are we ever going to need to get any bigger?
Big numbers thread.
>exponential growth is big
holy shit who would have thought
>>55067085
With Moore's law coming to an end, probably not for a long time, though there probably are some scientific applications that would possibly benefit from an architecture that can process a 128-bit value in one go.
>>55067085
>are we ever going to need to get any bigger?
Yes, because retarted programming will slowly require gazillions of memory.
>>55067168
we probably won't have gazillions of memory, at least for a long time though. 6,777,216 Terabytes ought to enough for anyone
>>55067133
B-but it's not my fault.
I'm a DBA for a "web scale" startup, and my boss figured a 32-bit ID field was far too small, because we could only have about 4.3 billion rows per table. And with the amount of events happening per-second, that wouldn't be nearly enough after a few months or years of operation.
So we're using a 64-bit field, and I told my boss that'd be way, WAY more than we could ever possibly need in hundreds of lifetimes, and that he didn't have to worry.
He asked me what I could compare it to, and I didn't really have an answer, and now he's skeptical about it. So here I am.
Don't blame me, I know it's impossibly high, but I'm doing it for my boss.
>>55067253
you could easily gave the memory example. if that fails, tell him about age of universe or number of sands on a desert or something.
>Busy beaver function
>Graham's number
a ternary digit (trit) can hold as much as 1.58 bits, a 21-trit computer could therefore handle larger values than a 32-bit computer.
Now imagine a the ternary pendant to a 64-bit machine :o
>>55067196
But it's not enough for -everyone-
>>55067253
4.32 x 10^17 seconds is about the current age of the universe. If you were to input 1 column of data every second since the beginning of the universe till now, it would take you another 35 times (or is it 8 times? I'm doing this in my head) that time to fill all of the columns.
>>55067168
>retarted
Millennials, everyone
>>55067085
>16,777,216 TB of RAM
Not enough.
Think bigger OP
Like a distributed dyson swarm which reaches out past jupiter.
>>55067085
now imagine how many different states the RAM could take that's 2^18,446,744,073,709,551,616 different states. Now imagine what pussy feels like cause that's the closest you're going to get.