One thing to remember is that arrays are just pointers under the hood. Even if you don't use them directly, or are always using smart pointers like std::shared_ptr<T>
, they're still there.
For example, accessing the following array:
c++
int foo[3] = {1, 2, 3}; // foo is identical to int*, except the type contains a size foo[1] == 2; *(foo + 1) == 2;
Realistically in modern C++ you could likely avoid raw pointers entirely. C++ references cover a ton of the pointer use cases. I'd say the main goal is to prevent data from being copied around needlessly, since that takes more time and memory bandwidth.
We got nerd sniped at almost the exact same time, but approached this in very different ways. I applaud your practical approach, but based on what I calculated, you should stop now. It will never reach 99.999%
A few calculations:
- There are 9592 prime numbers less than 100,000. Assuming the test suite only tests numbers 1-99999, the accuracy should actually be only 90.408%, not 95.121%
- The 1 trillionth prime number is 29,996,224,275,833. This would mean even the first 29 trillion primes would only get you to 96.667% accuracy.
- The density of primes can be approximated using the Prime Number Theorem:
1/ln(x)
. Solving99.9995 = 100 - 100 / ln(x)
for x givese^200000
or7.88 × 10^86858
. In other words, the universe will end before any current computer could check that many numbers.
To be fair, I used to work there, and not even Microsoft understands their docs.