Is base clock speed still relevant for TurboBoost processor performance?
Solution 1:
"Turbo Boost" only kicks in when the CPU is under very high utilization and, depending on power profile, the CPU decides that it can do better with higher clock speed. On laptops you might not see turbo boost having as much of an effect, especially on battery power, because the software or firmware might not want to use turbo boost and its accompanying energy consumption because it'd burn through your battery too fast.
On desktop computers, assuming you aren't worried about your electric bill, you can set your power profile to "performance", which should allow turbo boost to kick in whenever it would be useful, and will run the CPU at its maximum base clock speed most of the time.
Here's something to consider.
Assuming the following: - Both processors have an instruction pipeline equally deep. - The speculative execution engine on both processors is the same (generally only true on CPUs of the same generation). - The processors have the same number of hardware threads (cores and HT). - The processors have the same Thermal Design Power (TDP).
Then, we should expect that, when the CPU does not determine that turbo boost is required, e.g. under a modest load, the processor with the higher clock speed will get more work done, faster, with the same amount of energy.
This is not always true, and I'm oversimplifying a bit, because other factors can cause my assumptions to miss the whole picture, but this is the general idea.
To take it to an extreme, if you had an old 486 processor that had the same TDP as a Core i7 but only operated at 30 MHz, you better believe the i7 @ 2.6 GHz will be worlds faster, assuming that (somehow) both CPUs were otherwise equal in architecture / pipeline / caching / etc.
Since most typical desktop applications (browser, word processing, email) will not kick in turbo mode, you might expect very slight improvements in some processing time with a faster base clock speed, but 0.3 GHz is not really anything to write home about. If it were 1 GHz I'd say maybe you could notice. But remember, if the CPU is pegged for any substantial amount of time (at 100% utilization), turbo boost will probably kick in, and once that happens, both CPUs are operating at the same clock rate, so any difference in performance is negligible (assuming, like I said, that the other factors are equal across the CPUs).
The i5-3570 and i5-3570S are both from the same microarchitecture generation, and both are designed targeting the same market and similar price point. But here's the critical difference.
a vs b
The i5-3570 has a Max TDP of 77 Watts, whereas the i5-3570S has a Max TDP of 65 Watts!
That 12 Watts means that the 3570 will consume more power, which is also probably why its base clock is higher. So, mystery solved: it isn't a better microarchitecture or anything like that which makes the 3570 faster; it's that it eats more power. Of course we would expect something consuming more energy to be faster, assuming the same microarchitecture.
Solution 2:
Assuming I provide adequate cooling to allow the chip to easily reach the max turbo frequency, is it reasonable to expect the chips to operate at similar frequencies under normal circumstances?
Short answer: No.
Accurate answer: It depends.
The TDP for the 3570s is 12 watts lower per Intel's spec sheets.
A lower TDP, given the same architecture and assuming that power saving measures aren't being employed, will result in lower power consumption.
If your motherboard, chipset, BIOS and OS all permit the CPU to engage in CPU parking as well as lowering the frequency when load is low, it is very possible that the CPUs will have the same temp in day to day use.
This is especially true given that day to day use for the vast majority of people who don't game most of the time will probably result in median CPU loads around 2%.