About ten years ago, a company was marketing a revolutionary machine for vending french fries. Put in some coins and get french fries in seconds, piping hot, and delicious. Trouble was, the machine had only one major part, a dwarf who ran through a hole in the back and fetched the chips from the kitchen.
All software, no matter how simple or complex, causes the CPU to execute machine level instructions. Processor chips have speed limits. Heat is an issue. Nothing comes without cost. Issues are traded until a workable design emerges. Would Intel spend a billion dollars developing marginally faster chips without exploring the possibilities of optimizing current products first? So at the chip level, we reach an insurmountable barrier. The overclockers look for cracks in this wall, but they rarely fine one wide enough to allow barges to float through.
As for doubling RAM and disk space. I have my doubts. Compression simply removes redundant data, making formerly large files fit small spaces. This consumes processing ability and trades speed for size, trade-offs well understood and documented. We cannot expect both.
For further investigation, I'd want to see this company's product produce these same improvements using third-party, or my own, benchmarks. A simple test would be, if a doubling of disk space is claimed, start saving files and see when the disk fills. If 10gb is claimed and only 5gb gets stored, the claim is incorrect.
I could be very, very wrong. I just think things which sound too good to be true usually are. And this sounds wonderful.
Cheers, PW. |