SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Non-Tech : Kirk's Market Thoughts
COHR 154.52-3.0%Nov 7 9:30 AM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Kirk © who wrote (25841)9/9/2025 10:00:28 AM
From: Elroy  Read Replies (3) of 26440
 
While we're on the subject of technology, who knows the answer to this one?

Everyone and their grandma is building a massive compute power AI data center. The processing demands of these AI data centers are massive, and the buyers cannot get enough processing power. More processing please, AI accelerators which (I think) get the thing to be processed to the processors faster are in high high demand, so all the AI data centers can process things more and more quickly.

In this system, where is all the inputs that need to be processed? It's sort of like ChatGPT is using EVERYTHING EVER WRITTEN to determine how to answer a query. Where is EVERYTHING EVER WRITTEN stored?

My assumption (perhaps wrong) is that if processing needs MUST increase as much as possible and be as speedy as possible, the volume of stuff TO BE PROCESSED must also be massive. So.....in an AI data center, where is that stuff stored? Where does it come from?

How is it that the part of the AI sytem that accesses the inputs (which need to be processed!!!) is not the bottleneck?
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext