SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Strategies & Market Trends : The Art of Investing
PICK 49.91+1.0%Dec 19 4:00 PM EST

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
To: Sun Tzu who wrote (10588)11/14/2025 8:32:35 PM
From: Johnny Canuck1 Recommendation

Recommended By
sixty2nds

  Read Replies (2) of 10704
 
I normally add the prompt "Provide a chain of thought." to my prompts.

Before ChatGPT 5 started to hide the chain of thought, it allowed you to see its thought processes and you could ask it to expand or restrict the scope in a dimension that was outlined in the chain of thought.

Most of the other AIs like grok, perplexity and claude will still provide the chain of thought if you ask it to in the prompt.

I started to do it as I read a paper that suggested some LLMs gave different answers depend on if the chain of thought was asked for.

For example in grok if I ask for the the discounted cashflow value of the stock price without the prompt to ask for a chain of thought, it will just search the web for an answer but if I ask for a chain of thought it will actual calculate the answer from the raw data.

That paper I read focused on an example where the LLM was asked to calculate a math problem. Without the additional prompt to provide a chain of thought the answer was wrong, with the chain of thought it was right.
Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext