Caching is an access shunt. It reduces the need to go up the access line, say to the 'bone, in order to get often use pages. The terms "core" and "edge" don't really have applicable meaning to the mechanics of caching. Caching is an edge phenomenon, but the concept obviously can be used in the core. Those terms are sufficiently ambiguous so it's best to avoid them.
The idea in caching is to store often used pages in caching servers in the headend or as near as possible to the user. What pages are stored, their priority, queuing status, lifetime, are controlled by load algorithms. When a link makes a call for a page the traffic server determines where the page is and directs the call there. If the there is a caching server, location of that server is a prime consideration. It may be that the call has to go up to the 'bone. Unfortunately due to diversity of traffic and the primitive level of development of these technologies, going to the 'bone occurs the majority of the time.
Downloading IExx from say, Connexion, is a different process. This is where the term "core" makes more sense since Connexion's servers are considered to be remote relative to an exercise of the caching model. The bytes transferred go through the 'bone. I believe you have to go to the 'bone to get to Connexion's servers even if you're next door to the server room. |