Hello Steven, and thank you for those acknowledgments. Sorry it's taken a while to get back to you.
Your questions concerning the apparent low number of strands in the sub-sea section is a common one. It crosses the minds of most industry observers, once they get around to focusing on the business potentials and the metrics involved in determining bandwidth revenues. The short answer is, strand count is not linearly related to overall system gain.
The long answer follows.
Due to the still nascent stages of photonic development we are in there are some highly unlikely factors at play here, yet very real ones, nonetheless. As I stated, the relationship is not linear, for reasons which I would hope to explain below.
Many, if not most, of the reasons for a seemingly low strand count can be found in some physical constraints, while others I strongly believe are strategic, stemming from some legacy thinking among the parties involved, which in turn foster fundamental in-house rules concerning what it is, exactly, they want to sell. In short, they want to sell bandwidth derivatives (the golden eggs), but not the factory that produces them (the golden goose).
Very few carriers, even domestically (MFNX being one of the exceptions), set out to sell raw optical capacity, although it is hoped that this is will change with time, on the domestic front, first, and then probably much later, internationally. For now, however, they still very much view this as giving away the store.
Instead, they momentarily limit (artificially control?) supply by selling tightly managed streams in various denominations and packaging arrangements, just as they always have done, despite the much greater economies afforded by the optical model. Nothing new here that we haven't already been exposed to for eons, except for the scope. Never before have such huge potentials existed in gain, as they do now with the optical model.
In any event, when it comes to transoceanic crossings, due to the greater distances involved than those where even MFNX is now selling dark fiber, the physical constraints are the most formidable factors to get around, when formulating overall system design parameters. Again, due to the primary and secondary effects of the lengthened distances.
"... assuming that there would be no management team more attuned to this unalloyed potential, future demand than GBLX's management team..."
What GBLX is not all about at this time is flooding the world with optical bandwidth capacity. Instead, they are about providing enough capacity to allow themselves to gain a huge advantage from conditions which are characterized by a lopsided under supply, or shortage of bandwidth, between the continents. And therein lies a very significant difference. The mention of this, of course, suggests another possible argument, and that is the question of whether or not a glut could even possibly be created, but that is another entire thread unto itself. Suffice it to say that GBLX will prosper on even any moderate improvements in supply that they can bring to the table for now, since a severe shortage does, in fact, exist between the end points they've targeted.
I'm convinced that even if GBLX had a mind to sell strands and wavelengths to all comers at this time, they still couldn't do it, because of some of the constraints you alluded to which I'll attempt to cover in more detail, in a moment. But consider the following rationale, first.
The fiber being used today is still untapped in the upper regions of their potential. This goes not only for the individual strands, but for their derived wavelengths -after they go through the DWDM process- as well. For this reason (vastly untapped reserves) GBLX would view sales of strands and wavelengths, purely on an optical capacity basis, as giving away capacity, albeit still untapped as it is today, which would have the same effect as cutting off future realized revenues when the state of the art permitted much greater yields in the future from improved DWDMs.
In fact, we reach an ironic crossroad in this analysis, a paradox, if you will, if you consider that because they are loathe to remove the stigma of a bandwidth shortage, they actually perpetuate it (the shortage) by stifling it to no one's gain. Or, so it would seem at first.
And in so doing, they fulfill their ultimate game plan, which has all to do with maintaining absolute control over every drop of capacity that they deliver. And so it should be, for those drops, and the cables they flew in on, belong to them.
And even if countless terabits of capacity remain untapped for the time being, or forever, going undelivered in the ultimate scenario, they can do this. They're not going to give it away, in any event. As you could well imagine, this makes for some interesting philosophical discussion, one which is in reality based in economics, however, but I've digressed enough here, and I'll get back to the strand count.
For this reason they will continue to provide "derived" bandwidth in the conventional ways that carriers have been delivering it all along. They will hand it off from one of their own DWDM-TDM complexes, or POS routers, or map IP directly onto wavelengths. Which, in turn, may feed other routers, or ATM switches, but always in ways that could be quantified in terms of OC-3s, or OC-12s, etc. or STM1, STM4, etc. (OC-n's being North American SONET denominations, and the STM-n's being European/ITU SDH).
Or they may yield to some yet undone means, such as Gigabit Ethernet (GE) or native ATM. But in very rigid and quantifiable ways, nonetheless. It's all about book ends, packaging, and unit-level pricing.
Translation: With few exceptions GBLX will continue to sell optically derived transmission capacity in traditional formats, such as T3s, OC-3s, OC-48s, STM-n. They will not make a major push to sell native optical capacity any time soon to ordinary end users. The exceptions I alluded to being, to other carriers and the very largest enterprise players. If they chose at this time to do otherwise, they would run out of production capacity before they were even able to optimize what they have in place.
Here's the problem they would face. If eventual improvements in DWDM gain factors allowed increases in the number of wavelengths that could be derived from existing strands, then GBLX would rather be the ones to implement those increases and sell off the spoils as additional yield, as opposed to their customers doing it on a one-time sunk-cost basis.
An additional factor at play here is, if they were to decide to sell by the individual strand or lambda (wavelength) to a customer on an indefeasible right of use (IRU) basis, then that customer can optimize that strand or wavelength to an OC-768, or even Terabit levels, if the state of the art eventually permits, at their own election, without any additional revenue ever going to the carrier, or to GBLX, in this case.
... why is GBLX laying such relatively "few" strands...Is it a question of component (laser modules or other) or fiber supply constraints, cost?"
The list of generic constraints probably reads like most engineering challenges of this sort, at least in principle. Cost engineering takes into account many factors both real (physical) and market related (subjective perceptions).
The subjective class of factors being primarily focused on the perceived current value, and the perceived time to obsolescence. When does a cable become obsolete? When the fiber begins taking in too many hydroxyl ions due to water ingress? Or when it costs more to maintain than another, newer cable could bring in, in the way of revenues, if a more recent and improved set of technologies were used?
It's probably some answer that rests in between these two extremes, but rapidly favors the latter as you take the time to decide which.
Among the primary physical factors which explain why there are so few strands, is the need to provide amplification at the fiber strand level for each strand placed. Optical amplifiers, in turn, require special engineering and powering, themselves, which means that the entire cable must be electrified, from one end to the the other end. No mean feat.
So, in addition to the optical loss budgets which dictate optical amplifier spacings, the engineers must also contend with longitudinal electric power delivery which has a budget all of its own, as well, and other attendant problems associated with electrification at the material level. And very likely, these can be more problematic to engineer and provision than the optical loss budgets of the fiber itself.
If these two considerations were not enough, then there is the form factor associated with not only the cable construction, but the amplifier pod, the repeater itself, a torpedo-like in-line amplifier housing that almost seamlessly resides as a part of the cable construction itself, that must be laid with minimum disruption. Size, shape and weight are factors which have to be taken into account here, too.
Increasing the number of strands by a factor of two, or even 1.5, say, would have enormous implications in the total number of additional amplifiers required, and by extension, this would have even greater implications in the manner and amount of power required in the overall system budget, in order to support them.
Longitudinal weight and stress factors also come into play, as do other physical dynamics. And, in the end, if you did increase the stand count, one could ask, "Why?" And, "For what?"
Since no one fiber will be tapped for its full potential capacity, anyway, why bother to add more of them, beyond the four or five in each direction needed, and the few spares required in case of defects and maintenance requirements? This is one argument that I've heard.
Of course, this argument presumes the basic tenet that I covered above, and that is, primarily, to sell bandwidth derivation only, and not raw optical capacity, hence the reintroduction of one of the subjective factors I previously alluded to, which comes from of marketing.
And if that is the proposition going into this game, then it makes at least a modicum of sense, if you think about it. As long as improvements in DWDM continue to be made at the end points, and as long as the operating ranges of the deep sea amplifiers can handle some marginal increases in spectrum width, then why fire up additional strands when they are not needed, when they will never be fully optimized before they are functionally obsolete from an overall system standpoint, anyway?
With quantifiable improvements in fiber coming about every two years, and faster by some measures, at this point, there may be a time when the strands being laid today might be deemed obsolete much sooner than one would have previously thought, and newer ones may need to be pulled more quickly, as a consequence. It is thought that even by that time, whenever that is, the full potential of those being laid today will not have been realized.
Given the rapid pace of economic and technological improvements in photonics being what they are, then, any incremental costs associated with the continued maintenance of today's fibers will at some point in time exceed any future benefits. This is when you reach that point of diminishing returns, or worse, operating at a loss, when you stop to consider what the competition may be up to.
And, very ironically, even when the cable is deemed obsolete at that point in the future, when it is capped and sealed on both sides of the pond, it still will not have reached its fullest potential. Nowhere near it.
Hope this helped. Comments and corrections welcome.
Regards, Frank Coluccio
|