SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : Novell (NOVL) dirt cheap, good buy? -- Ignore unavailable to you. Want to Upgrade?


To: Retired Eagle One who wrote (29779)1/4/2000 11:48:00 PM
From: ToySoldier  Read Replies (1) | Respond to of 42771
 
The first week sell-off was predicted by a few analysts in the dying days of 1999. One of the main reasons I heard in 99 was that the investors which had been seeing day after day of new highs in many of the sectors (specially the tech sector) would be foolish to lock in their gains in the 1999 tax year when they could lock them in a couple days later in 2000 and thereby deferring the declaration of capital gains till the next tax year.

That combined by the Y2K bug being a no-show as far as the media and public are concerned is speeding up the pace of the fear of interest rate hikes in Feb.

Also, from what I understand the fund managers have until the end of January to close their books on 1999 so some analysts predicted that the fund managers may ride the crest as long as the 3-4 week of Jan before locking in and closing the books. These fund managers also would be concerned about the Feb interest rate rise.

Well because of the Y2K no-show, and the crest seeming to sink quickly (because of the expected fist days of trading for tax purposes), I would think many of these fund managers might have decided to close their books earlier than the end of Jan.

Well this just keeps the downslide going even faster. I would think tomorrow will also not be a good day for the markets until all these factors stabilize (i.e. people lock in their profits and the analysts feel we investors have suffered enough to appease God GreenSpan in Feb). Then people will be looking for the deals and off we go looking for new highs in the indexes again.

Just my opinions.

Toy



To: Retired Eagle One who wrote (29779)1/5/2000 8:23:00 AM
From: Steve Bannister  Respond to of 42771
 
Hi Retired Eagle One. What Lemon and I are debating is how pervasive and durable a change XML will be in the industry. A matter of degree, really. If its at my end of the scale, it should drive product decisions differently than if its at Lemon's end. Scott is very good at these things.

Novell, and many other companies, are already incorporating XML into their products. For example from Novell, DirXML uses XML to externally represent data stored in NDS, Novell Directory Services. XML parsing, or processing, capability is also integrated into WebSphere for NetWare, the next generation open web application server integrated into the next release of NetWare, which by the way will ship very very soon. That allows organizations converting to eBusiness to use XML as the common data format between legacy and legacy, or legacy and new applications, and use WebSphere logic to connect the two and present the results on the web.

Caching is a different thing. Caching on the web makes everything go faster, scales the web, because it moves commonly used data closer to the end user. Its exactly the same idea as the various levels of cache inside your computer. You have cache on the cpu, second level close to the processor chip, third level in ram, fourth level perhaps on a disk cache, and finally persistent store on the disk.

Internet caching, in a way, turns the web into one big cached computer. This is not a technically pure description, but is illustrative. Novell leads the universe in caching technology for the web, and in market share depending on how measured. It is an increasing revenue stream for Novell, and with the release of ICS last quarter should continue to be an interesting area. IDC claims it will be a 2B market in a couple of years. Caching platforms do more than that, and potentially much more, but that is where most of the market cap is right now.

Novell does have a thousand points of light. The relationship among them is that everything is totally focused on the key areas of networking, and increasingly not just the ones we were really good at when we basically invented the industry two decades ago though they are still extremely relevant. I know this area looks complex especially to newcomers, because it is. Keep at it, there is no more exciting industry around.



To: Retired Eagle One who wrote (29779)1/5/2000 12:36:00 PM
From: P. Ramamoorthy  Respond to of 42771
 
Retired Eagle,
Re."... Great timing? ..."
Understand how it feels. I suspect the short run up was due to George Gilder report and magazines like Smart Money naming NOVL as one of the ten to watch in 2000. If you take this out of the equation, NOVL price is heading down to its comfort zone. There is a lot of talk about technology. When this technology will turn into sales and revenues? Would NOVL know? Ram



To: Retired Eagle One who wrote (29779)1/5/2000 12:39:00 PM
From: Scott C. Lemon  Read Replies (2) | Respond to of 42771
 
Hello Retired Eagle One,

You asked a couple of good questions, and Steve seemed to address them ... I was going to offer one other analogy that might help the understanding of "caching" ...

You asked:

> Is NOVL caching going to be the big money maker for NOVL this year
> or something else?

And Steve had, as part of his reply:

> Caching is a different thing. Caching on the web makes everything
> go faster, scales the web, because it moves commonly used data
> closer to the end user. Its exactly the same idea as the various
> levels of cache inside your computer. You have cache on the cpu,
> second level close to the processor chip, third level in ram,
> fourth level perhaps on a disk cache, and finally persistent store
> on the disk.

Another way to look at this is a much higher level. (Something that I try to do every now and then ... ;-)

If you look at the Internet as a massive, global communications system, then it should become obvious that many of the problems that exist in other communications systems must also exist in the Internet ... and be dealt with.

In the "TV" world, programming is created by various people and production companies. This is then sent to the networks, who beam the content out via satellites (or high-speed fiber) to all of the regional broadcasters, who then send out the radio signals locally to our TVs. Even cable TV is almost always picking up the satellite feeds at a local or regional "head end" and then sent over the cable to our homes.

But you would never really see a production house create a program, and then attempt to broadcast from their building to the entire U.S. or world ... it becomes a problem of scale ... you would need a massively powerful transmitter, and might even microwave to death your local audience. This issue of scale is what Steve was referring to.

If I take this to the Internet world. Currently we are seeing various performance problems (problems of scale) as more and more people try to access content on the Internet. This is because often the producer tries to "transmit to the world" by setting up a web server on a limited Internet connection. Either the web server or the Internet connection soon become swamped by the number of requests and delays occur.

So the "first step" in Internet evolution was for companies like Exodus Communications (and others) to set up big data centers, with huge web servers and huge Internet connections. Anyone can now "rent" space on one of these "huge transmitters" of Internet content. This is ok, but still has delays because so much of the network between you and them is out of the control of even Exodus ...

So the "next step" has been the creation of large distribution networks of servers, which are spread throughout the Internet. Companies like Akamai, SandPiper, RBN, and InterVU are all part of this revolution. What they are doing is setting up the "regional broadcasters" throughout the Internet, so that consumers access the content via the "closest" source.

Caches are a very good solution for creating these distribution networks. They are a server which is optimized to "cache" or store a copy of recently accessed content ... web pages, pictures, audio, and now even video. Think of it as your local BlockBuster Video store ... it's a place to get a copy of a movie near by, with out having to get the movie from the studio directly!

I refer to this whole process as "Object Routing" ... it is the process of routing objects (web pages, pictures, audio, video, Java applets, etc.) through the infrastructure of the Internet.

The other core place that a cache comes in handy, is in front of any web server. Caches can help to spread the requests for web pages across numerous caches, instead of a single server.

Novell has unparalleled price/performance with their ICS and BorderManager products. There is no other cache that is able to provide the raw horsepower to move data in the Internet. They have also signed up numerous OEMs who are now pushing their hardware products with Novell's ICS software.

I believe that as the recognition of object routing and these distribution networks increases, the sales opportunity for caches will equal or exceed the market that Cisco enjoys today for "packet routers" ... so it's going to be big ... ;-)

Hope this helps ... I like to try and think of better ways to explain things for presentations that I do ... let me know if it helped or not ...

Thanks!

Scott C. Lemon