SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Pastimes : Computer Learning

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
From: Frank Sully2/27/2022 7:45:07 PM
  Read Replies (2) of 110581
 
Introduction

Hi Gang!

Josh invited me to post here so here I am. I am interested in “machine learning” and training deep neural networks (DNN) AI models. As you may know, a conversational DNN AI model developed and trained by NVIDIA and Microsoft has 530 billion parameters.

Message 33543488

As discussed here, 2021 was the year of mammoth monster AI models

Message 33629486

I originally became interested in AI & Robotics as a child. My father was a life-long sci fi fan and he turned me onto the classic Hugo Award winning classic “A Canticle For Leibovitz” by Walter Miller at a young age. I also grew up on the “Tom Corbett: Space Cadet” book series and began devouring all the sci fi my local public library (a short walk from my grammar school) had on hand, finding in particular Isaac Asimov’s classic Hugo Award winning Foundation Trilogy (have you seen the new Apple TV+ adaptation - it’s great!) and his “I, Robot” short stories and books. And of course then there was the Star Trek series, the Star Wars Trilogy and Blade Runner movies and more recently the West World and Continuum series. A friend of mine in Chapel Hill, North Carolina where I completed my doctoral studies in chaotic dynamical systems called me a “Space General”, claiming I had graduated from being a “Space Cadet”.

I retired two and a half years ago and began watching my retirement nest egg portfolio much more carefully. In late 2020 the Enterprise AI company C3.ai (AI) went public and I bought 180 shares shortly thereafter at about $110/share. In a matter of weeks it soared to $180/share and I cashed in profits on 50 shares. As those of you who watch CNBC probably know, last year C3.ai plummeted and is currently trading near $20/share. I quadrupled my position at about $40/share so we’ll see what happens.

I began to want a better understanding of current artificial intelligence and robotics technology as an investment. I found the SI “Singularity, AI, etc.” message board moderated by the late John Pitera and met my SI AI mentor Glenn Petersen there. He encouraged me to think of AI investments as long-term and to pack them away at the bottom of a sock drawer. John had written an extensive Introduction Header for his message board and I devoured it eagerly. He explained how NVIDIA’s invention and continual refinements to GPUs, originally developed for computer gamers, found application to training deep neural networks (DNN) AI models due to their ability to do parallel processing, while CPUs such as Intel’s x86 architecture were based on single stream computation, albeit at a blazingly fast rate. This and NVIDIA’s CEO Jensen Huang’s decision to move the company into AI about a decade ago led to NVIDIA’s current dominance in the world AI & Robotics chips and software markets. For the archived Introduction Header see link below:

The Coming AI Singularity Message 33598107



I invested heavily into NVIDIA and last year the company soared with incredible growth in earnings and revenues (on the order of 60%) and its position in the Metaverse sector with its Omniverse and Digital Twins and Digital Avatars as well as it’s position in autonomous vehicles, soaring 130% in a year. In the last three months it has retreated considerably with the entire tech sector but I’m still up some 80% in a bit over a year.

As I continued to research the AI & Robotics Space I found that Alphabet (aka Google) was also an AI and Robotics powerhouse, with subsidiaries like world dominating AI & Robotic lab DeepMind, autonomous vehicles developer and manufacturer Waymo’s (with fully autonomous robotaxis already operating in Phoenix and San Francisco), robot developer Everyday Robots, robotic software developer Intrinsic and most recently drug discovery company Isomorphic Labs using protein folding AI AlphaFold (developed last year by DeepMind, once again stunning the world). See link:

Message 33561414

I invested heavily into Google as well. Up at one point some 80% it is now up about 55% in a bit over a year. I have continued researching and investing in AI and Robotics, with a total portfolio of nine companies now making up half of my retirement nest egg. “ So I’m a gambler! Roll dem dice! Oh no, snake eyes!”

Anyway, after John Pitera’s death a bit over a year ago, my SI AI mentor Glenn Petersen became moderator of the message board revamped as “AI, Robotics & Automation, linked here:
AI, Robotics & Automation Message Board: Subject 59856I post extensively on Glenn’s message board on NVIDIA and other AI & Robotics topics. Glen, seeing my intense interest in NVIDIA and noting that the board had become dormant and that the board moderator hadn’t posted anywhere on SI for a number of years, suggested that I volunteer to moderate the message board.

I became moderator of the NVIDIA message board in the middle of last year and immediately set out to revamp and update the board. Since the Untroduction Geader to the board was very skimpy and obsolete I updated and revised it considerably. I added an extensive history of NVIDIA’s legacy product, graphics GPU chips (which it invented over two decades ago), which now contribute a bit less than half its annual revenue (NVIDIA dominates the world market with an 80% market share, AMD has the remaining 20% pretty much). I also included a comprehensive discussion of it’s data center and supercomputer AI GPU chips, which contribute almost half of its annual revenues (NVIDIA dominates the world market with over 80% market share:

<<< Analyst firm Omdia recently ranked Nvidia at the top of the AI processor market with an 80% share in 2020, well ahead of its competitors.

The tabulation puts Nvidia AI processor revenue at $3.2 billion in 2020, an improvement from $1.8 billion in 2019. Omdia ranked Xilinx second with its FPGA products. >>>

Note that Xilinx was ranked #2. AMD recently acquired Xilinx and I had bought AMD six months ago for the impact on AMD’s AI & Robotics capabilities of the upcoming acquisition.)

Back to the NVIDIA introduction Header, I also included substantial discussion of NVIDIA’s Deep neural network AI models for autonomous driving and their current and upcoming implementations, e.g.:
Ride in NVIDIA's Self-Driving Car ( seven and a half minute video)

CEO Jensen Huang recently said in an interview:

<<< VB: What is your confidence level in the automotive division and how we are moving forward with self-driving cars? Do you feel like we are getting back on track after the pandemic?

Huang: I am certain that we will have autonomous vehicles all over the world. They all have their operating domains. And some of that is just within the boundaries of a very large warehouse. They call them AMRs, autonomous moving robots. You could have them inside walled factories, and so they could be moving goods and inventory around. You could be delivering goods across the last mile, like Neuro and others. All these great companies that are doing last-mile delivery. All of those applications are very doable, so long as you don’t over promise.

And I think there will be thousands an thousands of applications of autonomous vehicles, and it’s a sure thing. This year is going to be the inflection year for us on autonomous driving. This will be a big year for us. And then next year, it’ll be even bigger next year. And in 2025, that’s when we deploy our own for software where we do revenue sharing with the car companies.

And so if the license was $10,000, we shared 50-50. If it’s a subscription base of $1,000 or $100 a month, we share 50-50. I think I’m pretty certain now that autonomous vehicles will be one of our largest businesses. >>>



I also included substantial discussion of NVIDIA’s work on the Omniverse (NVIDIA’s version of the Metaverse), Digital Twins and Digital Avatars, future growth drivers of revenues and earnings. CEO Jensen Huang recently said in an interview:

<<< VB: And how are you viewing progress on the metaverse? It seems like everybody’s a lot more excited about the metaverse, and the Omniverse is always part of that conversation.

Huang: Let’s see. Omniverse for enterprise is being trialed and being tested in about 700 or so different companies around the world. We have entered into some major licenses already. And so our numbers are off to a great start. People are using it for all kinds of things. They’re using it for connecting designers and creators. They’re using it to simulate logistics warehouses, simulate factories. They’re using it for synthetic data generation because we simulate sensors physically and accurately. You could use it for simulating data for training AIs that are collected from LiDAR, radars, and of course cameras. And so they’re using it for simulation part of the software development process. You need to validate software as part of the part of the release process. And you can put Omniverse in the flow of robotics application validation. People are using it for digital twins, too.

VB: You’re going to make the biggest digital twin of all, right? [Nvidia plans to make a digital twin of the Earth as part of its Earth 2 simulation, which will use AI and the world’s fastest supercomputers to simulate climate change and predict how the planet’s climate could change over decades].

Huang: We’re off building or scoping out the architecture and building the ultimate digital twin.

VB: And do you feel like we’re also heading towards an open metaverse? Will it be sufficiently open as opposed to a bunch of walled gardens?

Huang: I really do hope it’s open and I think it will be open with Universal Scene Description (USD). As you know, we’re one of the largest contributors, the primary contributor, to USD today. And it was invented by Pixar. It’s been adopted by so many different content developers. Adobe adopted it, and we have gotten a lot of people to adopt it. I’m hoping that everybody will go to USD and then it will be become essentially the HTML of the metaverse. >>>

For those interested here is a link to the message board:

NVIDIA Message Board: Subject 24955

And finally, to get to the crux of the matter, Maxine learning and training deep neural network AI models, I included an introduction, presented below:

Deep Learning

Modern AI is based on Deep Learning algorithms. Deep learning is a subset of machine learning, which is essentially a neural network with three or more layers. These neural networks attempt to simulate the behavior of the human brain—albeit far from matching its ability—allowing it to “learn” from large amounts of data.

Deep Learning Algorithms for AI

The first one-hour video explains how this works. Amazingly, it is just least squares minimization of the neural network loss function using multi-dimensional Newton Raphson (gradient descent). See second one half hour video. Who thought Calculus would come in handy?

1. MIT Introduction to Deep Learning

2. Gradient Descent, Step-by-Step

Math Issues: Optimizing With Multiple Peaks Or Valleys



A problem with gradient descent optimization is that it can find minima of functions as well as maxima of functions. Worse, there can be multiple peaks and valleys, so more properly gradient descent finds local extrema. One is interested in machine learning, e.g., in global minima. This makes the problem considerably more difficult. This is particularly true since loss functions for deep learning neural networks can have millions or even billions of parameters.

Another problem has to do with the size of the data sets used to train deep learning neural networks, which can be huge. Since gradient descent is an iterative process, it becomes prohibitively time-consuming to evaluate the loss function at each and every data point, even with high-performance AI chips. This leads to stochastic gradient descent: the loss function is evaluated at a relatively small random sample of the data at each iterative step.

3. Stochastic Gradient Descent:

Any comments or discussion?

Slainte! (Irish for “Cheers!”)
Frank “Sully”

BTW, with St. Paddy’s Day coming up I’ve been posting Irish music (“Ceol na nGael”) on the Music Box message board, also moderated by Josh. Here are some links if anyone is interested:

St. Paddy’s Day Prelude: “Irish Eyes”. Here’s me on classical guitar with the “crowd pleaser” favorite “When Irish Eyes are Smiling”Message 33724643
St. Paddy’s Day Prelude #2. Cead Mile Failte! (Irish for “A Hundred Thousand Welcomes”).Message 33725002
St. Paddy’s Day Prelude #3. Cead Mile Failte! (Irish for “A Hundred Thousand Welcomes!)Message 33726503

Erin Go Bragh! (Irish for “Ireland till the end of time”)

Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext