GSI Technologies Appear Ever More Relevant To The Modern Computing Landscape

Mar. 15, 2022 11:48 AM ETGSI Technology, Inc. (GSIT)GOOG6 Comments5 Likes


  • The Gemini APU solves the Von Neumann issue which is why it's so much faster in sparse matrix operations where memory interaction slows things down on normal hardware.
  • While it's efficient with sparse matrices from an informational perspective, where normal hardware really fails to deal with even smaller information load, it's good for large matrices as well.
  • Google is shifting to a multi-modal model for its search, and I think that the Gemini APU would work well for these algorithms.
  • Looking for a portfolio of ideas like this one? Members of The Value Lab get exclusive access to our model portfolio. Learn More »
Google Announces EUR 1 Billion Investment In Germany, Including Renewable Energies

Sean Gallup/Getty Images News

Published on the Value Lab 12/3/21

This article is about GSI Technologies (NASDAQ:NASDAQ:GSIT), but it is also just as much about what Google (NASDAQ:GOOG) is doing with its search algorithms. I am a Data Scientist and not a Computer Scientist, so not an expert on hardware, but as I understand it the movement of Google from using BERT to multimodal methods in search would make the Gemini APU relevant if they could commercialise it soon enough. With the Gemini APU APIs being built out now, hopefully they'll be in time to catch this wind. GSIT is currently valued at only a hundred million in market cap, but if it became a major hardware provider to Google, there is probably at least a 5x opportunity here, if not 50x considering the breadth of applications and the growth in those applications. With the Gemini APUs ideal for search applications as well as recommendation discussed in previous articles, we think it could be a revolutionary play for yet another reason.

What is Google Doing that GSIT Could Help With?

Google is moving from using BERT to using a multi-modal system for understanding the information contained in a query.

What is BERT though?

My explanation for the layman is something like this. If you represent a word by some unique matrix of numbers, that key would not depend on the context of the word. Of course, the context of the word matters a lot. Think 'man bites dog' versus 'dog bites man'. BERT trains itself to guess what words are missing in sentences where things are randomly left out. Then it can understand future queries by training paying attention to all the elements in the sentence with reference to each other element in the sentence. Every time you need to train this model, it requires a series of matrix operations, and many permutations of these matrix operations for each word in a sentence because it's paying attention to the relations between words. On top of the representation of words already being a matrix, this results in a bunch of matrix operations. And this is happening at many heads in parallel and through many layers. So the operations are already complex on their own, and there is a matrix of these operations too, so there is a lot of dimension here that will require reaching into the memory and making computation. All this results in a model that can generate language, extremely convincingly where you would absolutely not know that it's an AI, meaning that it also has an almost human understanding of the intention of a query. This is what makes Google a good search engine. You don't need to query in a perfect way, the engine will infer meaning from your query to give you the results you want.

Now, imagine that in addition to all these operations, you add a whole other dimension to this. Imagine combining the meaning of words with the appearance of images. Much like words, images also have matrix representations. They are three dimensional. Two dimensions are the space, and the last dimension is the colour channel (RBG). The size of the image determines the number of pixels, each with an intensity value for red, green and blue. Lots of numbers and potentially big matrices. What MUM is, the multi-modal model Google wants to use now instead of BERT, is to combine an understanding of the meaning of a sentence with the values of pixel intensities in each channel that make up an appearance of an image. So in addition to all those matrix operations related to words, we massively increase the complexity of learning by combining that now with image matrices as well, which have a lot of values in them! In the future, even audio information could be given numerical representations and be further combined into these multi-modal systems to combine data of various types to enhance Google's understanding of queries.


GPUs and even TPUs work off Von Neumann architecture. This has inherent limitations where frequent reaches into the memory before performing calculations creates a bottleneck. The GSIT patent acquired from MikaMonu means this is no longer a problem, with memory in-place operations being possible. Because of all the computations that Google's models already do from memory, and with multi-modal systems likely to increase the complexity of the calculations in pretty much an exponential fashion, the Gemini APU which doesn't rely on Von Neumann architecture could change the game. The company remains self-financing, and we have hope that they can start delivering a commercial product soon. As long as their patent can stay protected, and that they create a product that can be commercially shipped, our understanding is that it will be useful for some of the most valuable applications in the world.

Of course, we don't know what will happen, and we are not experts in hardware. But we know enough to understand what this APU might be able to do for these companies. With massively faster and less energy intensive calculations, Google, Netflix (NASDAQ:NFLX) and Amazon (NASDAQ:AMZN) should be lining up at their doorstep once the Gemini is ready to be sold.

It's hard to say exactly how big the opportunity is. The Gemini is going to be orders of magnitude faster at certain applications, and likely to reduce energy usage consequently by 60-70%. Supposing the obtainable market can be built starting from Netflix's cloud computing costs in a bottom-up approach, consider their expense of around $30 million per month, or $360 million per year for AWS needs. Netflix is about 50% of the streaming market, so for the needs of streaming companies, which is substantially recommendation engines, the value of the market for Gemini might be around $720 million in streaming. Of course, streaming and their recommendation engines are just a small subset of the recommendation engines running on servers across all of ecommerce. Then there's also search, so looking at just streaming understates things. But supposing the $720 million as a super conservative figure and supposing a 10% operating margin, which is what GSIT had for its legacy SRAM business, you get a $72 million EBIT. GSI Technology is a $100 million company, so that means a 1.4x multiple on this forecast EBIT. It's very low, with semiconductor companies easily doing a 15x EBIT multiple which would value GSIT at $1 billion, suggesting a potential 10x opportunity. It could end up being much higher than this considering we only considered server costs with respect to streaming.

In any case, while GSIT gets its Gemini ready for production in foundries and shipping to servers of hopefully marquee customers like Amazon and Google, they are still managing to keep somewhat above water in terms of cash burn with their legacy businesses. Without R&D, their operating income would be about $3 million, but they are in cash burn territory with the R&D being above $20 million at this point while trying to develop APIs and libraries for the Gemini. So equity raises are in the cards with additional paid in capital growing by 20% since last year. Dilution is certainly non-negligible here, which is a risk. But the APIs are being worked on as we speak, and the company hopes to be able to get its product out to first customers in Q1 2022. After that, it shouldn't be more than a couple of years of dilution before the product is fully launched, hopefully as semiconductor shortages ease. With 10x, or perhaps 5x after two years of dilution, being the highly conservative estimate of upside, and markets available beyond streaming like for search and more complex Google multimodal algorithms, the opportunity remains very compelling as a small, speculative exposure.

If you thought our angle on this company was interesting, you may want to check out our service, The Value Lab. We focus on long-only value strategies, where we try to find international mispriced equities and target a portfolio yield of about 4%. We've done really well for ourselves over the last 5 years, but it took getting our hands dirty in international markets. If you are a value-investor, serious about protecting your wealth, our group of buy-side and sell-side experienced analysts will have lots to talk about. Give our no-strings-attached free trial a try to see if it's for you.

This article was written by

Author of The Value Lab
A long-only voice with eclipsing growth through 2020 and 2022 bear markets.

Valkyrie Trading Society seeks to provide a consistent and honest voice through this blog and our Marketplace Service, the Value Lab, with a focus on high conviction and obscure developed market ideas.

DISCLOSURE: All of our articles and communications, including on the Value Lab, are only opinions and should not be treated as investment advice. We are not investment advisors. Consult an investment professional and take care to do your own due diligence.

DISCLOSURE: Some of Valkyrie's former and/or current members also have contributed individually or through shared accounts on Seeking Alpha. Currently: Guney Kaya contributes on his own now, and members have contributed on Mare Evidence Lab.


Disclosure: I/we have a beneficial long position in the shares of GSIT either through stock ownership, options, or other derivatives. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.

Recommended For You

Comments (6)

To ensure this doesn’t happen in the future, please enable Javascript and cookies in your browser.
Is this happening to you frequently? Please report it on our feedback forum.
If you have an ad-blocker enabled you may be blocked from proceeding. Please disable your ad-blocker and refresh.