Seeking Alpha


Send Message
View as an RSS Feed
View brokeagain73's Comments BY TICKER:
Latest  |  Highest rated
  • Cliffs Natural Resources Could Be Worth Way More Than $1 Per Share [View article]
    Good point on the pelletizing. Cliff's delivers a high margin, highly marked up product in the US in comparison to the Australia and Brazilian miners. This is why they modeled in 0 sensitivity with respect to US ore even at 100/ton. Cliffs will do ok and limp along marginally profitable and cash flow positive until IO recovers in a couple of years.
    Dec 23, 2014. 12:06 PM | 3 Likes Like |Link to Comment
  • Update: AMD Earnings: Survive 2015, Aim To Thrive In 2016 [View article]
    @Vlad - not too familiar with the cross licensing situation, but presumably even if that isn't transferable there is the second option in which AMD sells off it's IP in parts, goes private etc. However, assuming you are right, well that doesn't leave many options for AMD now does it?
    Oct 20, 2014. 01:53 AM | Likes Like |Link to Comment
  • Update: AMD Earnings: Survive 2015, Aim To Thrive In 2016 [View article]
    If you any of you authors AE, Sean etc want to write something worth reading - write something the describes "day after". The point at which AMD needs to position itself to be acquired or it's IP sold off. I am curious what that might look like....just a thought.
    Oct 19, 2014. 11:35 PM | Likes Like |Link to Comment
  • AMD Earnings Announcement A Disaster [View article]
    @Fiberton - as are most tech companies (20%-35%) and that is exactly my point. There has to be at least some R&D being cut.
    Oct 18, 2014. 02:11 AM | Likes Like |Link to Comment
  • AMD Earnings Announcement A Disaster [View article]
    What part of the "organization" do you think that 700 person reduction will come from? If not R&D, in part, then where? You're deluding yourself if you believe its all from HR, accounting, legal, admin, sales and marketing and executive management. Logically, as a rule of thumb, you'd have to assume at least the percentage of headcount attributed to R&D is susceptible.
    Oct 17, 2014. 08:42 PM | Likes Like |Link to Comment
  • Nvidia And AMD: Higher Resolutions Could Trigger Higher Revenues [View article]
    @Justin - the margins and ASP for auto chips were mentioned on their last few CC's and a couple of Investor/analyst presentations. If I can find one ill post it.

    As for arguing both sides - that comment wasn't aimed at you. It was a generalization about the AMD investor community. I don't recall you ever doing this - but some of your readers are. I apologize for not being more specific.
    Jul 10, 2014. 09:33 PM | Likes Like |Link to Comment
  • Nvidia And AMD: Higher Resolutions Could Trigger Higher Revenues [View article]
    @Justin - "So around 170k Audi's are sold annually - even if we assume every single new audi has a tegra K1 at the heart of it's infotainment system, that's a fraction of a fraction of a fraction of smart phones:"

    You do realize that they are selling, in many cases, more than just the K1 chip? That is why the created Jetson on the hardware side and the software stack to support building instrumentation panels. I

    n addition to that autos present a multi-chip opportunity per unit and cellular connectivity.

    Finally, the TK1 that goes into autos is sold at higher margins - it's not an apples to apples comparison and not necessarily a volume argument. There are several estimates out there that make a case for this market to reach on upwards of 100 billion by 2025. That's not chump change even if the volumes are low. With NVIDIA's lead and the adaptability of their GPU to the solutions needed in autos, there is no reason they couldn't hold on to a nice portion of that market.

    On a side note - the points you make in your article to support a larger than expected upgrade cycle are also the same points one might make as to why the discrete GPU has quite a bit of life in it and why integrated CPU's aren't going to be good enough anytime soon for the high end market. Many AMD fans want to argue both sides of that equations - simply because it suites AMD.
    Jul 10, 2014. 01:57 PM | 1 Like Like |Link to Comment
  • Bitcoin/Litecoin Miners Leave AMD And Nvidia In The Dust [View article]
    @geek - define "excellent"?

    As I understand it their margins for consoles are on par or less than what NVIDIA sees with Tegra. Considering the volumes that isn't excellent if you ask me.

    And not to be a d1ck but you're wrong about MS and Sony paying for the NRE - that is typically split if not solely absorbed by AMD and since they don't disclose this you'll never know for sure but I can tell you when I worked with them - they stated the follow with regards to NRE. "With a custom solution we typically want to see something from the customer - just so we know they have some skin in the game." Most manufacturers will absorb some if not a large portion of the NRE since they are the supplier. The only time you see the customer pay all the NRE is when they get the IP and are only paying for the engineering - get your facts straight.
    Jun 27, 2014. 02:27 AM | Likes Like |Link to Comment
  • Bitcoin/Litecoin Miners Leave AMD And Nvidia In The Dust [View article]
    @xxx - I agree - AMD certainly isn't loosing money. Though, the margins aren't great either but nonetheless the argument is more about the appeal of consoles and the fact that they are subsidized in many cases which broadens their appeal somewhat - but you are correct AMD DOES NOT loose money on them Sony and MS do/have.

    I also agree that there is a very good chance that if the DNA of the console stays the same then AMD is the likely candidate for the win.
    Jun 26, 2014. 09:24 PM | Likes Like |Link to Comment
  • Bitcoin/Litecoin Miners Leave AMD And Nvidia In The Dust [View article]
    @Tri - it seems a few other posters have already posted on this subject. As Hanson says the consoles are sold at retail for less than the cost to the manufacturer - that is a top line loss and commonly referred to as a loss leader.

    However, since you asked...
    Jun 26, 2014. 07:25 PM | Likes Like |Link to Comment
  • Bitcoin/Litecoin Miners Leave AMD And Nvidia In The Dust [View article]
    @Rand - "Expensive? Not for what one gets." You've got to be kidding me, though I guess it could depend on what one values, but in've got to be kidding me? Oh and keep in mind these consoles are heavily subsidized in many cases. Let's be honest; you essentially get a PC that is bested by even mid-range PC rigs - threes years older. The software has just as much to do with the viability and success of the platform than the hardware - you have to know that.

    "If you do not think current console sales represent success, just what would it take to be a success to you?" You obviously, didn't fully read my comment. I don't disagree - but my comments were forward looking which was the context of the post I was responding to. The current gen is the here and now - but we don't just invest in the current scenario, do we?

    "In MY opinion, a lot of people just want to flip a switch to game, and they do not want the cost, insecurity, complexity and maintenance a PC inevitably involves." I don't disagree but, maybe your missing my point, but this comes in many form factors...and again read my earlier comments - cloud and ARM/Android gaming. Can't get any more turn key than that.

    "Well, yes, BUT most PC-level games ALSO need an x86, not an ARM, even on SteamOS. And only AMD can put both good graphics and x86 on an SoC." Rand, I am not sure what you reading into my comments but I am not talking about PC gaming per say. However, what do you think the consoles are if not PC's? Heres a hint - a PC.

    "Most NEW AAA games will be designed for x86 consoles with AMD GCN graphics, not ARM and not so much Nvidia. Consoles may be the best step up for smartphone ARM game players who want more." Oh like watch dogs and anything running over top of UNREAL 4? Again, come on.

    "And since the AMD SkyBridge project will support ARM with GCN graphics on an SoC, if ARM ever does become useful for console gaming, AMD has concrete plans to be there. " Sure AMD has no problem kludging together buzzword compliant products that fail to perform - NVIDIA has almost a 10 year lead here and current architecture that scales with proven economies.

    "Pot, meet kettle." Really, because AMD'ers have quite the reputation? seriously, it's like a religious mission for you guys, but hey its all opinion anyway.
    Jun 25, 2014. 06:53 PM | 1 Like Like |Link to Comment
  • Bitcoin/Litecoin Miners Leave AMD And Nvidia In The Dust [View article]
    @Tri - please "try" and understand. Consoles, as their DNA exists today, are, in my opinion, in their twilight years. They are expensive and clunky - so, though it may make sense if you create a traditional console to use AMD (I don't disagree), in the future, x86 will likely be only one of many choices. Cloud, and ARM based gaming products have plenty of potential here especially if used in conjunction - cost, portability, low power, and great performance are all factors of using the two. So, if you had the choice; why bother with x86 at all on the client? The only thing that is really keeping x86 in the game is that most PC games run on Windows. However, both NVIDIA and Google are pushing hard to make this NOT true much longer. If that does come to fruition then NVIDIA has the lead in a majority of these new markets. Now, AMD does have some upside but it shouldn't be thought of blindly just like the downside for NVIDIA shouldn't be taken blindly.

    All too many of you AMD die hards let your zealotry and greed overshadow some rather glaring but simple analysis. Despite the incessant rhetoric and claims of unfairness and unfortunate, cosmic bad breaks that your typical AM-Disciple loves to proclaim. NVIDIA isn't 3x's the market cap and infinitely more profitable (prior to the last 2 quarters) by chance - the company is a top notch engineering centric firm that is fundamentally run well! While AMD is chasing high cost semi-custom wins and competing with the likes of TI and freescale, NVIDIA is constantly reinventing themselves and driving new markets. Now maybe it's just me but I'd rather bet on the consummate leader and not the perennial underdog.
    Jun 25, 2014. 02:17 PM | 2 Likes Like |Link to Comment
  • AMD Is Giving Nvidia A Headache [View article]
    @rav - you are changing the argument, but that is fine.
    " You refuse to understand that nVidia is getting squeezed in three and quite possibly 4 arenas."

    That is because all the empirical evidence points otherwise. Continued financial growth, the emergence of new high margin markets, continued dominance in the discrete market, etc. All you seem to want to do is provide is hyperbole and conjecture to this discussion. "Could", "maybe", "possibly"...none of your statements are steeped in fact. Clearly the analysts agree with me:

    " The nVidia-Intel cross license agreement expires in 2016. With it is a loss of $250 million per year income for nVidia. " I already made my argument with respect to this - go back and read it again, maybe this time you'll understand it

    "Most analysts agree that the mid range market for GPU's will be substantially reduced as AMD APU's and Intel IGP evolves and becomes more sophisticated. "

    "The Justice Department order to Intel to support PCIe also expires in 2016. " So, what? I also made my case on why Intel won't abandoned PCIe without replacing it and also made a case on how NVIDIA may mitigate that.

    This is more BS from you trying to infer the bits in pieces from all sources to create a super charged world for AMD - provide me the links!?

    "And MAntle is not optimised for nVidia. Mantle has also been adopted by at least 40 studios developing games. "

    Please name them - because from what I am hearing and reading, Mantle is all but dead.
    "Mantle has also scared the hell out of Microsoft who is desperately trying to get Directx12 to market." I hope so because I would love to see MS dump the xBox and partner with IBM and NVIDIA to deliver games over the cloud - btw have you checked out GRID - it f'ing rocks! Better than console play, scales and moves rapidly - consoles are doomed!

    "Evidentlly Tegra just lost the Surface design win. Tegra is a disappointment.
    More conjecture - more guess work and rumor from an internet tabloid - that rumor hasn't been confirmed. By the same toke the K1 is rumored to be in Google TV and the new Nexus - codenamed Molly and flounder respectively.

    The difference here is that the rumor was based off of upstream change logs referencing both the code names and Tegra - which, once leaked were removed which tells me there maybe some truth to it...but until it's confirmed I won't count my chickens. My point is I can play the other side of the BS "perfect world" too.

    "AMD has for years been the best friend the IT consumer ever had. " I can't fully agree with this, this is more opinion based on my experience. What AMD does have going for it is that they are willing to do things really cheap to get business. I have first hand experience with this.

    " Intel is irrelevant in the Tablet and Mobile market because their silicon costs to much to fab. " OK so what is your point here? Wouldn't this bode well for NVIDIA considering they have a pretty solid chip in Tegra that can be used in lots of markets? How does this benefit AMD, as they have 0 mobile presence and no ARM products, yet.

    "And AMD has developed new innovative products and has an OEM presence in the new ARM server market with revolutionary and possibly disruptive scalable tech."

    I partially agree - AMD has done some new stuff but it isn't exactly groundbreaking. They seem to follow but jump on the bandwagon late - which is fine if you are the 800lb gorilla like MS because then you can, in most cases, buy your success, but with AMD that is a much dicier proposition. Nonetheless, you can make the same argument for n number of companies - Intel, NVIDIA, etc. They are all doing new and innovative stuff - though as investors we want to chose the companies whose innovations we feel have the best chance of sticking.
    May 19, 2014. 02:22 PM | Likes Like |Link to Comment
  • AMD Is Giving Nvidia A Headache [View article]
    @rav -

    It's becoming clearer that you are creating scenarios and a world where ONLY AMD can be successful. A perfect storm of improbable bad decsions and unlikely circumstances that come together to screw both Intel and AMD. You make it sound like Intel is going pull the rug out from underneath NVIDA, stating mostly your opinion, which is fine, but then turn that same big bad wolf into a bubbling puppy whose about to have it's @ss handed to the by the likes of AMD - a company on the verge of solvency, with low employee morale and a brand new CEO with a BS in Information Systems from Hartwick College????

    With that being said...

    "You are missing the point. The discrete gpu produced by nVidia to sell in the x86 space is dependent on old tech being sold in a mid-price point market. All new bleeding edge designs get replaced by new bleeding edge designs. "

    You're changing the argument again, the argument was regarding the viability and alternatives to PCIe and how it would affect NVIDIA. If that is your point then fine but you are assuming that somehow the world abandons PCI overnight and failed to acknowledge that the midrange market is not NVIDIA's sweet spot, which pretty much makes your last point moot.

    However, if you want to continue with this direction, all I can say is - NVIDIA already started down this path sum 8-10 years ago and how we have SoC with high GPU's that scale efficient, virtualized GPUS for the data center, automotive, etc. If you read back I already made brief reference to this.

    " The x86 AMD APU is eroding that mid-range market to the point that most analysts agree will be the end of the discrete gpu in about 3 to 5 years in anything but a workstation. " Again, conjecture and opinion, not to mention an argument that has failed to materialize for either AMD or Intel in the last 10 years, despite grand promises.

    "Without the x86 discrete GPU market nVidia can not pay to design new high performance silicon and expect to be competitive pricewise in the market." You clearly don't know what butters NVIDIA's bread. How many times do I have to say HIGH'FING'END?!!!!!?

    "Tegra is also a cheap chip nowhere near the power of anything AMD is producing for the x86 market "

    Yes because that is the market it is intended to sell into it, and it does it with a TINY 5-8W draw! Again, read my comments about how NVIDIA has invested IP in scaling those blocks. There is no reason, with PD, that NVIDIA couldn't create a chip on par.

    " or for that matter the ARM Opteron server market.""HA! That POS isn't even out and you're claiming victory!!!! Shall I remind you that NVIDIA was the first to consider this market with Denver and to answer your question YES!!! AMD is putting together some crappy stock core and calling it day, while PD was designed to scale. If you're going to rebuke tech be sure you are well informed, clearly you make statements off the cuff.

    "And yes AMD is producing ARM Operon servers to sell with SeaMicro tech. Can nVidia do that?

    12 core Opteron ARM oerver silicon is hardly a cheap chip and a much better point of a spear than Tegra. And the margins are substantialy higher."

    Start by reading my last paragraph and YES! You might want to listen to the last NVIDIA cc - however, I think it's a market that will yield little financial reward but hey even NVIDIA has their issues. Also, again there is no reason NVIDIA couldnt field a similar chip.

    "nVidia is about to get squeezed from both sides. Intel is very aware that it has no answer to this HD graphics design conundrum. Since intel does not design discrete GPU's they have no portfolio to use that's "paid for"."

    Sounds like more of the same rhetoric I have heard for years that never materialized, but hey you have your opinion and I have mine.

    "That is the writing on the wall for nVidia. " Cool, can't wait to see how much of that writing you actually read and understood, because if I remember correctly, you guys were all saying the same thing while pumping one of the umpteen new directions AMD was going in and how this was going to "spark" it turn around. Despite all that I wish you the best in your investments, and hope you make boatloads of money - if AMD does turn it around great but I wouldn't bet too heavily against NVIDIA - they are doing some really cool stuff if you ask me and they seem to know the business side well.
    May 19, 2014. 01:17 AM | Likes Like |Link to Comment
  • AMD Is Giving Nvidia A Headache [View article]

    "Actually you might want to rethink this abit from a different perhaps less "wordy" perspective."
    Thanks for the advice but ill stick to my original statement - I've used both and I am stating, in part, from experience.

    "Any comparison of CUDA to OpenCL; all things being equal algorythmically begins to fall apart when you consider that OpenCL will likely "work better" with AMD's x86-64 enhanced instruction set."

    Perhaps you've never used either? The algorithms used will affect performance and vary from architecture to architecture due to the fact that how a developer chooses to implement such an algorithm will directly effect how that high level code is translated in to instructions. Heavy use of floating point vs integer arithmetic or something like FFT.

    "OpenCL will likely "work better" with AMD's x86-64 enhanced instruction set."

    Arguing in conjecture is moot - "will likely" is not strong enough conviction nor a plan for me to believe it earnest without empirical data. Simply taking the things that are in AMD's wheel house and "guessing", "hoping", "praying" they make a difference on a somewhat equal playing without substantive data is not helping your argument.

    " Mantle most assuredly will have x86 extensions and I do not see it unlikely that OpenCL will not have the same benefit" OpenCL and Mantle serve different (though some overlapping) needs. Whether AMD decides to mature Mantle in a parallel computing platform is yet to be seen.

    "Besides a very good piece last year on The Register places OpenCL and CUDA on the endangered species list for some very good reasons."

    Then why bother make the argument in favor of OpenCL if you believe that?
    Besides the entire 500 word article (yes very substantive) is told with an AMD biased - the Article's title starts with: "AMD Thinks...". If you want to pull articles from the internet please provide more objective ones - I am sure you'd feel the same way if I pulled an article entitled "NVIDIA says console gaming is dead".

    "neither AMD nor Nvidia can get away with prescribing a programming language to developers and expect widespread adoption."

    NVIDIA has gotten away with it in part. CUDA is very widely used in the arena it was intended for. I am not sure widespread adoption was their initial intent - the intent was to provide a tool which allowed a special niche to take advantage of the GPU's highly parallel workflow, though it does seem like they want it to be more ubiquitous.

    Because of that NVIDIA continues to work at abstracting the nuances of the GPU away from the developer. As of today I can, using VS, create a CUDA app with no need to write any CU/GPU specific instructions and on top of that there are plenty of pre-built libraries I can simply drop in that provide me tons of common functions. Doing already what Lewis is claiming AMD WANTS to do. His entire argument about "having to know" the GPU is a bit hollow as you can abstract that stuff away by adding layers as you mature the product. That is how tooling is built.
    May 19, 2014. 12:49 AM | 1 Like Like |Link to Comment