Recent

% | $
Quotes you view appear here for quick access.

Advanced Micro Devices, Inc. Message Board

hei4me 57 posts  |  Last Activity: 8 hours ago Member since: Nov 8, 2004
SortNewest  |  Oldest  |  Highest Rated Expand all messages
  • The significance of Knights Landing over its predecessor is it is a stand alone parallel processor unit which does not need a Xeon CPU in order for it to function. This gives Knights Landing an advantage over Nvidia's Tegra HPC accelerators which also needs Xeon. Now you know the rest of the story, when it comes to why Nvidia joined the IBM PowerPC consortium Nvidia had no other choice it was up against Knights Landing going forward.

  • Reply to

    AMD still has not started to use 20nm process tech

    by usforced Mar 25, 2015 8:32 AM
    hei4me hei4me Mar 25, 2015 10:45 AM Flag

    I believe you should read what is printed instead of running off at the mouth.

    It is clear you do not understand the difference between low power and high power processes. it is all in the wattage the chip will perform using. Low power means just that the chips in mobile phone will perform using low power while the high power chip will perform using high power such as a discrete GPU. If TSMC does not offer a High power 20 NM then there is no way that AMD or Nvidia can move to the 20 NM process.

    I know it is common to hear that the 20 NM process was too expensive for AMD and Nvidia but that was just made up to save face. TSMC in order to maximise profit and at the same time serve a new master APPLE abandoned the high power 20 NM process leaving Nvidia and AMD hanging.

  • Reply to

    Mullins Is Great For Game-Centric Tablets

    by usforced Mar 25, 2015 7:38 AM
    hei4me hei4me Mar 25, 2015 10:27 AM Flag

    The truly gullible AMD fanboy would believe Papermaster, we will not sell into the Chromebook market while at the same time selling into the console market where margins are even slimmer.

    If Mullins was all it was supposed to be it would be a perfect fit for Chromebook. since the key is to produce it cheap in order to service a low ASP market. AMD and customers realize Mullins has other failures which keep it out of the Chromebook market. Papermaster was too embarrassed to name those failures instead he made up some rubbish.

  • Reply to

    AMD still has not started to use 20nm process tech

    by usforced Mar 25, 2015 8:32 AM
    hei4me hei4me Mar 25, 2015 10:12 AM Flag

    It is amazing how you can miss the point of all this!!!

    It does not matter if Samsung or TSMC moves down to 16 NM or 14 NM nodes if those nodes are just like 20 NM and are low power while AMD needs high power nodes for their business.

  • Reply to

    AMD still has not started to use 20nm process tech

    by usforced Mar 25, 2015 8:32 AM
    hei4me hei4me Mar 25, 2015 9:56 AM Flag

    "SMC’s 20nm manufacturing process was designed for low-power mobile system-on-chips for smartphones and tablets. At present it is used by Apple, Qualcomm and some other customers of the world’s largest contract maker of semiconductors."

    Its not a matter of AMD skipping 20 NM it is a matter of TSMC skipping AMD. TSMC is now catering to the likes of Apple.

    Who in his right mind expected anything different when it came to foundries. When foundries need to make a name for themselves they take care of all their customers but after the initial pecking order is in place the larger customers get the gravy and customers such as Nvidia and AMD who have waited 4 years for a process shrink are left holding the bag.

    Its much worse for AMD then Nvidia since Nvidia understood TSMC's 28 NM process and created GPUs which grew performance while at the same time used less power. AMD on the other hand it seems did not understand the 28 NM process since their GPUs may have grown performance but at a terrible price of extremely high TDP. Nothing could spotlight this AMD failure more then the two years that have gone by in which AMD has not responded to Nvidia's new GPU designs

  • Reply to

    Freesync is a scam

    by hei4me Mar 24, 2015 1:48 PM
    hei4me hei4me Mar 24, 2015 2:34 PM Flag

    I do not know if you realize this or not but Gsync and If Freesync worked, are not intended for systems that can power 2K or 4K displays. But I guess if you went Nvidia Gsync you could get away with using a slightly less powerful system due to Gsync low end masking abilities.

    Anyway I am not on either the Gsync or Freesync bandwagon since everything I read about why you need those technologies are made up horse dung. The reason why modern game stutter is due to the graphic ram being used up and the time it takes for the GPU to flush and replenish the memory for the next frames. That is the reason why the Titan X has 12 GB of GDDR5 memory thus minimizing the flush and replenish cycles in turn minimizes stutter.

    The first Game in which this memory stutter was most pronounced was the game Fear in the era of 128 MB, 256 MB and 516 MB cards. The web was alive with gamers trying to fix the Fear stuttering problem on their high end cards but their was no fix other then to turn down settings the same with Crysis.

    You would need to ask yourself why would Nvidia and AMD pay the cost of high amounts of GDDR5 memory if lack of memory was not the cause of gaming stutter?

  • Reply to

    Freesync is a scam

    by hei4me Mar 24, 2015 1:48 PM
    hei4me hei4me Mar 24, 2015 2:08 PM Flag

    I know you realized these are the first Freesync displays to hit the market thus these displays have been shepard by AMD from the get go. As I said this explains the slow introduction of Freesync displays, who would want to be the first on the block to bring back ghosting to the gaming world.

  • Read the PC perspective Article about the new Freesync monitors and how they perform. I came away with the perception that Freesync does absolutely nothing and is the reason why Freesync can allow multiple outputs.

    Here is a clue for you Gsync by Nvidia was developed in order for Nvidia to enter the PC console market with Nvidia's Shield brand. With Gsync Nvidia envisioned low power shield consoles being able to appear to enable smooth gaming on AAA gaming titles. This is done as PC perspective explained by rehashing the same frame perhaps 50 times while waiting for the next frame to render. Thus what appears to be smooth frame rates even though the GPU is performing like a dog. Freesync has no low end thus you would receive the stuttering and jitter expected from a game performing below minimum.

    Why did I bring this up well simply put Freesync will not help AMD's APUs game any better then without Freesync since without a low end as Gsync has for weaker systems the stuttering will continue for AMD APUs.

    In other words Nvidia's Gsync masked underpowered system performance while Freesync does nothing a all for low powered systems.

    As far as the ghosting issue PCper found with Freesync displays this is a matter of these displays being tuned out of their ranges in order to adapts to Freesync. Perhaps is the reason why Freesync displays have taken so long to debut. The worst part of all this is Ghosting is a thing of the past considering the ultra fast refresh rates of modern displays. How unruly could Freesync be to bring back ghosting from the past to the present?

  • hei4me hei4me Mar 24, 2015 10:58 AM Flag

    A better question is what happened to the huge Verizon deal that was promised two years ago?

    One thing you can depend upon with AMD pumpers, they will never keep the companies feet to the fire when it comes to announcing new customers but then like a puff of smoke they are gone!!!!

  • hei4me hei4me Mar 19, 2015 11:07 AM Flag

    After many years of calling the so called bashers on this board idiots for pointing out the problems with Bulldozer we now have pumpers throwing Bulldozer under the bus.

    To make this sudden turn around of allegiance far worse, pumpers have no apologies for leading the newbie AMD investors down the Bulldozer primrose path. All they offer is turning their backs on todays technology in order to pump tomorrows technology. Funny though over the years the present technology was pumped as much as tomorrow's technology.

  • Reply to

    Interesting article on AMD

    by mykhello Mar 13, 2015 8:51 PM
    hei4me hei4me Mar 14, 2015 8:30 AM Flag

    What I find weak about the article is Mr Lufkin at no time mentions that AMD has slashed R&D back to 1990's levels. Perhaps when he mentioned that AMD was nimble it was an optimist view of slashing R&D, management, and workforce many times in the last couple of years?.

    I would rather believe Mr Read, who in a rare moment of truthfulness, stated that going forward AMD investors would suffer many bumps in the road. Mr Read payed for his candidness by being replaced shortly after. But since that time AMD's revenues have fallen from $1.6 billion to $1.24 billion showing that Mr Read was right and the Board just like Mr Luskin are snake oil salesmen who would rather imagine how things could be then face reality as Mr Read did.

    And by the way Apple has always been an OEM but now they are a vertically stacked corporation that sells complete systems in which many components are Apple produced. To compare AMD to what APPLE once was is ridiculous since it would be the same as comparing Intel to HP or Dell . Intel as well as AMD are component suppliers, Apple, Dell, and HP are OEMs

  • hei4me hei4me Mar 13, 2015 11:58 AM Flag

    Everything you wrote you point out as positive but in reality everything you wrote has negative connotations attached.

    1. " AMD is spending less money" would be a good thing if there wasn't global competition out there that are spending on R&D at a extremely rapid pace. AMD lowered spending on R&D has not been seen since the turn of the century,

    2. "It has less debt" Only true if you are speaking of the $5.4 billion owed after the ATI closing but if you are speaking of the present you are mistaken, AMD's debt has grown by $200 million during the last two years. Which reduces the savings that should have come from the refinancing of the debt earlier.

    3. "It has better current assets" Now this is a bald face lie since AMD has sold off every hard asset it owned and is left with IP assets that are only tangible if AMD products perform, if not other companies will make AMD's IP worthless.

    4. "the long term debt is better distributed along the time" This depends upon the eyes of the beholder some may say that extending out debt makes it seem to go away but other would say extending out AMD's debt just makes the quarterly interest payments go on and on with the balloon payment just below the horizon.

    5. "It has a better income from new activities. And the focus of the CEO is to change to capitalize new opportunities in the future market of semiconductor." Sorry to remind you but the custom chip business is a loser that brings in bare revenue but does not grow the company's bottom line. Some may argue that it keeps the lights on but I would argue if AMD needed to reduce R&D spending so dramatically in order to show near break even numbers I would argue it is cuts that are driving AMD's financials and not the semi custom unit.

  • Reply to

    relentless AMD attack by sellers once again

    by hard_primer Mar 12, 2015 10:44 AM
    hei4me hei4me Mar 12, 2015 11:13 AM Flag

    Clue for you guys Intel dropped Q1 2015 revenue forecast by a billion siting PC slowdown. Now in the best of times AMD performs in lockstep with Intel when it comes to the consumer PC market. But these aren't the best of times for AMD and their consumer PC market, so if Intel is warning lower revenues for PC you can be certain in ratio that AMD will suffer lower PC sales and then some.

    So my friends the uniformed such as yourself are blaming AMD's stock price drop on weak hands but in truth, they without AMD saying a word see the writing on the wall.

    Truly the writing on the wall is not etched upon a low and not so wide wall but it is etched upon a neon billboard size wall. Since AMD's computer PC unit has had falling revenues now for over 4 years running

  • Reply to

    More worries for AMD

    by hei4me Mar 11, 2015 11:59 AM
    hei4me hei4me Mar 11, 2015 2:49 PM Flag

    It is not really Apples to Apples since all test are run on benchmarks that should favor ARM since they are not X86. very similar to tech writers saying Intel mobile processors are slower then their ARM counterparts but that does not take into account that Intel's mobile processors still function using Android or X86 OS same with these Microserver benchmarks, if they like 95% of the server world were using X86 Operating system benchmarks Intel's CPUs would sing and X-Gene 1 would get a big fat N/A.

    This is another reason other then X-Gene 1 doing so poorly when it comes to Perf/wattage, who is going to spend money on purchasing the microserver and then having to write or purchase apps that work with X-gene 1? There may be a few ARM IT die hard loyalist out there but 99% of It personnel are going to stick with the horse that brought them instead of being talked into purchasing a flea bitten Nag.

  • If everything else was not bad enough it seems 64-bit ARMv8 ISA, the AppliedMicro X-Gene 1 is a watt sucking under achiever as compared to all the hype about Micros servers using ARM64.

    According to Anandtech's Johan De Gelas The X-Gene 1 performance is less then the Atom based C2750 while consuming the same or more power as a 10X better performing Haswell based Xeon.

    Now you know the rest of the story on why AMD is not in a hurry to release its own ARM 64 micro server chip. I can remember a time when the hype masters were in full swing but now when it comes to AMD ARM 64 the silence is deafening.

  • hei4me hei4me Mar 10, 2015 8:49 AM Flag

    Its hard sometimes to read reports that are half truths such as First time AMD has licensed it Radeon graphics technology. How about the Xbox 360 were AMD still receives royalties for their GPU inside? or how about the Nvidia and Intel deal in which over a 6 yr period Intel pays Nvidia $1.50 billion for Nvidia's Graphics technology.

    Or how about the lawsuits Nvidia has against the some of the ARM SoC graphics companies over using Nvidia IP.

    You can thank Nvidia for AMD's MediaTek deal since MediaTek needing Graphics IP did not want to go the infringement route as their Arm brothers have and sooner or later the chickens are going to come home to roost and they are going to have to pay Nvidia for its technology.

    Quite funny how the writer mentions Nvidia has not licensed a single technology since 2013 but does not mention that the entire Arm ecosystem infringed upon Nvidia instead of licensing from them.

  • Reply to

    Freesync is ready

    by usforced Mar 6, 2015 9:20 AM
    hei4me hei4me Mar 6, 2015 10:38 AM Flag

    For same reason why Mantle has been placed in hibernation Freesync will suffer the same fate. AMD does not have enough market share and having Freesync only work with the latest AMD APU and cards only reduces that market share. Just as Mantle, if AMD did not pay developers to code for it it was not used and with Freesync displays will not come with the special display scaler chip if the AMD market is not large enough to support it.

    Also Freesync is Mantle's kissing cousin both are used to give under performing CPU/GPU a boost Good for AMD but not needed when you purchase a Intel/Nvidia or Intel/Radeon system.

    I can see now why Nvidia introduced G-sync since Nvidia with Shield is becoming a seller of complete systems. The problem with those Shield systems they are very weak performers and need the stable medium frame rates G-sync display could offers.

    This could also be true with consoles if the console CPU/GPU supported Freesync. They sell enough consoles for display producers to make it worth while to add the display scaler chip.

  • Reply to

    Mantle Ends Up Like All AMD Projects: DEAD

    by bowjagger Mar 3, 2015 8:33 PM
    hei4me hei4me Mar 5, 2015 9:43 AM Flag

    The correct story as far as AMD's Mantle, Mantle was a gimmick that created a graphics filter that lowered visuals in order to gain CPU performance. What do you think Microsofts closer to the Metal Xbox 360 does in order to play modern games? The Xbox 360 takes what is supposed to be DirectX 10 or 11 game and lowers the visuals to DirectX 9 levels.

    Since it is ATI's unified shader unit GPU inside the Xbox 360, I would imagine AMD inherited this closer to the metal knowledge on how to create the same filter for PC as Microsoft uses for Console.

    Up to just recently there was no need for a Graphics filter to be used in the PC arena but with the Bulldozer fiasco AMD has lost the ability in the PC CPU/GPU arena to run DirectX 10 and 11 efficiently thus AMD was forced to pull the console's close to the Metal rabbit out of their hat for PC.

    Just like Consoles have lowered Game visual standard for the last 7 yrs Mantle would have done the same for the PC lowering visual standards in order to allow AMD's faulty CPU to game slightly better.

    This is how I see it if you want visually filtered console gaming purchase a Console but if you want PC gaming purchase an Intel/ Nvidia or Intel/Radeon gaming PC which allows present day DirectX gaming.

  • Reply to

    Mantle Ends Up Like All AMD Projects: DEAD

    by bowjagger Mar 3, 2015 8:33 PM
    hei4me hei4me Mar 5, 2015 2:26 AM Flag

    Mantle was a gimmick that showed very little extra performance outside of what a driver refresh delivers. That is the reason why at the same time AMD released Mantle Nvidia did its own driver refresh and soundly tore Mantle a new one.

    As far as Microsoft is concerned AMD did not influence Microsoft in any way since DirectX 12 was due a year or two ago but Microsoft placed it on the back burner for two reasons, first PCs were getting less powerful as companies such as HP and Dell configuring 5 year old legacy components and branded them as new systems. This is called good enough computing!!!

    Also Microsoft in order to deliver a OS that would not bog down tablets or low end net-books needed a bare bones version of Windows that did not have a demanding graphics API such as DirectX 12.

  • hei4me hei4me Mar 4, 2015 10:05 AM Flag

    With the Mantle obituary, another blaze that the AMD fire company is going to let burn out!!!!!

AMD
2.65+0.02(+0.76%)Mar 26 4:00 PMEDT