% | $
Quotes you view appear here for quick access.

Advanced Micro Devices, Inc. Message Board

hei4me 21 posts  |  Last Activity: Jul 6, 2016 11:03 AM Member since: Nov 8, 2004
SortNewest  |  Oldest  |  Highest Rated Expand all messages
  • Reply to

    Is Nvidia’s GP104 based GT1060 real?

    by mark1952 Jul 6, 2016 9:52 AM
    hei4me hei4me Jul 6, 2016 11:03 AM Flag

    This is about the dumbest story ever since the 1080 die is the cream of the crop of the wafer, the 1070 is a die that did not make it to the 1080 standard and every other die that comes off that wafer will be lower standard dies to the 1080 or the 1070.

    You can not have it both ways claiming low yields of the 1080 and then say their can not be a 1060 because the 1080 takes up all of the available dies. The lower the yield for the 1080 or 1070 the higher the yield for the 1060.

    The only reason why Nvidia did not release the 1060 was to protect ASPs of the 980 and 980 Ti since it is obvious the 1060 will be every bit as good as those cards since the 1080 and 1070 have such a performance advantage over them.

    In other words do not depend upon this article to make investment choices since you will be a very sad investor in deed.

  • hei4me hei4me Jul 6, 2016 10:48 AM Flag

    No this is what Nvidia did to lower their wattage ratings, Nvidia lowered their consumer GPGPU (General Purpose Graphics Processor) abilities long ago. It was AMD that continues the highly consumer wattage inefficient GPGPU of the 290 series with the 480.

    To make matters worse AMD has not been able to turn GPGPU off when their is no need for GPGPU which is 99% of the time making AMD cards power hogs when compared to Nvidia .
    AMD's first attempt at lowering consumer GPGPU abilities was the Fury Series which found by lowering GPGPU to Nvidia standards you had a card which performance and wattage was far better then any of the 290 or 390 GPGPU dependent cards. For the gamer who is Number one with a bullet when it comes to purchasing Graphics cards the Fury was the first attempted by AMD to be competitive with Nvidia in the consumer space.

  • Reply to

    AMD Radeon RX 480 8GB Review

    by usforced Jul 5, 2016 7:01 AM
    hei4me hei4me Jul 5, 2016 10:15 AM Flag

    You do realize that the 1080 gives you more then twice the performance of the 480 while only having a 4W power consumption difference. Meaning that the 480 is 100% less efficient then the 1080 or 1070

    I would say the guy who wrote this article is a AMD fanboy or is on the AMD payroll since does not realize the 480 isn't in the same league as the 1080 or 1070 and should not be compared.

    As we seen with the Anandtech review the 480 was compared to AMD's previous 290 and 390 series cards and it only could match up with the 4 yr old 290 making the 480 only an upgrade card if you have a 4yr old 290 or an older card.

  • hei4me hei4me Jul 4, 2016 12:54 PM Flag

    Did you make those wattage numbers up yourself. The TDP of the GTX 750 is rated at 60 watts. Also Nvidia did not tell everyone that the 750 Ti was a world beater it was a $99 dollar card for first time gamers to use.

    It was AMD that spoon feed gamers that their 480 was all gamers would need, when AMD is caught fudging the number, by over tuning these review cards AMD fanboys are trying to find instances where others have done the same.

    Well you need to look no further then AMD themselves fudging the Fury X tuning higher until they needed water cooling and then had no head room in order for consumers to overclock their purchased Fury X cards. The 480 suffers from the same problem tuned so high that they exceed the 6 pin connector and PCIe standard thus negating the headroom purchasers would need to overclock their card.

    It's time for AMD fanboys to admit the problem AMD already has and has offered a BIOs fix lowering the tuning.

  • hei4me hei4me Jul 3, 2016 2:15 PM Flag

    Maximum PC magazine August 2017 edition has at the end of the issue Budget, Midrange and Turbo builds which highlight an 1070 for the Midrange and a 1080 for the Turbo. The Budget build highlights a R9 380 4GB for $190.

    So you see the PR department of AMD may claim their will only be demand for the 480 but as we see already AMD already has a 380 GB 380 that sells for $190 what is the sense of receiving an additional $9.00 for a 480 which undercuts not only the 380 but the 390 and 390X?

    What we have here is AMD fanboys having no idea of the AMD price policy thus the idiotic joy they found with the 480 selling so cheaply. LOL

  • hei4me hei4me Jul 2, 2016 3:37 PM Flag

    Hide your head in shame, since the six pin connector was supposed to have given the 480 all the additional power the 480 would need.

    Since the issue is found on the PCIe bus all anyone with half a brain would need to realize is the problem is based upon AMD ratcheting up the power higher then the ability of the 6 pin connector to deliver thus the card draws all additional power through the PCIe bus.

    In other words not only did AMD overclock these cards higher then what the 6 pin connector could deliver they also altered the succession of how power was to be delivered. First the PCIe slot 75 watts then all other power needs would come from the 6 pin connector. I believe if anyone could check the 6 pin connectors power draw they would see that the 6 pin was also maxed out.

    Thus the only Moron here is yourself where you take what AMD says at face value due to the fact you know nothing about the subject at hand.

  • hei4me hei4me Jul 2, 2016 2:02 PM Flag

    So AMD calls it Tuning everyone else calls it overclocking.

    There's nothing worse then a company taking a middling performance component ratcheting up the voltage and current to have it perform better then it should. then when they sell these same components to consumers or OEMs they are tuned much lower.

    Then AMD admits, all too quickly, they tuned review samples higher then the standard for a PCIe slot and 6 pin power connector are rated for.

    We can only thank individual such as they have over at Tom's Hardware for exposing AMD since if not, when the things went wrong or the consumer's performance did not meet the review sites performance, blame would have been placed on anyone else's shoulder other then AMD

  • hei4me hei4me Jul 1, 2016 12:25 PM Flag

    Just like FuryX AMD right out of the gate, AMD has overclocked the 480 to the maximum.

    This will not be a problem for those that purchase their own motherboards which can handle the additional wattage on the 12 volt PCI lane but for those purchasing a lowest common denominator OEM system with the 480, those OEMs will underclock the 480 to keep that discrete card from crashing the entire system.

    In other words, just like the reported 480's BIOs fix by lowering the 480 power consumption, OEMs will do the same and normal performance of the 480 will not be as high as reported by review sites presently.

  • hei4me hei4me Jun 30, 2016 11:38 AM Flag

    The main problem with the 480 is it does not differentiate itself from the 290 or 390 that is currently on the market. Thus AMD has undermined all of the 300 series pricing without giving gamers a card that significantly outperforms those older and more expensive cards.

    This is the same thing AMD did to it Radeon cards by selling a console chips that outperforms all but the highest end AMD PC systems thus cannibalizing margins and ASP of those systems.

    Over at Tom's Hardware they didn't even bother with testing the 480 against the 1080 or 1070 since by doing so the reader would have clearly seen that by purchasing the 480 you were actually purchasing at best a status quo card of two generation ago and not a next generation performance card.

    Outside of AMD cannibalizing itself as the consoles have done the 480 also reminds us all of AMD's fatal flaw of extending the lifetime of bad designs such as Bulldozer. The 390 and 480 are nothing more then a rehash of the 290 which has served to lose 60% of AMD's market share to Nvidia since the launch of the 290.

    One last point do you guys remember all the talk of Nvidia not getting their hands on HBM 2 memory well what happened to AMD when it comes to HBM 2? In other words, no one other then Nvidia's super server excellorater cards have HBM 2 thus the consumer cards will have to wait until 2017 and beyond thus making a mockery out of what was written here about the HBM 2 subject.

  • hei4me hei4me Jun 27, 2016 9:44 AM Flag

    trying to rewrite AMD/GF history there Tom?

    If you have not paid attention AMD was a $1.6 billion dollar a quarter revenue company now they are below $900 million. During this time AMD has not once covered their yearly contractual Wafer agreement with GF. In other words AMD every year has stiffed GF at least $200 million a year, GF for the last three years have forgave AMD of that obligation.

    Does that sound as if GF is the problem or is it AMD designs are truly of poor quality?

  • hei4me hei4me Jun 27, 2016 9:44 AM Flag

    trying to rewrite AMD/GF history there Tom?

    If you have not paid attention AMD was a $1.6 billion dollar a quarter revenue company now they are below $900 million. During this time AMD has not once covered their yearly contractual Wafer agreement with GF. In other words AMD every year has stiffed GF at least $200 million a year, GF for the last three years have forgave AMD of that obligation.

    Does that sound as if GF is the problem or is it AMD designs are truly of poor quality?

  • hei4me hei4me Jun 15, 2016 8:25 PM Flag

    Not crying just giving you a chance to hop off the bandwagon before you get taken to the cleaners once again.

    For the last eight years AMD fanatics throw, all products previous to the announced products under the bus, only to have the new products disappoint the heck out of the market.

    Anyway I believe AMD for the sake of the X86 nation better get off their rear and come up with better product to start helping Intel fight off the ARM hordes. Notice I did not add Nvidia with AMD and Intel since as far as I am concerned Nvidia is part of the army of Arm's invaders trying to overthrow X86.

  • hei4me hei4me Jun 15, 2016 3:33 PM Flag

    Beings that both of you have the patience of a saint and have followed the Zen and Polaris sagas from the onset, I believe late 2012, you do not have to read the article since deep down you already realize that Zen and Polaris as everything else that came before it, could be the same a failure.

    I was especially impressed at Mark for bringing up what AMD said about their mobil Polaris GPU Imagine that a 2017 AMD mobile APU that performs better then a console APU released in 2013. When I read that I almost fell out of my chair in laughter. What really makes me laugh is that before that APU is released AMD will have already produced a better console APU for Sony and Microsoft leaving the poor AMD mobile owner sucking wind as the previous 4 yrs.

  • hei4me hei4me Jun 10, 2016 1:21 PM Flag

    Love it when AMD fanboys use a game AOS which AMD paid the developers to make it a slanted AMD game.

    I really do not care either way but fanboys stop the crying when Nvidia offers monies and software to gain advantages over AMD GPUs.

    By the way don't you think it is strange that no finished DirectX 12 games are out yet even though Win 10 has been offered for more then a year now? A bunch of blockbusters were released during that time but the developers did not switch from DirectX 11 to DirectX 12. Could it be the developers over reliance on Console games sales keeps them from moving on ?

    Here again AMD kicks themselves in the rear, makes the console chips good enough for millions using DirectX 11 or what ever, while trying at the same time to convince PC gamers that DirectX 12 and AMD's Radeon is the future. LOL

  • hei4me hei4me Jun 9, 2016 1:01 PM Flag

    so what this writer is implying , the 480 beats the 980 Ti then it also beats the FuryX by a whopping amount?

    If this was true then why is it the 480 is priced at $199 and the Fury series at $600?

    From what I have read through notable tech sites, the 480 will be a little better then the 380 and not as good a performer as the 290 0r 390 leaving no apparent reason to upgrade if you have one of those AMD cards.

  • Reply to

    RX 480 vs GTX 1070

    by seavanus Jun 2, 2016 12:05 PM
    hei4me hei4me Jun 3, 2016 3:23 AM Flag

    Another hilarious angle to the 480 story is that AMD fanboys claim that the 1080 performance is just on par with the additional frequency over the 980 Ti just a bump they say..

    If this is true then our AMD fanboys should try to explain how is it that AMD's 480 has a lower frequency using a smaller process shrink and and to make matters worse has less then half the performance of a 1080.

    But hei4me the 480 cost less $199 to be precise. That is no badge of honor when you take into account that the performance of the 1080 is 30% better then a Titan X at $400 less. Now that is a real Performance to cost winner for Nvidia.

  • Reply to

    RX 480 vs GTX 1070

    by seavanus Jun 2, 2016 12:05 PM
    hei4me hei4me Jun 2, 2016 2:05 PM Flag

    Funny that AMD guys will tell you they would not pay so much for a high end graphics card when the Nano Fury card sold for just a few dollars less then a 980 Ti. The Nano gave you half of the performance of the Ti but almost the same price.

    when Fixusanow says that he would place two 480's instead of one 1080 he may do so but the vast majority of Graphics cards are purchased by OEM and unless you as a customer customize when ordering OEMs will not, due to the power usage and cost coming within a few cents of a single 1080.

    Another point by the time AMD actually starts selling 480s, Nvidia will have less powerful but cheaper 1000 series cards ready for wholesale to OEMs and retail for add on partners forcing AMD to once again lower their prices.

    All that blah blah hype by the Radeon Graphics Group have been proven to be just that Hype!!!

    The truth is AMD has in Intel language created a Tick, the same design but on a new process shrink.

    I take that back AMD has created a half tick where they did not take advantage of the process shrink as Nvidia has. Tells you a lot about how poorly 14 NM is at GF/Samsung's when compared to the TSMC 16 NM shrink.

    Taking into account that both TSMC and GF/Samsung process shrinks are not true 14 NM and 16 NM processes since the die sizes are 20 NM size and not 14 NM and 16 NM, AMD in die size only gains from 28 NM to 20 NM losing half of the benefits of the 14 NM shrink.

  • hei4me hei4me May 15, 2016 1:13 PM Flag

    Same old same old, AMD can not help but spinning truth into AMD truth. Production is in the eyes of the beholder and Nvidia beat AMD to the punch with Pascal while AMD is still spinning a weak 290/390 Polaris, which no one really knows the launch date of. To make matters worse the Fury based Polaris is sometime in 2017. Which means no competition for Nvidia for another year or so.

    In other words AMD is in the same spin cycle claiming everything is fine but just like 2014 and 2015 AMD will have nothing in 2016 to compete with Nvidia thus guaranteeing in the graphics market Nvidia dominance.

  • hei4me hei4me May 13, 2016 9:45 AM Flag

    And then reality set in, Nvidia up by more then the entire AMD's share price.

    Always remember Nvidia does not compete in the X86 market but yet its revenues dominates AMD's by $500 million.

    Now the Wells Fargo guy gets the rotten egg award for unsuccessfully trying to disparage Nvidia prior to Nvidia's release of financials. Nvidia grew revenues in both gaming and server the two avenues that Mr Ray bet against.

  • hei4me hei4me May 10, 2016 1:04 PM Flag

    What strikes me as hilarious is How after AMD execs have already told AMD fanboys that Polaris was at best a middle to low end chip, fanboys all over the Web try to make a silk purse out of a sow's ear, they claim AMD will take market share due to AMD will not have a high end product. LOL

    These same fanboys did not have this same opinion when AMD released Fury to compete at the high end, only now without a high end they claim midrange is the way to go.

    Some here are new to the board, so they do not recall the pain of the many times Nvidia launched product with AMD being late and then not having competitive products, thus losing market share.

6.86+0.04(+0.59%)Jul 29 4:00 PMEDT