IBM unveils world’s first 7nm chip

Discussion in 'Tech Talk' started by Deadend, Jul 9, 2015.

  1. Deadend
    Veteran Crowfall Member

    Joined:
    Jun 22, 2008
    Messages:
    1,449
    Likes Received:
    14
    Occupation:
    Monkey.
    http://arstechnica.com/gadgets/2015/07/ibm-unveils-industrys-first-7nm-chip-moving-beyond-silicon/
    Death of the big box PC is coming, been a bit slower then I thought it would be. Well if it wasn't for certain intrenched commercial producers it would be gone by now. 14nm was more then enough for a powerful system on a chip, but too much big business in supplying for big box PC still although it's in a slow decline now.

    Even though they are saying 2 years for commercial use they don't mean us. They mean server farms and super computers and such probably 5 years or more likely closer to 10 years before this sees retail.

    AMD is all ready beginning to split their company up likely in an attempt to having something survive the transition. and it wouldn't surprise me in the least if Nvidia was quietly trying to sell right now.

    Depends on how much trouble companies have putting this through an actual retail production line though. Intel is having problems with their 10nm production but I think they are still using silicon only rather then an alloy like the 7nm don't know how much easier or harder the alloy will make things.

    Well at the very least we all probably got another 5 years maybe 10 with our big sexo boxes :D
     
  2. EniGmA1987
    Veteran Staff Member Xenforcer

    Joined:
    Aug 25, 2010
    Messages:
    4,778
    Likes Received:
    34
    The EUV news is the most significant part of the article by far. Every single player in computer chip fabrication has been working towards and waiting on that tech for a long time now and it has significantly held back advancement by not having it available.
     
  3. Rbstr
    Veteran

    Joined:
    Feb 16, 2012
    Messages:
    391
    Likes Received:
    0
    Extreme-UV. Also known as X-ray ;). Always been odd that they feel like they need to use a euphemism there.
    A very large issue going any smaller is that you start to run out of atoms and bump into a regime where your electrons can simply tunnel through the gate in large numbers. That leakage current completely destroys any efficiency gains you'd usually see

    2 years is highly optimistic for deployment anywhere. The nature of integrated circuit technology is also that most of the developments deploy to just about all market important market segments at once. Intel's new node usually lands on laptop chips first. It's unlikely you'll see supercomputer architecture being the first place this is deployed. Cost effectiveness usually rules that and the server-farm space. New nodes aren't so terribly great for that like they might have been in the past.

    The other aspect of this is that die shrinkage doesn't completely equate to current players being in trouble. They're only in trouble if they don't make stuff that people want.
    Even though mobile is the growth area. Many people want more power than a Core-M gives and plenty of people want a really powerful computer for various reasons. AMD is in trouble because everything they make is basically outclassed above by Intel or below by the various mobile chips (And now Intel too, TBH). nVidia is in hotter water, but someone needs to make bigboy graphics still and they have seen the mobile light, even if they aren't too successful there.
     
  4. EniGmA1987
    Veteran Staff Member Xenforcer

    Joined:
    Aug 25, 2010
    Messages:
    4,778
    Likes Received:
    34
    Nvidia burned most mobile companies too many times so that is why you really dont see their Tegra line anywhere anymore, though I do love me some Tegra X1

    They have a lot of business in the server and workstation side because of their good software integration and support, something AMD severely lacks in. Supposedly though, people in the server side of the industry are not too happy about NVLink. Im not really sure why, both NVidia and IBM are behind the tech and it seems all good on paper. I guess it is more about the tech being proprietary and the move to it locking down options to Nvidia only hardware. Seems odd to me since it is IBM's Power architecture that has NVLink built into the CPU to actually allow communications with the NVLink built into Nvidia GPUs, and when you are buying IBM server stuff you are pretty much locked into IBM only stuff anyway... *shrug*

    Im interested in seeing the tech on the desktop side, but since it requires CPU hardware integration as well as motherboard integration then that probably means we will never see it on AMD processors unless AMD is also allowed to use in on their GPUs. And Unless Intel has certain plans for the tech I doubt they will decide to dedicate a chunk of their transistor budget over to Nvidia's proprietary tech, especially when it means having both NVLink and PCI-E graphics connections both built into the hardware to support both GPU companies. That is just asking too much, regardless of Nvidia's marketshare.
     
  5. Deadend
    Veteran Crowfall Member

    Joined:
    Jun 22, 2008
    Messages:
    1,449
    Likes Received:
    14
    Occupation:
    Monkey.
    Intel just pushed their 10nm chips out another year and will be running another 14nm cpu line for 2016.
     
  6. EniGmA1987
    Veteran Staff Member Xenforcer

    Joined:
    Aug 25, 2010
    Messages:
    4,778
    Likes Received:
    34