Recent

% | $
Quotes you view appear here for quick access.

Intel Corporation Message Board

  • wallisweaver wallisweaver Mar 18, 2013 11:51 PM Flag

    Intel Cuts Manufacturing Costs With Big Data

    Intel is finding big value in big data. Over the past two years the company has developed more than a dozen data-intensive projects that have bolstered both its operational efficiency and bottom line.

    According to Ron Kasabian, general manager of big data solutions for Intel's data center group, these ongoing efforts have resulted in millions of dollars of cost savings.

    "We started the year before last, realizing we had an opportunity to leverage data that's floating around the enterprise today, (data) we weren't dealing with," Kasabian told InformationWeek in a phone interview.

    This predictive analytics process, implemented on a single line of Intel Core processors in 2012, allowed Intel to save $3 million in manufacturing costs. In 2013-14, Intel expects to extend the process to more chip lines and save an additional $30 million, the company said.

    Data-intensive processes also help Intel detect failures in its manufacturing line, which is a highly automated environment. "A lot of what we're doing is pulling log files out of manufacturing and test machines," said Kasabian. "Across our entire factory network, we're talking about 5 terabytes an hour. So it's very big volume."

    By capturing and analyzing this information, Intel can determine when a specific step in one of its manufacturing processes starts to deviate from normal tolerances.

    Big data benefits Intel's security efforts too. The company says its big data platform can process 200 billion server events, and provide early warning of security threats within 30 minutes.

    From informationweek

    SortNewest  |  Oldest  |  Most Replied Expand all replies
    • keep pumping Waldo, maybe some day INTC might get back to 24.00, pity the markets got overbght and are taking INTC the dog with it.

    • "A lot of what we're doing is pulling log files out of manufacturing and test machines," said Kasabian. "Across our entire factory network, we're talking about 5 terabytes an hour. So it's very big volume."
      By capturing and analyzing this information, Intel can determine when a specific step in one of its manufacturing processes starts to deviate from normal tolerances....
      SPC
      Statistical process control (SPC) is a method of quality control which uses statistical methods. SPC is applied in order to monitor and control a process. Monitoring and controlling the process ensures that it operates at its full potential. At its full potential, the process can make as much conforming product as possible with a minimum (if not an elimination) of waste (rework or trash). SPC can be applied to any process where the "conforming product" (product meeting specifications) output can be measured. Key tools used in SPC include control charts; a focus on continuous improvement; and the design of experiments. An example of a process where SPC is applied is manufacturing lines.

      • 2 Replies to semi_equip_junkie
      • SPC was first used on a sustaining basis by Western Electric in the late 1920's and 1930's, led by Dr. Walter Shewart. When Motorola started having quality problems in the 80's, some smart people decided to research and see how phone people made high quality phones back in the "old days", and low and behold they found the methods used at Western Electric (the people who make the phones and equipment for the ATT system - a government granted monopoly, by the way) were pretty smart about it.

        SPC has great application at Intel - high volume production that requires extraordinary tight tolerances - exactly what SPC is best at. The danger with SPC is getting people to change processes once they get processes "in control" - use of SPC can reinforce a natural resistance to change. This is a problem with all businesses to one degree or another, but in spades with tech - "controlled" production processes in the context of an industry where change is frequent.

      • SPC is nothing really new - but Intel refined it and enhanced it.
        Six Sigma is a set of tools and strategies for process improvement originally developed by Motorola in 1985.[1][2] Six Sigma became well known after Jack Welch made it a central focus of his business strategy at General Electric in 1995,[3] and today it is used in different sectors of industry.[4]

        Six Sigma seeks to improve the quality of process outputs by identifying and removing the causes of defects (errors) and minimizing variability in manufacturing and business processes.[5] It uses a set of quality management methods, including statistical methods, and creates a special infrastructure of people within the organization ("Champions", "Black Belts", "Green Belts", "Orange Belts", etc.) who are experts in these very complex methods.[5] Each Six Sigma project carried out within an organization follows a defined sequence of steps and has quantified financial targets (cost reduction and/or profit increase).[5]

 
INTC
34.645-0.015(-0.04%)2:42 PMEDT