Close
  • Latest News
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Logo
Logo
  • Latest News
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Home Latest News
    • Networking
    • PC Hardware
    • Storage

    Nvidia Exec: Moore’s Law in Danger of Dying Out

    By
    Jeff Burt
    -
    May 3, 2010
    Share
    Facebook
    Twitter
    Linkedin

      For more than four decades, Moore’s Law has held true, thanks in large part to Intel first continuing to crank up the speed of its processors, and more recently by rapidly growing the number of processing cores on a single chip.

      However, according to Bill Dally, Moore’s Law has reached its limit on traditional CPUs from the likes of Intel and Advanced Micro Devices, and needs a new way of doing things if it is to continue.

      Not surprisingly, Dally-vice president and chief scientist at graphics chip maker Nvidia-believes the only salvation for Moore’s Law lies in moving from serial processing to parallel processing, and more specifically, from CPUs to GPUs.

      In a column on Forbes.com posted April 29, Dally argues that the energy needs for the CPUs Intel and AMD are pushing out there are creating an environment where Moore’s Law can no longer continue.

      “We have reached the limit of what is possible with one or more traditional, serial central processing units, or CPUs,” Dally wrote. “It is past time for the computing industry-and everyone who relies on it for continued improvements in productivity, economic growth and social progress-to take the leap into parallel processing.”

      Moore’s Law sprung from a paper written by Intel co-founder Gordon Moore 45 years ago, in which he predicted that the number of transistors on a chip would double every 18 months, and thus the performance of the CPU also would double during that time.

      However, what worked in the 1980s and 1990s is not working anymore, despite what Intel officials say, and a new way of computing must be adopted, Dally said.

      In comparing serial processing with parallel processing, the Nvidia executive pointed to the task of counting the words of his column. In serial processing, one person would count each word. In parallel processing, each paragraph would be given to a different person, and the word counts from each graph would be added together.

      As the demand for greater computer performance grows, the problems with the serial CPU architecture will become more apparent, and Moore’s Law will end, he said.

      “[T]hese needs will not be met unless there is a fundamental change in our approach to computing,” Dally wrote. “The good news is that there is a way out of this crisis. Parallel computing can resurrect Moore’s Law and provide a platform for future economic growth and commercial innovation. The challenge is for the computing industry to drop practices that have been in use for decades and adapt to this new platform.”

      There is now a need for energy-efficient systems that practice parallelism rather than serial processing, Dally said.

      “A fundamental advantage of parallel computers is that they efficiently turn more transistors into more performance,” Dally wrote. “Doubling the number of processors causes many programs to go twice as fast. In contrast, doubling the number of transistors in a serial CPU results in a very modest increase in performance-at a tremendous expense in energy.

      “More importantly, parallel computers, such as graphics processing units, or GPUs, enable continued scaling of computing performance in today’s energy-constrained environment. Every three years we can increase the number of transistors (and cores) by a factor of four. By running each core slightly slower, and hence more efficiently, we can more than triple performance at the same total power. This approach returns us to near historical scaling of computing performance.”

      Nvidia has aggressively been pushing its GPU technology into more mainstream computing environments, particularly in such areas as HPC (high-performance computing). Nvidia in October 2009 introduced its new Fermi GPU architecture, which incorporates more than 3 billion transistors and 512 CUDA cores.

      CUDA is the parallel computing engine for Nvidia’s GPUs.

      AMD, through its ATI unit, also is looking to bring graphics computing more into the mainstream, and working on its Fusion strategy of offering full CPU and GPU capabilities on a single chip. For its part, Intel also is expected to continue growing the graphics capabilities of its processors.

      Intel and Nvidia have been partners, but the relationship recently has been strained. Intel in February 2009 sued Nvidia, claiming a 2004 agreement between the two did not give Nvidia the right to develop chip sets for newer Intel chips, such as those developed with the “Nehalem” architecture. The suit is scheduled to go to trial this year.

      The Federal Trade Commission is suing Intel for alleged uncompetitive practices, not only for its treatment of AMD but also in regards to Nvidia. Intel officials have denied the allegations.

      Nvidia also has created a Website called “Intel’s Insides,” which offers a series of editorial-style one-panel cartoons mocking Intel’s various legal issues.

      Jeff Burt
      Jeffrey Burt has been with eWEEK since 2000, covering an array of areas that includes servers, networking, PCs, processors, converged infrastructure, unified communications and the Internet of things.

      MOST POPULAR ARTICLES

      Big Data and Analytics

      Alteryx’s Suresh Vittal on the Democratization of...

      James Maguire - May 31, 2022 0
      I spoke with Suresh Vittal, Chief Product Officer at Alteryx, about the industry mega-shift toward making data analytics tools accessible to a company’s complete...
      Read more
      Cybersecurity

      Visa’s Michael Jabbara on Cybersecurity and Digital...

      James Maguire - May 17, 2022 0
      I spoke with Michael Jabbara, VP and Global Head of Fraud Services at Visa, about the cybersecurity technology used to ensure the safe transfer...
      Read more
      Big Data and Analytics

      GoodData CEO Roman Stanek on Business Intelligence...

      James Maguire - May 4, 2022 0
      I spoke with Roman Stanek, CEO of GoodData, about business intelligence, data as a service, and the frustration that many executives have with data...
      Read more
      Applications

      Cisco’s Thimaya Subaiya on Customer Experience in...

      James Maguire - May 10, 2022 0
      I spoke with Thimaya Subaiya, SVP and GM of Global Customer Experience at Cisco, about the factors that create good customer experience – and...
      Read more
      Cloud

      Yotascale CEO Asim Razzaq on Controlling Multicloud...

      James Maguire - May 5, 2022 0
      Asim Razzaq, CEO of Yotascale, provides guidance on understanding—and containing—the complex cost structure of multicloud computing. Among the topics we covered:  As you survey the...
      Read more
      Logo

      eWeek has the latest technology news and analysis, buying guides, and product reviews for IT professionals and technology buyers. The site’s focus is on innovative solutions and covering in-depth technical content. eWeek stays on the cutting edge of technology news and IT trends through interviews and expert analysis. Gain insight from top innovators and thought leaders in the fields of IT, business, enterprise software, startups, and more.

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Advertisers

      Advertise with TechnologyAdvice on eWeek and our other IT-focused platforms.

      Advertise with Us

      Menu

      • About eWeek
      • Subscribe to our Newsletter
      • Latest News

      Our Brands

      • Privacy Policy
      • Terms
      • About
      • Contact
      • Advertise
      • Sitemap
      • California – Do Not Sell My Information

      Property of TechnologyAdvice.
      © 2021 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.

      ×