Close
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Logo
Logo
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Home Latest News
    • Networking
    • PC Hardware
    • Storage

    Nvidia Exec: Moore’s Law in Danger of Dying Out

    Written by

    Jeff Burt
    Published May 3, 2010
    Share
    Facebook
    Twitter
    Linkedin

      eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

      For more than four decades, Moore’s Law has held true, thanks in large part to Intel first continuing to crank up the speed of its processors, and more recently by rapidly growing the number of processing cores on a single chip.

      However, according to Bill Dally, Moore’s Law has reached its limit on traditional CPUs from the likes of Intel and Advanced Micro Devices, and needs a new way of doing things if it is to continue.

      Not surprisingly, Dally-vice president and chief scientist at graphics chip maker Nvidia-believes the only salvation for Moore’s Law lies in moving from serial processing to parallel processing, and more specifically, from CPUs to GPUs.

      In a column on Forbes.com posted April 29, Dally argues that the energy needs for the CPUs Intel and AMD are pushing out there are creating an environment where Moore’s Law can no longer continue.

      “We have reached the limit of what is possible with one or more traditional, serial central processing units, or CPUs,” Dally wrote. “It is past time for the computing industry-and everyone who relies on it for continued improvements in productivity, economic growth and social progress-to take the leap into parallel processing.”

      Moore’s Law sprung from a paper written by Intel co-founder Gordon Moore 45 years ago, in which he predicted that the number of transistors on a chip would double every 18 months, and thus the performance of the CPU also would double during that time.

      However, what worked in the 1980s and 1990s is not working anymore, despite what Intel officials say, and a new way of computing must be adopted, Dally said.

      In comparing serial processing with parallel processing, the Nvidia executive pointed to the task of counting the words of his column. In serial processing, one person would count each word. In parallel processing, each paragraph would be given to a different person, and the word counts from each graph would be added together.

      As the demand for greater computer performance grows, the problems with the serial CPU architecture will become more apparent, and Moore’s Law will end, he said.

      “[T]hese needs will not be met unless there is a fundamental change in our approach to computing,” Dally wrote. “The good news is that there is a way out of this crisis. Parallel computing can resurrect Moore’s Law and provide a platform for future economic growth and commercial innovation. The challenge is for the computing industry to drop practices that have been in use for decades and adapt to this new platform.”

      There is now a need for energy-efficient systems that practice parallelism rather than serial processing, Dally said.

      “A fundamental advantage of parallel computers is that they efficiently turn more transistors into more performance,” Dally wrote. “Doubling the number of processors causes many programs to go twice as fast. In contrast, doubling the number of transistors in a serial CPU results in a very modest increase in performance-at a tremendous expense in energy.

      “More importantly, parallel computers, such as graphics processing units, or GPUs, enable continued scaling of computing performance in today’s energy-constrained environment. Every three years we can increase the number of transistors (and cores) by a factor of four. By running each core slightly slower, and hence more efficiently, we can more than triple performance at the same total power. This approach returns us to near historical scaling of computing performance.”

      Nvidia has aggressively been pushing its GPU technology into more mainstream computing environments, particularly in such areas as HPC (high-performance computing). Nvidia in October 2009 introduced its new Fermi GPU architecture, which incorporates more than 3 billion transistors and 512 CUDA cores.

      CUDA is the parallel computing engine for Nvidia’s GPUs.

      AMD, through its ATI unit, also is looking to bring graphics computing more into the mainstream, and working on its Fusion strategy of offering full CPU and GPU capabilities on a single chip. For its part, Intel also is expected to continue growing the graphics capabilities of its processors.

      Intel and Nvidia have been partners, but the relationship recently has been strained. Intel in February 2009 sued Nvidia, claiming a 2004 agreement between the two did not give Nvidia the right to develop chip sets for newer Intel chips, such as those developed with the “Nehalem” architecture. The suit is scheduled to go to trial this year.

      The Federal Trade Commission is suing Intel for alleged uncompetitive practices, not only for its treatment of AMD but also in regards to Nvidia. Intel officials have denied the allegations.

      Nvidia also has created a Website called “Intel’s Insides,” which offers a series of editorial-style one-panel cartoons mocking Intel’s various legal issues.

      Jeff Burt
      Jeff Burt
      Jeffrey Burt has been with eWEEK since 2000, covering an array of areas that includes servers, networking, PCs, processors, converged infrastructure, unified communications and the Internet of things.

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      MOST POPULAR ARTICLES

      Artificial Intelligence

      9 Best AI 3D Generators You Need...

      Sam Rinko - June 25, 2024 0
      AI 3D Generators are powerful tools for many different industries. Discover the best AI 3D Generators, and learn which is best for your specific use case.
      Read more
      Cloud

      RingCentral Expands Its Collaboration Platform

      Zeus Kerravala - November 22, 2023 0
      RingCentral adds AI-enabled contact center and hybrid event products to its suite of collaboration services.
      Read more
      Artificial Intelligence

      8 Best AI Data Analytics Software &...

      Aminu Abdullahi - January 18, 2024 0
      Learn the top AI data analytics software to use. Compare AI data analytics solutions & features to make the best choice for your business.
      Read more
      Latest News

      Zeus Kerravala on Networking: Multicloud, 5G, and...

      James Maguire - December 16, 2022 0
      I spoke with Zeus Kerravala, industry analyst at ZK Research, about the rapid changes in enterprise networking, as tech advances and digital transformation prompt...
      Read more
      Video

      Datadog President Amit Agarwal on Trends in...

      James Maguire - November 11, 2022 0
      I spoke with Amit Agarwal, President of Datadog, about infrastructure observability, from current trends to key challenges to the future of this rapidly growing...
      Read more
      Logo

      eWeek has the latest technology news and analysis, buying guides, and product reviews for IT professionals and technology buyers. The site’s focus is on innovative solutions and covering in-depth technical content. eWeek stays on the cutting edge of technology news and IT trends through interviews and expert analysis. Gain insight from top innovators and thought leaders in the fields of IT, business, enterprise software, startups, and more.

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Advertisers

      Advertise with TechnologyAdvice on eWeek and our other IT-focused platforms.

      Advertise with Us

      Menu

      • About eWeek
      • Subscribe to our Newsletter
      • Latest News

      Our Brands

      • Privacy Policy
      • Terms
      • About
      • Contact
      • Advertise
      • Sitemap
      • California – Do Not Sell My Information

      Property of TechnologyAdvice.
      © 2024 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.

      ×