Close
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Logo
Subscribe
Logo
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Subscribe
    Home Latest News
    • Networking
    • PC Hardware

    Intel Looks to Multicore Future

    Written by

    Mark Hachman
    Published March 3, 2005
    Share
    Facebook
    Twitter
    Linkedin

      eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

      SAN FRANCISCO—By 2015, Intel may deliver processors with tens or even hundreds of individual cores, Intel researchers said Thursday at the Intel Developer Forum here.

      This “many core” strategy may support hundreds or thousands of instruction threads, which may create their own problems of managing them with the appropriate compiler software, executives said. Intel Corp.s top researcher also indicated that future Intel microprocessor designs will steal a page from rival AMD (Advanced Micro Devices Inc.)

      The last day of the Intel Developer Forum is typically marked by a sci-fi look at the future, and Intel researchers debuted “super-resolution” video techniques, personal communicators, and the parallelism techniques for multiprocessor systems.

      The company also described an important breakthrough: the first laser implemented in standard CMOS silicon, and a critical step in developing future optical interconnects.

      Justin Rattner, an Intel senior fellow in the Corporate Technology Group who has replaced Pat Gelsinger as the public face of Intels research division, said that Intels research teams are forced to conceptualize ideas 10 years into the future, to make sure that they become products in time.

      Rattner compared Intels platforms evolution to that of the peppered moth, whose lightly colored wings grew darker during the Industrial Revolution to match the soot being deposited on trees and buildings. He said Intel works with users and customers to determine the direction of the companys evolution, to make sure that the company is in step with their needs.

      “The key to this on our platform is to never stop evolving. If you stop evolving, you die,” Rattner said.

      The theme of IDF, however, has been a deeper exploration of Intels dual-core plans, and Rattner said Intel scientists have already begun thinking of how to anticipate and solve problems as the technology develops.

      Two of these include finding the compiler software needed to send a balanced load of instructions to the individual cores, as well as transferring information between them and the rest of the system.

      In a project code-named “Shangri-la,” Intel researchers used a language named “Baker” to manage the flow of data in a series of eight simulated cores, each managing eight instruction threads.

      Intel uses compiler software to translate instructions coded in C+ and other languages into the machine instructions used by the cores. Baker, apparently, will place even more intelligence in the compiler software, using the Shangri-la runtime to route instructions and even take unused cores offline to save power.

      /zimages/6/28571.gifClick here to read about reducing power consumption in mobile devices.

      Todays chips transfer data by way of pins, the tiny pieces of metal along the chips edge. As data rates reach higher and higher, the number of pins required will exceed the physical space along the edge of the die, Rattner said.

      To solve this, Intel is considering two alternatives: “3-D stacking” a dedicated memory wafer on top of the logic wafer, as well as die stacking, a more conventional technique.

      In 3-D stacking, an entire wafer would be bonded on top of another. Normally, a 300-mm wafer is fabricated as a discrete unit, and the microprocessor dice etched into its surface are removed and packaged.

      In Intels research scenario, however, a second DRAM (dynamic RAM)-dedicated wafer would be bonded to the top of the microprocessor, creating millions of possible connections. In this case, on-chip memory controllers would be required, a tactic already employed by AMD.

      Rattner said it was “inevitable” that Intels future multicore designs will emulate one element of rival AMDs processors and use an on-chip memory controller.

      “You have to have an on-chip memory controller—theres no place else to go,” Rattner said, adding that he expected that there would be a “rather large” number of memory controllers built directly into the die.

      The other alternative Intel is considering is stacking the processor dice directly on top of one another, Rattner said, similar to the way in which Intel stacks flash chips one on top of the other.

      Silicon photonics will be another way that Intel solves the data problem, at least between chips. Optical fiber is already used in the Fibre Channel connection, for example. A few weeks ago, however, Intel created the first continuous-wave silicon laser, the source of the signal, which Rattner said was “truly a breakthrough.”

      /zimages/6/99849.jpg

      Rattner also invited Intel researchers on stage to show off three of Intels ongoing projects: a personal communicator that serves as an adjunct to the notebook PC; a research technique that uses computational power to improve the resolution of recorded video; and a project to virtualize not just the processor, but also other components in the system.

      In one technique, a researcher took several pictures of a still image and compared them with a grainy, low-resolution video of the same object. While the video, taken with a cell phone, looked horrible when scaling up to normal resolution, the images used mathematical interpolation to create a clearer image. The same technique eventually could be used on the fly to improve the apparent resolution of prerecorded video.

      A second presentation took the virtualization concept a step further. In a demonstration, an Intel integrated graphics controller was shared between two PCs. Future work will include storage and networking, Rattner said.

      /zimages/6/28571.gifCheck out eWEEK.coms for the latest news in desktop and notebook computing.

      Mark Hachman
      Mark Hachman

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      MOST POPULAR ARTICLES

      Artificial Intelligence

      9 Best AI 3D Generators You Need...

      Sam Rinko - June 25, 2024 0
      AI 3D Generators are powerful tools for many different industries. Discover the best AI 3D Generators, and learn which is best for your specific use case.
      Read more
      Cloud

      RingCentral Expands Its Collaboration Platform

      Zeus Kerravala - November 22, 2023 0
      RingCentral adds AI-enabled contact center and hybrid event products to its suite of collaboration services.
      Read more
      Artificial Intelligence

      8 Best AI Data Analytics Software &...

      Aminu Abdullahi - January 18, 2024 0
      Learn the top AI data analytics software to use. Compare AI data analytics solutions & features to make the best choice for your business.
      Read more
      Latest News

      Zeus Kerravala on Networking: Multicloud, 5G, and...

      James Maguire - December 16, 2022 0
      I spoke with Zeus Kerravala, industry analyst at ZK Research, about the rapid changes in enterprise networking, as tech advances and digital transformation prompt...
      Read more
      Video

      Datadog President Amit Agarwal on Trends in...

      James Maguire - November 11, 2022 0
      I spoke with Amit Agarwal, President of Datadog, about infrastructure observability, from current trends to key challenges to the future of this rapidly growing...
      Read more
      Logo

      eWeek has the latest technology news and analysis, buying guides, and product reviews for IT professionals and technology buyers. The site’s focus is on innovative solutions and covering in-depth technical content. eWeek stays on the cutting edge of technology news and IT trends through interviews and expert analysis. Gain insight from top innovators and thought leaders in the fields of IT, business, enterprise software, startups, and more.

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Advertisers

      Advertise with TechnologyAdvice on eWeek and our other IT-focused platforms.

      Advertise with Us

      Menu

      • About eWeek
      • Subscribe to our Newsletter
      • Latest News

      Our Brands

      • Privacy Policy
      • Terms
      • About
      • Contact
      • Advertise
      • Sitemap
      • California – Do Not Sell My Information

      Property of TechnologyAdvice.
      © 2024 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.