Close
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Logo
Subscribe
Logo
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Subscribe
    Home Latest News
    • Networking
    • PC Hardware

    Intel Scales Up the Stakes with Multicore Chip Strategy

    Written by

    John G. Spooner
    Published March 9, 2006
    Share
    Facebook
    Twitter
    Linkedin

      eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

      SAN FRANCISCO—Intel is assembling the building blocks for a radically different chip architecture that could arrive by the end of the decade.

      Although the chip giant officially announced its Core Microarchitecture at its spring Developer Forum, here, researchers at the company have already been working on a potential follow-on that will be capable of harboring tens of cores, far more than the Core Microarchitecture and its predecessor, which is already in development at the moment, Intel executives said.

      Driving the new research is the fact that, within six to eight years, Intel will be able to produce chips that will have between 16 billion and 32 billion transistors, versus a maximum of 2 billion now, based on the Moores Law tenent that states the number of transistors inside chips double every two years.

      Given the immense increase expected, Intels researchers are changing the way they look at chip design, introducing a program they call Tera-Scale Computing.

      Tera-Scale Computing, at its heart, strives to shift from smaller numbers of complex processor cores to a battalion of simple, general-purpose processor cores and backing those cores up with more specialized cores for jobs such as encryption.

      “Its a radical change across both the capabilities it could provide to users—either consumers or corporate users—because now youre taking about teraflop operations delivered to each person,” said Jeff McVeigh, technical assistant to Intel CTO Justin Rattner, in Hillsboro, Ore. “It gets down to enabling platforms to take on more human line capabilities.”

      /zimages/1/28571.gifClick here to read more about Intels spring Developer Forum.

      The work is a departure from the companys Core Microarchitecture, which focuses on getting as much work done as possible per clock cycle within two or four cores and uses a good deal of parallelism or breaking up jobs to process them more quickly.

      But, instead of focusing on ways to wedge more powerful copies of its current style processor cores into a single chip, Intels Tera-Scale Computing research project focuses on creating large numbers of smaller, simpler cores which it can augment, on chip, with more specialized cores capable of handling complex jobs such as cryptography.

      Tera-Scale chips would have enough power to improve tasks involving recognition, data mining and synthesis, boosting the performance of text or video search and features like speech recognition by adding predictive capabilities to computers.

      Meanwhile, the chips might make more immersive learning and virtual meeting software would be made possible as well.

      “All those pieces are the capabilities that this kind of computing power would enable,” McVeigh said. “We view it as very important and were putting in the effort to make sure it happens.”

      That said, McVeigh stressed the project is still in the research stages and declined to say whether it would even make it into products.

      /zimages/1/28571.gifHow does Intel keeps its enterprise customers coming back for more? Click here to read analyst Roger Kays answer.

      Intel research labs tend to focus on technology thats between five and 10 years in the future. The labs bring projects to the proof of concept level—basically proving they can be used—but then Intel product groups must decide whether or not to incorporate them into the companys products.

      The Tera-Scale project is unique in many ways, however, and points to a higher-than-normal level of importance to the company.

      Whereas individual research projects tend to get code-names, the effort, unveiled at a pre-IDF Intel Research briefing, essentially has its own brand name.

      Next Page: Pouring resources into Tera-Scale.

      Pouring Resources into Tera


      -Scale”>

      Meanwhile, Intel is pouring a tremendous amount of resources into the effort. About 40 percent of the researchers in its Corporate Technology Group, which include its Microprocessor Technology Lab, its Communications Technology Lab and Systems Technology Lab, and about 900 researchers, are working on some 80 projects involving Tera-Scale Computing.

      Those projects could be interwoven in many ways to support the project. A chip with many, many cores would need a very big pipeline for data. Intel researchers are working on Silicon Photonics, a project that involves building optical connections into silicon using standard manufacturing techniques.

      The project, which the company has said is targeted at chip-to-chip connections, could present one avenue for creating pipelines to keep the chip flush with data.

      Meanwhile, inside the MTL, researchers are designing new TCP/IP processing cores and new types of memories, including configurable caches, 3-D stacked memory and high-bandwidth memory.

      Researchers are also advocating for transactional memory, which coordinates multiple threads accessing the same memory versus todays approach of locking it for use one thread at a time.

      /zimages/1/28571.gifClick here to read more about Intels chip manufacturing plans.

      Intels labs are also working on new types of transistors which are smaller, faster and more efficient, in addition to techniques that would allow two different chip wafers to be glued together, creating the ability to tightly pair processors and memory in a way that resembles an Oreo cookie.

      The new approach is unlikely to be rolled out all at once. An intermediate phase in which some of the features of Tera-Scale Computing are pulled forward in the less-distant future.

      Given its customarily conservative nature, Intels likely already working on a many-core chip that would bridge the gap between its current architectural approach and one of Tera-Scale computing.

      McVeigh declined to discuss any such efforts, but did not deny their existence.

      Chips with many-cores are at least six or eight years out, said Kevin Krewell, editor-in-chief of the Microprocessor Report, in an interview with eWEEK.

      However, he said Tera-Scale-style chips could change the landscape of chip design.

      “Theres lots of re-architecting that could be done,” Krewell said. “If you have the right software, say, a function can be taken out of main CPU and diverted off to a dedicated piece of hardware.

      “Once you get more sophisticated scheduling, a processor can make the decision… it can decide then [data] is going off to an accelerator.”

      Next Page: Making the chips.

      Making the Chips

      “Once youve got the ability to push the boundaries of how big you can make a chip and put that many transistors on it, it really opens up the boundaries of… what you can do with it—how many cores make sense, what you do with those cores.”

      Making the chips

      If the project were to form the basis of its future chip architectures, Tera-Scale design principals could be shared across all of Intels processor lines, including handheld devices, ultramobile PCs, notebook PCs, desktops and servers by varying the numbers of general purpose cores in each use.

      A server processor would be the best candidate for a larger number of general-purpose cores, including specialized XML processing cores, whereas a handheld chip might use fewer general-purpose cores.

      Chips for all of the categories might use TCP/IP processing and or cryptography cores, however.

      The general-purpose cores, for their part, would look different than Intels current processors as well. The companys Core Architecture, revealed this week, places emphasis on getting more work done and delivering greater energy efficiency than the NetBurst architecture, which is the architecture behind preceding chips like Pentium 4 and the dual-core Pentium D.

      Intel has confirmed that, at a minimum, it will be capable of producing four-core chips with the architecture.

      However, the research project is taking researchers in the direction of a “totally different architecture,” said Bob Crepps, a technology strategist in Intels Corporate Technology Group.

      Tera-Scale has Intel researchers devising much simpler processor cores, likely short-pipeline, in-order cores which would take up less room on a chip and crunch through data very quickly by dividing data up into chunks and processing it in parallel.

      Specialized cores for TCP/IP or cryptography could have their own specific design, which emphasize efficiency.

      A cryptography core, for one, might be designed to be wider than 64-bits, something that would allow it to rip through a 1024-bit or 2084-bit key more quickly than a general-purpose, 64-bit core, researchers said.

      Challenges would remain in keeping such large numbers of cores flush with data, likely leading to changes in cache design—Intel researchers are working on this as part of their configurable cache, 3D-stacked memory and high-bandwidth memory experiments—in addition to system platform changes to speed up system input/output.

      Processor power management and heat management would change as well, in part by varying the number of cores that were turned on at a given moment.

      General-purpose cores might turn on or off based on demand. Similarly, a cryptography core, if present, could be made to quickly awaken, make its calculations and then shut down.

      Cores could also work on data and, if they got too hot, hand it off to others, creating the data equivalent of a hot potato. The same scenario could ensure servers remain up-and-running by supporting an on-chip fail-over system, researchers said.

      /zimages/1/28571.gifIntel aims for new mobile PC category. Click here to read more.

      “It drastically changes how applications and operating systems would run,” McVeigh said. “Now that you have the abundance of cores, you get away from the notion of having to schedule things on one or a few cores…and are able to be more efficient—almost dedicate parts of the cores and I/O and memory to dedicated tasks to provide for better reliability, better performance… so were not always having to balance the resources.”

      Software would also present a major challenge in putting Tera-Scale-like processors into production. It would require Intel to work with software developers to encourage them to create software that can take advantage of the many-core chips.

      Software changes could allow for something called speculative multithreading, where software compliers could look for areas of parallelism in application code and break them up to process, researchers said.

      However, Intel has yet to disclose if its begun doing any such work with software makers.

      Ultimately, “This effort isnt just within our Microprocessor [Technology] Lab. It spans all of the other areas as well, because it has impacts on communications and systems—where youre dealing with the memory, dealing with virtualization, partitioning, trust and how those apply directly to Tera-Scale style platforms,” McVeigh said.

      /zimages/1/28571.gifCheck out eWEEK.coms for the latest news, views and analysis on servers, switches and networking protocols for the enterprise and small businesses.

      John G. Spooner
      John G. Spooner
      John G. Spooner, a senior writer for eWeek, chronicles the PC industry, in addition to covering semiconductors and, on occasion, automotive technology. Prior to joining eWeek in 2005, Mr. Spooner spent more than four years as a staff writer for CNET News.com, where he covered computer hardware. He has also worked as a staff writer for ZDNET News.

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      MOST POPULAR ARTICLES

      Artificial Intelligence

      9 Best AI 3D Generators You Need...

      Sam Rinko - June 25, 2024 0
      AI 3D Generators are powerful tools for many different industries. Discover the best AI 3D Generators, and learn which is best for your specific use case.
      Read more
      Cloud

      RingCentral Expands Its Collaboration Platform

      Zeus Kerravala - November 22, 2023 0
      RingCentral adds AI-enabled contact center and hybrid event products to its suite of collaboration services.
      Read more
      Artificial Intelligence

      8 Best AI Data Analytics Software &...

      Aminu Abdullahi - January 18, 2024 0
      Learn the top AI data analytics software to use. Compare AI data analytics solutions & features to make the best choice for your business.
      Read more
      Latest News

      Zeus Kerravala on Networking: Multicloud, 5G, and...

      James Maguire - December 16, 2022 0
      I spoke with Zeus Kerravala, industry analyst at ZK Research, about the rapid changes in enterprise networking, as tech advances and digital transformation prompt...
      Read more
      Video

      Datadog President Amit Agarwal on Trends in...

      James Maguire - November 11, 2022 0
      I spoke with Amit Agarwal, President of Datadog, about infrastructure observability, from current trends to key challenges to the future of this rapidly growing...
      Read more
      Logo

      eWeek has the latest technology news and analysis, buying guides, and product reviews for IT professionals and technology buyers. The site’s focus is on innovative solutions and covering in-depth technical content. eWeek stays on the cutting edge of technology news and IT trends through interviews and expert analysis. Gain insight from top innovators and thought leaders in the fields of IT, business, enterprise software, startups, and more.

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Advertisers

      Advertise with TechnologyAdvice on eWeek and our other IT-focused platforms.

      Advertise with Us

      Menu

      • About eWeek
      • Subscribe to our Newsletter
      • Latest News

      Our Brands

      • Privacy Policy
      • Terms
      • About
      • Contact
      • Advertise
      • Sitemap
      • California – Do Not Sell My Information

      Property of TechnologyAdvice.
      © 2024 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.