Close
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Logo
Subscribe
Logo
  • Latest News
  • Artificial Intelligence
  • Video
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Subscribe
    Home Innovation
    • Innovation
    • Servers

    Google Developing Quantum Computing Chip

    Written by

    Jeff Burt
    Published September 3, 2014
    Share
    Facebook
    Twitter
    Linkedin

      eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

      Google is teaming with researchers from the University of California, Santa Barbara, to develop processors for quantum computing systems, expanding an effort that the search giant began more than a year ago.

      In a post on the Google Research blog Sept. 2, Hartmut Neven, director of engineering for the Google Research group, wrote that the UC Santa Barbara researchers will give the company’s Quantum Artificial Intelligence unit a needed hardware partner as it looks to build on the work it has done with D-Wave Systems over the past year.

      “With an integrated hardware group the Quantum AI team will now be able to implement and test new designs for quantum optimization and inference processors based on recent theoretical insights as well as our learnings from the D-Wave quantum annealing architecture,” Neven wrote.

      The UC Santa Barbara group is headed by John Martinis. The group in April reported that it had built a quantum computing array that is more reliable than past systems, which would represent a critical step forward in the development of such systems.

      “Quantum hardware is very, very unreliable compared to classical hardware,” Austin Fowler, a staff scientist in the physics department at the university, said in a story published on the UC Santa Barbara Current online new site. “Even the best state-of-the-art hardware is unreliable. Our paper shows that for the first time reliability has been reached.”

      Google is among a growing list of tech companies looking to unlock the vast potential of quantum computing. Most recently, IBM announced in July that it will spend $3 billion over five years on a series of projects that will look into the future of processors and systems, not only the continuing push to shrink traditional chips but also to investigate what will replace such silicon processors when the architecture reaches its physical limits.

      The possibilities include quantum computing as well as neurosynaptic computing, carbon nanotubes and graphene.

      Later that month, Microsoft officials began talking more about Station Q, a quantum computing lab at UC Santa Barbara where researchers with the software giant are working with counterparts at the university.

      Quantum computing holds the promise of creating systems that are millions of times faster than current supercomputers. In traditional computing, bits can only hold values of “1” or “0.” However, quantum bits—or “qubits”—can hold values of 1, 0, or both at the same time, opening up the possibility of systems running through millions of calculations simultaneously. In an interview with eWEEK in July, Bernie Meyerson, IBM Fellow and vice president of Innovation at the company, said the expansion of possibilities was analogous to human communications.

      “It would be frustrating to have a conversation where you could only say ‘yes’ or ‘no,'” Myerson said. “What if you could say ‘maybe’?”

      Chip makers for decades have been trying to keep up with Moore’s Law, which was stated by Intel co-founder Gordon Moore, who said that the number of transistors in chips would double every 18 to 24 months. However, it is becoming increasingly difficult to continue shrinking transistors and circuitry, and eventually such techniques will reach their physical limits.

      At the same time, trends like mobile computing, big data and the cloud are putting pressure on systems for better performance, greater bandwidth capacity and more memory, and businesses are demanding computers that consume less power. Conventional chip designs will reach 7nm and maybe a little smaller. What’s after that is uncertain.

      Google has been aggressive in developing their own servers and networking systems that can give the company the right levels of performance and power efficiency needed for their massive data centers. The company last year launched its Quantum Artificial Intelligence lab at NASA’s Ames Research Center using a quantum computer from D-Wave.

      “We believe quantum computing may help solve some of the most challenging computer science problems, particularly in machine learning,” Neven wrote in a blog post in May 2013. “Machine learning is all about building better models of the world to make more accurate predictions. If we want to cure diseases, we need better models of how they develop. If we want to create effective environmental policies, we need better models of what’s happening to our climate. And if we want to build a more useful search engine, we need to better understand spoken questions and what’s on the Web so you get the best answer.”

      In his latest post, Neven said Google “will continue to collaborate with D-Wave scientists and to experiment with the ‘Vesuvius’ machine at NASA Ames, which will be upgraded to a 1000 qubit ‘Washington’ processor.”

      Jeff Burt
      Jeff Burt
      Jeffrey Burt has been with eWEEK since 2000, covering an array of areas that includes servers, networking, PCs, processors, converged infrastructure, unified communications and the Internet of things.

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      Get the Free Newsletter!

      Subscribe to Daily Tech Insider for top news, trends & analysis

      MOST POPULAR ARTICLES

      Artificial Intelligence

      9 Best AI 3D Generators You Need...

      Sam Rinko - June 25, 2024 0
      AI 3D Generators are powerful tools for many different industries. Discover the best AI 3D Generators, and learn which is best for your specific use case.
      Read more
      Cloud

      RingCentral Expands Its Collaboration Platform

      Zeus Kerravala - November 22, 2023 0
      RingCentral adds AI-enabled contact center and hybrid event products to its suite of collaboration services.
      Read more
      Artificial Intelligence

      8 Best AI Data Analytics Software &...

      Aminu Abdullahi - January 18, 2024 0
      Learn the top AI data analytics software to use. Compare AI data analytics solutions & features to make the best choice for your business.
      Read more
      Latest News

      Zeus Kerravala on Networking: Multicloud, 5G, and...

      James Maguire - December 16, 2022 0
      I spoke with Zeus Kerravala, industry analyst at ZK Research, about the rapid changes in enterprise networking, as tech advances and digital transformation prompt...
      Read more
      Video

      Datadog President Amit Agarwal on Trends in...

      James Maguire - November 11, 2022 0
      I spoke with Amit Agarwal, President of Datadog, about infrastructure observability, from current trends to key challenges to the future of this rapidly growing...
      Read more
      Logo

      eWeek has the latest technology news and analysis, buying guides, and product reviews for IT professionals and technology buyers. The site’s focus is on innovative solutions and covering in-depth technical content. eWeek stays on the cutting edge of technology news and IT trends through interviews and expert analysis. Gain insight from top innovators and thought leaders in the fields of IT, business, enterprise software, startups, and more.

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Advertisers

      Advertise with TechnologyAdvice on eWeek and our other IT-focused platforms.

      Advertise with Us

      Menu

      • About eWeek
      • Subscribe to our Newsletter
      • Latest News

      Our Brands

      • Privacy Policy
      • Terms
      • About
      • Contact
      • Advertise
      • Sitemap
      • California – Do Not Sell My Information

      Property of TechnologyAdvice.
      © 2024 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.