Close
  • Latest News
  • Artificial Intelligence
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
Read Down
Sign in
Close
Welcome!Log into your account
Forgot your password?
Read Down
Password recovery
Recover your password
Close
Search
Logo
Logo
  • Latest News
  • Artificial Intelligence
  • Big Data and Analytics
  • Cloud
  • Networking
  • Cybersecurity
  • Applications
  • IT Management
  • Storage
  • Sponsored
  • Mobile
  • Small Business
  • Development
  • Database
  • Servers
  • Android
  • Apple
  • Innovation
  • Blogs
  • PC Hardware
  • Reviews
  • Search Engines
  • Virtualization
More
    Home Latest News
    • Storage

    Internet Insight: The Paradox of Grid Computing

    By
    Peter Coffee
    -
    January 7, 2002
    Share
    Facebook
    Twitter
    Linkedin

      Experienced aviators warn novice pilots, “the problem with a multiengine airplane is that sometimes you need them all.” During takeoff, for example, failure of even a single engine is a high-risk situation—but with more engines, there is a greater chance of at least one failure.

      Distributed computing systems, such as grid computing, involve a similar paradox: The more resources the system has, the greater the number of points where the system can fail or degrade—and the harder the task of ensuring adequate performance in all situations, without unacceptable overhead.

      A computing grid faces four new tasks, in addition to whatever problems it was built to solve. The grid must discover and allocate available resources as their availability comes and goes; it must protect long-distance interactions against intrusion, interception and disruption; it must monitor network status; and it must initiate and manage communications among the processing nodes to make each ones needs known to the others. There is no single optimal approach to any of these tasks but rather a family of possible solutions that match up with different types of problems.

      Delays in communication between widely separated nodes fall into two groups. Fundamental is the speed-of-light limit: A node at one location cannot possibly become aware of a change in state at another location in less than the straight-line, speed-of-light propagation time of almost exactly 1 nanosecond per foot of separation.

      That sounds good until its compared, for example, with modern local memory access times of, at most, a few tens of nanoseconds. Tightly coupled applications, such as simulation or process control, are therefore disadvantaged in distributed environments “until science discovers a method of communication that is not limited by the speed of light,” as Aerospace Corp. scientists Craig Lee and James Stepanek wrote in their paper published in April 2001 (which can be accessed via www.eweek.com/links).

      There are problem decomposition techniques that arent as badly handicapped by the speed of light: for example, Monte Carlo simulation, or the kind of data parceling strategies made famous by the SETI@Home project, which distributes sets of radio telescope data for intelligent-life detection by screen saver software. When problems lend themselves to this approach, they often dont need frequent synchronization and therefore arent severely hampered by distance.

      What does affect the latter class of problem, though, is the limited bandwidth of networks and network interfaces. Plotting recent progress, Lee and Stepanek in the paper cited earlier find network access bandwidth, as determined by available interface cards, doubling every 2.5 years, ominously lagging the 1.5-year doubling time of processor performance, assuming continued Moores Law improvement—which many project as likely through 2010.

      With processor speed outpacing the ability of interface cards to send and receive to the grid, it follows that some processing power will be best employed in boosting information content per bit: for example, by continuing the refinement of data compression algorithms using techniques such as the wavelet transforms in the JPEG 2000 standard.

      Data compression developments such as these are offset, however—perhaps to devastating effect—by the growth of data overhead entailed in the use of XML syntax to make data more self-disclosing than it is in application-specific binary data structures. Theres a difficult trade-off to be made between ad hoc availability of data for unanticipated uses and efficient, cost-effective packaging of data.

      Sad to say, a great deal of processing power may also be consumed by the calculations needed to implement data integrity and security measures, such as encryption for authentication of messages sent and received. Grid computing, in an open environment such as an IP network, invites both attempts to read the mail between the nodes and to analyze the patterns of traffic for what they might reveal about concentrations of valuable information.

      If network and computer are the same, it follows that the network—an inherently exposed asset—is increasingly the locus of IT value. Enterprise IT architects and service providers will have to learn to protect it without crippling its hoped-for performance gains.

      Peter Coffee
      Peter Coffee is Director of Platform Research at salesforce.com, where he serves as a liaison with the developer community to define the opportunity and clarify developers' technical requirements on the company's evolving Apex Platform. Peter previously spent 18 years with eWEEK (formerly PC Week), the national news magazine of enterprise technology practice, where he reviewed software development tools and methods and wrote regular columns on emerging technologies and professional community issues.Before he began writing full-time in 1989, Peter spent eleven years in technical and management positions at Exxon and The Aerospace Corporation, including management of the latter company's first desktop computing planning team and applied research in applications of artificial intelligence techniques. He holds an engineering degree from MIT and an MBA from Pepperdine University, he has held teaching appointments in computer science, business analytics and information systems management at Pepperdine, UCLA, and Chapman College.
      Get the Free Newsletter!
      Subscribe to Daily Tech Insider for top news, trends & analysis
      This email address is invalid.
      Get the Free Newsletter!
      Subscribe to Daily Tech Insider for top news, trends & analysis
      This email address is invalid.

      MOST POPULAR ARTICLES

      Latest News

      Zeus Kerravala on Networking: Multicloud, 5G, and...

      James Maguire - December 16, 2022 0
      I spoke with Zeus Kerravala, industry analyst at ZK Research, about the rapid changes in enterprise networking, as tech advances and digital transformation prompt...
      Read more
      Applications

      Datadog President Amit Agarwal on Trends in...

      James Maguire - November 11, 2022 0
      I spoke with Amit Agarwal, President of Datadog, about infrastructure observability, from current trends to key challenges to the future of this rapidly growing...
      Read more
      Applications

      Kyndryl’s Nicolas Sekkaki on Handling AI and...

      James Maguire - November 9, 2022 0
      I spoke with Nicolas Sekkaki, Group Practice Leader for Applications, Data and AI at Kyndryl, about how companies can boost both their AI and...
      Read more
      Cloud

      IGEL CEO Jed Ayres on Edge and...

      James Maguire - June 14, 2022 0
      I spoke with Jed Ayres, CEO of IGEL, about the endpoint sector, and an open source OS for the cloud; we also spoke about...
      Read more
      Careers

      SThree’s Sunny Ackerman on Tech Hiring Trends

      James Maguire - June 9, 2022 0
      I spoke with Sunny Ackerman, President/Americas for tech recruiter SThree, about the tight labor market in the tech sector, and much needed efforts to...
      Read more
      Logo

      eWeek has the latest technology news and analysis, buying guides, and product reviews for IT professionals and technology buyers. The site’s focus is on innovative solutions and covering in-depth technical content. eWeek stays on the cutting edge of technology news and IT trends through interviews and expert analysis. Gain insight from top innovators and thought leaders in the fields of IT, business, enterprise software, startups, and more.

      Facebook
      Linkedin
      RSS
      Twitter
      Youtube

      Advertisers

      Advertise with TechnologyAdvice on eWeek and our other IT-focused platforms.

      Advertise with Us

      Menu

      • About eWeek
      • Subscribe to our Newsletter
      • Latest News

      Our Brands

      • Privacy Policy
      • Terms
      • About
      • Contact
      • Advertise
      • Sitemap
      • California – Do Not Sell My Information

      Property of TechnologyAdvice.
      © 2022 TechnologyAdvice. All Rights Reserved

      Advertiser Disclosure: Some of the products that appear on this site are from companies from which TechnologyAdvice receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. TechnologyAdvice does not include all companies or all types of products available in the marketplace.

      ×