Its hard to be a modern electronics hacker when the performance of your test and measurement gear has to advance at 10 times the pace of the hardware mainstream. Thinking about this problem made me realize, though, that the same issue arises in business process measurement and control.
Its easy to explain why electronic measurement equipment has to be 10 times as good as the gear that it tests. The input amplifier of an oscilloscope, for example, has to respond to input changes about 10 times as quickly as the most rapid changes of interest in any signal that you want to examine. Otherwise, that wiggly line on the screen isnt really a picture of the signal; its more a picture of the behavior of that amplifier circuit as it tries to keep up with the signals spikes and ripples.
When the test instrument cant keep up, the roles become reversed: The outside signal becomes the test probe, while the scopes own amplifier circuit becomes the test subject. Thats not what you paid for.
This problem is especially on my mind as nanotechnology merges into IT providers near-term plans and long-term road maps. Intel, for example, said at the end of last month that its use of silicon will continue well into the second decade of this century. However, reports in November said that Intels flash-memory products will incorporate devices on the order of 20-nanometer scale by around 2012. The company will therefore need fabrication accuracies of a nanometer or two at least that soon, even if its not actually using nanoscale components by that time.
Intel also disclosed at the end of last month that it will intensify its research in fabricating transistors from carbon nanotubes—structures that are themselves only 1 nanometer in diameter, implying subnanometer manufacturing precision.
Samsung and others already propose to build “field emission display” screens, now in prototype and shipping in volume as soon as the end of next year, using nanotubes as pixel-by-pixel electron guns. These screens will have the brightness and contrast of a CRT and the 1-inch thickness of a flat screen without the power consumption of current plasma screens.
So heres the problem: When working with physical components approaching the smallest physical structures that can be made, at least out of atoms as we know them, what do we use to inspect them or practice other forms of quality control? The demand for building blocks such as carbon nanotubes already outpaces the ability of the market to ensure that people get what they pay for. A December report from Lux Research cited several examples of fraud in the market for carbon nanotubes, including shipments containing 30 percent useless growth catalyst and others containing mere carbon soot.
The same Lux Research report cited another incident where a supplier claimed the ability to make particles of a certain size but turned out to be unable to ship those particles in a way that kept them from clumping back into larger chunks. Some people merely fool their customers while others even fool themselves—and the customer has to be able to dive into deeper levels of insight in the search for assurance.
Testing and measuring todays accelerating business processes creates the same kind of punishing demands at both extremes of scale and precision. The test data sets used by developers must be larger than those of an actual application to provide early warning of scalability issues. The time steps used in recording and controlling transactions must be smaller than the time scale of the transactions themselves to avoid inconsistent or even inaccurate results.
The tools and processes that we use to build the next generation of technology must be, in a sense, beyond that generation already. If you cant construct it, inspect it and measure it more precisely than anyone plans to use it, the only way you can test it is by turning it on and seeing if it works.
Hardware hackers call that a “smoke test”: Its as bad an idea as it sounds.
Technology Editor Peter Coffee can be reached at peter_coffee@ziffdavis.com.