rogerspace 3.0 | On Miniaturization

Index/Home On Sustainability On Miniaturization The MkI Insight Renewable/Sustainable Energy

On the Universe The Meditations Galleries The Old Quarter Bibliography/Links





Miniaturization is the main trend technology has been bringing us. Whatever the task is that needs to be done, it can now be done with a smaller tool than was previously needed.



Miniaturization in Aircraft

In WWII, at first only heavy bombers could accurately deliver substantial ordinance load over long distances, since only they had both the range and accuracy-delivering stability required. Then it was found that the Lockheed P-38 Lightning, a twin-engine, single-pilot fighter / attack plane, could deliver its 4,000 lbs. rated load the same distance as a single B-17G could deliver that load. The longer a B-17’s mission, the greater percentage of its cargo load had to be fuel, so in maximum-range missions carrying of only a 4,000 lbs. bomb load was typical. The P-38 could carry that plus the (radically smaller vs. a B-17) required fuel the same distance. It didn’t have hardly a fraction of the B-17’s stability though, especially once the B-17 got an autopilot system for bomb run usage, so massed P-38s with a clear-nosed lead P-38 serving as the designated (and only real potential) navigator were seldom used. A useable autopilot system could not even try to be fitted to the P-38 for space and weight reasons. Thirty years later the Grumman F-14 Tomcat was capable of both delivering a B-17’s maximum possible bomb load over a longer range (via in-flight refueling) and protecting itself from hostile fighters, using a total crew of only 2. Smaller bomb loads were by then capable of being delivered with extreme accuracy completely unknown in WWII via gyroscope, infrared and by now laser and GPS –guided missiles, thereby maximizing effectiveness while minimizing collateral damage and civilian casualties. Ten years later, we had well-developed versions of both small single-pilot fighter jets (the F-16 Falcon) and larger 2-person fighter/attack aircraft (the F-15 Eagle) that had always-on autopilot systems so reliable that they were used to keep the plane stabilized against their naturally unstable center of gravity behind center of lift designs, maximizing their agility while minimizing pilot control difficulties.

Aircraft instrument panels have increasingly shifted from a sturdily-framed array of ten to thirty gauges of a few pounds each to multipurpose, on-the-fly configurable large microprocessor-driven LCD panels capable of giving reliable 3-D GPS-guided navigation maps in any weather.

Fifty years after its first use, there were plans to produce reproductions of the Messerschmitt 262a twin-turbojet interceptor of late WWII, but using smaller engines with 50% greater thrust.

The latest in turbo-ramjet engine designs, while looking very like an SR-71 Blackbird’s engines with its moving-spike intake, and is about the same size, is six times as efficient at generating thrust.

Now that we've already gone from from an 11-man B-17's bombing capabilities being greatly exceeded by a single-man F-16 hyper-agile fighter jet, we're going into having F-15 fighter/bomber groups being somewhat replaced by unmanned Predator use, where the human sits behind a LCD television of sorts showing the view from the nose of the Predator via radio link, with the human controlling it via radio control, essentially a full-scale radio-control model of, well, the future?



Miniaturization in (Scaled-Down) Remote Control Aircraft

Remote-control model aircraft have gone from the 1950s’ “U-Control” single-speed gas two-cycle –driven planes going in tethered circles with fishing line controlling only the elevator to ballistic electric aerobatic fixed-wing aircraft and a wide array of gyroscopically-stabilized helicopters. Some radio control fixed-wing aircraft use Mylar wings and are small and light enough to be strictly for indoors use, “cruising” at roughly a walking pace and having a turning radius small enough to circle over some dinner tables. Some radio control helicopters are similarly for strictly indoors use, and made docile and amazingly affordable partly by piezoelectric gyroscopes smaller and lighter than a penny.



Automotive Miniaturization Effects

Engines

Automotive engines have seen enormous efficiency and cleanliness improvements from miniaturization. Miniaturized portable computer power very carefully regulates fuel flow, with high pressure fuel injectors maximizing fuel atomization (minimizing droplet size) to minimize output of unburned hydrocarbons and carbon monoxide and maximize efficiency. Engineering and metallurgical refinements have brought better cylinder head cooling, allowing resurgence in compression ratios, bringing efficiency improvements with it, via four valve per cylinder pent roof or semi-hemispherical cylinder head designs. Much greater casting and machining refinement is required by a four-valve combustion chamber, with much greater potential local hot spots if not designed well. Such a design allows a central spark plug location, which allows for undisturbed flame front propagation, thereby minimizing pinging and preignition, which is largely what allows for that resurgence in compression ratios. Those extra valves and increased compression ratios in turn allow higher engine speeds, providing higher power levels from a given engine weight without drivability or efficiency problems (just the opposite in fact). The extra cam lobes on each camshaft for the additional valves generate more heat, which along with the extra combustion intensity require more use of aluminum vs. heavier iron for improved heat flow away from the combustion chambers. Some engines, such as that in my ’86 Honda CRX Si (three valves per cylinder, all-aluminum engine), have an oil-to-water heat exchanger attached to the oil filter feed so the oil temperature can be kept under control without additional external plumbing and heat exchangers.

Computer miniaturization has brought very sophisticated engine management systems that control not only ignition timing, but valve timing and lift in many cases, plus increasingly variable intake manifold harmonic resonance characteristics. Honda has continued being the leader in variable valvetrain timing and lift control, typically using differing lift profiles for each of a cylinder’s intake valves at partial throttle to maximize efficiency via maximized intake charge velocity. There are three main reasons large engines designed for high peak horsepower have in the past at least had extremely uneven low-load behavior:

First, large, single intake valves in each cylinder allow a low-momentum moseying of intake air/fuel mix into the cylinders, where it doesn’t have the vortex formation ability to spread the frame kernel fast enough once the spark plug fires.

Secondly, carburetion (used until the mid '80s) and throttle-body fuel injection (the simplest form of fuel injection) rely on a “wet” manifold vs. a “dry”, air-only, manifold for port fuel injection engines. A wet manifold allows rich and lean areas of the intake gases into the cylinders, depending on rpm, throttle position and other factors. When a “wet” manifold is cold, fuel will condense on its walls and trickle as a liquid into the cylinder instead of as a vapor. Some of that fuel will get blasted past the piston down into the crankcase, where it mixes with the oil pan’s oil to prematurely age that oil. Some condensed fuel will directly hit the spark plug, temporarily fouling it.

Third, carburetion at least doesn’t fully “atomize” the fuel to minimize its droplet size anywhere nearly as well as fuel injectors do, especially at idle.

These three factors combine under low load (high manifold vacuum) and especially cold engine conditions to completely prevent even and efficient fuel combustion. Port fuel injection blasts a fine spray of fuel at the intake valve for only as long as the computer controlling it decides the engine needs. “Dry” manifolds also allow extra manifold runner length, allowing specific harmonic resonant frequencies to boost engine airflow at needed rpm points. That was used in my old CRX Si to boost power at about 2500 and 5000 rpm at the expense of 3750 and 7500 rpm power, which works fine because the engine naturally breathes well from 3500 to 6500 rpm anyway, and 7500 rpm is almost the point where oily bits start coming apart. Some newer engines have “variable” intake runner lengths via flapper valves, so what would be a harmonically dead spot in the rpm range at one runner length can switch to a harmonically hot spot via that flapper valve depending on rpm. The most developed valve control systems for four valve per cylinder heads mostly close one valve under low load so intake air momentum and velocity can be maintained at low load without involving intake restriction under heavier loading. The smaller valves (vs. two-valve designs) also open and close quickly easier, being lighter, allowing lighter valve springs which make life easier for the entire top of the cylinder head.

The newest “direct injection” systems feed extremely high pressures to centrally-mounted fuel injectors (spark plug right next to them) to inject the fuel right at the time it’s expected to be ignited. This has a wide array of benefits. No more intake valve deposits, no more preignition (compression ratios even for turbo engines can go way up), the fuel cools the chamber immediately before combustion, the fuel has no chance to condense on the cylinder walls, and you won’t get better fuel atomization. While the systems require much better engineering and machining than does regular sequential port fuel injection, meaning extra initial expense, the reduced fuel and environmental costs more than compensate. It’s developed from diesel fuel delivery technology, and when using multiple injections per power cycle, spaced milliseconds apart, traditional diesel clatter is reduced to where it sounds and feels plenty refined for regular passenger car usage. When coupled with turbocharging, such diesels have plenty of power as well as traditional diesel stump-pulling torque. One such diesel even won Audi the 24 Hours of LeMans for 2006, even coupled to a particulate filter to remove any soot.

The soot typically seen spewing from older trucks’ exhaust stacks on acceleration quite like some sort of road-going coal-fired steam locomotive is from simple mechanical fuel injection giving the cylinders a full-power dose of fuel before the engine’s turbocharger has spooled up enough to deliver the air full power requires. Like an out-of-tune older car putting out black, smoky exhaust when cold, or an early low pressure ratio turbojet spewing a smoke trail on takeoff, the mixture is rich, leading to potential carbon deposits, which insufficient combustion temperatures and insufficient intake air simply vent out the exhaust instead of fully burning. Modern trucks use computer-controlled fuel injection with a vast array of sensors so only as much fuel is injected as the engine is receiving enough air to burn.



Accident Avoidance Systems

Automobiles have developed gyroscope-based, (multi)microprocessor-controlled systems that combine multiple forms of skid control. Besides anti-lock brakes, there are now yaw control systems that keep a car’s two ends from swapping places in traction-limited situations. These systems sometimes involve reduction of engine output, especially in the increasing number of cars there the throttle pedal and engine throttle are electronically linked rather than strictly mechanically.



Miniaturization in Industry

Microprocessor-driven sensor-based control systems have been fitted to a very wide array of older industrial equipment, retro-fitting them to operate with a degree of increased efficiency via automation unheard of when the equipment would have first been installed. Partly because of this miniaturization, many jobs that at least occasionally required substantial muscle use can now be done by folks with less of the muscle mass and endurance previously required, both making the job easier while effectively allowing more women into the workforce.



Miniaturization in Communications Gear

Communication has benefited massively from miniaturization. No longer tied to land lines, communications can be maintained wherever there is a cell phone tower mostly visible within a few to several miles. Instead of quiet peaceful time to oneself, we can now be reached anywhere, any time (as long as the phone is on). Instead of a building-based land line -connected wall-mounted or tabletop phone, we have a complete miniature computer system carried with us in a pocket, increasingly with both cell phone and internet communication capabilities.



Miniaturization Effects on Photography

Photography has seen a complete revolution from miniaturization. In the late 1960s the world started marveling at the reliability of uniquely well-machined film cameras from Japan, which was finally practicing well the capitalism taught to it by America immediately following WWII. The transistor was developed in the mid 1950s at Bell & Howell in America, and Japan fairly quickly learned to make affordable, reliable consumer devices around it. Once the integrated circuit was developed the electronics miniaturization trend began in earnest, with Japan continuing to develop some of the world’s best affordable electronics gear. Japanese cameras gained that progressing electronics in their automatic exposure and aperture control mechanisms, eventually followed by automatic camera focusing.

The Charge-Coupled Device (CCD) light detection array was developed and increasingly used to help the cameras’ electronics “understand” a scene so that the automatic settings could be optimized with increasing sophistication. Eventually CCDs were developed which could themselves serve as a rudimentary form of photographic film that never needed developing and changing. CCD and support circuitry development has proceeded to the point where a pocket-size device can deliver an image fully suitable for 8x10 printing. These pocket cameras now typically include 3X to 10X optical zoom lenses, some of which (including a 10X model) do not extend out from the pocket-sized case upon use. That in previous decades was strictly the stuff of James Bond movies at best. The smaller, much less energy-hungry CMOS light sensor array is also seeing constant development. It is now at a stage about where CCDs were a few years ago, allowing some significantly more compact and affordable cameras albeit of generally, but not always, lower image quality capability. Some CMOS sensor arrays have started seeing use in cars as both back-up view cameras and forward night-vision cameras reliant on infrared light cast by dedicated light emitting diodes near the headlights. The infrared-based night vision capability lets a driver see a deer or pedestrian with much greater lead time than any legal headlights would allow.

In the late 1980s I lugged about ten pounds of film-based camera gear up Mount Chocorua in New Hampshire, resulting in a couple dozen or so excellent nature shots plus, at the top, two of the best photos I’ve ever taken. The equipment included an approximately three pound 300mm telephoto lens. My current [2006] camera can store over 500 higher-quality images and has the equivalent of a 38 to 380mm zoom lens built in. I also have a decent-enough 3X teleconvertor for extreme optical reach, giving a 38mm to 1140mm 35mm-equivelent optical zoom range from something weighing not much more than that old 300mm telephoto lens by itself, and about as long. A single film image can generally show a wider brightness range than single digital camera images, but it’s not hard to bracket differing exposures of the same image with a digital camera and then combine the images, as many as one cares to take, into a single image with no lost shadow detail or blown-out highlights, just great detail from corner to corner regardless of lighting.

I chose a Fujifilm S5200 for its honeycomb geometry CCD sensor array, giving it much more light sensitivity over other cameras in its ($300 at the time) price range. With film, doing night photography involved significant additional exposure times because the longer film is exposed, the exponentially longer it basically needs to be exposed to get the image you expect (“reprocity failure”). What a 15 second exposure should generate, and does with a CCD-based digital camera, requires about a 1 minute exposure with film. A calculated 1 minute film exposure requires 4 to 15 minutes in practice. Many night photos I took with film went on for 15 to 30 minutes per exposure for that reason. If the exposure goes on long enough, the very different wavelengths of ultraviolet light affect the image by giving it a slightly blurred, bluish center area where the lens focused the ultraviolet light differently from visible light. While CCDs eventually overheat if exposed (given trigger voltage) for too long, my S5200 will give, in its 15 second maximum exposure time, an image comparable to that requiring 10 to 15 minutes with film in my experience. Satellite and some terrestrial telescope CCDs use Peltier solid-state heat exchanger wafers under the CCDs to not only prevent overheating but also minimize digital noise (oddly bright “hot pixels”) caused by high CCD temperatures. Integrated circuit miniaturization has brought ever-evolving digital noise management to digital cameras – while my 2003 Olympus C720UZ didn’t seem to try to combat its own “hot pixels” (800 ISO, 8 second maximum sensitivity), my 2006 Fujifilm S5200 (1600 ISO, 15 second maximum sensitivity) generates essentially none under any conditions, due to post-exposure processing as explained in the manual. You might see them in the LCD right after shooting the image, but they’re not in the image later (unless shooting in unprocessed RAW format). The exposure range in each of the S5200’s images is also much greater than that in the C720UZ’s images. The S5200’s images allow the images to be “pushed” (shadow detail brought out) much better than the C720UZ’s images, suggesting about a 2:1 color depth ratio between the two. One way to combat digital noise, to help “push” a digital camera’s images further, is to average together multiple identical maximum-exposure images so the random noise cancels itself out, dampens itself, to some extent. That is not hard at all with current affordable software (AutoStitch from AutoStitch.net and the basic version of PhotoMatix are both free).

As an update, since 2008 I've been using a Panasonic FZ18 (8MP), with an 18:1 optical all-glass zoom lens designed and made by medium-format camera maker Leica. In lower resolutions it will zoom in on the sensor to extend that zoom range, so if you only need a 3MP image you can get a 28.7:1 zoom. It also has two really good lens movement -based image stabilization modes, which combined with the 28.7:1 maximum zoom means I've gotten plenty of good images of planes and helicopters flying way overhead (no time for tripod use, have to follow the aircraft anyway). The lens is of sufficient quality that there is no chromatic aberration (“purple fringing”), and optical distortion (pincushioning and barrel distortion) are negligible. The zoom is so intense, and the lens so clear, that using the previously-mentioned 3X teleconvertor actually reduces detail clarity. So now one camera with a normal-looking lens, no tripod, can take an image better than my old film-based Petri SLR with a custom Mad Max -grade megazoom mounted on plywood resting only on something with house-grade stability. (And of course the FZ18 focuses itself very well.)

In summary, the technology that brought us the Hubble Space Telescope has made quality photography affordable enough to make our world far more documented, far more photographed than ever before. There are the stitched panoramas of the Mars landscape taken from NASA’s Mars rovers, the Hubble Space Telescope’s awe-inspiring images of the edges of the universe and incredible fisheye audio-including videos taken from the sides of NASA Shuttle booster rockets. There is also the tourist with just a pocket camera and some free software showing off a mind-blowing, seemingly infinitely-detailed seamlessly stitched panorama of the Grand Canyon. There’s the person proudly photographing their home newly decked-out in Christmas lights at night, leading to a heirloom-quality noiseless 8x10 print with zero lost shadow detail, zero blown-out highlights. There’s the guy feeling he’s looking at God from photographing just part of the clear, star-filled New Hampshire night sky with a camera seeming to have better night vision than he has.



Miniaturization Effects on Lighting

Efficiency improvements in artificial lighting have been driven partly by miniaturization. Spiral-type fluorescent bulbs that screw into standard Edison-type sockets are four times as efficient as standard incandescent bulbs and last several times longer. While economies of scale are largely responsible for their long-term cost now being much lower than incandescent-based costs, without the decent compact electronics at their base to quickly power them up (and let them fit in the first place) they would have no chance against as-small-as-you-want, immediately-on incandescent bulbs. Newer, small cold cathode fluorescent lamps (CCFL) generate 50% less heat and use 75% less energy than standard fluorescent bulbs, plus they start instantly and don’t flicker (all reasons for their common use in flatbed scanners).

Now LED-based bulbs are starting to become available, with CCFL-like improvements over standard fluorescents roughly similar to that fluorescents have over incandescent bulbs, but they always immediately reach full brightness and last essentially forever... They are simply diodes after all. LED technology stems directly from transistor development technology. LEDs have already replaced larger devices for laser use, allowing affordable, energy efficient laser printers and fingertip-sized laser pointers.

Do you why light bulbs will screw into traditional metal gas can lids? Thomas Edison wanted folks to feel some sense of familiarity regarding this new “electric light” light thing he was pushing... The “Edison bulb” socket used (and still uses) the same thread diameter and thread pitch as used by refill caps for kerosene lamp tanks, with today's portable gas/kerosene tanks and screw-in light bulbs being their direct descendants. The earliest electric outlets were the equivalent of wall-mounted lamp sockets, not changed to recessed double spade -type connectors until the early 1920s. LEDs however don't really need sockets, and in fact need almost no space at all. As the 12VDC CFLs in my home's photovoltaic-charged 12V subsystem started developing age issues after a few years I replaced many with very bright 12V LED light strips. One of the strips I separated into four mini-strips for use in 12V bulb sockets... Now I have base-only, bulb-less LED “bulbs” producing comparable (and instantly full) light for half to a third of the energy as the CFL they replaced. Most of my artificial light is now from whole 12V LED light strips attached to the ceiling here and there, with hidden wiring. It might sound weird, but it works great.



Miniaturized Audio Gear

Another area of extremely miniaturized consumer electronics is audio gear. What in the 1940s required a decent-sized bookcase full of vinyl 33 1/3 rpm records and a vacuum tube –based turntable/amplifier to play them on now takes about as much room as two or three adult fingers, including the in-ear headphones that go with it.

Some folks say though that the older technology produced a mellower, smoother sound. “Clipping” happens when an amplifier receives a signal so large that its power supply can’t keep up with the resultant current demands, named from the resulting waveform having an otherwise curved top clipped off to form a flat plateau instead. I gather use of vacuum tubes, or at least individual transistors vs. an all-in-one microprocessor, leads to much smoother, non-jarring clipping, but much of the reason for the smoother sound remains a mystery to those seriously looking into it. Generally though, advances in digital signal processing have allowed extremely high fidelity to the original source from radically smaller devices, especially vs. the crackly, no-bass, no-treble, monophonic 9V-draining pocket radios I remember from childhood. And my 2GB SanDisk MP3 player / radio / audio recorder, about 2X the volume of my thumb, has yet to eat a cassette tape.

When converting an audio source to MP3 (or other typical lossy) format though, you want to turn up the quality setting to make sure you don't lose too much detail though, because over-compressed MP3 files (just like over-compressed JPEG image files) can really trash source fidelity.



The Source of Miniaturization: The Personal Computer

And then of course there is the personal computer and its quickly-growing family of derivatives. These include the stand-alone network server and the very thin rack-mount modular network server and the similarly compact and sometimes hot-running laptop computer, which has its own family of smaller relatives from the notebook computer to the netbook to the hand-held Personal Data Assistant. All of these have gotten powerful enough in a sufficiently compact and affordable package that pretty much everyone is now able to have both a home computer and some sort of Web site hosted by at least their Internet service provider.

For my own use, until a few years ago I used my trusty AMD-powered desktop for occasional computational heavy lifting and my sister’s old laptop (mostly solar-powered in the warmer half of the year) for regular use (such as creating this document). Now I occasionally use a Windows-based laptop for workstation-grade use, and my 12V-powered Linux eePC900 netbook for regular use (including nearly all Internet access, as Linux doesn't get eaten alive by viruses as Windows so readily does), all about 95% solar-powered. (The DSL modem and router are normally grid-powered though, but I don't switch on their power unless I'm doing something that requires Internet access.)

This abundance of small, networked computers has spawned interest in grid computing, in which the otherwise unused CPU cycles of millions of on but mostly-idle computers are used for tasks ranging from working on possible cures for different types of cancer to examining distant radio emissions for hints of intelligent extraterrestrial life. Cumulatively, they provide the analytical power of a very powerful and very expensive mainframe supercomputer. (Supercomputers are themselves now being constructed similarly around parallel sets of individual PC-grade CPUs, albeit many thousands of them.)

A curious thing about miniaturization is that while the transistor has gotten so much smaller, and so uses so much less power, there are exponentially more of them in use vs. a decade ago so total personal computer power consumption hasn’t really decreased with miniaturization (until very recently). The power supply for an old Cyrix-based 200 MHz AT was rated at about 200 watts, vs. 60 watts for my late 1980s 64K Tandy TRS-80 Color Computer, and 400 watts for my desktop (well, tower actually) computer. The laptop I typed the original of this had, similar to the “Trash 80 CoCo”, a power supply rated at 65W. It did fine for current regular stuff with its 512MB of RAM and an 800 MHz Celeron running MS XP Pro, while my old CoCo did fine in the late-‘80s with the tasks I gave it, using a 0.79 / 1.4 MHz multitasking-capable Motorola chip and 64kB of RAM. The fluorescent lamp next to the laptop used only 5 watts though (about 2W now for its LED replacement), vs. a 60+ watt incandescent + 60 watt television that the CoCo used as a monitor, so I’m able to get more done in an infinitely more user-friendly manner using a third the total energy with the laptop vs. in the old days. Currently (2010), the energy used is about 95% from solar panels, + about 15W for the DSL modem and router... But usually folks fire up their desktop and separate (sometimes, still, CRT-based) monitor for any tasks (not everyone has laptops yet), and many still usually use incandescent lighting. So my power consumption situation, especially in warmer weather with the solar panels getting direct bright sunlight, is statistically a freak show.

Computers are in essentially everything now, from humidity-sensing toaster and microwave ovens, to the wall thermostat that saves you 40% on heating costs by cutting heating however much you want while everyone’s out, to the microprocessor that turns the window A/C compressor motor on only after at least three minutes have passed so it doesn’t stall and trip its thermal protection circuit and runs the blower only as fast as the room temperature requires. However interesting personal computers of whatever sort may seem, the effectiveness and efficiency improvements from home and business device automation have been the real computer revolution, achievable only via computer miniaturization.



Conclusion

Most of the miniaturization I’ve described has been driven by miniaturization of computers of all sorts, hence my finishing with that topic. It has had extremely far-reaching consequences on everyday life, giving us what our great-grandparents would have considered science-fiction lives. What the Romans did with slaves we do with microprocessor-controlled tools. We are able to accurately observe our surroundings with Star Trek –like accuracy via CCD cameras, work on disease cures via countless thousands of otherwise bored home computers, and get assistance upon encountering trouble in long trips instead of becoming a corpse to be found in a future spring’s snow melt –swollen streams.

It will continue, leading to technologies not even science fiction writers have typically envisioned. May it be used more for assisting the common good than for working against the common good.







Index/Home On Sustainability On Miniaturization The MkI Insight Renewable/Sustainable Energy

On the Universe The Meditations Galleries The Old Quarter Bibliography/Links