Tuesday, July 15, 2008

Rocketcam

Space exploration has captivated Americans for decades. Perhaps Captain Kirk of "Star Trek" said it best when he called space "the final frontier." We stand on the edge of a wilderness so tantalizingly close and mysterious, we can't help but risk our lives to know more about it.

rocketcam view from spacecraft
NASA/Ecliptic Enterprises Corporation
An example of the stunning view made possible by RocketCam

But until it's cheap enough to hitch rides to the moon, the public will have to settle for vicarious space exploration via video. In fact, movies and space exploration have been connected for a long time. One of the first popular movies, "A Trip to the Moon," made in 1902, is a fantastical tale of lunar exploration. Sixty-seven years later, televisions glowed with video feed of the first manned moon landing, fueling the public's imagination about space travel.

But video's place in space goes beyond thrills. Today's ultra connected world revolves around the idea that the more informed we are, the better the decisions we make. This particularly rings true with space exploration, where tiny problems can trigger disastrous consequences -- think the tragedy of the space shuttle Columbia.

In 2003, the shuttle and its seven astronauts burned up upon re-entry into the Earth's atmosphere. The source of the problem with the Columbia shuttle was a piece of insulation that fell from the surface of the external fuel tank 81 seconds after liftoff and damaged the left wing. NASA engineers viewed video of this mishap, but it was filmed from a distance, and they concluded that the insulation didn't cause any serious damage. They were wrong.

Had the shuttle been equipped with a RocketCam during the launch, the Columbia's wing might have been properly diagnosed and the disaster avoided. The RocketCam is an ordinary video camera with an extraordinary purpose. It's attached to the shuttle and during takeoff provides a crucial video view -- the view from the launching vehicle down to the ground. In fact, the RocketCam caught video of insulation flying off the Discovery shuttle fuel tank in the 2005 return-to-flight launch. But that video helped engineers make the informed decision that -- this time -- the insulation didn't cause any damage.

Components of RocketCam

The RocketCam doesn't look like anything special. But don't let the primitive exterior fool you -- this gadget is rugged. It can endure extreme speeds, vibration and temperatures.

Beneath its clunky exterior, the basic camera is about 2.8 ounces (about 80 grams) and four inches (10 centimeters) long [source: Space Show, Fine]. RocketCams use professional-grade Sony miniature color video cameras (in the past, the XC-999 model). But you can't just take one of these cameras and strap it to a rocket. To make the camera ultra durable, Ecliptic Enterprises Corporation takes the camera apart and puts it back together again. During this process, it ruggedizes (or strengthens) the camera, bolting it to a platform with added features that make it tough enough to withstand extreme environments. The camera's casing has an aerodynamic design, and a layer of insulation foam on its surface helps to protect it from extreme temperatures.

RocketCam AVS camera assembly
Ecliptic Enterprises Corporation
RocketCam camera equipment

The Ecliptic engineers also add technical features. For instance, every RocketCam that is built to take a ride on the external tank of a shuttle has a radio transmitter and antenna to send the video information down to the ground from space using electromagnetic radio waves. This allows the information to be sent quickly to multiple receivers. The camera may also incorporate various battery sizes or power support boards and other adaptable features, depending on the use.

RocketCams may be either analog or digital. These terms simply refer to how the visual data is recorded and reproduced. In analog recording, an older method, the device records information in continuous variations of waves. Digital technology doesn't use waves, but rather records information in numbers, such as in 1s and 0s. Digital versions of the RocketCam are more sophisticated with added features, such as "store-and-forward" viewing, which allows those on board to review video, similar to how a TiVo works. It also enables improved radio frequency (RF) bandwidth, which means it uses the available bandwidth more efficiently.

rocketcam DVS assembly
Ecliptic Enterprises Corporation
RocketCam Digital Video Controller for spacecraft
(five cameras shown)

Digital models take in a large amount of information in order to capture things like payload separation and to aid in failure investigations. To accommodate that, the digital RocketCam compresses this information to send it over electromagnetic radio waves down to Earth. (Although digital technology doesn't use electromagnetic waves to record information, it can use those waves to send information it as recorded elsewhere.) Compression of digital information involves finding common repeated patterns and shortening them, as you'll learn in How File Compression Works. To carry out the process, the engineers incorporate a compressor which uses a Digital Signal Processor chip to format the information.

rocketcam protective casing pod
Ecliptic Enterprises Corporation
RocketCam Analog Video System Pod.

Although they are most known for their positions on the outside of a launching vehicle, some RocketCams offer even more information about a space launch by recording the interior of a ship. SpaceShipOne, the first privately funded manned ship to go to space, used two interior RocketCams for the cockpit in addition to cameras on the exterior. These internal cameras don't require protective casing.

When used on space shuttles, NASA mounts RocketCams to the exterior fuel tank and on the two solid rocket boosters. These structures release from the shuttle after takeoff, so the camera only hitches a ride temporarily on a shuttle. But by this time, the vital information has already traveled to the ground.

At a mere 5 pounds (about 2,267 grams), this camera serves a lot of purposes [source: Space Show]. Read on to find out why it's so important.

Value of RocketCam

When monitoring the data of a rocket launch, the faster the information travels, the better. RocketCam can send live feeds of the launch, and this situationalawareness means ground engineers know what's happening in real time. And they can make fast, informed decisions if something goes wrong.

rocketcam view from spacecraft
NASA/Ecliptic Enterprises Corporation
The RocketCam allows ground engineers to view certain actions, like the release of the solid rocket
boosters, shown here.

RocketCam can be a highly valuable tool for research in space as well. As part of the Lunar Crater Observation and Sensing Satellite (CROSS) mission, NASA plans to send a rocket to the moon -- in fact, the plan is for the rocket to crash into the moon in an attempt to stir up debris. This crash, they suspect, could release water ice. Or, if not, analyses of the debris might offer clues as to whether there are traces of water on the moon. The RocketCam will aid in the research. Unlike with shuttle launches, it won't be released on the way. It will be attached to a small spacecraft that separates before impact so that the camera can send NASA video feed of the crash.

Public relations is also a major part of the business of space exploration, and many consider video the best way to get people excited about it. Watching video of a launch from the ground is only so interesting to television audiences. But give them a video view from the rocket itself, and they're riveted.

rocketcam showing external fuel tank separation
NASA/Ecliptic Enterprises Corporation
Here, the RocketCam shows a view from the external fuel tank as shuttle releases it.

Public appeal goes beyond NASA and shuttle launches -- private companies can help drive spacetourism with video, and they use RocketCams as well. After the Columbia disaster in 2003, NASA imposed a hiatus on its shuttle program, meaning private launches got more attention. One of the most famous of these private uses was for suborbital SpaceShipOne, the first manned flight to space altitude by a private pilot in June 2004. To help encourage the expanding field of private space exploration, Ecliptic has made efforts to keep the RocketCam affordable (an easy endeavor given that the company doesn't manufacture the camera technology, just its rugged casing) [source: Fine].

RocketCam has been included on more than 60 launches, including rockets (orbital and suborbital), spacecraft and shuttles. A lot of these launches include non-space-related projects as well, which we'll talk about next.

Blinded Me with Rocket Science

RocketCam's first blastoff on space shuttle Atlantis in October 2002 provided viewers with spectacular video -- but only for a few minutes. Unfortunately, the rockets that work to separate the solid rocket boosters from the shuttle released exhaust that significantly blurred the RocketCam view. Now, NASA mounts the RocketCam strategically to avoid this problem [source: Adams].

Value of RocketCam

When monitoring the data of a rocket launch, the faster the information travels, the better. RocketCam can send live feeds of the launch, and this situationalawareness means ground engineers know what's happening in real time. And they can make fast, informed decisions if something goes wrong.

rocketcam view from spacecraft
NASA/Ecliptic Enterprises Corporation
The RocketCam allows ground engineers to view certain actions, like the release of the solid rocket
boosters, shown here.

RocketCam can be a highly valuable tool for research in space as well. As part of the Lunar Crater Observation and Sensing Satellite (CROSS) mission, NASA plans to send a rocket to the moon -- in fact, the plan is for the rocket to crash into the moon in an attempt to stir up debris. This crash, they suspect, could release water ice. Or, if not, analyses of the debris might offer clues as to whether there are traces of water on the moon. The RocketCam will aid in the research. Unlike with shuttle launches, it won't be released on the way. It will be attached to a small spacecraft that separates before impact so that the camera can send NASA video feed of the crash.

Public relations is also a major part of the business of space exploration, and many consider video the best way to get people excited about it. Watching video of a launch from the ground is only so interesting to television audiences. But give them a video view from the rocket itself, and they're riveted.

rocketcam showing external fuel tank separation
NASA/Ecliptic Enterprises Corporation
Here, the RocketCam shows a view from the external fuel tank as shuttle releases it.

Public appeal goes beyond NASA and shuttle launches -- private companies can help drive spacetourism with video, and they use RocketCams as well. After the Columbia disaster in 2003, NASA imposed a hiatus on its shuttle program, meaning private launches got more attention. One of the most famous of these private uses was for suborbital SpaceShipOne, the first manned flight to space altitude by a private pilot in June 2004. To help encourage the expanding field of private space exploration, Ecliptic has made efforts to keep the RocketCam affordable (an easy endeavor given that the company doesn't manufacture the camera technology, just its rugged casing) [source: Fine].

RocketCam has been included on more than 60 launches, including rockets (orbital and suborbital), spacecraft and shuttles. A lot of these launches include non-space-related projects as well, which we'll talk about next.

Blinded Me with Rocket Science

RocketCam's first blastoff on space shuttle Atlantis in October 2002 provided viewers with spectacular video -- but only for a few minutes. Unfortunately, the rockets that work to separate the solid rocket boosters from the shuttle released exhaust that significantly blurred the RocketCam view. Now, NASA mounts the RocketCam strategically to avoid this problem [source: Adams].


Source: http://science.howstuffworks.com/rocketcam.htm

NOTE: I do not claim ownership of this article in any sense.
RocketCamTM is a trademark of Ecliptic Enterprises Corporation.

Saturday, July 5, 2008

How Night Vision Works?

The first thing you probably think of when you see the words night vision is a spy or action movie you've seen, in which someone straps on a pair of night-vision goggles to find someone else in a dark building on a moonless night. And you may have wondered "Do those things really work? Can you actually see in the dark?"
day/night binoculars
Photo courtesy of B.E. Meyers Company
Gyro-stabilized day/night binoculars manufactured by B.E. Meyers Company. See more night vision pictures.

The answer is most definitely yes. With the proper night-vision equipment, you can see a person standing over 200 yards (183 m) away on a moonless, cloudy night! Night vision can work in two very different ways, depending on the technology used.

  • Image enhancement - This works by collecting the tiny amounts of light, including the lower portion of the infrared light spectrum, that are present but may be imperceptible to our eyes, and amplifying it to the point that we can easily observe the image.
  • Thermal imaging - This technology operates by capturing the upper portion of the infrared light spectrum, which is emitted as heat by objects instead of simply reflected as light. Hotter objects, such as warm bodies, emit more of this light than cooler objects like trees or buildings.

In this article, you will learn about the two major night-vision technologies. We'll also discuss the various types of night-vision equipment and applications. But first, let's talk about infrared light.

Infrared Light

In order to understand night vision, it is important to understand something about light. The amount of energy in a light wave is related to its wavelength: Shorter wavelengths have higher energy. Of visible light, violet has the most energy, and red has the least. Just next to the visible light spectrum is the infrared spectrum.
light spectrum
Infrared light is a small part of the light spectrum.

Infrared light can be split into three categories:

  • Near-infrared (near-IR) - Closest to visible light, near-IR has wavelengths that range from 0.7 to 1.3 microns, or 700 billionths to 1,300 billionths of a meter.
  • Mid-infrared (mid-IR) - Mid-IR has wavelengths ranging from 1.3 to 3 microns. Both near-IR and mid-IR are used by a variety of electronic devices, including remote controls.
  • Thermal-infrared (thermal-IR) - Occupying the largest part of the infrared spectrum, thermal-IR has wavelengths ranging from 3 microns to over 30 microns.

The key difference between thermal-IR and the other two is that thermal-IR is emitted by an object instead of reflected off it. Infrared light is emitted by an object because of what is happening at the atomic level.

Atoms
Atoms are constantly in motion. They continuously vibrate, move and rotate. Even the atoms that make up the chairs that we sit in are moving around. Solids are actually in motion! Atoms can be in different states of excitation. In other words, they can have different energies. If we apply a lot of energy to an atom, it can leave what is called the ground-state energy level and move to an excited level. The level of excitation depends on the amount of energy applied to the atom via heat, light or electricity.

An atom consists of a nucleus (containing the protons and neutrons) and an electron cloud. Think of the electrons in this cloud as circling the nucleus in many different orbits. Although more modern views of the atom do not depict discrete orbits for the electrons, it can be useful to think of these orbits as the different energy levels of the atom. In other words, if we apply some heat to an atom, we might expect that some of the electrons in the lower energy orbitals would transition to higher energy orbitals, moving farther from the nucleus.

atom
An atom has a nucleus and an electron cloud.

Once an electron moves to a higher-energy orbit, it eventually wants to return to the ground state. When it does, it releases its energy as a photon -- a particle of light. You see atoms releasing energy as photons all the time. For example, when the heating element in a toaster turns bright red, the red color is caused by atoms excited by heat, releasing red photons. An excited electron has more energy than a relaxed electron, and just as the electron absorbed some amount of energy to reach this excited level, it can release this energy to return to the ground state. This emitted energy is in the form of photons (light energy). The photon emitted has a very specific wavelength (color) that depends on the state of the electron's energy when the photon is released.

Anything that is alive uses energy, and so do many inanimate items such as engines and rockets. Energy consumption generates heat. In turn, heat causes the atoms in an object to fire off photons in the thermal-infrared spectrum. The hotter the object, the shorter the wavelength of the infrared photon it releases. An object that is very hot will even begin to emit photons in the visible spectrum, glowing red and then moving up through orange, yellow, blue and eventually white. Be sure to read How Light Bulbs Work, How Lasers Work and How Light Works for more detailed information on light and photon emission.

In night vision, thermal imaging takes advantage of this infrared emission.

Thermal Imaging

Here's how thermal imaging works:
  1. A special lens focuses the infrared light emitted by all of the objects in view.
  2. The focused light is scanned by a phased array of infrared-detector elements. The detector elements create a very detailed temperature pattern called a thermogram. It only takes about one-thirtieth of a second for the detector array to obtain the temperature information to make the thermogram. This information is obtained from several thousand points in the field of view of the detector array.
  3. The thermogram created by the detector elements is translated into electric impulses.
  4. The impulses are sent to a signal-processing unit, a circuit board with a dedicated chip that translates the information from the elements into data for the display.
  5. The signal-processing unit sends the information to the display, where it appears as various colors depending on the intensity of the infrared emission. The combination of all the impulses from all of the elements creates the image.
thermal imaging system
Image courtesy of Infrared, Inc.
The basic components of a thermal-imaging system

Types of Thermal Imaging Devices
Most thermal-imaging devices scan at a rate of 30 times per second. They can sense temperatures ranging from -4 degrees Fahrenheit (-20 degrees Celsius) to 3,600 F (2,000 C), and can normally detect changes in temperature of about 0.4 F (0.2 C).

day
Image courtesy of Infrared, Inc.
It is quite easy to see everything during the day...
night
Image courtesy of Infrared, Inc.
...but at night, you can see very little.
thermal imaging
Image courtesy of Infrared, Inc.
Thermal imaging lets you see again.

There are two common types of thermal-imaging devices:

  • Un-cooled - This is the most common type of thermal-imaging device. The infrared-detector elements are contained in a unit that operates at room temperature. This type of system is completely quiet, activates immediately and has the battery built right in.
  • Cryogenically cooled - More expensive and more susceptible to damage from rugged use, these systems have the elements sealed inside a container that cools them to below 32 F (zero C). The advantage of such a system is the incredible resolution and sensitivity that result from cooling the elements. Cryogenically-cooled systems can "see" a difference as small as 0.2 F (0.1 C) from more than 1,000 ft (300 m) away, which is enough to tell if a person is holding a gun at that distance!

While thermal imaging is great for detecting people or working in near-absolute darkness, most night-vision equipment uses image-enhancement technology.


Image Enhancement

Image-enhancement technology is what most people think of when you talk about night vision. In fact, image-enhancement systems are normally called night-vision devices (NVDs). NVDs rely on a special tube, called an image-intensifier tube, to collect and amplify infrared and visible light.
image intensifier tube
The image-intensifier tube changes photons to electrons and back again.

Here's how image enhancement works:

  1. A conventional lens, called the objective lens, captures ambient light and some near-infrared light.
  2. The gathered light is sent to the image-intensifier tube. In most NVDs, the power supply for the image-intensifier tube receives power from two N-Cell or two "AA" batteries. The tube outputs a high voltage, about 5,000 volts, to the image-tube components.
  3. The image-intensifier tube has a photocathode, which is used to convert the photons of light energy into electrons.
  4. As the electrons pass through the tube, similar electrons are released from atoms in the tube, multiplying the original number of electrons by a factor of thousands through the use of a microchannel plate (MCP) in the tube. An MCP is a tiny glass disc that has millions of microscopic holes (microchannels) in it, made using fiber-optic technology. The MCP is contained in a vacuum and has metal electrodes on either side of the disc. Each channel is about 45 times longer than it is wide, and it works as an electron multiplier.

    When the electrons from the photo cathode hit the first electrode of the MCP, they are accelerated into the glass microchannels by the 5,000-V bursts being sent between the electrode pair. As electrons pass through the microchannels, they cause thousands of other electrons to be released in each channel using a process called cascaded secondary emission. Basically, the original electrons collide with the side of the channel, exciting atoms and causing other electrons to be released. These new electrons also collide with other atoms, creating a chain reaction that results in thousands of electrons leaving the channel where only a few entered. An interesting fact is that the microchannels in the MCP are created at a slight angle (about a 5-degree to 8-degree bias) to encourage electron collisions and reduce both ion and direct-light feedback from the phosphors on the output side.

    night vision image
    Photo courtesy of B.E. Meyers Company
    Night-vision images are known for their eerie green tint.
  5. At the end of the image-intensifier tube, the electrons hit a screen coated with phosphors. These electrons maintain their position in relation to the channel they passed through, which provides a perfect image since the electrons stay in the same alignment as the original photons. The energy of the electrons causes the phosphors to reach an excited state and release photons. These phosphors create the green image on the screen that has come to characterize night vision.
  6. The green phosphor image is viewed through another lens, called the ocular lens, which allows you to magnify and focus the image. The NVD may be connected to an electronic display, such as a monitor, or the image may be viewed directly through the ocular lens.


Generations

NVDs have been around for more than 40 years. They are categorized by generation. Each substantial change in NVD technology establishes a new generation.
  • Generation 0 - The original night-vision system created by the United States Army and used in World War II and the Korean War, these NVDs use active infrared. This means that a projection unit, called an IR Illuminator, is attached to the NVD. The unit projects a beam of near-infrared light, similar to the beam of a normal flashlight. Invisible to the naked eye, this beam reflects off objects and bounces back to the lens of the NVD. These systems use an anode in conjunction with the cathode to accelerate the electrons. The problem with that approach is that the acceleration of the electrons distorts the image and greatly decreases the life of the tube. Another major problem with this technology in its original military use was that it was quickly duplicated by hostile nations, which allowed enemy soldiers to use their own NVDs to see the infrared beam projected by the device.
  • Generation 1 - The next generation of NVDs moved away from active infrared, using passive infrared instead. Once dubbed Starlight by the U.S. Army, these NVDs use ambient light provided by the moon and stars to augment the normal amounts of reflected infrared in the environment. This means that they did not require a source of projected infrared light. This also means that they do not work very well on cloudy or moonless nights. Generation-1 NVDs use the same image-intensifier tube technology as Generation 0, with both cathode and anode, so image distortion and short tube life are still a problem.
  • Generation 2 - Major improvements in image-intensifier tubes resulted in Generation-2 NVDs. They offer improved resolution and performance over Generation-1 devices, and are considerably more reliable. The biggest gain in Generation 2 is the ability to see in extremely low light conditions, such as a moonless night. This increased sensitivity is due to the addition of the microchannel plate to the image-intensifier tube. Since the MCP actually increases the number of electrons instead of just accelerating the original ones, the images are significantly less distorted and brighter than earlier-generation NVDs.
  • Generation 3 - Generation 3 is currently used by the U.S. military. While there are no substantial changes in the underlying technology from Generation 2, these NVDs have even better resolution and sensitivity. This is because the photo cathode is made using gallium arsenide, which is very efficient at converting photons to electrons. Additionally, the MCP is coated with an ion barrier, which dramatically increases the life of the tube.
  • Generation 4 - What is generally known as Generation 4 or "filmless and gated" technology shows significant overall improvement in both low- and high-level light environments.

    The removal of the ion barrier from the MCP that was added in Generation 3 technology reduces the background noise and thereby enhances the signal to noise ratio. Removing the ion film actually allows more electrons to reach the amplification stage so that the images are significantly less distorted and brighter.

    The addition of an automatic gated power supply system allows the photocathode voltage to switch on and off rapidly, thereby enabling the NVD to respond to a fluctuation in lighting conditions in an instant. This capability is a critical advance in NVD systems, in that it allows the NVD user to quickly move from high-light to low-light (or from low-light to high-light) environments without any halting effects. For example, consider the ubiquitous movie scene where an agent using night vision goggles is “sightless” when someone turns on a light nearby. With the new, gated power feature, the change in lighting wouldn’t have the same impact; the improved NVD would respond immediately to the lighting change.

Many of the so-called "bargain" night-vision scopes use Generation-0 or Generation-1 technology, and may be disappointing if you expect the sensitivity of the devices used by professionals. Generation-2, Generation-3 and Generation 4 NVDs are typically expensive to purchase, but they will last if properly cared for. Also, any NVD can benefit from the use of an IR Illuminator in very dark areas where there is almost no ambient light to collect.

NVD camera
Photo courtesy of B.E. Meyers Company
NVDs come in a variety of styles, including ones that can be mounted to cameras.

A cool thing to note is that every single image-intensifier tube is put through rigorous tests to see if it meets the requirements set forth by the military. Tubes that do are classified as MILSPEC. Tubes that fail to meet military requirements in even a single category are classified as COMSPEC.


Night Vision Equipment and Applications

Night-vision equipment can be split into three broad categories:
  • Scopes - Normally handheld or mounted on a weapon, scopes are monocular (one eye-piece). Since scopes are handheld, not worn like goggles, they are good for when you want to get a better look at a specific object and then return to normal viewing conditions.
    pocketscope
    Photo courtesy of B.E. Meyers Company
    DARK INVADER Multi-purpose Pocketscope
  • Goggles - While goggles can be handheld, they are most often worn on the head. Goggles are binocular (two eye-pieces) and may have a single lens or stereo lens, depending on the model. Goggles are excellent for constant viewing, such as moving around in a dark building.
    goggles
    Photo courtesy of B.E. Meyers Company
    DARK INVADER Night-vision Goggles 4501
  • Cameras - Cameras with night-vision technology can send the image to a monitor for display or to a VCR for recording. When night-vision capability is desired in a permanent location, such as on a building or as part of the equipment in a helicopter, cameras are used. Many of the newer camcorders have night vision built right in.
    video camera
    Photo courtesy of B.E. Meyers Company
    Stealth 301 Series Day/Night Video Camera

Applications

soldier goggles
Photo courtesy of B.E. Meyers Company
This soldier is using DARK INVADER night-vision goggles.
Common applications for night vision include:
  • Military
  • Law enforcement
  • Hunting
  • Wildlife observation
  • Surveillance
  • Security
  • Navigation
  • Hidden-object detection
  • Entertainment

The original purpose of night vision was to locate enemy targets at night. It is still used extensively by the military for that purpose, as well as for navigation, surveillance and targeting. Police and security often use both thermal-imaging and image-enhancement technology, particularly for surveillance. Hunters and nature enthusiasts use NVDs to maneuver through the woods at night.

Detectives and private investigators use night vision to watch people they are assigned to track. Many businesses have permanently-mounted cameras equipped with night vision to monitor the surroundings.

A really amazing ability of thermal imaging is that it reveals whether an area has been disturbed -- it can show that the ground has been dug up to bury something, even if there is no obvious sign to the naked eye. Law enforcement has used this to discover items that have been hidden by criminals, including money, drugs and bodies. Also, recent changes to areas such as walls can be seen using thermal imaging, which has provided important clues in several cases.

camcorder
Photo courtesy of B.E. Meyers Company
Camcorders are a fast-growing segment
of the night-vision industry.

Many people are beginning to discover the unique world that can be found after darkness falls. If you're out camping or hunting a lot, chances are that night-vision devices can be useful to you -- just be sure to get the right type for your needs.

Link for this article:

http://electronics.howstuffworks.com/nightvision.htm

How can my spoon taste my food for me?

It's been another long day at work, and the last thing you want to do is go home and cook for your family. No matter how hard you try, the peanut gallery will find some reason to complain. "The soup is too salty," says one child. Another child complains that his casserole is too runny. "This pie crust could really use some citrus," reports your spouse, who apparently moonlights as a food critic for the local newspaper.

man cooking and tasting food
LWA/Photodisc/Getty Images
Let an intelligent spoon handle the food tasting.

What you need is an intelligent spoon, a mixing spoon that uses sensors to measure different food qualities, like acidity, during the cooking process. This automated kitchen utensil was developed by two students at the Massachusetts Institute of Technology (MIT) in 2005. The students were working in MIT's Counter Intelligence Lab, which tries to create smart technology products that make our time in the kitchen easier. Smart technology automates appliances and devices with computers so that most of the work and the required thought are eliminated from everyday tasks.

Some of the Counter Intelligence Lab's other cool innovations for kitchens of the future include the "Talking Trivet," which will let you know if something needs to be rewarmed, if it's too hot to touch or if it's ready to eat. Faucets will let you know both the temperature of the water and also if you're using too much of it. In the future, you could project recipes directly onto a countertop so the cookbook doesn't get dirty, and you could save electricity by never having to open your fridge just to know what's inside. Instead, a small camera and projection screen could let you know your fridge's contents when you're thinking about what's for dinner.

But what about those intelligent spoons? How can they help with dinner? How do they use smart technology to taste food? Go on to the next page to find out what these spoons can stir up.

Tasting Spoons
Looking for a way to taste food while you cook that's a little less high-tech? Traditional tasting spoons have a spoon on each side of the handle -- one side is for stirring and the other side is just for tasting. This two-headed spoon is connected by a little trough. When you're ready to try your concoction, tip the stirring side so that the contents run down into the other spoon. This way you can taste the food while keeping everything hygienic.

Intelligent Spoon

Currently, the prototype intelligent spoon is made of clear plastic so that you can see an array of wires running through the inside. The smart spoon contains zinc, aluminum and gold sensors. When the spoon touches food, the sensors measure the concoction's temperature, acidity, salinity and viscosity. When you connect the spoon to a computer with a cord, the computer analyzes the information from the spoon and provides feedback. Although the intelligent spoon is still in development, we guess that it measures these factors and creates suggestions based on a database of recipes stored in the computer.

smart spoon
Prototype of a smart spoon

How might this play out in the kitchen? Say you're making salad dressing from scratch. You find a recipe on the computer and start following it. You accidentally add too much salt, and you're worried the entire dressing is too salty. Rather than tasting the dressing, figuring out how to adjust it and tasting it again, the computer would tell you to add a certain amount of sugar. The computer would tell you how much vinegar or lime juice to add so that you had just the right amount of acidity, as measured by the mix's pH level. Is the dressing not holding together? The viscosity sensor will alert the computer that you need more egg yolk. It would be like having a GPS system for each recipe -- a step-by-step instruction based on what's in the bowl.

Now, if you love to cook, this might seem a bit silly. Some cooks would argue that your intuition and personal preference are better guides for preparing food than any precise recipe. It's the joy of tasting and tinkering that creates magic in the kitchen, and mistakes are part of the process. You just may stumble onto something better than what's in the cookbook. Cooks may even point out that you can't always follow a recipe word for word. For example, baking at higher altitudes is much different than baking at sea level because altitude affects moisture content. It's too early to tell whether the intelligent spoon will provide this level of assistance.

While cooking may become intuitive, the intelligent spoon could really help new cooks or clumsy cooks get on the right foot in the kitchen. They may eventually learn from their intelligent spoon and feel more comfortable cooking on their own. However, the intelligent spoon is not for sale yet, and there are several unknowns, including cost and ease of use. Smart appliances in general are not widely available in the marketplace, in part because they're usually more expensive than traditional appliances and because they can be more complicated to use. Many people may not trust a smart appliance when the regular one works just fine. In the case of the smart spoon, not everyone will want to drag a laptop into the kitchen for cooking advice.

Still, we may see the smart spoon in stores one day -- there are already some sensor spoons for sale. These spoons, meant to be used in the microwave, change color as the meal heats. When you see that your spoon is a certain color, you'll know it's time to eat. One variation is specifically for babies, so that a caregiver will know if food is too hot for a baby's gums.

For more information on this try

http://electronics.howstuffworks.com/spoon-taste-food1.htm

Tuesday, January 22, 2008

Robot Skin gets sensitive

Robots have all the luck. Researchers have been working overtime to make sure that robots will be able to soft, sensitive skin that feels pain and pressure. Now kindly DARPA researchers are making sure that human beings will also have sensitive artificial skin.
In 2005, Goddard technologist Vladimir Lumelsky worked on sensor embedded skin coverings for robots. This high-tech skin (see photo) is essential for robots doing space exploration, because humans and robots will work together under different conditions. Robots need to be sensitive.
Takao Someya lead a group of University of Tokyo researchers that same year in creating a robotic skin consisting of pressure-sensing and temperature-sensing networks laminated together (see photo). The resulting artificial skin can detect both properties simultaneously. The transistors used in the circuits and the semiconductors both use "organic" materials based on chains of carbon atoms, so maybe robots will "feel" a little bit more human.
How will artificial skin conduct sensation for human beings? Vertically aligned carbon nanotubes can make the surface of prosthetic devices feel like the the covering you were born with. A nanotube is delicately balanced on top of gold filaments, and threaded through a rubbery polymer. The polymer/nanotube skin will transfer heat from the surface to the sensor network below as quickly as your skin communicates this information. The nanotubes enhance the piezoelectric effect inherent in the polymer. The sensors can then generate a signal to the brain.
Source:www.livescience.com