ENERGY | WIRELESS | NANOTECH | MEMS | OPTICS | QUANTUM | 3D | CHIPS | ALGORITHMS

Tuesday, May 31, 2011

#ALGORITHMS: "Cloud Computing Expands Reach with iDataCenters"


Around the world, data centers are popping up to service the continued expansion of cloud services into mainstream computing. Following the lead of Google, IBM, Amazon and others, Apple continues to prep massive data centers east and west for its new thrust into cloud computing services. The debut of Apple's iDataCenters and its new cloud computing services, rumored to debut soon, will include the rollout of its iCloud.com domain.

The first cloud service to be offered by Apple will likely be dedicated to streaming music to iPads, iPhones, iPods and even PCs running iTunes—but that is just the start. Apple is also planning to retool its MobileMe service to provide a "storage locker" in the clouds not only for music, but for any file—enabling its iOS-based devices to store massive amounts of personal and corporate data that can be quickly accessed from the clouds as if it were stored locally.

The eastern Apple iDataCenter (not to be confused with the data center management app by the same name) houses a massive half million square feet of server farm and support facility space in rural North Carolina. Apple’s eagerly awaited cloud music service will most likely be streaming music from this new data center in Maiden, N.C. The iDataCenter was personally welcomed to the state by Gov. Bev Perdue, who signed an agreement that gives Apple a state tax credit worth $46 million.
\
The sprawling iDataCenter in North Carolina is rumored to be setting Apple back a cool billion dollars, and will draw an astounding 100 megawatts of power, which had to be negotiated with Duke Energy Corp. (Charlotte). The installation will boost Apple's cloud computing capabilities with five times more throughput than its existing 100,000-square-foot data center in Newark, Calif.

Apple has already reached agreements with EMI Group, Sony and Warner Music Group regarding its plans to stream audio to mobile device users who own those songs and which have been authorized to stream them. The only remaining holdout is Universal Music Group—the last member of the big four—which is expected to sign on before Apple unveils iCloud.com at its Worldwide Developers Conference next week (June 6-10 in San Francisco).

Apple is also planning a new data center in Santa Clara, Calif., which is being built by DuPont Fabros Technology—a developer of wholesale data center space. This smaller 11,000-square-foot facility consumes 2.3 megawatts. Apple has signed a seven-year lease on the space, which is near the company’s Cupertino headquarters. The new western iDataCenter is rumored to be housing a reinvented MobileMe service retooled to offer storage locker space for cloud computing tasks.
Further Reading

Thursday, May 26, 2011

#MEMS: "High-temp MEMS goes seismic"


Analog Devices Inc. is offering a dual-axis accelerometer capable of withstanding up 175 degrees Celsius (342 degrees Fahrenheit) for ruggedized industrial applications. The device is based on what ADI calls the world's first high-temperature micro-electro-mechanical system (MEMS) technology. Today, applications using an accelerometer in a high-temperature environment, such as tools used in geological down-hole measurements, require complex compensation circuitry to ensure that readings are not skewed by temperature. The new iMEMS ADXL206, on the other hand, has virtually no quantization errors or other non-monotonic behaviors over its entire operating range, from -40 degrees to +175 degrees Celsius, according to ADI (Norwood, Mass.)
Further Reading

#ALGORITHMS: "Smarter Conservation from Analytics and Cloud Computing"


The Water Pilot Study displayed analytics about each household’s usage patterns, comparing them to average usage and to other engaged members of the program. (Source: IBM, City of Dubuque)

Ecological conservation of precious natural resources, such as clean water, can be made smarter by using cloud computing to track usage patterns and software analytics to encourage voluntary conservation efforts.

Anecdotal evidence has long suggested that consumers will voluntarily change their usage patterns to foster conservation when given clear choices about how to do so. Now the city of Dubuque, Iowa, together with IBM, has used cloud computing and analytics to determine just how much might be saved voluntarily—measuring a reduction in water utilization by 6.6 percent and an eightfold improvement in leak detection and response time.

As a part of its Smarter Sustainable Dubuque research initiative, the city of Dubuque—a community of over 60,000 residents—sponsored the Water Pilot Study, which instrumented 151 homes with smart meters for nine weeks. During that time, the city supplied the owners with a real-time readout of their current consumption rate. Households were also supplied with analysis as to how they could improve their consumption patterns and social networking tools with which they could compete for conservation awards. (An additional 152 control homes were instrumented too, but without providing those households with a readout of consumption.)

The smart meters constantly updated water usage and communicated it to the IBM Research Cloud every 15 minutes. Each household's data set was then analyzed for anomalies, which were then reported back to the households to help them understand their consumption. One surprising finding was that 8 percent of the households—12 out of 151—had water leaks that they were unaware of until their house was instrumented and its data analyzed.

Data was collected and analyzed anonymously, but the consumption patterns, trends and anomalies were shared with both city officials and other community members without revealing individuals' names or addresses. Using a Web portal, community members signed on to view both their own personal household usage patterns as well as comparisons with others and overall averages. Online games and competitions were also sponsored to promote sustainable consumption habits as well as to help consumers perceive the communitywide impact of their efforts, in terms of a reduced water bills, fewer gallons consumed and a reduction in household's carbon footprint.

In total, 89,090 gallons were saved by the 151 households over nine weeks, which would amount to more than 514,740 gallons when extrapolated to a year, or about 3,409 gallons per household annually. If the program were extended to the entire city of Dubuque, which consists of 23,000 households, the smart meters and displayed analytics would have saved households a total of over $190,930 a year.

Surveys of household members also revealed that 77 percent thought the Web portal increased their understanding of water conservation, 70 percent said they understood better the community impact of the choices they make, and 61 percent reported that they had made personal changes in the ways they used water, such as taking shorter showers, fixing leaks, purchasing more water-efficient appliances or altering their yard watering time of day.
Further Reading

Tuesday, May 24, 2011

#CHIPS: "Smarter eEyes Focus on Cure for Blind"


Real biological eyes—diagrammed at top—use arrays of retinal cells that are lined up in rows, but their interconnections (middle) use a fractal pattern common in nature, according to University of Oregon professor Richard Taylor. (Source: University of Oregon)

By designing electronic-eye (eEye) implants using fractal interconnects, researchers aim to overcome the mismatch between using conventional image chips in bionic eyes. Today, several efforts are under way worldwide to create silicon retinas that can be implanted in the eyes of the blind, thereby enabling them to see again, albeit at vastly reduced resolution. Now researchers are aiming to remedy that by replicating the fractal-like interconnection topology of real eyes.

Real biological eyes contain the equivalent of 127 million pixels, whereas conventional eEyes are currently using sensors with less than 64 pixels, and even next-generation designs are only aiming for about 1,024. What is even worse, these researchers say, is that the interconnection topology of the an image chip is a square array, whereas the interconnection matrix for the "pixels" in a real biological eye is a branching structure called a fractal.

Fractals are common among all living things as a result of growing techniques that repeat a basic set of instructions—a fractal algorithm. For instance, the trunk of a tree divides into branches using the same fractal algorithm that is used for the veins in a leaf. In nature, trees, clouds, rivers, galaxies, lungs and neurons use the same fractal pattern of interconnections.

Now researchers are aiming to replicate this technique for interconnecting imaging elements in eEyes. Today's eEyes just sink metallic electrodes—one for each pixel—into the ganglia behind the eye, which then depends on the plasticity of the visual cortex in the brain to decipher the output from these new pixels—even though they do not match the normal topology of the biological retina. However, new research efforts are developing a technique that starts with a metallic seed that then grows all the repeated branching structures that in turn mate to the optic nerve behind the eye, thereby delivering to the brain the same kind of signals as retinal neurons.

The specific algorithm harnessed by the technique is called "diffusion limited aggregation," which researchers are using to grow image sensor interconnections that mimic a natural neural topology before being surgically implanted and interfaced to the optic nerve.

This summer Professor Richard Taylor and doctoral candidate Rick Montgomery will begin a yearlong quest with Professor Simon Brown at the University of Canterbury in New Zealand to grow these metallic fractal interconnection topologies for the backside of silicon image chips.

Instead of just providing a single output for each pixel, as with conventional eEyes, image sensors with the fractal interconnects will connect to the optic nerve with the same overlapping topology used by real biological retinal neurons. As a result, the researchers hope that the brain's visual cortex can perform the same sort of functions for eEyes that it does for real eyes, enabling the blind to recover not just some vision, but a visual experience that rivals that of normal people.
One challenge cited by the researchers is finding metals that can be coaxed with diffusion limited aggregation to form the Brownian trees typical of retinal cells and yet can be safely implanted into humans without side effects. Funding is being provided by the Office of Naval Research (ONR), the U.S. Air Force and the National Science Foundation.
Further Reading

Monday, May 23, 2011

#OPTICS: "Plastic optics boosted to 25 Gbit/s"


A new vertical-cavity surface-emitting laser (VCSEL) technology from VI Systems aims to extend the reach of cheap plastic fiber optics, with scientists at Georgia Institute of Technology reporting successful operation at 25 Gbits per second. VI Systems (Berlin) reports that it has achieved 40 Gbit/s in the lab and is aiming for 100 Gbit/s performance.
Further Reading

Sunday, May 22, 2011

#ENERGY: "Smarter Hydrogen Fuel Maker Mimics Plants"

The emerging hydrogen economy depends on finding smarter ways to generate the volatile gas from plentiful natural resources, such as splitting water into hydrogen and oxygen with sunlight. Smarter hydrogen generators will ditch precious metals for fields of silicon nanopillars etched with semiconductor fabrication equipment, thus realizing the dream of cheap, abundant hydrogen fuel generated from water and sunlight a la plants.


Silicon nanopillars—each 2 microns in diameter—etched with semiconductor fabrication equipment substitute for expensive platinum electrodes, enabling cheap hydrogen generation from water and sunlight. (Source: Technical University of Denmark)

According to the Department of Energy (DoE) we should be mimicking the way plants generate their own fuel from water and sunlight, but unfortunately the price of convention electrolysis is too high due to its use of expensive platinum catalysts. To realize the dream, DoE-funded researchers are now fabricating tiny micrometer-sized pillars of cheap abundant silicon to take the place of expensive platinum catalysts, thus promising to bring down the price of hydrogen fuel and enabling widespread commercialization.

Plants use photosynthesis to produce a fuel (adenosine triphosphate) from sunlight and water, which is then stored until it is needed for respiration, growth and other normal cellular operations. The "hydrogen economy" concept mimics this operation by using sunlight to drive an artificial photosynthesis-like action more accurately termed photo-electro-chemical (PEC) water splitting.

The result could be abundant, cheap hydrogen gas that can be stored indefinitely without the need for batteries, then converted into energy on-the-fly either by burning it directly in engines or using it to produce electricity in fuel cells.
Today most hydrogen fuel is produced from natural gas, which unfortunately releases carbon dioxide as a by-product. However, if artificial photosynthesis can be perfected, then hydrogen fuel could be produced from nothing more than water and sunlight, making it cleaner and cheaper than any conventional fuel.

Unfortunately, the easiest way to split water into hydrogen and oxygen makes use of expensive platinum catalysts, but now the SLAC (originally Stanford Linear Accelerator Center) National Accelerator Laboratory, working with Stanford University and the Technical University of Denmark, believes it has eliminated the need for expensive catalysts, in favor of microscopic fields of pillars etched in silicon.

The key to the researchers' hydrogen generation method was their discovery that depositing nanoscale clusters of the molybdenum-sulfide molecules onto its fields of silicon pillars enabled them to split the hydrogen off of the oxygen in H2O (water) when exposed to sunlight. The resulting "chemical solar cell" was found to work as well as more expensive conventional designs using expensive platinum catalysts.

Now the researchers are working on a mechanism that separates the hydrogen from the oxygen generated, thus allowing each to be separately stored until needed as fuel for combustion or to produce electricity in a fuel cell.

Jens Nørskov at the DoE's SLAC National Accelerator Laboratory worked on the project with researchers at Stanford University and a team at the Technical University of Denmark led by Ib Chorkendorff and Søren Dahl.
Further Reading: http://bit.ly/NextGenLog-ij31

Friday, May 20, 2011

#ENERGY: "Algae creates hydrogen fuel"

Algae can produce hydrogen fuel from water and sunlight, with a little boost from man-made nanoparticle catalysts, according to engineers at the U.S.Department of Energy's Argonne National Laboratory. By commandeering the photosynthesis mechanisms that enable algae to harness the energy of the sun, algae can produce abundant fuel to power an emerging hydrogen economy, they say.


Chemist Lisa Utschig tests a container of photosynthetic proteins linked with platinum nanoparticles, which can produce hydrogen from sunlight. Tiny bubbles of hydrogen are visible in the container at right.


Led by Argonne National Lab chemist Lisa Utschig, working with colleague David Tiede, the team at Argonne's Photosynthesis Group recently demonstrated how its platinum nanoparticles can be linked to key proteins in algae to coax them into producing hydrogen fuel five times more efficiently that the previous world record.
Further Reading: http://bit.ly/NextGenLog-l63E

#ALGORITHMS: "Cloud Makes 3D Models from Aerial Photos"

Cloud-based services are enabling fast, cheap, large-scale three-dimensional models of almost any landscape. The models are generated from easy-to-obtain aerial photos from drones—unmanned aerial vehicles.


Unmanned drones can take thousands of aerial photographs today, but stitching them together has required human expertise and sophisticated high-end software. (Source: EPFL)


New software from EPFL spinoff Pix4D automatically generates 3D models from aerial photos. (Source: EPFL)

Unmanned aerial drones (UAVs) are becoming inexpensive enough for small businesses or even individuals to use, permitting thousands of aerial photographs to be snapped of points of interest. Unfortunately, the high-powered analysis software required to stitch together aerial photos is outside the budget of all but large corporations. Now a new genre of inexpensive cloud-based services is appearing, capable not only of stitching together those patchworks of photos, but even able to automatically interpret what they see, thereby generating three-dimensional (3D) models on the cheap.

The Pix4D project does just that. A spin-off of the European research organization called the Ecole Polytechnique Federale de Lausanne (EPFL), Pix4D was named for its ability to transform the fourth dimension—time—into a method of generating 3D models from 2D images shot by aerial drones. By harnessing time, a UAV with a digital camera can take thousands of photographs from the air, capturing every possible angle of view of objects on the ground. Without smart cloud-based computing resources, however, these aerial photos would have to be hand-assembled, and even then they would only yield a flat 2D map of the area photographed.

Pix4D software running in the clouds, on the other hand, not only automatically stitches together thousands of 2D images to make accurate maps, but can also infer the 3D information needed to make a model that can then be viewed from any orientation. The cloud service works with the geo-tags on each image, comparing them with those taken at nearby times and locations, resulting in a stunning 3D model of whatever is imaged using a relatively inexpensive cloud-based service.

The Pix4D cloud service accepts a stream of related photos from which it generates a 3D model in as little as 30 minutes. The service not only automatically generates the 3D maps, but also adds points of interest that can be cataloged by users. To demonstrate the service, Pix4D took 50,000 photos of its host city—Lausanne, Switzerland—and created the world's highest-resolution 3D model of the city. The Pix4D user interface then allows users to navigate to any location in the city and view it from any orientation.

New software from EPFL spinoff Pix4D automatically generates 3D models from aerial photos. (Source: EPFL)
Don't have a ready UAV? EPFL has spun off another startup that makes an inexpensive drone. Called the senseFly, this pint-sized aerial vehicle is currently being used to take high-resolution photos for many applications, from farmers who wish to survey the evolution of their crops over large distances and long periods of time to archaeologists hunting for evidence of as yet undiscovered ruins.
Further Reading: http://bit.ly/NextGenLog-mkG1