ENERGY | WIRELESS | NANOTECH | MEMS | OPTICS | QUANTUM | 3D | CHIPS | ALGORITHMS

Tuesday, August 30, 2011

#3D: "All-in-One Solution Offers Complete Stereoscopic Production"

By including a 3D Web camera along with a stereoscopic display, LG has created the world's first all-in-one 3D laptop for creating, viewing and sharing stereoscopic imagery on its smartphones, TVs and glasses-free monitors.



LG's new all-in-one 3D laptop houses the first complete stereoscopic production studio for making, showing and posting 3D social media online for viewing on its smartphones, TVs and glasses-free monitors.

Most stereoscopic displays today use active-shutter technology, which displays left- and right-eye images in quick alternative succession, but requires users to wear expensive battery-powered LCD shutter glasses whose lenses alternate between black and clear. Unfortunately, replacing the batteries is not the only hassle, since some viewers have reported being able to see "flicker" as the active-shutter lenses switch from black to clear.

An alternative is an auto-stereoscopic display like LG debuted recently for its glasses-free D-2000 computer monitor. However, for its first all-in-one 3D laptop, LG chose the middle-ground, using instead its patented Film Patterned Retarder (FPR) technology which stripes the laptop screen with an alternating grid of light polarizers. By displaying a similarly striped image--where vertical pixel strips alternate with left- and right-eye images--a pair of passive polarized glasses allow the brain to reconstruct the original 3D image. Unlike active-shutter glasses, which can cost up to $100, passive polarized glasses cost no more than a pair of inexpensive sunglasses. A 3D Sound Retrieval System (SRS) included completing the stereoscopic experience.

The 15-inch A530 laptop display can show full high-definition-quality movies--1920-by-1080 pixels--in normal or 3D modes. But the most novel aspect of LG's all-in-one 3D laptop is its inclusion of a stereoscopic Webcam--dual cameras spaced at a distance appropriate for imaging the user against a 3D background. And by including software that marries the 3D Webcam capability to self-publishing on YouTube, the A530 can serve as the world's first all-in-one 3D production studio--at least for non-critical social media applications.

LG's 3D Space Software allows users to view 3D games as well as to create stereoscopic video, photos and movies for viewing on its smartphones, TVs and glasses-free monitors.

Besides being the world's first all-in-one 3D production console, the LG A530 also seems to be a pretty cool laptop in general, with built-in fingerprint reader for quick log-ons, an Intel i7 processor, Nvidia GeForce GT 555M graphics card, 8GB of RAM, a 750GB hybrid hard-drive housing 4GB of flash for fast boots, USB 3.0 and a luxurious brushed-metal exterior complemented by diamond-cut edges.

Further Reading

Monday, August 29, 2011

#WIRELESS: "Social Net Saves Fuel With Smartphone"

Crowd-sourcing the state of traffic-lights from dash-mounted smartphones enables smart social networkers to keep cars rolling through green lights, thus cutting fuel consumption by up to 20 percent.



Smarter social networking apps could actually save you money on fuel, according to the Massachusetts Institute of Technology and Princeton University, which recently demonstrated its SignalGuru app. The app watches for red lights and then alerts friends in cars behind the driver to adjust their speed to avoid it.

The idea behind the MIT/Princeton research project is to crowd-source red-light timing information and then share that information with other drivers. The smarter software works by dash-mounting a smartphone so that its camera is facing slightly upwards where it can see the traffic lights as a car drives past them. A cloud-based server then collects each traffic light's state and enters it into a real-time database. Then, depending on the current location of a particular car, drivers will be informed as to the proper speed to drive in order to hit the next light while it is green. As a result, SignalGuru users were able to save an average of 20 percent on their fuel costs compared to those who had to stop at lights.

Where previous experimental traffic-light advisory systems used GPS data or data from traffic sensors, SignalGuru uses visual data from cellphone cameras. (Source: MIT)

The main investigator on the project, MIT doctoral candidate Emmanouil Koukoumidis, claims that 28 percent of energy consumption and 32 percent of carbon dioxide emissions in the United States are from automobiles, which should enable SignalGuru to have a strong, positive impact on both the environment and the economy if enough people start using commercial versions of the app.

Koukoumidis was inspired to create the app by the growing trend of drivers that are already dash-mounting their smartphones in order to use them as navigational devices. SignalGuru just repurposes the forward-facing camera to keep an eye out for traffic lights. Eventually SignalGuru could also be integrated with these navigation apps, in order to offer not only a recommended speed, but perhaps even route changes that make sure the driver hits green lights all the way.

Koukoumidis, who worked with MIT professor Li-Shiuan Peh and Princeton professor Margaret Martonosi on the project, also envisions crowd-sourcing other related information as drivers proceed on their commutes. For instance, advising as to where the lowest gas prices are available along a driver's route, where parking spaces are available, and even tracking the progress of city busses for smartphone users who are hoofing it.

SignalGuru worked best in Cambridge, Mass., where traffic lights operate on a fixed schedule, allowing drivers to synchronize their driving speed with the changing traffic lights to within two-thirds of a second. The system worked less well in Singapore tests where traffic lights track how many cars are on the road and vary their timing. Nevertheless, SignalGuru was able to keep Singapore commuters driving through green lights with and error of slightly more than a second, which still enabled drivers to reap significant fuel savings.

Further Reading

Friday, August 26, 2011

#SENSORS: "Tattoo-Like Sensors Monitor Brain and Muscles"

Wearable electronic devices can be deposited on the skin like a temporary tattoo.



Electronics have been successfully transferred from a dissolvable carrier to the skin, enabling tattoo-like sensors to monitor brain waves and muscle actions for everything from remote medical diagnosis to immersive gaming to spy-inspired covert surveillance.

The next time you see an indecipherable tattoo in San Diego it could be an experimental new electronic sensor being tested by bioengineering professor Todd Coleman. While the device was invented by engineering professor John Rogers who worked with Coleman at the University of Illinois along with Northwestern University engineering professor Yonggang Huang, Coleman is now perfecting applications for the device at UCSD.

An example of tattoo-like wearable electronic devices. (Source: University of Illinois)

The smarter part of the device is its use of a flexible water-soluble substrate on which sensors and eventually wireless transmitters can be integrated. The substrate can deposit the electronics on the surface of the skin, then be dissolved leaving only the electronics behind on the skin. This is a process that is similar to that used with a temporary tattoo (minus its paper carrier).

In the lab, the researchers have installed sensors on a subject's skin to monitor muscular movements that controlled a computer game with sub-vocalization, which the team envisions as a prelude to covert sensors that military surveillance operatives could use to communicate sighting details in complete silence.

Today the kind of electrodes that can monitor such muscle motions are bulky and require a firm attachment to the skin, often requiring messy conductive pastes for optimal performance. However, the wearable tattoo sensors being perfected by Coleman's team are claimed to have all the desirable attributes of standard electrodes minus the bulk and messy paste.

Many patients have debilitating conditions that could benefit from 24/7 monitoring, such as epileptics whose seizures could be predicted if they constantly wore electrodes, but the rarity of attacks is too infrequent to warrant the inconvenience of wearing electrodes. Many other medical conditions could also benefit from 24/7 monitoring, which could be made more convenient with Coleman's tattoo-like sensor arrays.

Next, Coleman's team is working toward integration of the tattoo-like sensors with the rest of the circuitry required to create a wireless sensor node. For instance, by adding an active transmitter to the passive sensors, the device could monitor muscles or even brain waves and use the results for diagnosis or to control communications or even games.

Meanwhile, Roger's team back at the University of Illinois is developing electronics that can be integrated onto human organs. For instance, Roger's team is working on an electronic eyeball camera that adapts to the curvature of the eye for improved imaging. And another sensor model can be deposited directly on the heart to monitor its beating. Eventually, Rogers hopes to perfect implantable sensors that could provide high-resolution mapping of the electrophysiology of the brain, heart, and other organs, as well as be integrated with other electronics to correct maladies that require periodic electronic stimulation.

Funding was provided by the National Science Foundation and the Air Force Research Laboratory.

Further Reading

Thursday, August 25, 2011

#SECURITY: "Five Steps to Securing Internet-Enabled Devices"

So far the most serious security breaches have been from PCs, but embedded system designers are working to prevent Internet-enabled consumer devices from being used as backdoors for future intrusions.



Wind River, makers of a popular Real-Time Operating System (RTOS) called VxWorks for Internet-enabled devices, claims that there will be 50 billion connected devices in use by 2020, any one of which could become the Internet's weakest link.

New forms of attacks and exploits are materializing all the time, according to computer security experts at McAfee, which claims there are 55,000 new malware programs and 200,000 zombies uncovered every day. This is on top of the 2 million malicious Websites that already exist.
So with more connected devices and growing threats, how do you secure Internet-enabled devices?

"We are working with McAfee to deliver stronger security for embedded devices, using a holistic approach that considers security issues at every layer--from the silicon chips and virtualization used, to the operating-system, network and communication stacks, to the application layer," said Marc Brown, vice president of tools & marketing operations at Wind River.

Wind River suggests taking a five-prong approach to securing Internet-enabled devices.

The most serious intrusions so far, appear to be orchestrated by organized crime and government agencies that are committing serious crimes, ranging from embezzling money to stealing state secrets to altering the behavior of physical systems, potentially harming equipment and endangering lives.

In 2009, operation Night Dragon was found to be monitoring energy companies--an advanced persistent threat (APT) that was using coordinated spear-phishing emails, Trojans and remote-control zombies to funnel operational details, exploration results and even the contents of sealed bids to command-and-control computers. Also in 2009, Google, Adobe Systems, Juniper Networks, Rackspace and others reported Operation Aurora, which they claim was aimed at accessing valuable source-code repositories. And McAfee's latest exposure was Operation Shady RAT, which used malware to break into secure government computers worldwide.

To secure Internet-enabled devices from becoming the new backdoors for such intrusions, Wind River is advising its clients to take five steps. First, make a thorough threat assessment regarding the communications capabilities that need to be secured on all new devices. Next, the logical components need to be compartmentalized using virtualization, so they can be separately reset in the case intrusions are detected. Certified runtime components then need to be selected, such as Wind River's Achilles certified VxWorks RTOS.

Then, application-level white- and gray-listing needs to be enforced to make sure infected code is not allowed to run. And, finally, test and validation suites need to be run periodically to make sure that unforeseen vulnerabilities do not creep into code at any time during its lifecycle.

Further Reading

Wednesday, August 24, 2011

#QUANTUM: Tunneling Enables 3D Touch

Spiked nanoparticles ease quantum tunneling to allow the third-dimension to be sensed by touch-screen users.



Soon, fingers will be able to sense the third dimension by virtue of a supersensitive thin film that harnesses quantum tunneling to sense force. This will allow novel gaming-, artistic- and control-gestures, such as searches that dig deeper into a file structure by pressing harder.

Today, smartphones use either inexpensive resistive (Nokia C5) or more expensive capacitive (Apple iPhone) technologies, both of which have advantages and disadvantages. However, a new kind of overlay--called Quantum Tunneling Composite (QTC)--provides the best of both worlds. Additionally, QTC enables touch screens to recognize 3D gestures. Already used to add the sense of touch to the fingertips of NASA's Robonaut, Peratech has now licensed its QTC technology to Samsung for 3D joysticks, and to Nissha Printing for 3D touch screens in smartphones and tablets.

Spiked nanoparticles aid in quantum tunneling to allow Peratech's film to sense a continuum between feather-light and heavy touches, thereby enabling 3D gestures.
The QTC Clear overlay can be added to any capacitive touch screen, or can completely replace a traditional resistive touch screen. Capacitive touch screens today cannot sense force, whereas QTC adds a readout of the force being exerted on a screen by a touch, resulting in easier manipulation of on-screen objects as well as enabling 3D gesture recognition. This property can be used in a variety of applications including novel gaming interfaces to artistic expression using variable line widths depending on the force exerted.

Besides enabling 3D gesture recognition, QTC can also drastically cut the power consumed by capacitive touch screens. Today, capacitive touch screens must be constantly "on and ready" to sense touches, but adding the QTC film, a finger can switch the capacitive layer "on" only when a finger is resting on it.

QTC overlays are less than eight microns thick, consists of two layers of transparent conductors sandwiched between plates of glass. Unlike resistive touch screens which must be made of a soft polymer in order to deflect enough to be sensed, QTC overlays can sense, through glass, deflections of just a few thousandths of a millimeter.
The key to the QTC technology is its use of spike-shaped nanoparticles embedded in a transparent insulator. Quantum tunneling works because electrons are actually waves, which take a finite distance to dissipate when they hit an insulating medium. By adding spikes to the shape of normally round nanoparticle conductors, the quantum likelihood of an electron wave traversing the insulator, in which the nanoparticles of the film are suspended, is increased, thereby making QTC overlays much more sensitive than traditional films.

Peratech's force-sensitive films are already being used by NASA, the Defense Department, and several robot makers to add tactile sensing capabilities to everything from textiles to articulated hands. Next QTC overlays, being readied by Nissha, will enable the technology's first foray into touch-screen-based devices.

Further Reading

Tuesday, August 23, 2011

#MARKETS: "Analysis Finds Business Disasters Result From Mimicry"




Innovation drives markets, but innovation without oversight can result in isomorphism, when one company's breakthrough money-maker turns out to be the downfall of whole industries, according to a leading business analyst.

Isomorphism in business is the tendency of proven strategies to spread over time, leading industries into similar practices that have been successful for others. Unfortunately, without oversight, a practice that is good for one company can be bad for an industry, in the worst case leading to "terminal isomorphism" similar to what struck the mortgage industry a few years ago.

University of California, Berkeley’s Haas School of Business professor Jo-Ellen Pozner recently used the mortgage meltdown as an example of what not to do--that is, mimic the practices of a successful endeavor in one market or company, while ignoring the basic business oversight that executives normally exercise to restrain overzealous follow-the-leader management practices.

All industries suffer from isomorphism, where success is often emulated using the "imitation is the most sincere form of flattery" principle. For instance, look at the iPhone's success and the subsequent smartphone industry where a large proportion of new models imitate it. That model for success is currently being repeated by all the touch-screen tablets that imitate Apple's iPad.

Growth in mortgage loan fraud, based on U.S. Department of the Treasury Suspicious Activity Report Analysis.

Imitation in and of itself is not a bad thing and often makes sense as a mass marketing strategy. For instance, Microsoft's MS-DOS imitated an earlier operating system called CP/M from now defunct Digital Research, and later Microsoft's Windows imitated the Apple Mac OS, resulting in a highly successful software dynasty.
However, isomorphism goes too far when it emulates practices that do not reflect mainstream business know-how, leading, in the worst case, to an industry meltdown, which Pozner calls "terminal isomorphism."

Pozner gives an example to illustrate this point. "It might be that one firm (firm A) in a given industry starts promoting very junior employees to supervisory positions. Because those places become attractive to the youngest, freshest and most innovative crowd (if that's what those very junior employees represent), those people flock to firm A. Eventually firms B, C and D catch on and begin to do the same. What you would see, eventually, is a bunch of firms racing to get this young talent, overpaying those employees, putting them in roles they are not prepared to handle, and alienating more senior employees. Eventually, those organizations will be so overwhelmed with internal organizational issues they will be unable to spend the time necessary to innovate. So although promoting the "cool kids" was rational at the individual firm level, at the industry level it becomes self-destructive," said Pozner.

In regulated industries, like that causing the mortgage crisis, the answer is closer supervision by regulatory bodies, but for the rest of the business community, the answer is closer supervision by corporate executives to avoid moving from "Main Street logic" to "Wall Street logic," according to Pozner.
Our research "explains more than predicts, but we hope to increase people’s awareness and give them the tools to be more vigilant and identity potential problems earlier on," said Pozner.

Pozner performed the analysis with professor Paul Hirsch at Northwestern University’s Kellogg School of Management, and University of California, Berkeley Haas doctoral candidate Mary Katherine Stimmler.

Further Reading

Monday, August 22, 2011

#ALGORITHMS: "Evolution Has Ossified the Internet"




The Internet is evolving, according researchers at the Georgia Institute of Technology, but unfortunately extinction has resulted in a rigid structure where all information is being forced through a small set of mid-layer protocols that reduce flexibility and decrease security. To remedy, Georgia Tech recommends restructuring the mid-layers into a set of nonoverlapping protocols that do not compete with one another and thus will not become extinct as they evolve.

Anyone who has used the Internet for very long knows about its evolution by the number of extinct protocols that are no longer used. For instance, FTP (File Transfer Protocol) used to be the only way to transmit files too large for SMTP (Simple Mail-Transfer Protocol), but clever programmers have devised ways of using server-side algorithms to deliver large files using HTTP (Hypertext Transfer Protocol). As a result, FTP has become virtually extinct on all but legacy systems.

Researchers at the Georgia Institute of Technology wondered if these evolution and extinction phenomena on the Internet were in any way similar to evolution and extinction in nature. After all, protocols could be viewed as species that compete for resources, with the weaker ones eventually becoming extinct. Similarly, the evolution of the Internet's architecture could be described as a competition among protocols, with some thriving and others becoming extinct.

To test their theory, the group headed by computer science professor Constantine Dovrolis crafted a research program that tracks the evolution of architectures, called EvoArch. The overall goal was to help understand how protocols evolve in order to develop better ones that protect the Internet from the wide variety of threats it is facing today and to prevent extinctions that ossify the Internet, making it more vulnerable to attacks. The general conclusion derived from EvoArch was that unless new protocols are crafted to avoid competition, they will inevitably lead to extinctions.

The six layers, from top to bottom, are specific applications (like Firefox), application protocols (like HTTP), transport protocols (like TCP), network protocols (like IP), data-link protocols (like Ethernet) and physical layer protocols (like DSL).

In particular, the six layers of the Internet have evolved into an hour-glass shape where protocols at the very top and bottom continue to evolve, but where those toward the middle have become stagnant, leaving unnecessary security-risk opportunities open for exploitation.

At the top application layer where browsers, email clients, video and audio streamers exist, there is still plenty of diversity and competition among alternatives. Evolution here is still healthy, weeding out the weaker applications and strengthening those with better security. At the application protocols layer, where HTTP, SMTP and newer protocols like RTP (Real-time Transfer Protocol) exist, extinction has eliminated some of the weaker protocols, but enough variety still exists.

In the middle layers, however, extinction has left only a few survivors, ossifying its structure. At the transport layer (layer three), TCP (Transmission Control Protocol) competes with only a few other alternatives, such as UDP (User Datagram Protocol), and at layer five, the network protocol, IP (Internet Protocol) and ICMP (Internet Control Message Protocol) are used almost exclusively. Diversity resurfaces at layers five and six, where Ethernet and other data-link protocols such as PPP (Point-to-Point Protocol) communicate with a wide variety of physical layer protocols including DSL (digital subscriber line), coaxial cable and fiber optic alternatives.
From running simulations with the EvoArch program these researchers have concluded that the only way to reintroduce diversity into the middle layers without inevitable extinctions is to create protocols that do not overlap with the others. By thus eliminating competition for the same resources, a rich set of middle layer protocols with increased security should be able to survive.

Further Reading

Friday, August 19, 2011

#SPACE: "Kepler technology spots 'exoplanets'"







In a matter of months after its launch two years ago, the Kepler spacecraft had made more planet sightings than in the entire history of astronomy—1,235 and counting. So far, only 17 have been confirmed, but scientists at the National Aeronautics and Space Administration (NASA) are confident that 80 percent will eventually be verified.
Further Reading

Thursday, August 18, 2011

#CHIPS: "IBM demos cognitive computer chips"




By replicating the functions of neurons, synapses, dendrites and axons in the brain using special-purpose silicon circuitry, IBM claims to have developed the first custom cognitive computing cores that bring together digital spiking neurons with ultra-dense, on-chip, crossbar synapses and event-driven communication.
Further Reading

Wednesday, August 17, 2011

#SECURITY: "China Implicated as Intruder by McAfee"

While never mentioning China by name, a recent McAfee report implicates China as the being behind a rash of malware intrusions into government and industry computers in search of state secrets, industrial designs and intellectual property.



McAfee, the Internet security provider, recently released a study that implicates China as being the state sponsor behind a rash of malware intrusions into government, industry and human-rights organizations computers, resulting in petabytes of state secrets, industrial designs and other intellectual property being stolen.

"The loss represents a massive economic threat not just to individual companies and industries but to entire countries," said Dmitri Alperovitch, vice president of Threat Research at McAfee, in his report entitled: Revealed: Operation Shady RAT.
While never specifically cited, China is implicated as the state sponsor of these intrusions into government computers of the United States, Canada, South Korea, Vietnam, Taiwan and the United Nations.

While the specific state actor behind a rash of malware intrusions is not specifically named, McAfee's recent report on Operation Shady RAT does cite all of China's traditional adversaries as victims, including the government of the U.S., Canada, South Korea, Viet Nam, Taiwan and the United Nations.

"Interest in the information held at the Asian and Western national Olympic Committees, as well as the International Olympic Committee (IOC) and the World Anti-Doping Agency in the lead-up and immediate follow-up to the 2008 Olympics [which were held in China] was particularly intriguing and potentially pointed a finger at a state actor behind the intrusions," said Alperovitch in the report.

McAfee "gained access" to a so-called command-and-control server which was logging its intrusions into the computers which were infected with a spear-phishing email that was sent to individuals with access to the targeted computer systems. Once opened, the email triggered the download of malware that initiated a backdoor communication channel to the command-and-control server penetrated by McAfee. This server in turn interprets instructions that had been encoded into the hidden comments embedded in an infected webpage. This same method was used by Operation Aurora--the intrusion into Google's computers which triggered that company's threat to stop doing business in China back in 2010.

McAfee describes its report as "the most comprehensive analysis ever revealed of victim profiles from a five year targeted operation by one specific actor--Operation Shady RAT [a common acronym in the industry which stands for Remote Access Tool]."
McAfee describes the adversary as having been "motivated by a massive hunger for secrets and intellectual property" and describes its treasure trove of stolen documents as "nothing short of a historically unprecedented transfer of wealth--closely guarded national secrets (including from classified government networks), source code, bug databases, email archives, negotiation plans and exploration details for new oil and gas field auctions, document stores, legal contracts, supervisory control and data acquisition [SCADA] configurations, design schematics and much more."

Further Reading

Tuesday, August 16, 2011

#WIRELESS: "Apple Propelling Mobile, Killing Navigation/MP3/Cameras"

Led by Apple's iPhone and iPad, smartphones and touch-screen tablets are propelling mobile broadband, but cannibalizing single-function devices for navigation, playing music and taking photographs.



Smartphones and tablets are driving the mobile broadband market while personal navigation, music players and digital-still cameras simultaneously decline in market share. (SOURCE: IHS iSuppli)

More and more users are moving to smartphones and touch-screen tablets—led by Apple's iPhone and iPad—as their main mobile broadband device, but are then using them to also perform navigation, music-playing and picture-taking tasks, thereby causing those markets to fall into decline.

According to IHS iSuppli, the market for touch-screen tablets is the fastest-growing mobile broadband sector, with over a 57 percent increase in device sales. The mobile broadband market—which also includes laptops, netbooks and e-books—will top 157 million units in 2011, compared with just over 100 million units in 2010. The big boom in mobile broadband sales will cool somewhat in the coming years, according to iSuppli, but will still sizzle at 38 percent growth in 2012 and stay in the double-digit range out to 2015, when it will grow 11 percent to over 350 million units. Overall, the five-year compound annual growth rate in mobile broadband is predicted by iSuppli to exceed 28 percent.

At the same time, rising smartphone and tablet sales will cut into markets for personal-navigation devices, music players and digital-still cameras, according to IHS iSuppli. Tablets by themselves will lead at a remarkable growth rate of over 72 percent through 2015, while smartphones will grow at 28 percent, and personal-navigation devices, music players and digital-still cameras will flatten or fall into decline.

The iPhone and iPad will continue to lead in sales, with Apple's iPad receiving the highest satisfaction rate of all tablets when compared with other vendors, according to an IHS iSuppli survey of 1,404 end users this spring. Apple iPad owners rated their satisfaction at 8.8 on a scale of 10, compared with 8.5 for Samsung's Galaxy Tab, 8.4 for Motorola's Xoom and 7.6 for HP's Touchpad. A surprise second place, at 8.75, was scored for the Zenithink tablet from China's Shenzhen (although the sample set was very small, compared with Apple's iPad).
Among U.S. tablet owners, over 79 percent said they owned an Apple iPad and 61 percent said they had already decided to stick with the Apple brand for their next tablet. Among users who did not have tablets yet, over 50 percent said they would likely choose the Apple brand, with the second choice being Dell, but at a distant 11 percent. As a result, IHS iSuppli predicts that Apple's iPad will have the majority of tablet sales through 2012, and will remain the top-selling brand at least through 2015.
Further Reading

#ALGORITHMS: "Virtual Rat to Cure Human Diseases"

The National Institutes of Health is investing $13 million in a Virtual Rat Project--not to save rats, but to develop analytics that cure human diseases.



Daniel Beard at the Medical College of Wisconsin (Milwaukee) is leading the virtual rat project.
Due to their similar physiology, lab rats have already helped find cures for cardiovascular diseases in humans. But housing, feeding and breeding them is costly. Now the National Institutes of Health is aiming to simplify procedures by creating virtual rats that behave just like the real thing. The five-year $13 million NIH-funded effort aims to develop models of interventions that can be perfected in rats. The models will then be used to mitigate human disease--from high blood pressure to heart failure.

Unlike the European Union's Virtual Physiological Human project, which aims to perfect models of human biological functions, the Virtual Physiological Rat project instead aims to determine why genetic dispositions express themselves as disease, and how to prevent it.

Researcher Daniel Beard of the Medical College of the University of Wisconsin (Milwaukee)
"The goal of the rat project is to determine why an individual rat comes to have certain markers of cardiovascular disease, to predict what markers which rats will develop and why, and to engineer new rats to test out predictive understanding," said computational biologist Daniel Beard at the Medical College of Wisconsin (Milwaukee).
Beard, who will be running the Virtual Physiological Rat program, said computer models of rat physiology have already been used to advance the state of understanding of cardiovascular disease, which is the leading cause of death in people worldwide.

The new Virtual Physiological Rat simulation will extend that understanding to the environment, thus enabling researchers to unearth the complex inter-relationships between multiple genes and their real-world expression.
Live-rat experiments will still be used, but only for advanced diagnostics aimed at verifying the accuracy of rat simulations. As the model grows in complexity, the researchers will be able to do what-if simulations about the effect of interventions. They will then use experiments to verify or falsify their hypotheses.

The project will begin by bringing together all the knowledge accumulated about rats with known genomes, such as how properly functioning hearts, kidneys, skeletal muscles and blood vessels work together to produce healthy rats. Once a detailed model of a healthy rat's cardiovascular system is completed, the researchers will use the model to make predictions. They will then go into the lab to verify that live rats respond in the manner predicted by the model.

Once a healthy rat's physiology has been successfully tested, Beard's group plans to extend the model to rats with high-blood pressure and other cardiovascular diseases, hopefully to find a relationship between these conditions and specific genes and the environmental factors that contribute to their expression. Eventually, Beard hopes to create detailed rat analytics that can recommend early interventions to stop heart disease from developing. To test the prowess of the final rat analytical model, Beard plans to breed new strains of rats with novel new genetics.

"We are using a number of defined genetic strains that have interesting cardiovascular phenotypes that mimic important aspects of human disease. We also plan to engineer new strains as useful throughout the life of the project," said Beard.
By predicting the cardiovascular health of new strains, and making recommendations on early interventions, the researcher can follow up with live rat experiments to confirm or falsify those hypotheses, thus enabling predictive analytics to be developed and perfected. Beard's team will also include scientists from the United Kingdom, Norway and New Zealand.


Further Reading

Monday, August 15, 2011

#MARKETS: "Era of the PC Waning"




Personal computers were once considered a bottomless market, but the rise of tablets, smart TVs and other Internet-connected devices has finally established an anchor-point to vault over the PC. First the Internet connected all the world's PCs, and now its dominance is sweeping those antiques away as new consumer-oriented devices offer more convenient access to cloud-based services. Although this trend has been predicted before, it was not until the stunning success of Apple's iPad, that the cannibalization started in earnest, prompting IHS iSuppli to predict that Internet-connected consumer devices will surpass PCs in unit sales by 2013.

Lumping together all the Internet-enabled devices--from televisions to gaming consoles, IHS iSuppli predicts that their unit shipments will surge from 161 million in 2010 to nearly 504 million in 2013. In contrast, it predicts 434 million units for PCs in 2013 (up from 345 million in 2010). And by 2015, Internet-enabled consumer devices will top 780 million, according to IHS iSuppli.

The PC market will be eclipsed by the Internet-enabled consumer devices by 2013. (Source: IHS iSuppli)
"The Internet is revolutionizing the consumer electronics business by delivering products that can bring Web-based content to homes," said Jordan Selburn, principal analyst for consumer platforms at IHS. "In the future, consumers will be more likely to access the Internet through their televisions than with their PCs."

Selburn predicts that this year, Internet-enabled consumer devices will top 241 million units and will grow by another 50 percent in 2012. Last year the top Internet-enabled consumer device was the gaming console at just over 50 million units, but in 2011 the meteorological rise of touch-screen tablets will shoot past gaming consoles, growing at a rate of 214 percent to almost 62 million units from under 20 million units in 2010. By 2015, IHS iSuppli predicts that touch-screen tablets will be shipping at 300 million units per year.

For their measurements, IHS iSuppli did not include any devices that can process data or which are inherently wireless, such as PCs and smartphones, leaving televisions, networked Blu-ray players, gaming consoles, set-top boxes, digital media players, and touch-screen tablets. Tablets, which could also be grouped in wireless or data-processing categories, were slotted with Internet-enabled consumer devices because of their role in what IHS iSuppli calls the "connected home."

For instance, Apple's iPad can play music libraries on the home audio system and can display video on its television, thereby playing a central role for the connected home. After media tablets, smart Internet-enabled Blu-Ray players will exhibit the second fastest growth rate for the connected home, according to IHS iSuppli.
IHS iSuppli's report, "It's 2011--Where's My Connected Home?" details the market watcher's predictions.

Further Reading

Thursday, August 11, 2011

#ALGORITHMS: "Synthetic Biology Redefining Life"


Genetic engineering got an uptick recently when biologists began creating synthetic organisms that will henceforth compete with humans for natural resources, prompting a Presidential Commission to advise "prudent vigilance." Last year, J. Craig Venter, Hamilton Smith and Clyde Hutchison created the first synthetic bacterial cell capable of self-replication, and the race was on between nature-made and man-made organisms.
President Obama immediately formed a commission to study the impact of synthetic biology and to make recommendations. The Presidential Commission for the Study of Bioethical Issues makes the recommendation of "prudent vigilance" according to Amy Gutmann, chairman of a report entitled The Ethics of Synthetic Biology: Guiding Principles for Emerging Technologies

Gutmann's report discusses the Presidential Commission for The Hastings Center, where several other leading scientists have also provided commentary on the merits and risks of synthetic organisms competing with natural organisms in the environment. According to Gutmann, no new regulations are needed to police synthetic biology, but that "responsible stewardship requires that existing federal agencies conduct an ongoing and coordinated review of the field’s risks, benefits and moral objections as it matures."

One of the biggest arguments in favor of synthetic organisms is the ability to move beyond mere genetic engineering into a space populated entirely by new biological processes that could serve humans by producing, for instance, synthetic fuels. In that vein, Agilent Technologies recently became the first industry member of the University of California at Berkeley Synthetic Biology Institute (SBI). There, top researchers in health, medicine, energy and new materials will balance the benefits of synthetic organisms against their threat to the environment.

The three pillars of Synthetic Biology according to the European Conference on Synthetic Biology.

Agilent has made a multi-year, multi-million dollar commitment to assist the SBI researchers with its high-precision measurement tools and assaying algorithms including the active participation of its staff of engineers and scientists. Also contributing to SBI's efforts will be the Lawrence Berkeley National Laboratory (LBNL), which will assist in translating research findings into organisms that can produce fruitful industrial processes, products and technologies.

Currently, SBI employs 33 scientists from eight departments at UC-Berkeley and four divisions of LBNL. Projects already underway include synthetic organisms that produce inexpensive drugs, biofuels and entities that directly attack cancer cells and which aid in the purification of water, increase agricultural yields, remediate pollutants and create new miracle materials.

The genie is out of the bottle, according to William P. Sullivan, Agilent CEO and president, who issued the following statement at the announcement of Agilent's joining SBI: "Synthetic biology potentially can have as profound an impact in the 21st century as semiconductor technology had in the 20th."
SBI is led by LBNL professor Adam Arkin who is the acting director, working with associate director Douglas Clark, a professor of chemical and biomolecular engineering and executive associate dean of the College of Chemistry at UC Berkeley.

Further Reading

#CHIPS: "New architecture promises better battery"


A change in architecture is promising to close the gap between semiconductor technology and battery technology, which has traditionally lagged behind semiconductors due to its dependence on unchangeable chemical reactions. Instead of storing charge in a main battery—then doling it out to individual devices on demand—a new breed of hybrid capacitor/battery is storing just enough energy for an adjacent device's exclusive use. Ioxus Inc. (Oneonta, N.Y.) says it is solving the "battery problem" by defining a new distributed-energy architecture.
Further Reading

Wednesday, August 10, 2011

#CLOUD: "Hundreds of U.S. Data Centers Closing"



The bulk of the U.S. government data center shutdowns will be on the East Coast, but a total of 30 states will have at least one data center plug pulled.

To save money, the U.S. government will shut down hundreds of data centers across the country and consolidate their services into its remaining data centers. The White House Office of Management and Budget recently announced that it would be shutting down 373 U.S. government data centers by 2012. Over the last two years, the number of U.S. data centers has quadrupled, and yet they are running at only about 27 percent utilization, according to the Office of Management and Budget. The maintenance costs of these data centers, including backup power supplies, air conditioning, fire-suppression and special security devices, has been astronomical, causing them to consume 200 times more power than the typical office space. By more fully utilizing the remaining data centers, the White House hopes to maintain current service levels while drastically cutting costs.

So far the Administration has shut down 81 of these data centers already this year, and has a goal of shutting down another 195 during 2011, and 97 more by the end of 2012 for a total of 373. Beyond 2012, its overall goal will be to shut down 800 data centers by the end of 2015, which it claims will save taxpayers over $3 billion annually. The shutdowns are a part of the Obama Administration's attempts to cut government costs called the Campaign to Cut Waste.

The data centers range in size from a 195,000-square-foot Department of Homeland Security facility in Alabama that is bigger than three football fields, all the way down to four tiny 1,000-square-foot Department of Agriculture data centers all in the same zip code.

The 373 data centers to be shut down by the end of 2012 include 113 used by the U.S. Department of Defense, 44 used by the U.S. Department of Agriculture, 36 used by the U.S. Justice Department, 25 used by the U.S. Department of the Interior, 24 used by the U.S. Department of Homeland Security, 22 used by the U.S. Department of Transportation, 22 used by the U.S. Department of Commerce, 19 used by the U.S. Department of Human and Health Services, 15 used by NASA, 12 used by the U.S. Environmental Protection Agency, 10 used by the U.S. Department of Treasury, six used by the U.S. Department of State, six used by the U.S. Veterans Administration, five used by the U.S. Department of Energy, five used by the U.S. General Services Administration, four used by the U.S. Academic Decathlon, and two each used by the U.S. Labor Department and the U.S. Small Businesses Administration.

Further Reading

Tuesday, August 09, 2011

#ALGORITHMS: "IntraLinks SaaS Community Roadmap Unveiled"




Premiere business-to-business deal- and process-management SaaS maker IntraLinks reveals its roadmap for a transformation into a real-time collaboration network.

Premiere business-to-business deal- and process-management SaaS maker IntraLinks reveals its roadmap for a transformation into a real-time collaboration network. Already an industry leader in secure, cloud-based SaaS (software as a service) tools that more than 1 million business users are accessing to communicate, share documents and manage critical business processes, IntraLinks recently revealed its roadmap for transforming into a real-time collaborative network, complete with community-building toolsets.

"Today, we have over 70 enterprise use cases where we provide secure SaaS solutions for high-value transactions, such as mergers and acquisition deals," said John Landy, chief technology officer at IntraLinks. "But looking forward, we are investing heavily in a roadmap that provides enhanced case-management tools for adaptive workflows, community-building, and structured collaboration you can use online or off."

IntraLinks' community-building efforts permit the user base to interact by maintaining profiles, contact lists and analytics regarding opportunities to participate in upcoming projects.
(Source: IntraLinks)
IntraLinks users access the company’s services either with their Web browser or with the IntraLinks' mobile applications for iPad, iPhone or BlackBerry. Once signed in, users can access templates of best practices that simplify the use of containers to exchange documents relevant to their particular business process. This multi-tenet SaaS solution allows users to search across workspaces to find and exchange documents, no matter where they are stored. Besides its two data centers, IntraLinks maintains two additional SunGuard protected repositories for secure documents, as well as multi-factor protocols and encryption services for users that request the highest level of security.

For the future, IntraLinks has embarked on a year-long transformation process that will be incrementally released over the rest of the year, and which beta-testers are already helping debug.

"Pharmaceutical companies already use us today for their start-up processes when they have a new drug study, making all the documentation available and making sure [participants] have uploaded all the proper forms. Now we want to build additional use cases around that model," Landy said, adding that this is an area where the company plans to make great investments going forward.

The first enhanced SaaS capability will be a new case-management process that IntraLinks calls "guided collaboration," which will provide adaptive workflow monitoring and tracking of due-diligence processes, including approvals, all of which will be managed from convenient dashboards.

The second major enhancement to its offerings will formalize community access for its million-strong user base, allowing individuals to search and find colleagues with whom they have previously worked, or who have skill sets that are relevant to a new project they are starting. The social-networking-style interface will provide both individuals and enterprises with a marketplace-style community, allowing new projects to be offered and skill sets to be hawked using industry-specific tagging schemes.

The third new capability will build on a separate IntraLinks application called the Courier, which is used offline to automate the process of electronically exchanging documents for red-lining and approvals, thus eliminating the need for faxing or courier exchanges of physical documents. The new improved Courier SaaS, now called by its internal code name, "Project Collabrio, " will be integrated with IntraLinks portals to provide real-time collaboration services including Chat.

Further Reading

Monday, August 08, 2011

#MATERIALS: "Rare Earths Getting Rarer"

Intematix has solved the problem with rare earth scarcity by moving its production to China, for now, and developing alternatives to rare earths in its manufacturing plant in the U.S.


The irony about rare earths is that they are not that rare, its just that the only operating mines, right now, at in China. Eventually, the U.S., Canada, Australia and many other sites in Asia and Europe will open but for the next five year price gouging and hoarding are likely to run rampant...

Further Reading

#ALGORITHMS: "XSEDE Cyber-Science to Exceed Teragrid"




The Extreme Science and Engineering Discovery Environment (XSEDE) exceeds its predecessor--the Teragrid project--by linking the most advanced U.S. supercomputers into a cyber-infrastructure provisioned with simplified user-access software enabling researchers to address more diverse projects extraordinaire.

The National Science Foundation has kicked off a successor to its popular Teragrid project, that links together 17 of the nation's fastest supercomputer and software development centers with infrastructure support to realize the world's most advanced applications in materials science, medicine, genomics, astronomy, biology and specialty fields like earthquake engineering.

The new Extreme Science and Engineering Discovery Environment (XSEDE) “will expand on the range of services offered by TeraGrid," said John Towns of the University of Illinois's National Center for Supercomputing Applications. Towns, who had a variety of roles in the TeraGrid project, will lead the XSEDE project. "XSEDE will establish a distributed environment that increases the productivity of researchers with collaborative tools and direct access to instrument data in addition to high-performance computing resources."

The result will be a cyber infrastructure that encompasses resources from XSEDE and other providers, creating a living ecosystem of services that researchers and educators can use to develop capabilities that go beyond the raw supercomputer power provided by the previous TeraGrid program.

XSEDE will create immersive 3D simulations like this one of a blowout similar to that which destroyed the Deepwater Horizon oil rig in the Gulf of Mexico where "ribbons" of color indicate flows. (Source: LSU Center for Computation and Technology)
"XSEDE will put processes in place that evolve the environment over time as new technologies emerge and as new requirements are understood from the user community," said Towns.

The NSF Office of Cyberinfrastructure will coordinate the addition of new disciplinary areas to engage more research areas that were included under the TeraGrid umbrella. The NSF has already allocated $121 million to kick-off XSEDE which will be led by the University of Illinois's National Center for Supercomputing Applications and will link 16 supercomputer sites nationwide at the University of Illinois at Urbana-Champaign, Carnegie Mellon University/University of Pittsburgh, University of Texas at Austin, University of Tennessee, University of Virginia, Southeastern Universities Research Association, University of Chicago, University of California San Diego, Indiana University, Jülich Supercomputing Centre, Purdue University, Cornell University, Ohio State University, University of California Berkeley, Rice University, and the National Center for Atmospheric Research.

The Shodor Education Foundation will also participate, helping to develop new software tools that propel scientific discovery. A simplified User Access Layer--managing authentication and job monitoring--will be developed to enable applications to be run on its resources without the detailed knowledge that was typically necessary to make optimal use of previous TeraGrid resources.

More than 10,000 scientists used the TeraGrid supercomputer resources over its 10-year lifetime--2001 to 2011--all at no cost. XSEDE will likewise provide integrated supercomputer, networking, and software infrastructure resources to needy scientists and engineers nationwide, but will broaden its application base to include new kinds of community-based projects.

Further Reading

Friday, August 05, 2011

#CHIPS: "Authentication Chips Combat Rampant Counterfeiting"




Counterfeiting has spread from credit cards to microchips to circuit boards and entire networking appliances, prompting semiconductor makers worldwide to pioneer a new billion dollar market for smart authentication microchips.

Smart cards have pioneered in the anti-counterfeiting market by perfecting authentication microchips, which ensure that credit and security cards are genuine and authorized. Now major microchip makers are diversifying into authentication chips for a variety of markets rife with counterfeiting. The chips can be used in everything from computer circuit boards to networking devices that connect to cloud computers.

"Counterfeiters are cloning a range of electronic components, prompting networking companies like Cisco to start using authentication chips in their devices," said John Devlin, senior practice director for AutoID and smart cards at ABI Research (Scottsdale, Ariz.). "The market is small now--only around $100 million--but will grow to over $4 billion by 2016."


The authentication microchip market is small today, but will grow rapidly in a diverse anti-counterfeiting market expected to be worth $6 billion by 2016.
Authentication chips cast into hardware the identity-based cryptography protocols that exchange secure keys to guarantee that the manufacturer who claims to have made a device is genuine. Encryption is usually also employed to prevent the kind of eavesdropping on the secure key exchange that counterfeiters can use to fake identity-based protocols. The exact algorithms used are often kept a closely guarded trade secret by the authentication microchip manufacturers, which include the makers of smartcard microchips, such as NXP, STMicroelectronics, Infineon, Inside Secure, Maxim and Renesas.

Today, microchip-based security solutions amount to just five percent of the total anti-counterfeiting, brand protection and authentication market, but it is one of the most profitable segments accounting for as much as 32 percent of total revenue, according to ABI Research. And going forward, the firm predicts that authentication microchips will be the fastest-growing segment of the market for the next five years.
Counterfeit components are being found in nearly every category of device, according to Devlin, who claims that even President Obama's Air Force One aircraft has detected and replaced counterfeit components, prompting the Federal Aeronautics Administration to introduce special training procedures to spot counterfeits. The National Electronics Distributors Association estimates that in 2010 the IT industry alone lost $100 billion to counterfeit components.

Secure memory chips have been used for several years in many anti-counterfeiting applications, according to Devlin, but authentication microchips have only been widely available for about a year and a half, permitting almost any type of electronic device to employ smart algorithms that guarantee that they are genuine.

Authentication microchips typically add less than a dollar to the cost of manufacturing an electronic device and are usually mounted on printed-circuit boards alongside other components. Prices are dropping too, which will permit even cost-sensitive devices like radio-frequency identification tags to include authentication protocols in the near future, according to Devlin.

Further Reading

Tuesday, August 02, 2011

#CHIPS: "Freescale processors gain on-chip e-reader"




Earlier this year, Freescale Semiconductor Inc. announced the first line of processors designed to power sub-$99 e-readers. Now it has extended that line downward with integrated E-Ink driver circuitry for low-end devices from medical and home/office automation to watches whose face is an electronic paper display (EPD).
Further Reading

#ALGORITHMS: "Top Trader Bot Beats All Humans"


Human traders don't have a chance against bots which have been competing among themselves for a decade, resulting in a super-bot whose adaptive aggression annihilates all animates. A decade has passed since IBM's seminal demonstration that software bots could beat humans at securities and commodities trading. Since then the bots have been competing among themselves until now, on the 10-year anniversary of bot's supremacy, an architecture called AA (adaptive-aggressive) has emerged as king-of-the-hill.

The International Joint Conference on Artificial Intelligence (IJCAI), held last month in Barcelona, marked a decade since its 2001 meeting where software robots (bots) proved their supremacy over humans at stock-market trading. Technically called a continuous double auction--since both buyers and sellers set prices on their offers--CDA today dominates the trading activity on nearly all commodity and stock exchanges. Most institutional investors use bots that are variations on the well-known CDA strategies which were outlined a decade ago in an IBM paper at IJCAI 2001 entitled: Agent-Human Interactions in the Continuous Double Auction.

In that paper, software bots were shown to consistently outperform human traders on real-time CDA markets, in particular by using two trading-agent strategies called ZIP (zero-intelligence plus) and GD (after the economists Steven Gjerstad and John Dickhaut). In subsequent work, the same research group at Thomas J. Watson Research Center reported an improved strategy they called GDX, which outperformed both ZIP and GD when bots were pitted against other bots (as is the case today on most modern stock and commodities exchanges).

The performance of IBM's GDX continues to be used as a figure-of-merit for commercial bidding strategies, but according to a new paper presented this year at IJCAI 2011, a newer strategy called Adaptive Aggressive (AA) outperforms ZIP, GD and GDX in bot-versus-bot trading. The new paper entitled: Human-Agent Auction Interactions: Adaptive-Aggressive Agents Dominate was presented last month at the conference by professor Dave Cliff and doctoral candidate Marco De Luca, both of the University of Bristol (England).

The AA algorithm differs for the other algorithms in that it aggressively trades off profit for the ability to make a transaction, it then adapts its aggressiveness after each completed buy or sell. By mimicking the method used in IBM's original paper to judge the bots, Cliff and De Luca claim to show that AA outperforms all the other strategies in both bot-to-bot and human-versus-bot trading.

Further Reading

Monday, August 01, 2011

#MATERIALS: "Wet Electronics Open Door to New Possibilities"


Gadgets, gizmos and wireless wonders must be fastidiously protected from moisture today, but researchers using circuitry with the consistency of Jell-O claim that the smarter electronics of the future will be all wet. Twice I ran my old Sony-Ericsson cell phone through the washing machine and it miraculously survived, but that is only a testimonial to device's excellent waterproofing technologies. That all may change soon, when ultra-secure moisture-friendly prototypes recently shown by North Carolina State University (NCSU) are commercialized.

Today, electronic devices of all types must be protected from not only submersion in water, but even from humidity in the air. Medical implants, for instance, must be hermetically sealed to secure them from shorting out. By harnessing the synergy between water-compatible hydrogels and liquid metals, NCSU researchers herald a new era of smarter moisture-compatible electronic devices.
A 2-by-2 array of crossbar switches where memory-resistors at each crossing operate like synapses in the brain. (Source: NCSU)

As you might imagine, materials that can happily be submerged without dissolving or shorting out their circuitry are few and far between. And those that can—such as plastics—have inferior electrical characteristics, making them too slow reacting for medical implants and other mission-critical electronics that must work rain or shine. However, by combining liquid metals with polyelectrolyte hydrogels, which have the consistency of Jell-O, a new class of fast submersible gadgets is on the horizon.

The key to this invention of NCSU professor Michael Dickey, however, is not the water compatibility of the materials themselves, but rather the ability of the metal—an eutectic alloy of gallium and indium—to form a nonconductive oxide skin when current flows through it. The switches can be programmed to act like synapses in the brain. In effect, these crossbar switches remember their "experiences"—an effect called a memory-resistor, or memristor, by their inventor, University of California at Berkeley professor Leon Chua (this technology is currently being commercialized by HP Labs and Hynix).

Consequently, the new liquid-metal/hydrogel combination can be used to create brainlike circuitry that learns from its environment. The first task of these new water-compatible circuits, however, will be much less ambitious, since for one thing they are still being built on the millimeter scale rather than the micron- and nano-scale of circuitry in the brain. However, simple circuitry can be realized with the new approach to create biological sensors that can be directly implanted for medical monitoring.

NCSU doctoral candidates Hyung-Jun Koo and Ju-Hee So also contributed to the work, which was funded by the National Science Foundation and the U.S. Department of Energy.

Further Reading

#NANOTECH: "Nanocircuits that adhere to any substrate"


Researchers at Stanford University recently demonstrated a novel wafer-scale lift-off process for fabricating nanowire-based circuits on reusable silicon wafers, then transferring them to any substrate in any shape. The research team, led by professor Xiaolin Zheng, claims the flexible circuitry can be used to create anything from paper-thin displays and solar cells to biomedical sensors that attach directly to the tissue being monitored.
Further Reading