Friday, October 28, 2011

#MARKETS: "Jobs' Secret Formula at Apple Revealed"

Steve Jobs' death ends an era at Apple, but his legacy in innovation is overshadowed by his execution of solid business practices in a market filled with casualties. In particular, Apple's margins on its top-selling products lead the industry, but analysts question whether Jobs' vision and high-margin execution strategies can be maintained without him.

Steve Jobs’ genius may prove irreplaceable at Apple, according to IHS iSuppli, which recently polled the opinions of its analysts, asking the question: "Can Apple Stay on Top Without Jobs?"
The experts disagree on how much Apple's stunning successes have depended on Jobs’ skills--and even which skills were most important. Consequently, IHS iSuppli turned to the hard facts in an attempt to trace the company's success to its source. The most important factual explanation for Apple's successes, according to senior analyst Wayne Lam, is Apple's margins.
According to Lam, Apple's highest-volume product--the iPhone--enjoys hardware margins in excess of 70 percent. This makes Apple the industry's top money-maker in terms of market capitalization, according to Lam, who claims that Apple now has "more cash reserves than the U.S. Treasury."
"Apple’s iPhone product line has historically enjoyed very high hardware margins [of 60 to 70 percent]. In contrast, other leading smartphone brands typically get hardware margins of 50 percent or less," Lam said. "Another way of looking at this issue is that Apple's iPhone has consistently had the highest average selling price on the market today--around $645 while their competitors hover at around $400.”
According to Lam, Apple's high margins can be directly traced to Jobs' "laserlike focus," which enabled Apple to envision a clear path to success for new products. And once launched, Jobs did not allow naysayers to divert Apple's course, but saw product development efforts through to the end.
"Jobs didn’t care about quarterly reports, and he didn’t care what critics and the media said," noted Lee Ratliff, principal analyst, broadband and digital home for IHS. "All he cared about was implementing his vision. The force of his personality in this regard was enormous."
For instance, Apple did not invent the mouse and icon (that was done at Xerox Parc). However, Jobs did reinvent the graphical user interface that employed them for the Mac, then repeated that same model-for-success for the iPod, iPhone and iPad.
"Apple [took] all these products to a new level by reinventing how people interact with them," said Bob Braverman, senior director, communications and consumer electronics, at IHS. "This is entirely because of the visionary way in which Steve Jobs saw these markets and their usage models."
But can Apple maintain that methodology without Jobs? The past is littered with the remains of now-defunct companies with seminal visions, but not the wherewithal to sustain once the visionary leadership was gone. However, there are also success stories like Ford Motor Company that show it is possible to instill the founder’s personality into a corporate persona.
"Jobs has been compared to Thomas Edison and Henry Ford," said Dale Ford, senior vice president, electronics market intelligence for IHS. "These men were able to carry on their vision and legacy because the institutions they established lasted after their deaths. The question now is whether Apple has learned enough from Jobs that it can continue on with its success--like Ford Motor Co. did after the death of Henry Ford."
IHS claims the jury is still out, and will remain out for many years as Apple struggles to maintain its phenomenal margins against a sea of competitors, intent on clipping its wings.
"Without Jobs at the helm, Apple’s massive margins have got to wane at some point," said Braverman.
Further Reading

#ALGORITHMS: "Primer: BPM`s Role in Business Agility"

Purported to be the most important advancement in business management since the Industrial Revolution, business process management aims to turn enterprises into lean, agile, profit-making machines that simultaneously maximize customer satisfaction, product quality, delivery speed and time to market.

Business process management improves business agility by treating the processes of business as an enterprise asset in the design of lean, efficient organizations and information technology. By taking a holistic approach, rather than concentrating on specific aspects of the business process-like workflow, BPM applied to improve business agility has consolidated and optimized the best-of-the-best business and IT practices.
BPM aims to align all aspects of an enterprise's operations with its stated corporate goals--traditionally one of the most difficult aspects of businesses to manage, especially in large organizations. The best part of BPM, according to business leaders, is that it achieves efficiency and conformance while also promoting innovation and integration of new technologies. BPM is also claimed to increase customer satisfaction, product quality, delivery speed and time to market.

Implementing BPM is simplified by software that manages the complexity of the entire process lifecycle with a growing library of BPM assets and dependencies. (Source: IBM)
Modern BPM software aims to solve business problems by automating its processes--sometimes called business process automation (BPA)--thus enhancing business agility when responding to changes in customer needs, new regulatory demands and other sources of stress on businesses. Several software suites codify BPM, enabling it to be applied in real time, thereby turning the practice into an optimization process that can be constantly applied to incrementally improve an enterprise's products and services as well as its bottom line.
The processes--basically any business activity that aids in delivering value-added products and services to customers--become the strategic assets of the enterprise, with the BPM software helping them to become clearly understood, managed and constantly improved. BPA software then automates these processes into workflows that include tasks performed both by people in the enterprise and by IT application suites.
For those new to BPM or BPM applied to improve business agility, there are many good sources of information available today. According to the introduction to the "Handbook on Business Process Management" by Mike Hammer, a business process pioneer who was named one of "Time" magazine's 25 most influential people, BPM is the most important new concept in organizational performance since the Industrial Revolution. The six steps to realizing the advantages of BPM, according to the Handbook, are envision (strategically defining corporate goals), design (description of affected processes), modeling (software simulation of processes), execution (creating the software and services that realize processes), monitoring (tracking and measuring success of each process) and optimization (identifying bottlenecks and realizing process enhancements).
As the bridge between business goals and IT, a variety of BPM suites are available to help implement, manage and automate processes, as well as integrate them with existing operational intelligence, business modeling and analytical tools.
For instance, several companies offer an integrated set of software tools for creating, executing and optimizing business processes. For example, IBM also offers a comprehensive Business Process Manager to build, manage and optimize process applications. Used in service-oriented architecture (SOA) or non-SOA environments, sophisticated visualization tools such as "heat maps" pinpoint bottlenecks and allow drilling down to root causes. IBM also offers cloud-based integration of social business with BPM. That is the free-to-try Blueworks Live, with built-in instant messaging, comment sharing and live news feeds.
Further Reading

Thursday, October 27, 2011

#MATERIALS: "Erasable E-Paper Saves Trees, Cuts Costs"

Electronic paper that runs through thermal printers but can be electrically erased and rewritten aims to make tree paper obsolete. The paper makes use of flexible plastic that retains printed text without batteries or moving parts.

With all the movement to green technology, the day was bound to come when ordinary sheets of paper were replaced with electronic versions that run through printers but can be erased electrically and rewritten over and over. Made of a durable plastic, e-paper uses a similar technology to reflective e-readers, but in a more flexible form factor that does not require a battery and which can be instantly erased.

Further Reading

Wednesday, October 26, 2011

#ALGORITHMS: "Cities Awarded $50 Million to Get Smart"

IBM's Smarter Cities Challenge offers budget-constrained cities relief with free software and consulting services aimed at simultaneously improving a city's economic outlook and its citizen-service delivery. These goals are to be achieved by making each city's operations more efficient with smarter management, planning and forecasting.
Today cities account for more than half of the world's population, and by 2050 forecasters believe that the number of urban dwellers will rise to over 70 percent. As a result, cities are becoming the most important unit of government, charged with delivering services to the vast majority of the world's population. The worldwide recession and overtaxed budgets are forcing cities to tighten their belt, operate more efficiently, and seek smarter methods of managing resources and planning.

Today cities hold more that 50 percent of the world's population, and by 2050 IBM estimates that will rise to 70 percent or more. (Source: IBM)
Relief is being offered by IBM's $50 million Smarter Cities Challenge—a three-year program (2011-2013) that aims to help 100 cities worldwide operate smarter by harnessing new technologies and methodologies to solve long-standing civic challenges. Each 2012 award, which will average $500,000, including all provided services and software, will be made to cities-in-need that apply before Dec. 6, 2011. Last year, 26 awards were made to cities worldwide, with applications open to the governing bodies of cities speaking English, French, Spanish, Russian, Arabic, Chinese (Simplified) or Japanese.
Winners are visited by top IBM consultants, who spend weeks studying a city's problems to ascertain their needs—from finance to sustainability to public safety to citizen services. After conferring with officials, citizens, local businesses, academics and community leaders, IBM recommends actions that make smarter use of existing resources. Economic challenges, according to IBM, are met head-on with innovative recommendations on how to more efficiently deliver services to citizens. Experts from a variety of fields are brought into the process, including specialists in employment, health, public safety, transportation, social services, recreation, education, energy and sustainability.
Winning cities will be encouraged to become savvy users of the free online portal to city statistics called City Forward. By making public data available to City Forward, public policy experts worldwide can benefit from the success stories of other cities around the world, thus helping to identify local problems and potential solutions.
In 2012 and 2013, IBM will award 76 applying cities worldwide that demonstrate strong leadership and that are willing to collaborate with diverse stakeholders in the solutions to their outstanding problems. To read case studies on how the program has already benefited cities worldwide, visit the Smarter Cities Challenge Website, which describes IBM's recommendations to 2011 grant recipients. In the United States, for instance, several cities have acceded to IBM's 2011 recommendations, including ways to reduce the crime rate in St. Louis (Missouri), more efficiently manage aquaponics in Milwaukee (Wisconsin) and reduce high traffic fatality rates in Edmonton (Alberta).

Further Reading

Tuesday, October 25, 2011

#HEALTH: "Smart ePetri Dish Monitors 24/7"

Biological research may never be the same if California Institute of Technology succeeds in transforming the way cell cultures are monitored by using a cell phone camera as the platform for a "smart" Petri dish.

Today, the venerable Petri dish for culturing living cells is still in wide use even though it has not changed substantially since it was invented in the 19th century by Julius Richard Petri. Now Caltech wants to bring the Petri dish into the 21st century. By disassembling a smartphone and using its camera and display to monitor and illuminate, Caltech's "smart" Petri dish can monitor cell culture activity 24/7.
Researchers use Petri dishes to grow samples of selected micro-organisms, often merely to multiply their number enough to determine their type. For instance, a sample from a throat swab might be cultured in a Petri dish with a medical technician monitoring it until enough has grown to use in tests to see what variety of bug is making the patient's throat sore.
Often, after samples are grown, they are transferred to many other Petri dishes in which individual tests can be run to determine which drug works best. Here again, a lab technician must visually inspect each Petri dish to see if the culture is still growing--showing immunity to the drug--or whether growth is suppressed by the drug. Many other related uses of Petri dishes are made in other biological research areas, but they all share the same disadvantage of requiring trained technicians to monitor growth.
Caltech wants to change all that by making the Petri dish smart. And what better way to do that than to use the microelectronic components from a smartphone to automate their monitoring? Dubbed "ePetri" the new smart cell-culture dish was announced this month in the prestigious "Proceedings of the National Academy of Sciences."
To be sure, medical test automation is also being pursued by drug companies and other researchers using expensive microchip arrays that can culture as many as a million separate samples simultaneously. However, the Caltech invention shows how even the humblest research lab can profit from individual ePetri dishes using inexpensive smartphone components.
Individual Petri dishes are usually inserted into an incubator with dozens of other samples, each of which must be manually removed and periodically inspected to visually inspect and record how much growth has occurred since the last inspection. The ePetri dish, on the other hand, can monitor cell growth 24/7 as well as allow a technician sitting at a computer to quickly inspect it without removing it from the incubator. This not only saves time, but also reduces the risk of contamination.
"We can directly track the cell culture or bacteria culture within the incubator," said doctoral candidate Guoan Zheng who works in the Caltech lab of electrical engineer and bioengineering professor Changhuei Yang. "The data from the ePetri dish automatically transfers to a computer outside the incubator by a cable connection."

Caltech’s ePetri dish uses inexpensive smartphone components.
The poor-man's ePetri dish was assembled atop a standard cell phone image sensor chip in a rig constructed from, believe it or not, Lego building blocks. The cell culture is placed directly on the image sensor chip, thus eliminating the need for the expensive camera and lens system that is used to monitor cultures in multi-well microchip arrays. The smartphone's screen was mounted above the ePetri dish to illuminate it. A smartphone's Android app was then used to monitor growth 24/7 and record the images, which are accessed on a computer directly connected to the ePetri by USB.
For the future, the researchers want to integrate the ePetri dish with a built-in incubator so that a handheld lab-on-a-chip-type device could be used in a doctor’s office or in the field for remote diagnostics.
Other researchers contributing to the work included Caltech biologist Michael Elowitz, postdoctoral scholar Yaron Antebi and doctoral candidate Seung Ah Lee. Funding came from the Coulter Foundation.

Further Reading

Monday, October 24, 2011

#WIRELESS: "Smart Navigation Mates Cars to Signage"

Crowd-sourced consumer GPS navigation units are feeding a new array of smart transportation services, such as smart signage that recommends detour routes with specific times-to-destination.

Dashboard-mounted GPS units are almost as common as iPods, and now this network of location sensors is crowd-sourcing its resources by feeding into smart transportation systems that improve traffic management and planning.
"Our services are not just for consumers anymore," said Nick Cohn, business development manager at TomTom. "The individual drivers in their cars have more current information about the condition of the road than the government authorities have, but [we] can help close that gap."

TomTom amalgamates real-time traffic information from many sources--including its dashboard-mounted GPS receivers, its iPhone app users and TomTom fleet users--and then gives time-saving rerouting suggestions to travelers with specific times-to-destination instead of miles or kilometers. (Source: TomTom)
As one of a handful of dedicated GPS navigation unit providers, TomTom has also become one of the favorite iPhone applications. Its hardware is also built into the dashboards of Renault, Alfa Romeo, Fiat, Toyota, Mazda and Subaru. Additionally, its traffic data is displayed by the navigation units built by GM, Audi, Mercedes Benz, Ford, BMW and VW. TomTom also has deals to provide GPS tracking and reporting units for fleets worldwide so that management always knows where its cargo trucks are located. All these navigation units are communicating with TomTom's real-time servers, where algorithms run to analyze the data from both consumer and industrial users and provide rerouting suggestions, as appropriate, to drivers.
At the ITS World Congress (held last week in Orlando, Fla.), TomTom described its vision of the future where crowd-sourced information flowing through its servers helps governments around the globe to manage traffic and simplify urban planning as well as help industry fleets and individual drivers find their way about, not by the shortest route in miles, but by the quickest time on the clock.
"In The Netherlands, for instance, they are putting a freeway underground in a project that will take seven years," said Cohn. "To reroute commuters, they need to give them up-to-date information, but they have no hardware on the ground but their smart signs; we provide all the information to the signs."
TomTom is currently offering real-time and historical traffic information to industrial and government users in 18 countries globally, including Australia, Austria, Belgium, Canada, France, Germany, Ireland, Italy, Luxembourg, the Netherlands, New Zealand, Poland, Portugal, South Africa, Spain, Switzerland, the United Kingdom and the United States. TomTom also promises to continue rolling out real-time traffic services in the remaining countries it serves through 2012.
Further Reading

#ALGORITHMS: "Social Analytics Tracks Baseball Sentiments"

The Annenberg Social Sentiment Index--powered by IBM Social Analytics--is measuring fans' feelings from analytics that scan millions of baseball World Series tweets.

Retargeting analytics from pure business decision-support to tracking social sentiments in the more fun aspects of life--such as the ongoing baseball World Series--is the goal of a new social sentiment index created by the University of Southern California. The USC Annenberg Innovation Lab will present its baseball analytics project at the 2011 IBM Information on Demand and Business Analytics Forum (this week in Las Vegas).
The project's aim is to show how the same analytics that IBM's Watson used to beat human champions in the TV game show "Jeopardy," could be repurposed for interesting social media data mining. Analytics included both semantic and linguistic analysis of real-time posts on Twitter. Past social-sentiments projects by the Annenberg Innovation Lab include journalistic-oriented analytics applied to news stories, movie-oriented analytics predicting successes from film critiques, and retail-oriented analytics identifying trends in fashion shows.
IBM has had a long interest in baseball, as evidenced here in a picture of their team in 1938. (Source: IBM)
The Social Sentiment Index offers a "unique opportunity to gain valuable knowledge in the use of advanced analytics technologies [applied] to real-world settings to understand how this new information can benefit a variety of industries," said professor Jonathan Taplin, director of the USC Annenberg Innovation Lab.
Baseball analytics was recently featured in a major book and film--"Moneyball"--which describes how "big data" analytics was instrumental in the decisions made by Oakland Athletics general manager Billy Beane. And some baseball experts have suggested that the same “Moneyball” approach to selecting players and assembling a team is what got the Rangers and the Cardinals to the World Series over teams with higher payrolls.
"Organizations are realizing the value of analytics to better respond to customer needs, whether it’s analyzing fan sentiment during a sports event, hospital patient data for personalized treatment programs or the latest fashion trends for more targeted marketing campaigns," said Rod Smith, vice president of emerging technology, IBM.
During the World Series, fans are being encouraged to use the Twitter hash-tag "#postseason" which simplifies the accumulation of tweets for analysis. Compiled by USC students using IBM Social Analytics technology, over a million tweets have already been analyzed during the National League Championship Series (NLCS). This work has now been broadened for the World Series.
IBM Social Analytics is a relatively new feature in IBM Connections, which is available for compatible Home Pages, Communities and Profile pages. A "Recommendations" widget suggests content that users might find interesting and related communities they might want to join. The "Do You Know" widget recommends people to add to a social network, the "Things in Common" widget identifies others with common interests, and the "Who Connects Us" widget traces the social "path" that connects users.
So far, Annenberg's Social Sentiment Index has found that the St. Louis Cardinals' Chris Carpenter is the most popular player (with 1,573 tweets 61 percent of which were positive). Cardinal David Freeze came in second (768 tweets, 89 percent of which were positive). Overall, however, the Texas Rangers is the most popular team (over 56,000 tweets, 79 percent of which were positive) with five times as many tweets as the Cardinals (11,500 tweets, 80 percent of which were positive).
Further Reading

Friday, October 21, 2011

#ALGORITHMS: "Android Beats Apple With Location-Based Security"

Location-based services usually mean trying to sell something to nearby consumers, but Virginia Tech has invented location-based security for Android-based smartphones or tablets. With a capability that Apple's iOS can't match, the approach allows secure information to be viewed when in a designated vicinity but automatically zeros it out when the user leaves the premises.

Virginia Tech has modified Google's Android operating system to provide a location-based service that maintains privacy by automatically wiping sensitive information from smartphones and tablets.
The approach could be used to provide access to files when a person is in a secure location, but when the user leaves the facility that information is erased. The technique, which is not available for Apple iOS-based devices, could also be used to keep medical records and other sensitive files private, as well as to prevent teenagers from sexting.
The new security algorithm was recently demonstrated to Virginia Tech alumni in an inside-the-beltway group called VT IDEA (Virginia Tech Intelligence and Defense Executive Alumni), which is interested in research that benefits the intelligence and military communities. At VT IDEA the new technology was portrayed as the modern equivalent of the Mission Impossible franchises self-destructing tapes, which allowed spies to hear secret instructions, but then were wiped out afterwards. Likewise, with the new Virginia Tech work any data set can be tied to a physical location when viewed on the specially modified smartphones. All traces of that data is then erased all when the user leaves the secure location.
The technology is the brainchild of professor Jules White in Virginia Tech's Department of Electrical and Computer Engineering, who said that the system provides something that has never been available before: "it puts physical boundaries around information in cyberspace."
By fencing-in sensitive data, the system is designed to prevent both intentional security breaches, as well as inadvertent leaks caused by tell-tale trails left behind in the caches of browsers and other viewing apps. Android smartphones and tablets are given permission to access sensitive data while in a particular area, but when the devices leave the area, or when a supervisor finds that a device has been lost or stolen, its data can be completely wiped, a level of security that is unavailable elsewhere today, according to White.
"There are commercial products that do limited versions of these things, but nothing that allows for automating wiping and complete control of settings and apps on smartphones and tablets," said White.
Besides providing location-based viewing and wiping of sensitive data, the system also allows the different capabilities of the Android smartphone or tablet to be disabled when in certain areas. For instance, when entering a "for-your-eyes-only" room, the camera of the phone could be disabled to prevent spies from photographing sensitive information.
The same capabilities could also be used by parents to limit when and where their children can use their smartphone camera and email, to prevent distractions at school. Parents can even specify to whom teenagers can send images, to prevent sexting.
Medical records could likewise be tied to a caregiver’s examination rooms, preventing doctors or nurses from walking out with patient records on their Android device. Camera, email, Web access and other distractions could be blocked from a surgeon's smartphone or tablet while they are in the operating room, to prevent mistakes while they work (and to prevent photos of famous patients from making their way to the Internet).
The research was underwritten by the Virginia Tech Applied Research Corporation.
Further Reading

Wednesday, October 19, 2011

#SECURITY: "Privacy and Trust On Trial in 2012"

Georgia Tech predicts that Internet privacy and trust will erode in 2012 due to a new era of sophisticated cyber-threats including search poisoning, peer-to-peer botnets, and rampant mobile-device breeches.

As cyber-threats become more sophisticated in 2012, users will become increasingly vulnerable to exploitation by malicious websites, stealth botnets, and wholesale commercial-use of stolen data caches, wresting away user confidence in the Internet.
Georgia Institute of Technology recently released its annual analysis of the cyber threats facing Internet users in 2012. Prepared by its Information Security Center (GTISC) and Research Institute (GTRI), the report was presented at the recent Georgia Tech Cyber Security Summit, where academia, industry, and government IT security specialists gather each year.
"Malicious actors have the ability to compromise and control millions of computers that belong to governments, private enterprises, and ordinary citizens," said Mustaque Ahamad, director of GTISC in the report. "Academia, the private sector, and government must work together to understand emerging threats and to develop proactive security solutions to safeguard the Internet and physical infrastructure that relies on it."
According to the report, security measures are not keeping pace with the bad-guys’ increasingly sophisticated techniques to capture and exploit online user data.

Security guru Bruce Schneier weighs the ROI to enterprises in securing their computers, finding the optimal level of security balances cost of breeches versus cost of security measures.
Attacks "are becoming increasingly sophisticated and better funded," said Bo Rotoloni, director of GTRI’s Cyber Technology and Information Security Laboratory (CTISL). "We can no longer assume our data is safe sitting behind perimeter-protected networks. Attacks penetrate our systems through ubiquitous protocols, mobile devices, and social engineering, circumventing the network perimeter."
The most sobering aspect of the report is an emerging trend by which stealth botnets collect vast databases of personal information about users, then package it as if it were legitimately collect information that they then sell into reputable marketing channels. According to the report, sales leads qualified in this way can be sold for up to $20 each. As security experts work to plug this hole by targeting malware command-and-control computers, the bad guys are progressing to peer-to-peer botnets that fill-out online forms with stolen information to secure new services such as credit lines that direct funds to the bad guys.
Malware dealers are also gaming search-engine-optimization (SEO)--called search poisoning--in order to insert their phony websites into Google results, further eroding user confidence in the Internet. And here too, the bad guys are one step ahead of attempts to prevent malicious use of SEO. For instance, infecting domain name server (DNS) provisioning systems can directly substitute malicious URLs for legitimate ones. And by fortifying infected DNS servers with stolen or counterfeit certificate authorities, even security experts cannot tell legitimate websites for banks and other financial institutions from phony ones designed to collect usernames and passwords.
Last year the USB stick was the easiest way for malware to get around firewalls, but in 2012 the mobile phone will serve the same end of installing stealth malware on otherwise secure computer systems, according to the report. The new frontier in malware--mobile devices--appears even more daunting, as vulnerabilities in wireless browsers for handhelds often endure even after they are discovered. Particularly troublesome is the fact that there is no routine way to install security updates on many mobile devices.
Legions of mobile-phone stealth botnets are already silently growing. And even if all mobile browsers were to adopt a standard security update methodology, the bad guys are already one step ahead with compound threats which combine web browser, email, and text-messaging vulnerabilities to sidestep holes plugged in any one medium.
To combat the mobile threat, Georgia Tech is working with nine mobile browser makers in 2012 to identify and remediate their vulnerabilities.
Further Reading

Monday, October 17, 2011

#ALGORITHMS: "Smart Systems to Top $1 Trillion"

Smart systems that combine microprocessor control, connectivity and a high-level operating system will grow from a $1 trillion market today to $2 trillion by 2015.

Microsoft Embedded argues that all the connected devices today are providing unprecedented opportunities for fine-grain realtime business analytics.

Smart systems are proliferating in nearly all fields. And their use covers quite a broad range, including smart household appliances, smartphone navigation apps, smart security apps that identify suspicious activity, and supercomputers that use artificial intelligence to give expert medical or legal advice.
Already there are 1.8 billion smart systems in service worldwide, cutting across every application area under the sun—from personal hygiene to public transportation—but that number will more than double to over 4 billion over the next five years.
"IDC believes this new generation of intelligent systems and its ecosystem will have broad reach and establish the next wave in computing over the next five years," said Mario Morales, vice president of IDC's Semiconductor research program. "Cloud-based applications and analytic workloads will extract significant business value from all of the end-user data."
The idea is that all the electronic devices in the world are streaming information from which business intelligence can be derived—all the devices that users are carrying today, plus all the environmental sensors being deployed worldwide, plus all the surveillance video cameras, plus all Internet-connected computers and wireless devices such as remotely monitored medical implants. All these sources serve as the data streams from which the businesses of the future will derive their business intelligence.
A recent forecast by IDC ("Intelligent Systems: The Next Big Opportunity") claimed that over $1 trillion is already expended for smart systems of all types today, and by 2015 the market could top $2 trillion. IDC claims that today market penetration is about 20 percent, but in five years over one-third of all electronics systems will be smart devices. And that's just the start.
Embedded processors—the electronic brains that instill the intelligence into most smart systems—already outnumber the processor cores used in PCs, servers and mobile phones combined, according to IDC, which predicts that more than 14.5 billion cores will be embedded into smart systems by 2015.
Beyond 2015, IDC predicts accelerated growth sparked by the availability of an ecosystem of hardware, software and services. The smart-systems development tools that will be used to quickly prototype and manufacture new smart systems include programming templates for supercomputer-caliber microprocessors, vast arrays of nanoscale sensors and a wide array of cloud-based apps that extract analytics from the big-data streams in order to provide business intelligence.
"The cloud is the essential link," said Morales, principal author of IDC's report. "Cloud-based applications and analytic workloads will extract business value from all of this end-user data."
Further Reading

Friday, October 14, 2011

#ALGORITHMS: "Smart Systems Standardized by Feds"

Standardizing smart systems is a newly announced goal of the National Institute of Standards and Technology (NIST), the federal agency that has standardized everything from the yardstick to the atomic clock.

A $1 million award to the Institute for Systems Research will team University of Maryland at College Park researchers with scientists at the National Institute of Standards and Technology in a three-year program to create standards, testing methods and measurement tools to consistently rank the performance of smart systems.
Nearly every large system is getting smarter, from the largest nationwide networks to the smallest standalone devices, and even the underlying materials themselves are getting smart. At each level, these smart systems apply the same principles, according to NIST, since each layer includes the three elements of computation, communication and automation. From our household appliances to vehicles to buildings and even the utility grids themselves--all are getting smarter in similar ways.
"While we can expect an ever larger and more diverse range of smart systems and applications [in the future] they all share a basic set of requirements that should not be addressed in stovepipe fashion," said Shyam Sunder, director of NIST’s Engineering Laboratory. "We will take a broad view of these new technologies as we develop standards and measurement tools that apply to all."
NIST already keeps the U.S. "gold standards" for all types of measurements--from units-of-time to chemical-composition. NIST's newest effort will standardize smart systems that combine computers, communications and automation--what ISR calls a cyber-physical system (CPS).
"Investigating and understanding how the cyber-components can be synergistically interweaved with the diverse physical components in CPS pose foundational research challenges in science, engineering and computing," said former (founding) director of the Institute for Systems Research (ISR), principle investigator John Baras.
NIST and ISR aim to surmount the research challenges of smart systems by identifying obstacles, and ascertaining the need for measurement standards to surmount those obstacles. (ISR operates from the School of Engineering at the University of Maryland where more than 100 faculty researchers participate in what is one of six Engineering Research Centers established by the National Science Foundation). ISR will help design, integrate, test and manage a toolset that uses open-standards to enable subsystems from different U.S. manufacturers to interoperate. In addition, ISR will identify existing and anticipated markets as well as develop a framework to help guide U.S. investments in smart systems.
Besides comparing their performance, NIST also cited security as a major motivation, since smart systems need to be safe and reliable in order to fulfill NIST's mission to "enhance economic security and improve our quality of life." Smart systems called out by NIST as needing an extra measure of security included building control systems and remotely monitored medical implants, both of which were noted to be vulnerable to cyber threats.
"Smart vehicles, buildings, electric grids and manufactured products combine IT and physical technologies into interactive, self-repairing systems," said Sunder. "We want to help industry ensure that the systems are safe, secure and resilient."
According to NIST, by the end of the decade, more than 50 percent of the cost of cars, planes, machine tools, medical equipment and many other everyday items will be due to their computing, communication and automation capabilities. To create a level playing field against which innovation can be measured, NIST and ISR will create a suite of standards and measurement tools that can quantitatively distinguish the performance of smart systems.
Further Reading

Thursday, October 13, 2011

#ENERGY: "Artificial Leaf Turns Water Into Fuel"

As free as a water lily floating on a lake, solar-panels coated with a new catalytic material can harvest hydrogen fuel from ordinary water with no wires attached.

Many energy researchers have proposed using solar cells to generate electricity for electrolysis that splits water into its component parts--hydrogen and oxygen--which can then be used to power fuel cells. However, Massachusetts Institute of Technology (MIT) has combined all those functions into a single free-standing unit, which generates fuel whenever it is submerged in water illuminated by the sun.
Plants generate their own fuel for metabolism by harvesting sunlight, making MIT's new invention something like an artificial water-lily leaf. The true novelty of the device, however, is that it is free standing--a simple single panel with no connection wires to anything outside itself, and yet when submerged in water it immediately starts generating fuel from sunlight.

A flat panel module generates electricity with no wires attached, by virtue of built-in solar cells that split water into hydrogen fuel and oxygen. (Source: MIT)
The key to its novel behavior is the catalytic material used to coat each side of the free-standing solar panel. Rather than using rare expensive catalysts like platinum, the team of MIT professor Daniel Nocera used only Earth-abundant materials. On the front-side of the silicon solar panel is a layer of cobalt catalyst, which was discovered by Nocera a few years ago, and which was subsequently licensed to Sun Catalytix, which Nocera founded to commercialize the technology. Sun Catalytix chemist Steven Reece, Nocera's former graduate student, helped adapt the cobalt catalyst to the oxygen producing side of this new artificial leaf application of silicon solar cells, along with a team of five other researchers from Sun Catalytix and MIT.
The other side of the artificial leaf is coated with a nickel-molybdenum-zinc alloy, which splits off hydrogen from the water molecules. Systems using the technique will work by constructing a water tank with two chambers separated by the two-sided solar panels. Hydrogen can then be collected from one side while oxygen is collected from the opposite side. Each gas can then be stored separately, and later used to power a fuel cell which recombines them to generate electricity on demand, or could even be burned in an internal combustion engine to directly produce work.
Next, the researchers want to increase the efficiency of their artificial leaf, which is only about 2.5 percent efficient. The team is also trying out solar cells made from less expensive materials than crystalline silicon, such as thin films of iron oxide, potentially lowering the price of finished systems.
Further Reading

#TABLET: "Enterprises Get Ruggedized Secure Tablet"

Tablets used in the enterprise require strong security and other features that many consumer models simply do not offer. One manufacturer is trying to address the enterprise market with the introduction of a bullet-proof professional's tablet.

Touted as the world's first enterprise tablet, the Motorola ET1 is the first of a family of touch screen, ruggedized computers that will be sold to companies wishing to modernize their workforce without compromising on durability, configurability and branding.
Touch-screen tablets are arriving from all directions, each trying to offer all the features of an Apple iPad, but without addressing the issues that prevent enterprises from retrofitting them to the needs of existing personnel. HP promised an enterprise tablet, but did not deliver, and Apple seems content to produce delicate instruments that ignore issues such as breakage when they are dropped.

Motorola's first enterprise tablet (ET1) gives sales, marketing, servicing, warehousing and delivery personnel a ruggedized alternative to consumer-grade tablets.
Motorola Solutions, on the other hand, already supplies businesses nationwide with ruggedized mobile computers running Windows Embedded Handheld (aka Windows Mobile) and Windows Embedded Compact (aka Win-CE). These devices are used by retailers at the point-of-sale, by inventory personnel in the warehouse, and by delivery and service personnel in the field. By adding the ruggedized Android-based ET1 with a seven-inch touch-screen to the mix, Motorola Solutions claims to be offering the world's first enterprise-ready, secure tablet for businesses wishing to "be cool" like an iPad, but without the risk and expense of a using a consumer-grade appliance.
The ET1 has all the features necessary for the enterprise, such as extra thick Corning Gorilla Glass that keeps the touch-screen from breaking even if dropped daily. A removable bezel around the ruggedized touch-screen will allow a company to customize the tablet with its own logo and branding medallions. Hot-swappable battery packs allow the ET1 to be shared among users who can keep the device running 24/7. Secure log-in features allow the ET1 to be instantly provisioned depending on a user's security clearances. And a wide variety of accessories--from barcode scanners to magnetic stripe readers--allow the ET1 to substitute for a cash-register for sales personnel and an inventory computer for stock clerks.
Applications which run on Motorola's existing Windows-based handhelds also run on the ET1, including an assisted sales app, a mobile point-of-sale app, an electronic dashboard for managers, a planogram management app, and an item locator. A new HTML5 app development environment called RhoElements was also announced. RhoElements leverages Motorola’s existing PocketBrowser developers’ suite to enable ET1 customers to create their own enterprise apps for deployment on its Android-based tablet, or on Motorola's Windows Embedded Handhelds.
The first version of the ET1 is WiFi only for applications within the four walls of an enterprise, but Motorola Solutions said that wide-area network (WAN) versions will be available too in 2012. The ET1 will be delivered to large enterprise customers, such as retail chain stores, in the fourth quarter of 2011, in time for the Christmas season.
R. Colin Johnson has been writing non-stop daily stories about next-generation electronics and related technologies for 20+ years. His unique perspective has prompted coverage of his articles by a diverse range of major media outlets--from the ultra-liberal National Public Radio to the ultra-conservative Rush Limbaugh Show.
Further Reading

Tuesday, October 11, 2011

#3D: "Perfecting 3-D Microchips"

YOU’VE HEARD THE HYPE: The foundation of semiconductor fabrication will be transformed over the next few years as multistory structures rise up from dice that today are planar. After almost a decade of major semiconductor engineering efforts worldwide aimed at making the structures manufacturable, three-dimensional ICs are poised for commercialization starting next year—several years behind schedule.

Chip makers have spent the past several years perfecting the through-silicon vias that will interconnect 3-D ICs. Now that TSVs have been honed for 2-D tasks, such as transferring data from the front side of a planar chip to bumps on the flip side, the stage is set for 3-D ICs using stacked dice.
Further Reading

#MATERIALS: "Make Waves, Not Electricity, to Save Power"

Today the crystalline structures in microchips conduct electricity that wastes power by generating heat from friction as the electrons burrow through semiconductors to perform computations. Charge-density waves, on the other hand, burn less power by encoding data on modulations in the semiconductor's crystalline lattice.

Researchers at the University of California at Riverside have received a $1.5 million grant to encode data as charge-density waves instead of electrical current, thereby cutting the power requirements for digital electronic devices.
Charge-density waves have been known for almost a century, but this UC Riverside research group claims to be the first to use them to encode information. By encoding information as the collective states of a semiconductor's crystalline lattice, instead of an electrical current, the researchers aim to drastically lower the power needed to perform computations in semiconductors. In that way, many more charge-density wave computations can be performed using the same amount of power as conventional computers.
Encoding data on charge-density waves was the brainchild of electrical engineer Alex Balandin, chairman of the materials science and engineering program at UC Riverside, and recipient of this year's Pioneer of Nanotechnology Award from the IEEE Nanotechnology Council. Balandin is collaborating on the project with fellow Professor of Electrical Engineering Roger Lake, as well as University of Georgia Professor of Chemistry John Stickney.
The new charge-density wave efforts will create a material and data encoding technology that complements the conventional silicon transistors used today, allowing the new devices to be fabricated on the same chips using conventional complementary-metal-oxide-semiconductor (CMOS) manufacturing processes. Operating at room temperature, and without requiring any specialized materials such as ferroelectrics, charge-density waves can process digital information with far less electrical resistance than today.
Prototypes that prove the concept have been built in Balandin's Nano-Device Laboratory using materials grown in Stickney's lab. Various combinations of the phase, frequency and amplitude of charge-density waves are being tried out to encode data. One of the most promising encoding algorithms so far uses interference among the multiple charge-density waves, which has the potential to encode massive amount information, which could then be processed in parallel using much less energy than today.
The $1.5 million award was won in the "Nanoelectronics for 2020 and Beyond" competition, a joint effort of the National Science Foundation and the Nanoelectronics Research Initiative of the Semiconductor Research Corp. (SRC—a technology research consortium whose members include Intel and IBM).
Further Reading

#ALGORITHMS: "Free Business Intelligence From SaaS"

A popular platform-as-a-service (PaaS) business intelligence provider now offers a free software-as-a-service (SaaS) version for personal use. Called Cloud Personal, the SaaS lets individual users access MicroStrategy's business analytics suite before committing their company to its PaaS.

Business intelligence helps make sense of the glut of information streaming into enterprises, providing visualizations that help executives pinpoint relevant trends. MicroStrategy's platform as a service (PaaS) provides these services to its enterprise customers, but a new software-as-a-service (SaaS) version called Cloud Personal gives individual users free access to the company's business analytics.
"Cloud Personal combines data discovery, mobile business intelligence and social data sharing," said Kevin Spurway, vice president of cloud intelligence at MicroStrategy. "It's a 100 percent self-service environment running on our cloud."

MicroStrategy's free SaaS version of its business intelligence apps called Cloud Personal gives users the ability to produce visualizations like this graphic illustrating aircraft schedule tardiness.
MicroStrategy's main business is its PaaS, which installs on an enterprise's own server to provide its users with access to business intelligence and analytical tools. Cloud Personal SaaS offers a free subset of those capabilities to users who can bring their own data or can use a set of public data sets to evaluate its capabilities. Cloud Personal provides both analytics and a rich set of visualization tools that can be used to populate interactive dashboards for posting on a blog, Facebook or Twitter, or to share with mobile users of MicroStrategy Mobile apps for iOS devices.
Cloud Personal eliminates the need for IT to process many business intelligence tasks that users can now perform themselves, according to MicroStrategy. Users merely upload their data sets to the MicroStrategy Cloud platform, where its Visual Insight algorithms are triggered to analyze and then visualize results that uncover trends. Users can then construct dashboards to present their results using visualizations that can then be shared with colleagues.
MicroStrategy claims the user-configured analytics and visualization capabilities of its Visual Insight engine can match the results obtained by trained analysts using specialized software tools. Users can view results from any browser, or by using MicroStrategy's Mobile apps for iPhone and iPad.
For users who want to kick the tires before uploading their own data, public data sets from the, The World Bank and the National Center for Education Statistics are already preloaded into MicroStrategy's Cloud platform. Users can also view the results already obtained by other users in the MicroStrategy Cloud Personal Gallery, where each month's best dashboard receives a prize, and where MicroStrategy's own 20 example dashboards can also be viewed.
Further Reading

Monday, October 10, 2011

#WIRELESS: "Enterprise tablet ready for business"

Touchscreen tablets for the enterprise were unveiled by Motorola Solutions Inc., maker of push-to-talk radios and ruggedized wireless terminals for enterprises as diverse as first responders, warehouse managers and point-of-sale clerks.

The new Enterprise Tablet (ET1) joins a family of wireless terminals that allow fire-fighters to see around corners, buyers to visualize their entire supply chain, and salesmen to assess their inventory, display product features, and make sales by scanning barcodes and credit cards.
Further Reading

Friday, October 07, 2011

#ALGORITHMS: "Cloud Security Hardened with Hardware"

There's a new sheriff guaranteeing security to cloud users with a footprint so small that malware has no place to hide from it. Called the Strongly Isolated Computing Environment by its IBM and North Carolina State inventors, SICE uses the hardware on x86 multi-core processors to isolate secure cloud resources.

Professor Peng Ning, of the computer science department of the engineering school at North Carolina State University wrote 300 lines of code as all that needs to be trusted in order to ensure security.

Cloud security has special needs, especially public clouds where your data may be stored and processed on the same servers as your competitors. However, by repurposing the System Management Mode (SMM) hardware on x86 multi-core processors, IBM and North Carolina State University claim that secure isolated partitions can be easily managed, results that will be presented at this month's ACM Conference on Computer and Communications Security (Oct. 17-21, 2011, Chicago, Ill.)
Today, malware has a chance to get a foothold in thousands of lines of vulnerable code, any single entry point of which could yield unauthorized access to entire computing environments. On the other hand, IBM's and North Carolina State's Strongly Isolated Computing Environment (SICE), reduces its vulnerable-code footprint from thousands to just a few hundred lines that guarantee isolation among users.
"Our approach relies on a software foundation called the trusted computing base, that has approximately 300 lines of code," said Professor Peng Ning, of the computer science department of the engineering school at North Carolina State University. "Only these 300 lines of code need to be trusted in order to ensure isolation."
Ning worked with IBM T.J. Watson Research Center to create SICE. Using the SMM hardware of x86 multi-core processors, the researchers crafted a trusted computing base that manages secure, isolated environments in which to run separate user's jobs. Any malware that gets into a user's applications or data, will not impact other users, and can be easily "flushed" when detected by closing and reloading that user's jobs.
The SMM runs below even the hypervisor and operating system and answers only to code in the firmware BIOS plus the aforementioned 300 lines that implement the SMM. By isolating each workload on a separate core independently of the hypervisor, immunity to malware is combined with peace of mind about the security of sensitive data being processed in public clouds. SMM isolates and secures each separate computing environment so that its data only exists while used and is never exposed to other cloud users, the inventors claim.
The SICE framework adds about 3 percent overhead to system performance for most jobs, but slows down when direct network access is required by an application. The researchers said their next task was to optimize performance overhead for multi-core processors that require direct network access.
SICE was created by Ning with Xiaolan Zhang, a member of the technical staff at IBM’s T.J. Watson Research Center, and NC State doctoral candidate Ahmed Azab. Funding was provided by the National Science Foundation, the U.S. Army Research Office and IBM.
Further Reading

#MEDICAL: "iPhone Microscope Diagnoses Disease"

Researchers recently demonstrated how easy it could be to remotely diagnose disease anywhere in the world by merely snapping a close-up lens onto your iPhone and photographing a sample of blood.

Pollen (left) and plant stems (middle, right) are shown from an expensive microscope (top) and with an inexpensive add-on lens for an iPhone (bottom). (Source: University of California, Davis)

Medical and scientific measurements of all types are done with optical microscopes, which snap photos of blood samples that are then evaluated by screening technicians who refer suspect images to full-fledged doctors.
Researchers at University of California (Davis) have demonstrated an under-$50 close-up lens for the iPhone that can do the same thing--namely screen photos of blood samples, then refer suspect images to remotely located doctors using its 3G Internet connection. The team’s proposed add-on for the iPhone will be described in detail this month at the Optical Society of America's (OSA) Annual Meeting called "Frontiers in Optics (FiO) 2011" (San Jose, Calif. Oct. 16-20, 2011).
With all the sophisticated medical diagnosis techniques available today, optical microscopes are still the workhorse technology, with sales of over $3 billion in 2011 and growing at a rate of about five percent, according to BCC Research LLC. The group predicts the optical microscope market will exceed $4 billion by 2016.

Pollen (left) and plant stems (middle, right) are shown from an expensive microscope (top) and with an inexpensive add-on lens for an iPhone (bottom). (Source: University of California, Davis)
Today microscopy depends on "micron range" optical photography of blood and other patient samples, which can be achieved with a humble iPhone, according to Sebastian Wachsmann-Hogiu, a physicist with UC Davis' Department of Pathology and Laboratory Medicine at the Center for Biophotonics, Science and Technology. The 1.7 micron pixel size of the iPhone's camera combined with an inexpensive 5X magnification lens, gives the iPhone a microscope function in just the micron-range resolution needed to diagnose patient samples.
For developing nations an add-on microscope to smartphones could enable them to send in photos of patient samples for doctors to remotely diagnose. In fact, there are other smartphones with integrated microscopes and medical diagnostic apps, such as the LifeLens project to identify malaria in the field with a Window-based smartphone microscope. But the UC Davis team's aim was to demonstrate how inexpensively a bare-bones system could be made.
The first generation prototype was set up to not need any additional hardware at all--just a drop of water atop the iPhone camera lens which magnified the images as well as the ball lens that was finally chosen for the team's prototype.
"The water formed a meniscus, and its curved surface acted like a magnifying lens," said Wachsmann-Hogiu. "It worked fine, but the water evaporated too fast."
A one-millimeter ground glass ball lens--an under-$50 sphere that acts as magnifying glass--was used in the prototype constructed by Kaiqin Chu, a post-doctoral researcher at UC Davis. The ball lens was mounted in rubber surround that was fitted over the iPhone’s own camera lens.
The images were not as high-resolution as the laboratory microscope, but the doctors at the UC Davis Medical Center were still able to diagnose blood by counting cells and by noting the shapes of cells, such as in the banana-shape typical of sickle cell anemia.
Next the team wants to improve its lens resolution further, for making diagnoses of other samples such as skin, as well as develop apps for automatic screening, such as counting the number of red- and white-blood cells in a sample. They are also developing an inexpensive spectrometer add-on that could potentially identify the chemical signatures of substances by their spectra, such as reading out how much oxygen is in a patient's blood.
Further Reading

Wednesday, October 05, 2011

#ALGORITHMS: "Autism Traits Prove Valuable for Software Testing"

Smarter software-debugging services are being performed by savants where the intense focus and superlative technical abilities of high-functioning autism shine.

Asperger's syndrome, a form of autism that preserves linguistic skills and sometime features exceptional cognitive development, turns out to be a boon to tedious, time-consuming software-debugging tasks, according to Aspiritech NFP.
Raised to popular awareness by movies like "Rain Man" starring Tom Cruise as the brother of autistic savant Dustin Hoffman, people with Asperger's syndrome are often smarter at complex mathematics and tedious computer troubleshooting skills than ordinary programmers. Unfortunately, their poor social skills put them at a disadvantage when interviewing for programming jobs, and getting along with other employees.
Oran Weitzberg has a form of high-functioning autism, called Asperger's syndrome, which enables him to happily spend long hours performing software debugging tasks that are stultifying for ordinary programmers.
Aspiritech's mission is to demonstrate that many individuals with autism have savant-like skills with computers that can be channeled into successful businesses debugging software. Modeled on a similar company called Specialisterne, which has high-profile customers like Oracle and Microsoft, Aspiritech has recruited a growing team of exceptional programmers specializing in computer-software testing procedures. Aspiritech and Specialisterne are just two of a growing number of software-testing companies in Belgium, Japan and Israel that have been recruiting high-functioning adults with autism.
According to Aspiritech, its programmers have unique talents that make them exceptional software testers. These talents include attention to details, superlative technical aptitude and the "ability to thrive" while performing repetitive, task-oriented jobs that ordinary programmers find stultifying. After receiving initial funding from donations to their nonprofit organization, Aspiritech has since built up a portfolio of nine satisfied customers who report exceptional results from the team.
Aspiritech's board of directors includes social service providers, therapists, a vocational expert and a software engineer. The nonprofit also received start-up advice and consultation from Keita Suzuki, who has co-founded a similar company, called Kaien, in Japan. Aspiritech has hired and trained seven recruits with Asperger's syndrome. These recruits have since worked on software-testing projects for smartphone and cloud-computing applications. Aspiritech now offers functional-, compatibility- and regression-testing, as well as test-case development, with experience in cloud-computing platforms including Salesforce.
This year, 60 percent of Aspiritech's funding came from donations and just 40 percent from billable hours to clients; however, the company aims to raise that ratio to 50:50 next year. Aspiritech prices its services in the same ballpark as offshore software testing companies, and pays its programmers up to $15 per hour while providing a relaxed atmosphere that encourages the development of social skills.
Further Reading