ENERGY | WIRELESS | NANOTECH | MEMS | OPTICS | QUANTUM | 3D | CHIPS | ALGORITHMS

Thursday, September 29, 2011

#CLOUD: "World's Largest Academic Cloud"

Not to be left behind in the dust of industry marching forward, academia recently unveiled their largest cloud computing platform, aimed at providing the petabytes of storage required for giant scientific simulations and big-data analytics.



The San Diego Supercomputer Center (SDSC) Cloud is connected to 10 other supercomputer sites nationwide on the high-speed TeraGrid network.

The world's largest academic cloud, the SDSC Cloud, serves University of California at San Diego researchers and their associates, including 10 supercomputer centers nationwide connected by the high-speed TeraGrid.
Hosted at the San Diego Supercomputer Center, the SDSC Cloud offers academic researchers the resources to run gigantic simulations and analytics that they could not afford to support on commercial cloud providers.

The San Diego Supercomputer Center (SDSC) Cloud is connected to 10 other supercomputer sites nationwide on the high-speed TeraGrid network.
"The SDSC Cloud may well revolutionize how data is preserved and shared," said Michael Norman, director of SDSC. "Every data object has a unique URL [universal resource locator] and can be accessed over the Web."
The Web-based storage array is capable of 8-to-10 gigabyte per second sustained transfer rates over 768 Ethernet connections each running at 10-Gbits per second. Storage capacity today is 5.5 petabytes and is expected to grow to hundreds of petabytes as it scales linearly with each added resource.
Conceived in UC San Diego's Research Cyberinfrastructure (RCI) project, the initiative grew in scope to now include UC San Diego’s Libraries, School of Medicine, Rady School of Management, Jacobs School of Engineering, and SDSC research faculty doing federally-funded research projects at the National Science Foundation, National Institutes for Health, and Centers for Medicare and Medicaid Services. All these centers can now share data sets in the same SDSC Cloud.
"The SDSC Cloud marks a paradigm shift," said Richard Moore, SDSC’s deputy director. "One that says 'if you think your data is important, then it should be readily accessible and shared with the broader community'."
The key to the SDSC Cloud's ease-of-use is a program written for large NASA data sets by Rackspace called the OpenStack Swift Object Storage app. OpenStack organizes files into objects that are written to multiple physical storage arrays simultaneously, keeping at least two verified copies on different servers at all times. Also a Cloud Backup package uses SDSC's CommVault Backup service with continuous automatic data verification and integration with commercial cloud providers Rackspace and Amazon's S3 to allow a third copy to be replicated off-site for increased security.
Next month the SDSC Cloud will also start transferring huge data sets over simultaneous multiple 10-Gbit per second connections to CENIC (Corporation for Education Network Initiatives in California), ESNet (Energy Sciences Network), and XSEDE (Extreme Science and Engineering Discovery Environment).
The SDSC Cloud can also make use of the other advanced supercomputer services at the San Diego Supercomputer Center including its Data Oasis which can transfer three terabytes of data per minute, and Gordon--the world's first supercomputer to integrate large flash-based SSDs (solid state drives) for six terabyte per minute transfers.
Further Reading

Wednesday, September 28, 2011

#3D: "3D Pico Projectors Grow at Double Digits"

As sizes shrink down to palm-size pico projectors, and with a new injection of interest from 3D, double-digit market growth in projectors offers a beacon of hope that global economic uncertainty cannot dim.



Despite worldwide economic volatility, the growing markets for pico-size projectors as well as growing interest in 3D will enable double-digit growth for front projectors from 2011 to 2015, according to Pacific Media Associates.
PMA predicts that front projectors will grow 22 percent compared to 2010, topping 10.4 million unit shipments in 2011, and by 2015 are predicted to exceed 39.4 million units. PMA claims to have uncovered the technological features and product segments that are causing this market's fast growth rate in its recent biennial survey of U.S. front-projector users.

The thumb-size pico-projector's micro mirror is fabricated in the center of a thumbnail-size MEMS chip (Source: Microvision)
According to PMA, the factors affecting growth were different among users of three different categories of projectors, based on their size and output power.
In the "New Era" category, the biggest draw was the small size of the pico-size projectors enabled by micro-electro-mechanical system (MEMS) chips that have downsized the micron-size mirrors for palm-size projectors. Pico projectors are also going 3D, with the latest announcement coming from Microvision, which now offers a dual-lens laser-based unit that projects 3D in a manner similar to the digital cinema projectors, requiring only inexpensive, passive polarized glasses.
Among users of pico projectors (under 500 lumens), the biggest difference was between enterprises versus individual consumers, according to the PMA. Presentations were cited as the most frequent application 75 percent of the time by organizations. For individual consumers, on the other hand, only 35 percent cited the use of pico projectors for presentations. For individuals, viewing photos was cited as the more frequent application 58 percent of the time, as opposed to organizational users who only viewed photos 45 percent of the time.
The second highest use of pico projectors by both groups was watching videos, where 60 percent of organizations and 55 percent of individual consumers used them to project videos. As a result, many models of video cameras, camcorders and even some digital still cameras are now including a built-in pico projector. Pico projectors are also being built into high-end smartphones, according to PMA, which is proving to be popular in developing countries where a user's phone is their main computing device.
In the "Mainstream" category of 500-to-4,999 lumens at PMA, front projectors were used mainly in classrooms, corporate meeting rooms, and in home theater settings, where nearly 8 million units will be sold in 2011. PMA predicts that mainstream projectors will grow by 50 percent over the next five years to top 12 million units by 2015.
Education continues to be a main driver for mainstream projectors, despite austere government budgets worldwide, but technology is also driving acceptance at corporations. For instance, PC-free presentations are driving corporate users to favor newer devices that include WiFi so that the projector is wirelessly controlled from the presenter’s handheld unit, such as a touch-screen tablet or smartphone.
The High-End projectors with 5,000 lumens of output or more are mostly used for digital cinema, auditoriums and in science for simulation and visualization apps. PMA predicts that double-digit grow rates will more than double the sales of high-end projectors, up from 200,000 units in 2011 to over 430,000 units by 2015, a compound annual growth rate of about 21 percent. In this market, new high-end widescreen models are driving a modernization of existing theaters and auditoriums as well as pioneering emerging markets in developing countries.
Further Reading

Monday, September 26, 2011

#OPTICS: "World's First Nanoscale Optical Fibers'

Photonic chips that compute with light instead of electricity--the holy grail of optics--will enable a faster, more sustainable Internet, by virtue of lowering its power consumption with the world's smallest optical fibers.



Researchers have created nanowires generated by direct laser writing, here stacked to form a three-dimensional woodpile photonic crystal with a pronounced stop gap.

At its current rate of growth, the Internet is on track to consume half the world's energy in the next decade, according to professor Min Gu, the Director of the Centre for Micro-Photonics at Swinburne University of Technology. To head off the debacle, Gu's research group has created the world's smallest optical fibers, which he claims will enable not only a faster, but a more sustainable Internet, by virtue of reduced energy consumption.
"The Internet has become a major energy consumer," said Gu. "In the next decade the Internet will account for half of the world’s energy usage--so making it more efficient will make a huge difference to our carbon footprint."

Researchers have created nanowires generated by direct laser writing, here stacked to form a three-dimensional woodpile photonic crystal with a pronounced stop gap.
His research team, which was recently ranked among the top 100 universities for physics research in the Academic Ranking of World Universities published annually by the Shanghai Ranking Consultancy, is working with the six university, Australian government funded, effort to stem Internet power consumption at the Centre for Ultrahigh bandwidth Devices for Optical Systems. The aim of CUDOS is to create an all-photonics Internet that drastically cuts energy consumption by computing with light instead of electricity.
Photonics encodes the binary ones and zeros of Internet communications on pulses of light instead of electronic charge packets. Electricity must overcome the resistance of the wires through which it travels, which causes them to heat up, wasting much of the energy required for communications signals. As a result, all long-haul Internet communications today are performed with optical fibers which carry signals between metropolitan areas faster and without all the heat.
However, once the signals reach their target city, they are translated into electrical signals with expensive, power hungry converters which receive the weightless photons of light from the long-haul fibers and translate them into electrical signals that travel down a cable- or digital-subscriber link (DSL) to the user's computer. Likewise, between distant metropolitan areas "repeaters" must periodically translate the optical signals into electrical ones, then use high powered lasers to re-encode the electrical signals as light pulses that continue down fibers to the next repeater.
The holy grail of photonics is to eliminate the need for repeaters, cable, DSL and even the wires between the chips inside your computer with nanoscale optical fibers that perform all computing with light, eliminating the need to ever translate weightless, energy conserving photons into heavy, energy wasting electrons. With optical fibers small enough, even the communications signals on the processor inside your computer could be made optical, saving energy for computers, routers, and other communications equipment as well as extending the battery life of the mobile device themselves.
In its most recent step toward realizing the dream of all-optical computing, doctoral candidate Elisa Nicoletti working on a CUDOS project in Gu's lab, created the world's smallest optical fibers. Measuring just 68 nanometers in diameter, the new optical fibers are small enough to be used inside all-optical routers, switches and other future photonic microchips. Measuring just one-twelfth the wavelength of the light being transmitted through them, the nanofibers were directly written with a laser onto microchips using a nonlinear material called a chalcogenide.
Other groups have created nano wires out of plastic, which is a passive material that merely passes the signal. But by using chalcogenide, Gu's group was able to demonstrate nanowires constructed into a three-dimensional "woodpile" photonic crystal with a pronounced stop gap. Such structures can work with laser-powered light pumps to perform all the functions performed today with power-hungry electrical routers and switches.
Further Reading

Friday, September 23, 2011

#SPACE: "NASA'S Planet Hunter Bags a First"




NASA's Kepler spacecraft is logging planet after planet, finding more potential exoplanets in its first month than all of the previous terrestrial efforts in history. Recently Kepler found the first planet orbiting two stars simultaneously, and it only just beginning...
Further Reading

Thursday, September 22, 2011

#MEMS: "ICs entering mass consumer markets"




As MEMS enters the mainstream, the high-volume markets will favor mergers and acquisitions over the next five years, as the larger player fill in the gaps of their integrated solution for OEMs, who must fuse the outputs of multiple MEMS sensors for a wider variety of applications than just tablets and phones.
Further Reading

Wednesday, September 21, 2011

#CLOUD: "Enterprise Telecom Migrating to Cloud"




Telecommunications infrastructure is the latest enterprise asset to begin the move to cloud-based solutions. And over the next five years ABI Research predicts a mass migration of enterprise communications applications to the cloud.Premises-based communications and telecommunications capabilities including email, telephony, as well as audio-, video- and Web-conferencing, are steadily shifting to cloud-based solutions, according to ABI Research, which predicts that 41 percent of all enterprise communications application users worldwide will migrate to the cloud by 2016.
"The communications customers-premises equipment [CPE] market will only be growing at a 4.3 percent rate, while cloud communications will be growing at over 21 percent, reaching $8 billion in revenues by 2016," said ABI senior analyst Subha Rama.
The migration trend for enterprise applications infrastructure will steadily shift from customer premises equipment (CPE) to private and public cloud-based solutions, according to ABI Research.
Driving the CPE-to-cloud migration trend, according to ABI Research, is the general adoption of data center-based virtualization technologies, the need to offer a "connected experience" to mobile users of smartphones and tablets, and the potential of lower costs and higher efficiencies for cloud-based solutions.
CPE vendors will feel the squeeze the most, according to ABI Research, as their customer base slowly erodes in favor of public-, private- and hybrid-cloud solutions. Within five years, CPE will have lost 386 million users to virtual infrastructure solutions, with the drain steadily increasing as enterprises gain more confidence in migrating information and communications technology (ICT) to the cloud.
Many enterprises are hesitant to make major investments in the transition to cloud infrastructure, according to ABI Research, which explains that extending the lifetime of legacy CPE solutions will continue as long as the cost of transitioning to the cloud remains relatively high. But as the price tag drops, the cloud transition will accelerate in direct proportion.
According to ABI Research, cloud technology is acknowledged as increasing business agility through infrastructure consolidation, but the difficulties of managing security, exposure and integration with existing infrastructure will favor a steady, orderly transition rather than a quick switch. For the time being, hybrid approaches that allow legacy systems to continue to be useful will dominate, especially for large enterprises that face considerable investment costs to make the transition.
Smaller CPE vendors will feel the squeeze the most, according to ABI Research, since larger CPE vendors are already transitioning their own telecommunications capabilities to the cloud, potentially offsetting their losses to newer cloud-only vendors. The vendors expected to do best will be those that make the cloud transition seamless and without performance penalties, according to ABI's report titled: "Enterprise Cloud Applications and Vertical Analysis."
Further Reading

#WIRELESS: "Subconscious Smartphone Extends Battery Life"




A new subconscious mode for smartphones, announced at this week’s Mobile Computing and Networking conference, can improve battery life by more than 50 percent using the smarter algorithm. Savvy smartphone users know they can save battery life by switching off WiFi, but few take the trouble since it’s so convenient to just leave it at the ready to speed up Internet access. Now a new subconscious-mode could let smartphone users have their cake and eat it too.
Proposed by University of Michigan professor Kang Shin and prototyped by doctoral candidate Xinyu Zhang, smartphones and other devices with WiFi can license the new subconscious mode to extend smartphone battery life by as much as 50 percent.
WiFi speeds Internet connections by running orders of magnitude faster than typical 3G connections, convincing most smartphone users that it is worth leaving on so that acceleration is automatic whenever a familiar WiFi signal is detected. As a result, smartphones spend most of their time in what is called the "idle listening" mode, waiting for data-packets to arrive that could be transmitted by the nearest, strongest WiFi connection, as well as listening for incoming WiFi communication intended for it.
The team estimates that more than 90 percent of mobile devices with WiFi spend up to 80 percent of their time in idle mode today. The new subconscious mode, on the other hand, cuts power to the WiFi circuits until they are needed, then springs back to life before any data packets can be lost, greatly extending battery life without having to manually turn WiFi on and off.
The official name of the subconscious mode, coined by Shin and Zhang, is Energy-Minimizing Idle Listening (E-MiLi) and is being proposed by its inventors to original-equipment manufacturers (OEMs) of all types of WiFi equipped battery powered devices.
The techniques used by the researchers to achieve the savings was to down-clock the WiFi circuits by 16 times while they are in E-MiLi mode, then quickly ramp the clock speed back up whenever a packet arrives to be transmitted or when the smartphone's address is present in an incoming WiFi header. The outgoing activation was easy to achieve, according to Shin and Zhang, but detecting incoming packets while clocked down required custom algorithms that listen specifically for packet headers addressed to it.
To implement new E-MiLi mode, smartphone makers will have to add the algorithms for varying the WiFi circuits clock rate, plus WiFi chip-makers will have to adopt the new easy-to-recognize header structure that is recognizable even to down-clocked WiFi chips. Shin predicts that other wireless protocols, such as the industrial ZigBee wireless networks, will also adopt the technique to enable similar power savings on their mobile devices.

Further Reading

Friday, September 16, 2011

#MEDICAL: "Dr. Watson Diagnoses Medical Maladies"


IBM's Watson technology has put on its white coat and stethoscope at WellPoint in order to simplify and speed up the diagnosis of diseases by matching symptom sets in real time with millions of medical records, journal articles and research results. IBM's Watson was demonstrated to the world earlier this year as able to beat the most talented human contestants in the TV game show "Jeopardy." Nevertheless, Watson's artificial intelligence (AI) was always intended to improve the human condition rather than leapfrog it. As the first major step toward that goal, WellPoint, the nation's largest health benefits provider, is using Watson to help make more accurate and informed medical diagnoses.
With millions of pages of medical journals, research results and physician reports produced each year, diagnoses of difficult symptoms often involve successive appointments with multiple specialists until a patient finds a suitable treatment plan. WellPoint is aiming to speed up and improve the accuracy of medical diagnoses by using Watson to scan the vast volumes of data from those sources to match patients' symptoms with known successful treatment plans.
As the first commercial application of Watson's AI, WellPoint's solution will deliver up-to-the-minute evidence-based medical diagnoses for millions of Americans enrolled in its various health plans nationwide. WellPoint, including its various affiliates such as Blue Cross and Blue Shield, is the nation's largest health benefits organization with 34 million members in its health plans and 36 million more served through its subsidiaries. Watson's scalability, allowing its algorithms to be run on cluster supercomputers which can be brought online on-demand, will allow it to serve nearly any number of multiple, simultaneous diagnostic sessions. WellPoint said it was banking on Watson's ability to analyze the meaning of symptom sets and match them with available treatment plans, which will be presented to doctors and nurses as a list of "most likely" diagnoses.
"We believe [Watson] will be an invaluable resource for our partnering physicians and will dramatically enhance the quality and effectiveness of medical care they deliver to our members," said WellPoint's Chief Medical Officer Sam Nussbaum.
IBM will work with WellPoint to adapt Watson's AI from playing on "Jeopardy" to medical diagnostics, crafting algorithms that also analyze the interactions among drugs and therapies to ensure that no unforeseen side effects result from the treatment regimes it suggests. Watson will also be programmed to streamline communications among physicians during complex clinical review cases, quickly directing patients to the physician within their health plan and service area that already has a high success rate at treating similar maladies.
WellPoint’s "Dr. Watson" is planned for initial deployment in a pilot program in early in 2012.
Further Reading

#ENERGY: "Nuclear Generators Go Green"


Nuclear "generators" for the home, car, and industry are claimed to be truly green, unlike faux-green nuclear "reactors" which emit no greenhouse gases, but create dangerous nuclear waste. Called nuclear-powered laser-turbine electricity generators, they harness harmless thorium ore instead of radioactive uranium.
Nuclear power from reactors nudges enriched uranium into a critical mass that unless controlled can create a chain reaction that produces dangerous radioactive byproducts that must be kept safe from humans for thousands of years. Nuclear-powered laser-turbine electricity generators, on the other hand, harness the harmless radiation from the natural decay of thorium--an abundant natural ore.

Thursday, September 15, 2011

#ALGORITHMS: "Motions lifts MEMS-based remotes"


Armed with a new deal to integrate its MEMS sensor algorithms into Texas Instruments Inc.'s ZigBee-based radio frequency for consumer electronics (RF4CE) hardware platform—RemoTI—Hillcrest Laboratories Inc. hopes to penetrate further into the fast growing markets for MEMS-based motion-control interfaces for Smart TV, streaming video, motion-based gaming and 3-D gesture control.
Further Reading

#WIRELESS: "Smartphones' Downward Push Going Over Top"


Due to dropping prices of hitherto pricey components plus 3G network availability in developing countries, touch-screen smartphones are expected to dominate all cellphone sales by 2015. Smartphone sales will double their unit shipments by 2015 making them the largest cellphone market segment, according to IHS iSuppli, which predicts sales exceeding one billion units. In addition, the vast majority of those smartphones will use touch screens, as costs plummet for the previously pricey component, according to ABI Research. In fact, ABI predicts that 97 percent of smartphones will have touch screens by 2016.
Apple's iPhone is still leading the way, continuing to pioneer the smartphone space with new features, such as the ultra-high-resolution retinal display, and an ever-expanding complement of apps. However, the fastest growing segment of the smartphone business is at the low-end, where fewer features and scant app support are compensated for by more affordable prices for users in developing countries.

Driven by lower priced models, smartphones will account for the majority of cell phone sales by 2015 according to IHS iSuppli.
Samsung is addressing the low-end smartphone market with models whose sales are growing at a much faster pace than high-end smartphones. As a result, IHS iSuppli predicts that smartphone shipments overall will account for more than half of cell phone unit shipments in just four years. And the vast majority of even the lowest-end models will sport touch screens, according to ABI Research.
By 2015, global smartphone sales will top one billion units, according to IHS iSuppli, accounting for over 54 percent of the total number of cell phones sold that year. Such sales will more than double smartphones shipments in 2011, which are predicted to be about 478 million units--over 32 percent of the total cell phone market this year. The meteoric rise of smartphones started from humble beginnings, accounting for just under 16 percent of all cellphone sales in 2009.
Low-end smartphones cost less because they have less memory and fewer features, allowing their sales to grow at a faster pace, estimated to be more than 115 percent per year through 2015, compared with just more than 16 percent for mid- and high-end smartphones. Samsung appears to be gaining the most ground as a result of its low-end smartphone sales in China and Latin America, where sales skyrocketed by 600 percent over the last year, according to IHS iSuppli. Samsung's success is pegged to its use of newly available, low-cost, single-chip 3G baseband processors and its licensee-fee-free Android operating system.
Mobile app development is also driving the market overall, according to IHS iSuppli, but a thirst for adding the fun-factor with touch screens is spurring many consumers to trade up to smartphones, according to ABI Research, which claims that just 7 percent of smartphones had touch screens in 2006 compared with 75 percent in 2010. ABI Research also credits build-out of 3G networks in developing countries as spurring smartphone adoption.
R. Colin Johnson has been writing non-stop daily stories about next-generation electronics and related technologies for 20+ years. His unique perspective has prompted coverage of his articles by a diverse range of major media outlets--from the ultra-liberal National Public Radio to the ultra-conservative Rush Limbaugh Show
Further Reading

Tuesday, September 13, 2011

#ALGORITHMS: "Analytics Weighs Costs, Benefits of Cloud"


Manufacturing resource planning (MRP) tools have become routine for many industries, but now IT executives have their own decision-support tools for deciding when and how to make best use of cloud computing resources. Instead of lashing together handmade spreadsheets chock full of estimates and outright guesses, IT executives can now make use of a standardized set of business analytics that compares the cost of implementing services with dedicated, virtualized, private or public clouds.
Sentilla Analytics for Cloud weighs, compares and contrasts the business impact of dedicated servers, virtualization, and both private and public clouds, allowing IT managers to make more-informed business decisions. Available as a subscription service, the Sentilla software suite monitors energy usage, application utilization and other real-time metrics to guarantee the accuracy of its predictions, allowing IT managers to "what-if" about different cloud scenarios.

Sentilla Analytics for Cloud allows IT to input the types of tasks and resources needed for them into a form (bottom) from which analytics produce a cost-comparison chart for dedicated versus virtualized, private- or public-cloud implementations.
"Sentilla Analytics of Cloud is the world's first planning tool to provide IT executives with accurate predictions of the impact that different strategies will have on their business," said Joe Polastre, chief technology officer at Sentilla. "Other industries like manufacturing have had resource-planning tools for some time, but ours is the first MRP-like tool for IT."
Sentilla Analytics for Cloud helps IT determine the impact of migrating applications from dedicated in-house resources to virtual clusters, private or public clouds. It can also be used to determine ways of delivering more services within the same power and space footprint, by virtue of virtualization and cloud integration. Sentilla also recently announced integration with VMware's vCenter Server and vSphere, which enables Analytics for Cloud to track the cost of running dedicated versus virtual IT services.
Dimensions of impact include capital spending, license fees, operating expenses, maintenance and power consumption. Input data includes the applications being considered, in-house hardware capabilities, personnel costs and utility bills. Analytics then takes over to predict how infrastructure as a service (IaaS) and platform as a service (PaaS) could potentially lower costs. Graphical displays show comparisons between the various options--dedicated, virtualized, private- or public-cloud--with precise cost-per-year metrics as well as potential percentage savings. Sentilla Analytics for Cloud offers precise metrics for both Amazon EC2 and Rackspace public clouds, but can also be adapted to other cloud providers by tweaking an XML file.
Sentilla claims its Analytics for Cloud is the only available IT tool for collecting, monitoring and analyzing both performance and cost from actual resource utilization, workload, operating costs and power consumption data that is tracked in real time by its software.
Sentilla recently received an endorsement from the world's second-largest telecommunications company--the SingTel Group--whose investment arm, SingTel Innov8, led a $15 million Series C investment in the company in return for an equity position and a membership on the board of directors.
Further Reading

#ALGORITHMS: "Client hosting takes virtualization mobile"




Virtualization has been a boon for multiuser systems, letting them run the Windows operating system and applications on servers. Each user’s state is saved as a virtual desktop that can be remotely accessed from PCs, laptops, netbooks, tablets, smartphones and even dumb “thin clients” (terminals costing as little as $200). The downside is that remote users of server-hosted virtualization need to be online to take advantage of the virtual desktop infrastructure (VDI)—a show-stopper for highly mobile workers. That limitation has provided the motivation for a new paradigm called client-hosted virtualization, which runs on laptop computers.

Further Reading

Monday, September 12, 2011

#ROBOTICS: "Robotics Growing, Diversifying, Taking Charge"

The robotics market is experiencing healthy growth after a slowdown during the recession, with several new categories leading the charge including service, military, and security robots.



Global demand for robots broken down by region shows Asia market growing fastest followed by Europe. (Source: BCC Research)

Robotics is breaking out of the niche markets it had been shunted too, promising to grow to $30 billion by 2016. The significant uptick is mainly due to explosive growth in commercially viable professional service, military, and security robots, in addition to its tradition strengths in manufacturing, medicine, surgery, planetary exploration, and the handling of hazardous materials.
The global market for robots experienced great hype during the first decade of the new millennium, but was hit hard by the recession which flattened its growth before 2009. However, by 2010 robots were back on track and are expected to continue healthy growth through 2016, according to BCC Research.

As a sub-field of automation, robotics integrates many different disciplines all aimed at providing devices that can perform tasks that are too tedious, dangerous, or which require more precision than the typical human can perform. Most robots work in industrial settings performing highly specialized jobs, but a new breed of domestic, professional, and security robots are opening the door to a new era of smarter robots that can handle open environments and less precise instructions. For instance, search-and-rescue robots perform many of the same tasks as warehouse robots, but in an unstructured environment where navigation and task execution are more open ended.

The International Federation of Robotics (IFR) explains the recent surge in robotics as partially due to pent-up demand that occurred when industries delayed buying new robots or and replacing old ones until the recession receded. However, an analysis by BCC Research of new patents filed led the firm to predict rapid growth in new types of robots. BCC also reports that venture capital, which dried up during the recession, has recently rebounded allowing many new startups to break ground in 2011.
Robotics in now growing in diverse markets, according to BCC, including aerospace, automotive, chemical, construction, defense-related, electronics, food processing, home care, medicine, pharmaceutical, and textile and clothing manufacturing industries.

In 2011, robotics will grow to nearly $22 billion, according to BCC Research. The firm predicts a five-year compound annual growth rate of 6.7 percent out to 2016 when it will top $30 billion. When broken down by region, Europe shows the fastest growth rate, starting at $4.9 billion in 2011, but growing at a 9.6 percent rate to $7 billion by 2016. Asia shows the second fastest growth rate, starting at $7.7 billion in 2011, but growing at a 7.2 percent rate to nearly $11 billion by 2016. North American, where industrial robots are already commonplace, the growth rate is predicted to be a scant 2.7 percent, starting at $4.9 billion in 2011, and growing to $5.6 billion by 2016.

Further Reading

#MARKETS: "NTT sees growth in South America"




The next big growth market in connected electronics is South America, according to Japan's NTT Communications Corp. (NTT Com,) which announced Monday (Sept. 12) that it expanded the reach of its tier one global IP network there with a new point of presence (PoP) location in São Paulo, Brazil's most populous city.
Further Reading

Friday, September 09, 2011

#SECURITY: "Simulations Guarantee Earthquake-Proof Stadium"




California Memorial Stadium was slowly being pulled apart by a fault line that runs down its middle, but computer simulations have enabled a face lift that secures fans even if "the big one" hits during a game.
If a seismic fault was discovered to be splitting a sports stadium in half even without a quake, would the smart thing be to relocate the stadium? Not according to California engineers who claim to have validated a $321 million renovation of California Memorial Stadium with detailed computer simulations which "allow the fault to rupture without endangering life," according to David Friedman, a principal with Forell/Elsesser Engineers, which did the structural engineering design.
By putting threatened stands atop moveable seismic blocks that ride out the earthquake, fans in the new 63,000 seat stadium "will find themselves taking a ride--but a safe one."

Even if a quake splits the California Memorial Stadium in half, computer simulations assure fans they will be safe atop modular stands that slide around without coming apart.

California Memorial Stadium straddles an active earthquake fault--the Hayward Fault--which over the stadium’s 88-year life has split the structure in half with up to 9-inch cracks, requiring innovative seismic engineering to repair and renovate.

According to the computer simulations, the renovated stadium will be able to survive tremors without sustaining significant damage. Even if the "big one" hits, fans will ride gigantic seismic blocks that can move up to 6 feet laterally and drop down up to 2 feet without coming apart. The seismic blocks will be floating on a 4-foot thick concrete mat covered with two-layers of high-density plastic with sand in between to facilitate sliding. The computer simulation also shows that by dividing the overhanging boxes for the press and VIPs into five 60-foot wide modules, they will be able to safely sway and spring back on giant shock absorbers that damp movement down to about 1 foot.

Structural engineers and seismic specialists helped construct the detailed computer models whose simulations allowed them to verify the soundness of the design. First trenches were dug and boreholes were drilled all around the stadium and adjacent areas in order to accurately model the exact geological formations of soil and rocks. What the core samples revealed was that over the last 90 years the Hayward Fault has been moving in a geological process called "fault creep." The improved structure, however, will have adjustable gaps between sections so that the fault can continue to creep without cracking.

To see one of their simulations of what will happen during an earthquake, check out the stadium-quake animation.

The renovations currently under way at the stadium are due to be finished in time for the 2012 football season. (The Golden Bears are using the AT&T Park in San Francisco for the 2011 season while construction is underway.)

Further Reading

#MEMS: "Movea enlists MEMS to bring pay-TV into Internet age"




Cable and satellite television providers have lagged behind in their exploitation of the opportunities being taken advantage of by Internet protocol television (IPTV), which today is being watched on computers, gaming controllers and a new breed of Internet-connected smart TVs, all of which encourage consumers to "cut the cable." MoveTV, on the other hand, is designed to put cable and satellite TV back in the driver's seat by enlisting advanced motion control with MEMS sensors that is integrated with a deep software infrastructure which brings pay-TV into the Internet age.

Further Reading

Thursday, September 08, 2011

#SPACE: "NASA Space Laser to Make Radio Obsolete"




Radio-based space communications could be made obsolete by a laser-based long-haul optical connection that runs 10 to 100 times faster.

NASA will demonstrate a long-haul optical network by connecting California and Hawaii with a laser communication link that works similarly to fiber optics, sans the fiber. Called a free-space optical connection--as opposed to within a fiber--the first demonstration will show that long-haul lasers can communicate with pulses of light at 100M bps.

Today, space probes must transmit data back at 6M bps using radio waves, taking almost 90 minutes to receive a single 4GB high-resolution image. However, by encoding the bits on laser beams--like terrestrial long-haul fiber-optic networks--that same image would take only approximately five minutes to transmit. Similar to fiber-optic telecommunications--sans the fiber--NASA hopes to demonstrate its free-space optical transceivers between ground bases in California and Hawaii by bouncing their communications laser off a satellite. If the test is successful, NASA's Laser Communications Relay Demonstration Mission (LCRD) could also allow remote telepresence capabilities where astronauts would use remote-control robots to visit heavenly bodies virtually.

Laser Communications Relay Demonstration (LCRD) will act as a long-haul fiber-optic network--sans the fiber. (Source: NASA)

"Optical communication will enable a rapid return of the voluminous data associated with sending spacecraft and humans to new frontiers," said NASA Chief Technologist Bobby Braun at NASA headquarters in Washington during the announcement of the technology demonstration program for its laser-based communications. The $175 million NASA Technology Demonstration Mission program will support two other missions--one testing a new atomic clock for precise space navigation, and the other a solar-sail for propellant-free propulsion.

According to NASA, free-space laser communications not only provide significantly higher data rates, compared with radio-frequency communications, but lasers also decreased the mass, size and power consumption of the spacecraft using them. The project could also allow real-time data streaming from instruments that today must store-and-forward files, like hyper-spectral imagers and synthetic aperture radar (SAR). And for astronauts, laser-speeded communications will allow them to use telepresence to safely investigate nearby planets, moons and asteroids.

NASA Goddard Space Flight Center (Greenbelt, Maryland) will build the Laser Communications Relay Demonstration (LCRD) system with the Space Communications and Navigation (SCaN) office in the Human Exploration and Operations Mission Directorate and the NASA Office of the Chief Technologist. The mission, which is just one of three technology demonstration missions, will take four years to achieve and is slated to be finished in 2016. The other two technology demonstration missions--a deep-space atomic clock for precise space navigation and a propellant-free solar-sail propulsion system--will run in parallel, but will only take three years to achieve, and will be demonstrated in 2015.

Further Reading

Wednesday, September 07, 2011

#WIRELESS: "Unstoppable iPad's Only Competition Is iPhone"

With HP's quick exit from the tablet market, and lackluster sales of other tablets, Apple's main competition for the stocking-stuffing Christmas 2011 market is predicted to be the iPhone.



Even with HP announcing that it would briefly dip its toe back into the tablet production market, the iPad seems unstoppable. Specifically, the iPad is gaining from HP's quick exit from the touch-screen tablet market, with forecasters increasing their predictions for Apple's dominance. With burgeoning iPad sales to the education market just kicking off this month, and clear-sailing from its previous supply chain woes, Apple's main competition for the lucrative Christmas 2011 market may be the company's own iPhone 5, which IHS iSuppli predicts will be released this fall.

IHS iSuppli predicts that Apple will ship more than 60 million iPads in 2011, up nearly 246 percent from the more than 17 million it shipped in 2010 when supply was tight. Apple's stunning success will drive overall touch-screen tablet sales to more than 274 million units by 2015, according to the IHS iSuppli.

IHS iSuppli recently upped its forecast for Apple iPad shipments after HP's exit and the lackluster sales of other competitors like Samsung's Galaxy Tab. (Source: IHS iSuppli)

"Right now Apple's iPad is dominating the market since many of its competitors are really just getting started," said Rhoda Alexander, senior manager of tablet and monitor research for IHS. "Our research shows Apple continuing to dominate the high-end tablet market out to 2015, but we also believe that many people are just using their media tablet for email and Web surfing, which should open significant opportunities for competing tablets that are priced $100 to $150 less than the iPad."

As a bulkhead against such low-end market erosion, Alexander predicts that Apple may repeat its success in the iPhone market, where it kept the iPhone 3 in its catalog after the iPhone 4 debuted, but lowered its price by $100. Apple could similarly keep the iPad 2 around at a lower price point than the iPad 3 which is expected early in 2012. Keeping the iPad 2 around could make good business sense too, since the iPad 3 will likely be using the ultra-high-resolution retinal display, which will enable Apple to continue manufacturing the iPad 2 at significantly lower cost.
However, by 2015 Samsung, Amazon, LG and other mass market experts will likely have low-end tablets that will appeal to casual users who are mainly interested in email and web access. Even so, IHS iSuppli predicts that Apple will sell more than 120 million iPads in 2015.

In the short term, Apple is expected to lap its competition--even in developing markets like China which is expected to boost iPad sales in the first quarter of 2012 during the Chinese Lunar New Year holiday season. Plus Apple is expanding its focus on the IT market with additional support staff for corporate users accessing databases from their iPad.

Further Reading

Thursday, September 01, 2011

#ENERGY: "Gap Being Filled by Social Media"

A lack of understanding of their energy bills is prompting consumers to depend on the advice of friends and family, opening opportunities for new avenues for education, according to the "2011 IBM Global Utility Consumer Survey."



IBM helped Malta build the world’s first national smart utility grid, replacing 250,000 analog electric meters with smarter meters that track usage in real-time, identify sources of loss, set variable rates, and is being integrated with a new smart water metering system.

Energy consumers worldwide want to conserve and take advantage of smarter technologies but lack a basic understanding of how to do so. Consumers surveyed by the "2011 IBM Global Utility Consumer Survey" say they need smarter ways of making their energy decisions.

"We surveyed over 10,000 people in 17 different countries in nine different languages, and really focused on their expectations and perceptions as to where they see energy fitting into their lives," said Michael Valocchi, vice president, Global Energy & Utilities Industry Leader for IBM Global Business Services. "We found a really startling lack of knowledge."

IBM helped Malta build the world’s first national smart utility grid, replacing 250,000 analog electric meters with smarter meters that track usage in real-time, identify sources of loss, set variable rates, and is being integrated with a new smart water metering system. (Source: IBM)

According to Valocchi, "30 percent didn't understand the basics of their energy bill" leading to decision-making processes that depended on the evaluations of trusted advisers, rather than on understanding the clear choices being made available to them by the smart grid and smart meters. Younger consumers, in particular, were much more inclined to just depend on the consensual decisions of their social networks rather than on the traditional financial motivations being hawked by energy providers.

"Younger consumers under 25 are three times more likely to make their energy decisions based on family or friends advise," said Valocchi. "That’s information that is outside the control of the energy provider itself."
IBM's survey revealed that 60 percent of consumers still do not even understand the terms "smart grid" and "smart meter," but that understanding was key to their acceptance. For instance, 61 percent approved of smarter energy efforts when they finally understood them, but that only 43 percent, without a basic understanding of smart energy efforts, approved of them.
"People want to conserve energy," said Valocchi. "We just need to get better at showing them how."

IBM is recommending that instead of trying to educate consumers in the language and metrics of electrical energy--such as "dollars per kwh"—utilities instead should create consumer-oriented portals that compare a consumer's energy consumption with that of their neighbors, presenting clear visual evidence of the results of conservation and describing simple, effective methods of improving their standing among their peers.

"We are providing energy providers with a new way of engaging their consumers in a way that allows them to see the choices they can make in terms that they understand," said Valocchi. "It’s going to allow a whole new way of customer engagement that we have never seen in the industry."

As an example of progress already made, IBM described on-going efforts with the government of Malta (a small central Mediterranean archipelago) where research has revealed that deploying smart meters needs to be coupled with new presentations in billing that focus on the concrete steps that need to be taken to improve conservation with the new technology. The five-year effort, now in its fourth year, is a harbinger of how to make smart energy deployments successful and avoid consumer backlash during initial deployments.

"Smart metering needs to go hand in hand with the larger transformation," said Jean-Christophe Samin, project manager for IBM’s smart grid deployment for electricity and water in Malta.
According to Samin, the entire information chain needs to enhance consumer awareness, from the smart meter and the information it provides to the billing system.

IBM's pilot efforts with the Malta-government-owned Enemalta Corporation and Water Services Corporation include creating an online smart energy portal that explains conservation in easy-to-understand terms that offer just a few simple concrete choices, then provide the tools for measuring their progress in comparison to the efforts of their peers that IBM calls "social proof."

Further Reading