A small team at Otherlab, which does all kinds of weird things, has been using ARPA-E funding to develop what they're calling "thermally adaptive materials." We'll call it self-poofing fabric, for its ability to dynamically change its insulation in response to temperature. The idea is that the fabric will provide a small amount of insulation when it's warm out, and then increase how insulating it is (by trapping more air) in response to colder temperatures. When you see the prototype fabric in action, it looks like magic.
What's most exciting about Otherlab's fabric is that it operates completely passively. There's no power source, no wiring, and no controls—nothing but a combination of common synthetic fibers, each of which has different thermal expansion characteristics.
It might take a minute or so for the fabric to transform itself from completely flat to completely poofed, and then another minute to go back, but that's certainly quick enough to be useful. At minimum poof, the fabric insulates you about as well as a heavy t-shirt. And at maximum poof, it's equivalent to heavy outdoor gear, nearly tripling its insulating ability in response to a temperature drop of 15 °C .
Since the fabric is made of commodity materials and the manufacturing process is just like that used to produce other fabrics, the team at Otherlab (including Brent Ridley, Jean Chang, and Leah Bryson) says it should be able to scale up from fabric samples to actual garments within the next year.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next two months; here’s what we have so far (send us your events!):
European Robotics Forum – March 22-24, 2017 – Edinburgh, Scotland
NDIA Ground Robotics Conference – March 22-23, 2017 – Springfield, Va., USA
Automate – April 3-3, 2017 – Chicago, Ill., USA
ITU Robot Olympics – April 7-9, 2017 – Istanbul, Turkey
ROS Industrial Consortium – April 07, 2017 – Chicago, Ill., USA
U.S. National Robotics Week – April 8-16, 2017 – USA
NASA Swarmathon – April 18-20, 2017 – NASA KSC, Florida, USA
RoboBusiness Europe – April 20-21, 2017 – Delft, Netherlands
RoboGames 2017 – April 21-23, 2017 – Pleasanton, Calif., USA
ICARSC – April 26-30, 2017 – Coimbra, Portugal
AUVSI Xponential – May 8-11, 2017 – Dallas, Texas, USA
AAMAS 2017 – May 8-12, 2017 – Sao Paulo, Brazil
Austech – May 9-12, 2017 – Melbourne, Australia
Innorobo – May 16-18, 2017 – Paris, France
Let us know if you have suggestions for next week, and enjoy today’s videos.
The winner of the best video competition at HRI 2017 was this masterpiece from Professor Hiroshi Ishiguro (aka “The Man Who Made a Copy of Himself”) and colleagues Dylan F. Glas, Malcolm Doering, Phoebe Liu, and Takayuki Kanda from the Advanced Telecommunications Research Institute in Kyoto:
Stay tuned for the live concert tour, and read the paper at the link below.
Dreamer’s disembodied head can follow you without blinking. Ever.
And apparently, this is how UT Austin students choose to spend their Saturday nights: Trying to get a robot to poke itself in the eye.
[ UT Austin ]
An example of an Endeavor PackBot with a tethered Fotokite UAV demonstrated for safe nuclear plant inspection and decommissioning. This is an extension of our AirJoey work where a mother robot carries and supports daughter UAVs. This was funded by the DOE Environmental Management division through a cooperative agreement with Sandia National Labs.
[ CRASAR ]
Sphero is here to wish you a happy St. Patrick’s Day:
[ Sphero ]
RoboBoat 2017 takes place this June in Daytona, Fla:
[ Roboboat ]
Try and crash this drone with a suite of TeraRanger sensors on it into an obstacle. Go on, try it!
[ TeraRanger ]
I’m glad a couple of Sawyers have a job at DHL in the U.K., but for the life of me I can’t figure out what they’re accomplishing:
[ Rethink Robotics ]
Endeavor Robotics, the military and security robot company spun out of iRobot, offers a behind-the-scenes look at how it builds and tests its Made in the USA ground robots:
NXROBO’s BIG-I home robot was part of a demo smart bedroom from Haier:
It’s still a little bit difficult to tell how real-world functional or useful this robot will be, but I’m liking its design more and more.
[ NXROBO ]
You’ll probably need to turn on subtitles for this video from Pollen Robotics, but it’s worth it:
Meed has been designed as a robot accessible for all. It allows the discovery of robotics and learning to code. This journey will take place among new adventures that you will share with Meed. It is intended for young girls and boys, starting from 10, but also to anyone interesting in discovering the robotic in a new entertaining and poetic way.
Once they figure out what Meed will be and do (you can contribute to this, if you have ideas), they’ll be crowdfunding it, so stay tuned.
[ Pollen Robotics ]
From the University of Zurich:
We propose a novel collaborative transport scheme, in which two quadrotors transport a cable-suspended payload at accelerations that exceed the capabilities of previous collaborative approaches, which make quasi-static assumptions. Furthermore, this is achieved completely without explicit communication between the collaborating robots, making our system robust to communication failures and making consensus on a common reference frame unnecessary. Instead, they only rely on visual and inertial cues obtained from on-board sensors.
They’ll be presenting a paper on this at ICRA 2017 in Singapore.
[ UZH RPG ]
The simplest solution is always the most robust. With this in mind, we designed a four wheeled rover able to easily overcome vertical steps of more than 150% of its ground clearance without any active control. In other words, ROVéo doesn’t need to actively change its shape to overcome obstacles. The extended mobility is purely provided by the mechanical design of its chassis.
[ Rovenso ]
Robots playing games, drumming, serving drinks... I would say that Kuka’s booth at the China International Industry Fair was going overboard, but that might imply that I don’t approve of everything that they were doing:
[ Kuka ]
The Takanishi Lab at Waseda Univeristy in Japan has uploaded a pile of old robot videos. There’s all kinds of weird stuff, but here’s a selection of the weirdest:
[ Takanishi Lab ]
Al Gore didn’t really claim to invent the Internet in 1999, but he did champion a NASA mission that installed a deep space webcam pointed at Earth in 2015. And yesterday President Trump put a bullseye on that mission. Or, rather, on part of it. Trump’s 2018 budget blueprint asks Congress to defund the Earth-facing instruments on the Deep Space Climate Observatory (DSCOVR). Its sensors tracking magnetic storms emanating from the Sun would keep doing their jobs.
Selectively deep-sixing well-functioning instruments on a satellite 1.5 million kilometers from Earth is one of the stranger entries in President Trump’s first pass at a budget request. But it fits a pattern: Throughout the document programs aimed at comprehending or addressing climate change take deep cuts, even where there is no obvious fiscal justification.
“The budget targets almost anything that is related to climate,” observes David M. Hart, who directs the Center for Science and Technology Policy at George Mason University, near Washington, D.C.
Asked about climate change cuts at a press briefing yesterday, Trump Administration budget director Mick Mulvaney stated categorically: "We're not spending money on that anymore. We consider that to be a waste of your money." Whether the proposals come to pass, say Hart and other experts, will depend on Congress, and on how much political capital Trump and his administration gain or lose fighting on other issues such as immigration and health care in the months ahead.
Trump’s budget officials swung hardest at the Environmental Protection Agency, verifying earlier leaks that he would ask for a 31 percent slash in funding from its anticipated budget for fiscal 2017 (which ends 1 October). Many programs would lose ground under the proposed $2.6 billion reduction. Those targeted for elimination include the Clean Power Plan, which regulates CO2 emissions from power plants, EPA's climate change research and partnership programs, and the Energy Star product labelling program—“the most successful voluntary energy efficiency movement in history," according to its website.
Cuts proposed for the Department of Energy, meanwhile, are deeper than expected and disproportionately hit programs designed to carry energy innovations across the so-called valley of death between basic research and commercialization. Trump’s blueprint would nearly eliminate the department's applied science offices with a $2 billion reduction, andit zeroes out its tech incubator, Advanced Projects Research Agency-Energy (ARPA-E). ARPA-E had $291 million for fiscal 2016.
The Washington-based Information Technology and Innovation Foundation (ITIF) warned in a statement yesterday that these and other proposed cuts to Federal research and development would, if enacted, “signal the end of the American century as a global innovation leader.”
George Mason University’s Hart, who is also a senior fellow with the foundation, sees an ideological take on the innovation process driving Trump’s cuts. Hart has documented close alignment between the president's proposals and a budget plan issued by the Heritage Foundation. Heritage, a conservative Washington think tank, argues for a sharp division between government-funded lab research and proprietary corporate-funded product development.
"A more realistic view is that you have a continuum of projects. There’s a broad middle where the benefits are shared and thus the investment should be shared,” says Hart. Bridging that middle ground is critical in today's power sector, he argues, because deregulation has dried up the cash that once fuelled its cooperative R&D body. “The Electric Power Research Institute still exists, but it’s a shadow of its former self,” says Hart.
Venture capital attracted by ARPA-E-backed energy technologies, meanwhile, shows that DOE’s efforts appear to be paying off.
NASA looks like a budget survivor at first glance—Trump’s blueprint would shave just 1 percent off the agency's $19 billion 2016 top line and only 5 percent off of its $1.9 billion Earth sciences budget. “That is much less than the Earth science community feared,” says Marcia Smith, president of Arlington, Va.-based consultancy Space and Technology Policy Group and editor of SpacePolicyOnline.com.
Nevertheless, some of the Earth science cuts are potentially pernicious, and all target efforts to understand climate. In addition to 2018 spending cuts, three planned NASA Earth science missions would be scrubbed in addition to the blinding of DSCOVR’s Earth-facing sensors.
In three of the four cases, Trump would forego real benefits to gain minimal budgetary relief. For example, Smith figures NASA might save about $1 million by downgrading DSCOVR. Yet it measures Earth’s albedo, which is a “critical parameter for climate” according to Harvard University atmospheric chemist Steven Wofsy. Its measurements incorporate the scattering of sunlight by clouds and aerosols, which is “a tricky thing to calculate” says Wofsy.
Smith adds that, in her personal opinion, Gore was right about DSCOVR's unique, full-disc image of the Earth (and the Moon orbiting it): "It is useful to remind people just how fragile the Earth is.” Given the "tiny amount of money” at stake, Smith says that cut “has to count as a political issue, not a money issue.”
Another targeted mission, a follow-on to the Orbiting Carbon Observatory (OCO) that launched in 2014, awaits a 2018 launch. It was assembled from earlier missions’ spare parts and can be cheaply launched since it is destined for the International Space Station.
Whereas the existing OCO-2 scans CO2 emissions across the globe every 16 days, OCO-3 is promises high-precision measurement of regional carbon sources and sinks. One obvious application, he says, is fact-checking greenhouse gas reports. “It could really be powerful … to assess the emissions in China or in India where you can’t trust the numbers,” says Wofsy.
Then there is the Climate Absolute Radiance and Refractivity Observatory (CLARREO) mission, whose first iteration is, like OCO-3, to be bolted on to the ISS. CLARREO Pathfinder packs a finely calibrated spectrometer designed to cross-calibrate optical sensors on the entire fleet of U.S. and international Earth observing satellites, thus improving their accuracy 5-10 fold. "It would make sure that what they’re saying about climate is correct,” says University of Colorado senior scientist Michael King, who chairs the U.S. National Research Council’s Committee on Earth Science and Applications from Space.
King says better satellite data should, in turn, boost confidence in climate models, whose findings have been questioned by President Trump and top Administration officials, including EPA Administrator Scott Pruitt and Secretary of State Rex Tillerson. "There are uncertainties in climate models. Improving their accuracy should be in everybody’s best interest,” says King.
Wofsy also worries about unspecified reductions in Earth science research grants, which he calls the "seed corn” for future satellites.
Whether any of these attacks on climate science and action come to pass is ultimately up to Congress, and the reaction yesterday was weak even among Trump’s fellow Republicans. Smith notes that Rodney Felinghuysen, who chairs the House Appropriations Committee, responded with the dry reminder that Congress holds “the power of the purse.”
South Carolina Senator Lindsey Graham, meanwhile, called Trump’s budget “dead on arrival” over its proposed deep cuts to the State Department. And Democrats also issued blistering rejections.
Bill Foster, a physicist representing metropolitan Chicago, said in a statement: “It is hard to overstate how much damage this budget will do to our ability to remain at the forefront of innovation and problem solving.”
How much of the blueprint survives Congress is linked to how the Trump Administration’s credibility and popularity evolves in the months ahead, according to Hart and other budget watchers. “It may depend on how much clout the administration really has, [and] whether they’re deemed to be worth listening to."
A team of researchers based in Switzerland is on the way to laying bare much of the secret technology inside commercial processors. They pointed a beam of X-rays at a piece of an Intel processor and were able to reconstruct the chip’s warren of transistors and wiring in three dimensions. In the future, the team says, this imaging technique could be extended to create high-resolution, large-scale images of the interiors of chips.
The technique is a significant departure from the way the chip industry currently looks inside finished chips, in order to reverse engineer them or check that their own intellectual property hasn’t been misused. Today, reverse engineering outfits progressively remove layers of a processor and take electron microscope images of one small patch of the chip at a time.
But “all it takes is a few more years of this kind of work, and you'll pop in your chip and out comes the schematic,” says Anthony Levi of the University of Southern California. “Total transparency in chip manufacturing is on the horizon. This is going to force a rethink of what computing is”, he says, and what it means for a company to add value in the computing industry.
Even if this approach isn’t widely adopted to tear down competitors’ chips, it could find a use in other applications. One of those is verifying that a chip only has the features it is intended to have, and that a “hardware Trojan”—added circuitry that could be used for malicious purposes—hasn’t been introduced.
The work, which was published this week in the journal Nature, was conducted at the Paul Scherrer Institute’s Swiss Light Source. The facility is a synchrotron; it accelerates electrons around a ring at close to the speed of light in order to generate beams of X-rays.
To produce a 3D rendering of the Intel chip—an Intel G3260 processor—the team shined an X-ray beam through a portion of the chip. The various circuit components—its copper wires and silicon transistors, for example—scatter the light in different ways and cause constructive and destructive interference. Through a technique called X-ray ptychography, the researchers could point the beam at their sample from a number of different angles and use the resulting diffraction patterns to reconstruct chip’s internal structure.
They began their experiments with a chip they knew the structure of—an ASIC developed at the institute to read out data from a light detector. The team found an excellent match between the chip design and their measurements, and they were able to identify and study the features of a particular circuit inside the chip in fine detail.
Then they moved on to image circuitry inside the Intel processor, about which they had limited information. The ASIC was produced using 110-nanometer chip manufacturing technology, more than a decade from being cutting edge. But the Intel chip was just a couple of generations behind the state of the art: It was produced using the company’s 22-nm process. (Intel then jumped to 14nm and 10nm is now in the works.)
The resolution for this technique in any one direction is 14.6 nm, which rended individual transistor components fairly blurry. The resolution can be improved, but it may be a race to stay close to the smallest features in the chip industry: per Moore’s Law, chip dimensions have gotten smaller since 22nm and continue to do so.
This is not the first time that researchers have attempted to use X-rays to image the interior of integrated circuits, says team member Gabriel Aeppli. But “the resolution is better than it’s been in the past. The scale is also larger,” he says. “It’s a huge chunk of the chip compared to what you could do with any other technique.”
In this proof-of-concept, the team was using an experimental set-up that was optimized for cylindrical samples, and they had to carve a 10-micrometer-wide pillar out of the chip in order for the imaging process to work. But in the future, the group aims to move toward a different geometry that will allow them to image circuitry without having to damage the chip.
The imaging itself was no small feat. The sample has to remain stable, and interferometers must be used to continuously measure its position. It took about 24 hours to perform the X-ray measurements and the data processing took about as long, says team leader Mirko Holler. But additional computers should easily speed the processing, he says. And improvements to X-ray sources as well as other parts of the experimental apparatus could improve the imaging speed by a factor of 1000.
“I don’t see any other probe that could image a full processor without slicing it up,” says Jerry Hastings, a professor at the SLAC National Accelerator Laboratory in Menlo Park, Calif. “ You need something that simultaneously gives you the resolution and the penetrating power. And that’s where X-rays are unique.”
“What they’ve done is a proof of concept and it’s quite impressive at that level”, says Dick James, an emeritus fellow at TechInsights, which strips chips down to the transistor level to examine how the devices are built and wired together.
James says there are practical limitations to the amount of circuitry that can be analyzed using those traditional tear-down techniques, which, depending on the packaging, can start with a bath in boiling sulphuric acid. But he notes a lot can be gleaned by piecing together small electron microscope pictures of the chip: “You can get most of the cell library by looking at smaller areas.”
And even with a perfect image of an entire chip, there will be some things you can’t know about how the chip was made. “So many steps are sacrificial, ” James says. “Unless there’s some sort of residue or fingerprint left by a process, you can’t actually see what’s been done. You just have to infer it from the structures that are left behind.”
Although X-ray ptychography does promise bigger, high-resolution views, it faces several obstacles, James says. For one, chip feature sizes at the cutting edge could prove a challenge for its resolution. “The industry is getting ahead of this technique already,” he says. Another impediment is the need for a synchrotron source. These aren’t likely to find their way into chipmaking fabs, although there are a number of such facilities around the world and it is possible to rent time on them.
Because of these limitations, James says, the best application for this imaging technique could be on chips that are made with older manufacturing processes, and thus have larger features. This is the case for a number of chips used in military and space applications. “If you can look at the whole chip, then you can compare the chip with the original design,” James says, and “do a direct comparison [to] see if there are any obvious faults or any extra circuitry that’s been put in.”
Accelerometers introduced smartphone users to many handy new features—recording the distance you walk and automatically rotating the view when we turn the phone sideways, to name just two you probably used today. Though these sensors are generally quite helpful and accurate, computer scientists from the University of Michigan have just found a way to scam them.
A research team figured out that they could fool accelerometers using sound waves—in particular, a single tone played at an accelerometer’s resonant frequency. With it, they can cause two signal processing components within the phone to generate a false report of the accelerometer’s behavior. The group is led by Kevin Fu of the University of Michigan and includes collaborators from the University of South Carolina.
Patrick McDaniel, a security researcher at Pennsylvania State University, says the security risk of the particular scheme devised by Fu’s group is low. But the broader problem is a big one in the industry: Devices and software programs tend to blindly trust any data gathered from built-in sensors.
On Tuesday, the Industrial Control Systems Cyber Emergency Response Team of the U.S. Department of Homeland Security issued a public alert about the findings.
The Michigan group’s work underscores the point that any device that relies on data from a sensor to make a critical decision can potentially be led astray by that sensor. Besides smartphones, accelerometers are also used to activate airbags in motor vehicles, and to measure the rate and depth of chest compressions during CPR.
“If you're trusting your sensor inputs and you have no way to validate those inputs, you're going to have problems,” McDaniel says.
The University of Michigan team tested 20 models of capacitive micro-electromechanical (MEMS) accelerometers from five manufacturers: Bosch, STMicroelectronics, InvenSense, Analog Devices, and Murata Manufacturing. (The model numbers are listed here.)
They found that 75 percent of the accelerometers could be fooled by an attack that allowed them to slightly alter the sensors’ signals for a brief moment, and 65 percent were vulnerable to a more severe attack that allowed the team to control their signals indefinitely.
Fu says he hopes the work—which he calls a proof of concept—will start a conversation in the industry: “We need to question, why do we trust our sensors?” he says.
Their trick was possible because sound waves impart a physical force to any object they encounter. The University of Michigan team essentially used the vibrations produced by sound waves to alter accelerometers’ records of what was happening to them.
An accelerometer contains a physical mass placed on springs. When a device moves, the mass does too. The movement causes the capacitance—the ability to store charge—to change in the springs, which can be interpreted as movement. By producing vibrations through sound waves that moved that mass in a particular way, the group launched a series of attacks on the unsuspecting sensors.
The group first had to identify the resonance, or preferred frequency, of each accelerometer. At the resonance frequency, each sound wave reinforces the action of the previous one on the mass—leading to a much larger signal than you’d get at other frequencies. To find the resonance of the accelerometers, the team played tones at progressive frequencies from 2 kilohertz to 30 kilohertz, until they found a frequency where the accelerometer produced an outsized reaction.
Next, the team subjected the sensors to two types of attacks using sound waves at the resonant frequency. The first, called output biasing, exploits a feature of the low pass filter, a signal processing component that filters out high frequency interference. This technique can be used to slightly alter readings produced by the accelerometer for several seconds.
The second, called output control, takes advantage of the phone’s amplifier, which typically handles the raw signal even before it reaches the low pass filter. This method can be used to take control of the accelerometer indefinitely and produce false signals.
To show that it was possible to spoof accelerometers with these techniques, the group used each method to spell out “WALNUT” in a chart of the sensors’ acceleration over time.
Next, they wanted to use sound waves to hack actual devices, so they reached for a Samsung Galaxy S5, which comes with an MPU-6500 accelerometer from InvenSense. They loaded a a music video with the accelerometer’s resonant frequency embedded in it, and remotely prompted the phone to play the video.
At the same time, they ran a game on the phone called Spy Toys that relies on the accelerometer to control a toy car. While the video played, the toy car accelerated or decelerated in accordance with the pulses of the signal they had embedded in the video.
In their final demo, they used an off-the-shelf speaker to play a tone that caused a FitBit to log 2,100 steps in just 40 minutes, earning them 21 reward points on a health tracking site (they declined to cash in their points, citing ethical concerns).
Though these scams are certainly possible, they are not subtle—the attacker must be within close range of the device they wish to target, and has to know the model and resonance of the accelerometer inside.
In the group’s own example, an attacker would have to stand behind the owner and blare an audio track to take control of the Spy Toys car, or somehow prompt the owner’s phone to start playing the resonance frequency—perhaps by sending them to a website that automatically plays the track once they arrive.
“It falls into that kind of cool, but not something that would keep me awake at night, type of vulnerability,” McDaniel says.
Still, to prevent any issues, Fu suggests accelerometer designers choose a resonance in the ultrasound range, which is more difficult to generate with off-the-shelf speakers. And encasing devices in foam is a good way to stop sound waves from reaching a device’s accelerometer, though not always practical.
Through the University of Michigan, the research team is also attempting to sell software programs to manufacturers that it says can prevent such attacks in products that are already on the market.
Just in case, McDaniel thinks entrepreneurs and consumers should be wary of turning too many decisions and responsibilities over to any devices that rely on sensor data, until the industry figures out how to better validate that data.
“If you're using that sensor input for a security critical decision, well then that's something we really need to worry about,” McDaniel says. “If we can't be sure they're trustable, we need to limit the kind of security decisions we're making off of them.”