lunes, 31 de diciembre de 2018

Physicists record “lifetime” of graphene qubits

Researchers from MIT and elsewhere have recorded, for the first time, the “temporal coherence” of a graphene qubit — meaning how long it can maintain a special state that allows it to represent two logical states simultaneously. The demonstration, which used a new kind of graphene-based qubit, represents a critical step forward for practical quantum computing, the researchers say.

Superconducting quantum bits (simply, qubits) are artificial atoms that use various methods to produce bits of quantum information, the fundamental component of quantum computers. Similar to traditional binary circuits in computers, qubits can maintain one of two states corresponding to the classic binary bits, a 0 or 1. But these qubits can also be a superposition of both states simultaneously, which could allow quantum computers to solve complex problems that are practically impossible for traditional computers.

The amount of time that these qubits stay in this superposition state is referred to as their “coherence time.” The longer the coherence time, the greater the ability for the qubit to compute complex problems.

Recently, researchers have been incorporating graphene-based materials into superconducting quantum computing devices, which promise faster, more efficient computing, among other perks. Until now, however, there’s been no recorded coherence for these advanced qubits, so there’s no knowing if they’re feasible for practical quantum computing.

In a paper published today in Nature Nanotechnology, the researchers demonstrate, for the first time, a coherent qubit made from graphene and exotic materials. These materials enable the qubit to change states through voltage, much like transistors in today’s traditional computer chips — and unlike most other types of superconducting qubits. Moreover, the researchers put a number to that coherence, clocking it at 55 nanoseconds, before the qubit returns to its ground state.

The work combined expertise from co-authors William D. Oliver, a physics professor of the practice and Lincoln Laboratory Fellow whose work focuses on quantum computing systems, and Pablo Jarillo-Herrero, the Cecil and Ida Green Professor of Physics at MIT who researches innovations in graphene.

“Our motivation is to use the unique properties of graphene to improve the performance of superconducting qubits,” says first author Joel I-Jan Wang, a postdoc in Oliver’s group in the Research Laboratory of Electronics (RLE) at MIT. “In this work, we show for the first time that a superconducting qubit made from graphene is temporally quantum coherent, a key requisite for building more sophisticated quantum circuits. Ours is the first device to show a measurable coherence time — a primary metric of a qubit — that’s long enough for humans to control.”

There are 14 other co-authors, including Daniel Rodan-Legrain, a graduate student in Jarillo-Herrero’s group who contributed equally to the work with Wang; MIT researchers from RLE, the Department of Physics, the Department of Electrical Engineering and Computer Science, and Lincoln Laboratory; and researchers from the Laboratory of Irradiated Solids at the École Polytechnique and the Advanced Materials Laboratory of the National Institute for Materials Science.

A pristine graphene sandwich

Superconducting qubits rely on a structure known as a “Josephson junction,” where an insulator (usually an oxide) is sandwiched between two superconducting materials (usually aluminum). In traditional tunable qubit designs, a current loop creates a small magnetic field that causes electrons to hop back and forth between the superconducting materials, causing the qubit to switch states.

But this flowing current consumes a lot of energy and causes other issues. Recently, a few research groups have replaced the insulator with graphene, an atom-thick layer of carbon that’s inexpensive to mass produce and has unique properties that might enable faster, more efficient computation.

To fabricate their qubit, the researchers turned to a class of materials, called van der Waals materials — atomic-thin materials that can be stacked like Legos on top of one another, with little to no resistance or damage. These materials can be stacked in specific ways to create various electronic systems. Despite their near-flawless surface quality, only a few research groups have ever applied van der Waals materials to quantum circuits, and none have previously been shown to exhibit temporal coherence.

For their Josephson junction, the researchers sandwiched a sheet of graphene in between the two layers of a van der Waals insulator called hexagonal boron nitride (hBN). Importantly, graphene takes on the superconductivity of the superconducting materials it touches. The selected van der Waals materials can be made to usher electrons around using voltage, instead of the traditional current-based magnetic field. Therefore, so can the graphene — and so can the entire qubit.

When voltage gets applied to the qubit, electrons bounce back and forth between two superconducting leads connected by graphene, changing the qubit from ground (0) to excited or superposition state (1). The bottom hBN layer serves as a substrate to host the graphene. The top hBN layer encapsulates the graphene, protecting it from any contamination. Because the materials are so pristine, the traveling electrons never interact with defects. This represents the ideal “ballistic transport” for qubits, where a majority of electrons move from one superconducting lead to another without scattering with impurities, making a quick, precise change of states.

How voltage helps

The work can help tackle the qubit “scaling problem,” Wang says. Currently, only about 1,000 qubits can fit on a single chip. Having qubits controlled by voltage will be especially important as millions of qubits start being crammed on a single chip. “Without voltage control, you’ll also need thousands or millions of current loops too, and that takes up a lot of space and leads to energy dissipation,” he says.

Additionally, voltage control means greater efficiency and a more localized, precise targeting of individual qubits on a chip, without “cross talk.” That happens when a little bit of the magnetic field created by the current interferes with a qubit it’s not targeting, causing computation problems.

For now, the researchers’ qubit has a brief lifetime. For reference, conventional superconducting qubits that hold promise for practical application have documented coherence times of a few tens of microseconds, a few hundred times greater than the researchers’ qubit.

But the researchers are already addressing several issues that cause this short lifetime, most of which require structural modifications. They’re also using their new coherence-probing method to further investigate how electrons move ballistically around the qubits, with aims of extending the coherence of qubits in general.



de MIT News http://bit.ly/2R3gzhS

jueves, 27 de diciembre de 2018

Exploring New England's coastal ecosystems in the dead of winter

In early January 2018, a nor’easter pummeled the East Coast. A record-breaking high tide rendered many streets in Boston impassable and seawater rushed down Seaport Boulevard in Boston’s Seaport District. A deluge of water poured down the steps leading down to the Aquarium subway station, forcing it to close.

Less than a week later, in a dry classroom on MIT’s campus, a group of students discussed how coastal cities like Boston can cope with worsening floods due to rising sea levels.

“We live in a coastal city, so obviously we are being significantly impacted by sea level rise,” says Valerie Muldoon, a third-year mechanical engineering student. “We talked about the bad nor’easter earlier in January and brainstormed ways to mitigate the flooding.”

Muldoon and her fellow students were enrolled in 2.981 (New England Coastal Ecology), a class that meets during MIT’s Independent Activities Period. The course is offered through the MIT Sea Grant College Program, which is affiliated with MIT’s Department of Mechanical Engineering.

MIT Sea Grant instructors Juliet Simpson, a research engineer, and Carolina Bastidas, a research scientist, use the four-week class to introduce students to the biological makeup of coastal ecosystems, to the crucial role these areas play in protecting the environment, and to the effects human interaction and climate change have had on them.

“We want to give a taste of coastal communities in New England to the students at MIT — especially those who come from abroad or other parts of the U.S.,” says Bastidas, a marine biologist who focuses her research primarily on coral and oyster reefs.

Muldoon, who is a double minor in energy studies and environment and sustainability, says she was “so excited to see a Course 2 class on coastal ecology.”

“I’m passionate about protecting the environment, so the topic really resonated with me,” she says.

The course begins with an introduction to the different types of coastal ecosystems found in the New England area, such as rocky intertidal regions, salt marshes, eelgrass meadows, and kelp forests. In addition to providing an overview of the makeup of each environment, the course instructors also discuss the physiology of the countless organisms who live in them.

Halfway through the course, students learn about how human impacts like climate change, eutrophication, and increased development have affected coastal habitats.

“We focus on climate change as it impacts coastal communities like rocky shores and salt marshes,” says Simpson, a coastal ecologist who studies how plants and algae respond to human interference. “There are a lot of interesting implications for sea level rise for intertidal organisms.”

Sea level rise, for example, has forced organisms that live in salt marshes to migrate upland. Changes in both water and air temperature also have a drastic effect on the inhabitants of coastal regions.

“As temperatures rise, all of those organisms are going to need to adapt or the communities are going to change, possibly dramatically,” explains Simpson.

Protecting coastal ecosystems has far reaching implications that go beyond the animals and plants that live there, because they offer a natural defense against climate change. Many coastal are natural hot spots for carbon capture and sequestration. Salt marshes and seagrass meadows all capture vast amounts of carbon that can be stored for several thousand years in peat.

“I was shocked at how much carbon the plants in these ecosystems can hold through sequestration,” recalls Muldoon.

Protecting these areas is essential to continue this natural sequestration of carbon and prevent carbon already stored there from leaking out. Coastal ecosystems are also instrumental in protecting coastal cities, like Boston, from flooding due to sea level rise.

“We talk about the ecology of coastal cities and how flooding from storms and sea level rise impacts human communities,” adds Simpson.

The class culminates in a field trip to Odiorne Point State Park in New Hampshire, where students get to interact with the communities they’ve learned about. Using fundamental techniques in ecology, students collect data about the species living in the salt marsh and rocky shore nearby. 

Bastidas and Simpson will expand the class’ scope beyond New England in a new course — 2.982 (Ecology and Sustainability of Coastal Ecosystems) — which will be offered in fall 2019.

While the effects of climate change on coastal ecosystems often paint a dire picture, the instructors want students to focus on the positive.

“Rather than have students focus on the gloom and doom aspect, we want to encourage them to come up with novel solutions for dealing with climate change and carbon emissions,” adds Bastidas.

Muldoon sees a special role for mechanical engineers like herself in developing such solutions.

“I think it’s so important for mechanical engineering students to take classes like this one because we are definitely going to be needed to help mitigate the problems that come with sea level rise,” she says.



de MIT News http://bit.ly/2Cz8ad6

Stamp of approval

The German postal service has issued a stamp to honor the Illlustris simulation project, a supercomputer simulation of the universe that has provided astrophysicists with new insights into the formation and evolution of galaxies.

The Illustris project is headed by an international collaboration of astrophysicists from MIT, Harvard University, the Heidelberg Institute for Theoretical Studies, the Max-Planck Institutes for Astrophysics and for Astronomy, and the Center for Computational Astrophysics. 

Mark Vogelsberger, an associate professor of physics at MIT and the MIT Kavli Institute for Astrophysics and Space Research, leads MIT’s involvement in the Illustris project. The MIT team has been heavily involved in the development of the simulation, its execution, and the analysis of the resulting simulation data.

Since its publication in 2014, the simulation data has been studied by more than 2,000 researchers resulting in nearly 200 papers. This year, the team published first results of simulations from what they call "Illustris — The Next Generation" or IllustrisTNG, which provides an even more accurate picture of the universe.

“The German government was looking for astrophysical research that had a major impact on science in recent years,” Vogelsberger says.

Olaf Scholz, Germany’s Federal Ministry of Finance, contacted the Illustris collaboration team for permission to use the visual data from simulations for a postage stamp, Vogelsberger says. Almost a year after the initial inquiry, the Illustris collaboration was notified that a German stamp would be issued to honor their work.

The stamp was officially unveiled by Deutsche Post on Dec. 18.

“As far as I know, this is the first astrophysical simulation to ever get its own stamp, and maybe even the first stamp to honor general simulation work,” Vogelsberger says.



de MIT News http://bit.ly/2RiqmA0

MIT research honored with Physics World “Breakthrough of the Year” awards

Three scientific and engineering advances led by researchers in the MIT community have been named to Physics World’s 10 Breakthroughs of 2018. One MIT-led discovery received the magazine’s top honor: 2018 Breakthrough of the Year.

Pablo Jarillo-Herrero, an associate professor of physics at MIT, was awarded the Breakthrough of the Year award for his leadership role in the discovery that graphene sheets can act as an insulator or a superconductor when rotated at a “magic angle.” Steven Barrett, associate professor of aeronautics and astronautics at MIT, was honored among the top 10 for his team’s success in building and flying the first-ever plane with no moving propulsion parts. Alan Rogers, a scientist at MIT’s Haystack Observatory, was also selected for the top 10 for his contributions to observations of the earliest evidence of hydrogen gas, tracing signals to just 180 million years after the Big Bang.

Physics World editors selected these discoveries based on criteria including the significance of the findings toward expanding knowledge and understanding, the importance of the work for scientific progress, and development of real-world applications.

Birth of “twistronics”

Jarillo-Herrero and his collaborators found that graphene, a two-dimensional layer of carbon atoms with a honeycomb-like lattice, can behave at two electrical extremes when rotated at a certain angle: as an insulator, in which electrons are completely blocked from flowing; and as a superconductor, in which electrical current can stream through without resistance. This discovery led to the development of “twistronics,” which Physics World reporter Hamish Johnston called “a new and very promising technique for adjusting the electronic properties of graphene.”

Jarillo-Herrero and his colleagues published their findings in two papers in Nature last March. Yuan Cao, a graduate student in electrical engineering and computer science who works in Jarillo-Herrero’s lab, was featured for this same research in the 2018 Nature’s 10, profiles of 10 people who mattered in 2018 based on their contributions to science.

Flying on ions

MIT’s Electric Aircraft Initiative, which Steven Barrett coordinates, is focusing on technologies that, in the long-term, could result in planes with near-silent propulsion, and low or no emission. The team’s ion aircraft, which resembles a lightweight glider, features an array of wires strung like horizontal fencing along and beneath the wing’s leading edge.

The wires act as positively charged electrodes, while similarly arranged thicker wires, running along the back end of the plane’s wing, serve as negative electrodes. Batteries supply 40,000 volts to positively charge the wires via a power converter. The wires attract and strip away negatively charged electrons from the surrounding air molecules, like a giant magnet attracting iron filings. The air molecules that are left behind are newly ionized, and are, in turn, attracted to the negatively charged electrodes at the back of the aircraft. As the newly formed cloud of ions flows toward the negatively charged wires, each ion collides millions of times with other air molecules, creating a thrust that propels the aircraft forward.

Barrett and his colleagues published their groundbreaking results in Nature in November.

Very old hydrogen

Alan Rogers, a scientist at MIT Haystack Observatory in Westford, Massachusetts, was honored for the observation of hydrogen gas from the very early universe with the Experiment to Detect the Global Epoch of reionization Signature (EDGES) collaboration. Using a small radio antenna located in remote western Australia, Rogers and colleagues were able for the first time to detect the earliest hydrogen signals yet observed: just 180 million years after the Big Bang. The discovery provides insight into the formation of the first stars via their interaction with the surrounding hydrogen gas and its absorption of cosmic background radiation, which was explained in a National Science Foundation (NSF) video.

“There was a great technical challenge to making this detection, as sources of noise can be a thousand times brighter than the signal — it’s like being in the middle of a hurricane and trying to hear the flap of a hummingbird’s wing,” says Peter Kurczynski, the NSF program officer who supported this study. “These researchers with a small radio antenna in the desert have seen farther than the most powerful space telescopes, opening a new window on the early universe.”

The EDGES collaboration is led by Judd Bowman of Arizona State University, and the research was published in Nature in February.



de MIT News http://bit.ly/2SmcZfl

miércoles, 26 de diciembre de 2018

Julia language co-creators win James H. Wilkinson Prize for Numerical Software

Three co-creators of the MIT-incubated Julia programming language are the recipients of the 2019 James H. Wilkinson Prize for Numerical Software.

With origins in the Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Department of Mathematics, Julia is a programming language created in 2009 by Jeff Bezanson PhD ’15, former MIT Julia Lab researchers Stefan Karpinski, and Viral B. Shah, and professor of mathematics Alan Edelman.

The prize will be awarded to Bezanson, Karpinski, and Shah “for the creation of Julia, an innovative environment for the creation of high-performance tools that enable the analysis and solution of computational science problems.”

Released publicly in 2012, Julia has over 3 million downloads and is used in over 1,500 universities for scientific and numerical computing. “I am proud of the intellectual contributions of the Julia Lab, which applies the latest in computer science to science and engineering problems, while engaging interdisciplinary collaborations all over campus and beyond,” said Edelman. “Julia is increasingly the language of instruction for scientific computing at MIT.”

The prize returns to MIT since 1999, when Matteo Frigo PhD ’99 and now-professor of mathematics Steven Johnson ’95, PhD ’01 received the award for the Fastest Fourier Transform in the West. The Wilkinson Prize is awarded every four years to the authors of an outstanding piece of numerical software, and is intended to recognize innovative software in scientific computing and to encourage researchers in the earlier stages of their career.

“Julia allows researchers to write high-level code in an intuitive syntax and produce code with the speed of production programming languages,” according to a statement from the selection committee. “Julia has been widely adopted by the scientific computing community for application areas that include astronomy, economics, deep learning, energy optimization, and medicine. In particular, the Federal Aviation Administration has chosen Julia as the language for the next-generation airborne collision avoidance system.”

The award will be presented at the 2019 Society for Industrial and Applied Mathematics (SIAM) Computational Science and Engineering conference (CSE19) in Spokane, Washington. Feb. 25-Mar 1, 2019.



de MIT News http://bit.ly/2ShCzSB

MIT AgeLab hosts Governor Baker, winners of In Good Company Challenge

The MIT AgeLab, in collaboration with General Electric, Benchmark Senior Living, and the Governor’s Council to Address Aging in Massachusetts, recognized four entrepreneurial ventures as winners of the In Good Company Optimal Aging Challenge in a reception held at the Samberg Conference Center at MIT.

The In Good Company Optimal Aging Challenge was designed to support the development of breakthrough technologies, community resources, and solutions to address the issues of social isolation and loneliness among older adults in the Commonwealth of Massachusetts. Social isolation has been identified as a significant health issue, with an observed effect on mortality equivalent to that of obesity and smoking 15 cigarettes per day. Nearly one in five U.S. older adults are socially isolated, according to research from AARP.

“The best solutions happen when we are open to learning from others, collaborating and welcoming new ideas, and adapting ideas and services that have been around for a while to improve quality of life,” said Massachusetts Governor Charlie Baker. “People are going to live a lot longer, and not only are they going to live a lot longer, they’re going to have a great deal to offer.”

“Social isolation should never be accepted as an inevitable consequence of growing old,” said Massachusetts Health and Human Services Secretary Marylou Sudders. “This challenge is the kind of forward-thinking, cross-platform, collaborative approach that the governor expects of all of us.”

The challenge sought ideas-in-development and ready-to-market products within the domains of caregiving, transportation, housing, and volunteering. With participation from MIT, private industry, and the governor’s office, the challenge judges synthesized perspectives from the academic, business, and policymaking spheres.

The challenge winners are:

  • Share Kitchens, which strives to end loneliness where it often creeps in: during the simple, daily occurrence of mealtime. The nonprofit helps older adults transform their domestic kitchens into culinary spaces that foster communal cooking and intergenerational preparation and enjoyment of food. The organization’s vision is to create a network of Share Kitchens across Massachusetts that ensures food stability and provides employment for seniors.
     
  • FriendshipWorks, whose mission is to reduce social isolation, enhance quality of life, and preserve the dignity of older adults in the Greater Boston area. Founded in 1984, the organization is based on the belief that no one should have to be alone, regardless of age or frailty. FriendshipWorks strives to achieve this vision by recruiting and training volunteers of all ages, faiths, and backgrounds to provide friendship, advocacy, education, assistance, and emotional support to the elderly.
     
  • HelpAroundTown, a personalized, localized community marketplace that connects neighbors organically by facilitating transactions between people who need help and neighbors looking for flexible work. HelpAroundTown believes that some people have what other people need, and that by connecting them, we can strengthen community ties and create intergenerational bonds.
     
  • Care.coach, whose avatars combine human intelligence and compassion with software automation and clinical algorithms to provide 24/7 psychosocial support for older adults. Researchers at universities and clinicians in diverse care settings have validated Care.coach’s innovative approach in caregiving and its ability to reduce loneliness, improve perceived social support, and drive outcomes — including reducing the need for nursing visits to the home, preventing falls, and mitigating delirium among hospitalized older adults.

In addition to the recognition they received, the winners will split a cash prize of $20,000 to further expand their businesses.

In an echo of the challenge’s goals, Boston was recently named by Inc.com as the 15th-best place to start a business in the United States, with the MIT AgeLab’s presence and rapid growth in the longevity economy cited as reasons for the city’s ranking.

"Although it is often posed as a challenge, longevity is a great opportunity,” said AgeLab Director Joseph F. Coughlin. “What we call old age today is actually one-third of adult life, and that one-third of life is uncharted territory for business, policy, and all of us as individuals.”



de MIT News http://bit.ly/2LAXAoR

Sound and technology unlock innovation at MIT

Sound is a powerfully evocative medium, capable of conjuring authentic emotions and unlocking new experiences. This fall, several cross-disciplinary projects at MIT probed the technological and aesthetic limits of sound, resulting in new innovations and perspectives, from motion-sensing headphones that enable joggers to maintain a steady pace, virtual reality technology that enables blind people to experience comic book action, as well as projects that challenge our very relationship with technology.

Sound as political participation

“Sound is by nature a democratic medium,” says Ian Condry, an anthropologist and professor in MIT’s Department of Global Studies and Languages, adding that “sound lets us listen around the margins and to follow multiple voices coming from multiple directions.”

That concept informed this year’s Hacking Arts Hackathon Signature Hack, which Condry helped coordinate. The multi-channel audio installation sampled and abstracted audio excerpts from recent presidential inaugural addresses, then blended them with breathing sounds that the team recorded from a live audience. Building on this soundtrack, two team members acted as event DJs, instructing the audience to hum and breathe in unison, while their phones — controlled by an app created for the hackathon — played additional breathing and humming sounds.

“We wanted to play with multiple streams of speech and audio,” says Adam Haar Horowitz, a second-year master’s student at the MIT Media Lab, and member of the winning team. “Not just the words, which can be divisive, but the texture and pauses between the words.”

A guy walks into a library…

What happens when artificial intelligence decides what’s funny? Sound and democracy played prominently in "The Laughing Room," an installation conceived by a team including author, illustrator, and MIT PhD candidate Jonny Sun and Stephanie Frampton, MIT associate professor of literature, as part of her project called ARTificial Intelligence, a collaboration between MIT Libraries and the Cambridge Public Library.

Funded in part by a Fay Chandler Faculty Creativity Seed Grant from the MIT Center for Art, Science and Technology (CAST), "The Laughing Room" invited public library visitors into a set that evoked a television sitcom living room, where they told stories or jokes that were analyzed by the room’s AI. If the algorithm determined a story was funny, it played a recorded laugh track. "The Laughing Room" — as well as the AI’s algorithmic calculations — were then broadcast on screens in "The Control Room," a companion installation at MIT’s Hayden Library.

While fun for the public, the project also mined more serious issues. “There is a tension in society around technology,” says Sun, “between the things technology allows you to do, like having an algorithm tell you your joke is funny, and the price we pay for that technology, which is usually our privacy.”

Using sound to keep the pace

How can audio augmented reality enhance our quality of life? That challenge was explored by more than 70 students from multiple disciplines who competed in the Bose MIT Challenge in October. The competition, organized by Eran Egozy, professor of the practice in music technology and an MIT graduate who co-founded Harmonix, the company that developed iconic video games Guitar Hero and Rock Band, encourages students to invent real-life applications for Bose AR, a new audio augmented reality technology and platform.

This year’s winning entry adapted the Bose’s motion-sensing AR headphones to enable runners to stay on pace as they train. When the runner accelerates, the music is heard behind them. When their place slows, the music sounds as if it’s ahead of them.

“I’d joined hackathons at my home university,” said Dominic Co, a one-year exchange student in architecture from the University of Hong Kong and member of the three-person winning team. “But there’s such a strong culture of making things here at MIT. And so many opportunities to learn from other people.”

Creating a fuller picture with sound

Sound — and the technology that delivers it — has the capacity to enhance everyone’s quality of life, especially for the 8.4 million Americans without sight. That was the target audience of Project Daredevil, which won the MIT Creative Arts Competition last April.

Daniel Levine, a master’s candidate at the MIT Media Lab, teamed with Matthew Shifrin, a sophomore at the New England Conservatory of Music, to create a virtual-reality system for the blind. The system’s wearable vestibular-stimulating helmet enables the sightless to experience sensations like flying, falling, and acceleration as they listen to an accompanying soundtrack.

Shifrin approached Levine two years ago for help in developing an immersive 3-D experience around the Daredevil comic books — a series whose superhero, like Shifrin, is blind. As a child, Shifrin’s father read Daredevil to him aloud, carefully describing the action in every pane. Project Daredevil has advanced that childhood experience using technology.

“Because of Dan and his engineering expertise, this project has expanded far beyond our initial plan,” says Shifrin. “It’s not just a thing for blind people. Anyone who is into virtual reality and gaming can wear the device.”

A beautiful marriage of art and technology

Another cross-disciplinary partnership in sound and technology that resulted in elegant outcomes this fall is the ongoing partnership between CAST Visiting Artist Jacob Collier and MIT PhD candidate Ben Bloomberg.

Bloomberg, who completed his undergraduate and master’s studies at MIT, studied music and performance design with Tod Machover, the Muriel R. Cooper Professor of Music and Media and director of the Media Lab’s Opera of the Future group. Bloomberg discovered Collier’s music videos online about four years ago; he then wrote the artist to ask whether he needed any help in adapting his video performances to the stage. Fortunately, the answer was yes.

Working closely with Collier, Bloomberg developed a computerized audio/visual performance platform that enables the charismatic composer and performer to move seamlessly from instrument to instrument on stage and sing multiple parts simultaneously. The duo continues to develop and perfect the technology in performance. “It’s like a technological prosthesis,” says Bloomberg, who has worked with dozens of artists, including Bjork and Ariana Grande.

While technology has opened the door to richer sound explorations, Bloomberg firmly places it in an artistic realm. “None of this would make any sense were it not for Jacob’s amazing talent. He pushes me to develop new technologies, or to find new ways to apply existing technology. The goal here isn’t to integrate technology just because we can, but to support the music and further its meaning.”

Explorations in sound continue into 2019 with the innovative annual performance series MIT Sounding. Highlights of the 2018-2019 season include a collaboration with the Boston Modern Orchestra Project in honor of MIT Institute Professor John Harbison’s 80th birthday, the American premiere of the Spider’s Canvas, a virtual 3-D reconstruction of a spider’s web with each strand tuned to a different note, and residencies by two divergent musicians: the Haitian singer and rapper BIC and the innovative American pianist Joel Fan performing works by MIT composers.



de MIT News http://bit.ly/2Q4gOEe

Leg nerves activated by light offer new path to restoring mobility

For the first time, MIT researchers have shown that nerves made to express proteins that can be activated by light can produce limb movements that can be adjusted in real-time, using cues generated by the motion of the limb itself. The technique leads to movement that is smoother and less fatiguing than similar electrical systems that are sometimes used to stimulate nerves in spinal cord injury patients and others.

While this method was tested on animals, with further research and future trials in humans this optogenetic technique could be used someday to restore movement in patients with paralysis, or to treat unwanted movements such as muscle tremor in Parkinson’s patients, said Shriya Srinivasan, a PhD student in medical engineering and medical physics at the MIT Media Lab and the Harvard-MIT Division of Health Sciences and Technology.

The first applications of the technology might be to restore motion to paralyzed limbs or to power prosthetics, but an optogenetic system has the potential to restore limb sensation, turn off unwanted pain signals or treat spastic or rigid muscle movements in neurological diseases such as amyotrophic lateral sclerosis or ALS, Srinivasan and her colleagues suggest.

The MIT team is one of very few research groups using optogenetics to control nerves outside the brain, Srinivasan said. “Most people are using optogenetics as sort of a tool to learn about neural circuits, but very few are looking at it as a clinically translatable therapeutic tool as we are.”

“Artificial electrical stimulation of muscle often results in fatigue and poor controllability. In this study, we showed a mitigation of these common problems with optogenetic muscle control,” said Hugh Herr, who led the research team and heads the Media Lab’s Biomechatronics group. “This has great promise for the development of solutions for patients suffering from debilitating conditions like muscle paralysis.”

The paper was published in the Dec. 13 issue of Nature Communications. The team included MIT researchers Benjamin E. Maimon, Maurizio Diaz, and Hyungeun Song.

Light versus electricity

Electrical stimulation of nerves is used clinically to treat breathing, bowel, bladder, and sexual dysfunction in spinal cord injury patients, as well as to improve muscle conditioning in people with muscular degenerative diseases. Electrical stimulation can also control paralyzed limbs and prosthetics. In all cases, electrical pulses delivered to nerve fibers called axons trigger movement in muscles activated by the fibers.

This type of electrical stimulation quickly fatigues muscles, can be painful, and is hard to target precisely, however, leading scientists like Srinivasan and Maimon to look for alternative methods of nerve stimulation.

Optogenetic stimulation relies on nerves that have been genetically engineered to express light-sensitive algae proteins called opsins. These proteins control electrical signals such as nerve impulses — essentially, turning them on and off — when they are exposed to certain wavelengths of light.

Using mice and rats engineered to express these opsins in two key nerves of the leg, the researchers were able to control the up and down movement of the rodents’ ankle joint by switching on an LED that was either attached over the skin or implanted within the leg.

This is the first time that a “closed-loop” optogenetic system has been used to power a limb, the researchers said. Closed-loop systems change their stimulation in response to signals from the nerves they are activating, as opposed to “open-loop” systems that don’t respond to feedback from the body.

In the case of the rodents, different cues including the angle of the ankle joint and changes in the length of the muscle fibers were the feedback used to control the ankle’s motion. It’s a system, said Srinivasan, “that in real time observes and minimizes the error between what we want to happen and what’s really happening.”

Stroll versus sprint

Optogenetic stimulation also led to less fatigue during cyclic motion than electrical stimulation, in a way that surprised the research team. In electrical systems, large-diameter axons are activated first, along with their large and oxygen-hungry muscles, before moving on to smaller axons and muscles. Optogenetic stimulation works in the opposite way, stimulating smaller axons before moving on to bigger fibers.

“When you’re walking slowly, you’re only activating those small fibers, but when you run a sprint, you’re activating the big fibers,” explained Srinivasan. “Electrical stimulation activates the big fibers first, so it’s like you’re walking but you’re using all the energy it requires to do a sprint. It’s quickly fatiguing because you’re using way more horsepower than you need.”

The scientists also noticed another curious pattern in the light stimulated system that was unlike electrical systems. “When we kept doing these experiments, especially for extended periods of time, we saw this really interesting behavior,” Srinivasan said. “We’re used to seeing systems perform really well, and then fatigue over time. But here we saw it perform really well, and then it fatigued, but if we kept going for longer the system recovered and started performing well again.”

This unexpected rebound is related to how opsin activity cycles in the nerves, in a way that allows the full system to regenerate, the scientists concluded.

With less fatigue involved, the optogenetic system might be a good future fit for long-term motor operations such as robotic exoskeletons that allow some people with paralysis to walk, or as long-term rehabilitation tools for people with degenerative muscle diseases, Srinivasan suggested.

For the method to make the leap into humans, researchers need to experiment with the best ways to deliver light to nerves deep within the body, as well as find ways to express opsins in human nerves safely and efficiently.

“There are already some 300 trials using gene therapy, and a few trials that use opsins today, so it’s likely in the foreseeable future,” said Srinivasan.

The study was funded by the MIT Media Lab Consortium.



de MIT News http://bit.ly/2Q3hOZ2

School of Science honors postdocs and research staff with 2018 Infinite Kilometer Awards

The MIT School of Science has announced the 2018 winners of the Infinite Kilometer Award. The Infinite Kilometer Award was established in 2012 to highlight and reward the extraordinary work of the school's postdocs and research staff.

Recipients of the award are exceptional contributors to their research programs. In many cases, they are also deeply committed to their local or global MIT community, and are frequently involved in mentoring and advising their junior colleagues, participating in the school’s educational programs, making contributions to the MIT Postdoctoral Association, or contributing to some other facet of the MIT community.

In addition to a monetary award, the honorees and their colleagues, friends, and family are invited to a celebratory reception in the spring semester.

The 2018 Infinite Kilometer winners are:

Matthew Golder, a National Institutes of Health Postdoctoral Fellow in the Department of Chemistry, nominated by Jeremiah Johnson, an associate professor of chemistry;

Robert Grant, manager of the crystallography lab in the Department of Biology, nominated by Michael Laub, a professor of biology;

Slawomir Gras, a research scientist on the LIGO project at the MIT Kavli Institute for Astrophysics and Space Research, nominated by Nergis Mavalvala, the Curtis and Kathleen Marble Professor of Astrophysics, and Matthew Evans, an associate professor of physics;

Yeong Shin Yim, a postdoc at the McGovern Institute for Brain Research, nominated by Gloria Choi, an assistant professor of brain and cognitive sciences; and

Yong Zhao, a postdoc in the Laboratory for Nuclear Science, nominated by Iain Stewart, a professor of physics.

The School of Science is also currently accepting nominations for its Infinite Mile Awards. All School of Science employees are eligible, and nominations are due by Feb. 15, 2019. The Infinite Mile Awards will be presented with the Infinite Kilometer Awards this spring.



de MIT News http://bit.ly/2QQtqDX

Studying pavement from a marathon runner's perspective

Being a research student requires hours behind a desk. For Concrete Sustainability Hub researcher Thomas Petersen though, it also requires hours of running, and perhaps, even a marathon or two.

“I love running,” Petersen says. “I find that it helps my productivity.”

Petersen has made running a regular part of his research routine. He says hy stepping away from his desk and jogging along the adjacent Charles River, he finds he can think about problems “in a different way.”

“Whenever you’re working on a computer or with a paper in front of you, you tend to look at the details,” he explains. “Whereas, if you step back and do something that detaches you from the specifics of the equations you can think about the general processes more.”

Running is a lifestyle for Petersen. He ran as a collegiate athlete at Arizona State University and at North Carolina State University. Since then, he has also completed several marathons, including the New York Marathon, the San Francisco Marathon, and the Boston Marathon — which he has run three times.

And when Petersen isn’t running on pavement, he’s studying it. His work revolves around the material science of concrete and asphalt and, in particular, how they degrade due to their chemical composition or various stresses like temperature change.

In a climate like that of Boston, temperature changes can generate considerable stresses inside pavements. “Something I’ve been studying for much of my PhD is the mechanics of how internal loads develop due to various physical properties,” he says.

“Here in Boston, temperature cycles will have a significant impact. Pavements are laid down on a substrate, and if they expand or contract on the substrate and, the substrate resists that expansion or contraction, cracks can occur. I often see them when I run, actually,” he says. As a result, he says, an ideal subgrade should be stiff and well bonded to the pavement so that loads are effectively transferred.

Yet, sometimes cracks can be beneficial. “Often we try to release energy in the pavement by cutting joints,” says Petersen, “and in that case, it’s not effective to have a rigid connection because you are trying to create cracks that relieve the stresses.” This is why the average sidewalk has lines cut through it, Petersen explains. The lines direct the cracks away from the surface of the sidewalk and discourage it from storing mechanical stress.   

Pavement durability also depends on not just how the concrete is set, but also on the textures and composition of the materials. Some of Petersen’s work looks at the formation of calcium-silicate-hydrates (C-S-H), which occur when water and cement clinker, the chemical source and a catalyst for nucleation, are mixed to create cement. Ultimately, this cement is mixed with filler materials like sand and gravel to form concrete.

When studying the formation of C-S-H, which occur as nanometer-sized particles, Petersen looks at two key variables — the ability of the particles to diffusive through space and their stability with respect to one another. He has found that when C-S-H form an unstable, rapidly-moving phase the final product looks almost like the pattern of cowhide, with large bubbles of clinker mixing with large pockets of air. This kind of heterogeneous pattern tends to generate more stresses, and, in turn, decreases durability.

However, when C-S-H forms slowly, its pattern becomes more diffuse and homogenous, with small, evenly spaced air pockets and solids. This latter formation is by far the sturdier. Though altering these final patterns is difficult, thanks to his modeling, Petersen has helped to understand how particle mobility and stability determine these final patterns. These findings could provide others with a framework to better engineer nanotextures and, in turn, create a tougher, more resilient material.

As Petersen nears the completion of his PhD, his work on pavements has absorbed much of his attention. While he continues to run, he has chosen to cut back on marathons for the time being.

“I’m not so happy about it,” he laughs, “I’m only running three or four times a week.”

He’s already eying his next marathon, however. Once he finishes his dissertation, he plans to fly to Germany to compete in the Berlin Marathon where he hopes to break 2 hours and 30 minutes. Unlike the past marathons he has run, the Berlin Marathon is relatively flat. “I think if I run Berlin,” he says hopefully, “that that pace might just be possible.”

The MIT Concrete Sustainability Hub (CSHub) is a team of researchers from several departments across MIT working on concrete and infrastructure science, engineering, and economics. Its research is supported by the Portland Cement Association and the Ready Mixed Concrete Research and Education Foundation.



de MIT News http://bit.ly/2BDieAe

Oil and water: Studying the Persian Gulf's most pressing environmental dilemmas

On a black monitor in a dusty office in MIT’s Green Building, an iceberg the width of three football fields wallows in the shallow, briny waters of the Persian Gulf, 6,000 miles from its home.

Facing the screen is Maryam Rashed Alshehhi, a visiting assistant professor and recent doctoral graduate from the Masdar Institute of Science and Technology in the United Arab Emirates (UAE). Using advanced climate models, Alshehhi is estimating how quickly an iceberg will melt in one of the warmest regions in the world, and how existing water and wildlife, long accustomed to a parched dessert climate, will react.

The mirage-like simulation is not entirely hypothetical. Engineering firms in the UAE have recently touted plans to tow chunks of ice wrapped in plastic from Antarctica to the coast of the Persian Gulf to be used as fresh drinking water. There is perhaps no one better to vet the plan’s plausibility than Alshehhi, who in 2016 became the first United Arab Emirates national to get a PhD in earth observation and ocean color remote sensing. 

Since March, Alshehhi has been working in the Department of Earth, Atmospheric and Planetary Sciences’ (EAPS) Marshall Group to create the first-ever climate models of the Persian Gulf, a small body of water in an arid region where water availability and environmental standards have historically taken a back seat to rapid urbanization and oil interests.

Alshehhi comes to Cambridge, Massachusetts, as a part of a collaboration between MIT and Masdar Institute, a research-focused graduate university located in Abu Dhabi. The program, called the MIT/MI Cooperative Program, started in 2007 with the aim of strengthening the country’s research capabilities to transform its oil-based economy. Over 93 joint research projects have resulted, many focusing on sustainability issues, such as Abu Dhabi’s urban heat island effect and the Persian Gulf nation’s growing thirst for fresh water. 

“There’s really been a surge of interest in sustainability solutions in my country,” says Alshehhi. “And fresh water is one of biggest problems that we face. It’s really become the motivation for my research.”

Alshehhi grew up on the heels of the oil boom in the UAE. The second youngest of seven siblings, she was raised in Ras Al-Khaimah, a coastal city of a 345,000 or so nestled in between Dubai and the country’s easternmost tip. The explosive rise of industry in the country paved the way for her and her siblings to be the first college-educated generation in her family.

As an undergrad, she followed in one of her older sister’s footsteps, studying chemical engineering at Khalifa Institute before switching to environmental engineering during graduate studies at Masdar Institute. Oceanography, she realized once in graduate school, proved to be a challenging field of research in the UAE.

For one, collecting data in the Persian Gulf “is not easy at all,” she says. The extreme heat, which averages  41 degrees Celsius (106 degrees Fahrenheit) during the summer, makes for painstaking fieldwork. And on numerous occasions a research vessel that Alshehhi was aboard was ordered to turn around because it was approaching another coastal country’s waters.

But more often, Alshehhi recalls the excitement of being a part of the nascent field. She contributed to the first field campaign in the gulf region that followed NASA protocol, resulting in the first data on optical oceanography in the Arabian Sea.

“I love that research like this opens your eyes to new things,” she says. “And that you’re never doing the same routine day after day.” 

Alshehhi’s graduate research focused on monitoring the Persian Gulf from above, using satellite imagery to track biological life. The work produced a clearer picture into the enormous economic and ecological cost of the country’s engineering infrastructure, such as its desalination plants.

Currently, to meet fresh water demands, the country relies on 31 water desalination plants to run around the clock, purifying the equivalent of 4 billion bottles of saltwater a day. The plants pump the waste, a salty sludge, back into the gulf, resulting in steadily rising salinity levels in the Persian Gulf, threatening local fauna and marine life and causing outbreaks of algae blooms.

“These blooms then come into the dissemination plants and plug the filters, and we get a shortage of water,” Alshehhi explains. “In 2008, some plants were closed for three months, and we had a lot of issues.”

Dust storms in the region make it difficult to use conventional satellite images to detect these harmful algal blooms. Alshehhi and collaborators developed algorithms to clear the noise from these images by looking at the spectrum of satellite signals and eliminating some of the radiance of the aerosols.

The results, which are published in Geoscience and Remote Sensing Symposium, became Alshehhi’s PhD dissertation. In 2016, Alshehhi became the first UAE national to successfully defend a doctoral thesis in her field. 

Today, at MIT, Alshehhi’s work on the Persian Gulf goes beyond looking at just its surface. Together with John Marshall, EAPS Cecil and Ida Green Professor of Oceanography and expert in global ocean circulation modeling, she is creating in-depth models of the Persian Gulf using numerical and theoretical frameworks supported by field and satellite data.

The project, which is in the first of its three-year stint, will be the first record of the Persian Gulf’s water columns, circulation, and climate dynamics. Later on, these physical models will be overlaid with biogeochemical models to see how effects like increasing pollution, changing saline levels, and rising temperatures are influencing biological beings in the gulf.

“I’m really enjoying working with Maryam,” says Marshall. “I am learning so much from her about the [Persian] Gulf and its oceanography, but also something of the people and customs of the region, and the environmental challenges they face.”

Marshall says he’s looking forward to future planned visits to the UAE, where these models will help answer unsolved questions like where the salty sludge from the desalination plants ends up, the link between the dust storm deposits and biological life, and what kind of nutrients are settling into the water — and at what depth. 

“There are a lot of things that we don’t know, things we have never seen before,” says Alshehhi.

The research also comes during a time when the UAE faces some urgent environmental dilemmas. Alshehhi is hopeful her science can provide better insight into such issues, such as the safety of the first nuclear power plant under construction in Abu Dhabi, the impact from pollution left behind from the more than 100 oil tankers passing through the gulf every day, and whether an iceberg can prevent a fresh water shortfall and meet future demand.

These problems may be close to home for Alshehhi, but she stresses the broad implications of her research. 

“People may look at the project like, ‘That’s a small region,’” says Alshehhi. “But if you look at the bigger picture, there are a lot of research questions that we can take out on this project. Maybe there is a new thing that we have never observed about the gulf that can be applied to a similar region.” 

Alshehhi’s appointment is for a year, which means that next March, she will shed her recently purchased winter coat and boots and return home to the UAE.

“I am looking forward to going home and taking this experience with me,” she says. “It’s so important to come here and learn from experts like John and others so we can improve education and research back home.”

The snow, she adds with a laugh, makes it a little easier to leave.



de MIT News http://bit.ly/2rTTCib

viernes, 21 de diciembre de 2018

Lidar accelerates hurricane recovery in the Carolinas

Hurricane Florence's slow trot over North and South Carolina in September led to inundating rain, record storm surges, and another major disaster for the Federal Emergency Management Agency (FEMA) to contend with. Facing damage over hundreds of square miles, FEMA again called upon MIT Lincoln Laboratory to use their state-of-the-art lidar system to image the destruction in the region.

Installed onto an airplane and flown nightly over a disaster site, lidar (which stands for light detection and ranging) sends out pulses of laser light that bounce off the land and structures below and are collected again by the instrument. The timing of each light pulse's return to the instrument is used to build what researchers call a "point-cloud map," a high-resolution 3-D model of the scanned area that depicts the heights of structures and landscape features. Laboratory analysts can then process this point-cloud data to glean information that helps FEMA focus their recovery efforts — for example, by estimating the number of collapsed houses in an area, the volume of debris piles, and the reach of flood waters.

Yet quickly sending the nearly two terabytes of data from a single night's scan, or sortie, to the Laboratory for processing is a challenge. After a storm, local internet connections may be gone or spotty. When the Laboratory used this same lidar platform after Hurricane Maria in Puerto Rico, downed networks meant having to physically ship a hard drive back to Massachusetts — a more than two-day delay in getting the data into analysts' hands. When the team started the campaign in the Carolinas in mid-September, they faced the same obstacle.

This time, the obstacle was hurdled thanks to MCNC. The nonprofit organization formerly known as the Microelectronics Center of North Carolina is based out of Research Triangle Park near Durham, North Carolina, which was not directly affected by Hurricane Florence. MCNC gave the Laboratory free access to their North Carolina Research and Education Network (NCREN).

"Our state was hit hard by Hurricane Florence," says Tommy Jacobson, MCNC's chief operating officer and vice president. "For MCNC's leadership, it was a quick and easy decision to enable MIT, who was in the state to assist FEMA, with access to our network resources to help however we could in making sure relief got to those that needed it."

NCREN is North Carolina's broadband backbone, connecting more than 750 institutions including all public school districts in the state, universities and colleges, public safety locations, and other community anchor institutions. Access for the Laboratory meant rack space for equipment inside the MCNC data center. From there, MCNC provisioned a 10-gigabit IP connection from the NCREN to Internet2, an ultrafast, advanced network that connects research centers around the world. This connection gave the team the ability to upload large volumes of data daily from their equipment inside the data center back to a computing center on MIT campus that is also connected to Internet2.

From there, another 10-gigabit connection bounced the data from campus to the Lincoln Laboratory Supercomputing Center in Holyoke, where the data were processed.

"The 10-gig uplink from MCNC allowed us to transmit the data at such a higher speed that some of our uploads were done in about six to seven hours," says Daniel Ribeirinha-Braga, a member of the Laboratory's data management team in this hurricane effort. "Keep in mind that this is lidar data, which we get about 1.5 to 1.9 terabytes a night of, that needs to be copied to multiple places, such as other hard drives, organized to a single SSD [solid-state drive], and then uploaded to the Laboratory from MCNC." 

The collaboration between the Laboratory and MCNC came about through Matt Daggett, a staff member in the Humanitarian Assistance and Disaster Relief (HADR) Systems Group. He had worked at MCNC more than a decade ago.

"I was aware of the NCREN backbone and the data center on the MCNC campus," Daggett says. "When it became clear that our flight operations would be based out of the RDU [Raleigh–Durham International] airport, I knew MCNC would be the perfect place to get our data onto the Internet2."

Adds Jacobson: "We were grateful that MIT sought us out to provide that help," 

With the data processing underway, the Laboratory has begun delivering reports to FEMA. The lidar imagery reveals things that would be impossible for FEMA to know from looking only at satellite images. "The most important difference between a satellite image and the lidar image is that you can do 3-D measurements on it," says Anthony Lapadula of the HADR Systems Group. "So, because it's 3-D data, we can do things like tell you how big a hole in a road is, or tell you how big an elevation drop is as a result of a landslide."

One of the greatest advantages of the lidar work has been the time saved for FEMA. When someone reported damage at a specific location, FEMA could assess the damage quickly by asking Laboratory analysts to virtually visit the location in the point-cloud map and report what they found.

For instance, FEMA asked them to zoom in on a small town on the Lumber River in North Carolina that had been inundated with flood waters. Analysis of the data told FEMA the extent of the flood inundation, the volume of debris piles in the town, and changes in the river's path. There were also environment questions to be answered, such as what the volume was of a coal ash pile to determine how much, if any, washed away with flood waters. They could also check in on public infrastructure, like a lock and dam along the Cape Fear River that the data showed to be completely flooded. 

Completing the 40 sorties requested over the Carolinas took the team several weeks to complete, right up until Thanksgiving. The sorties covered areas down the coastline from the Cape Lookout National Seashore to Myrtle Beach and through bands stretching inland. These hundreds of miles of lidar data were processed to a resolution of approximately 25 centimeters. To put that resolution into perspective, Lapadula says that if the scanned areas were covered completely with basketballs, they would be able to precisely measure the location of each ball.

But with only one of these extremely advanced systems available for use, Lincoln Laboratory staff are limited in how much area they can cover and how many disasters they can respond to with the technology. The lidar system was originally developed by the Active Optical Systems Group, who has been assisting the HADR Systems Group with data collection, processing, and algorithm development.

Several industry collaborators also participated in this effort. Employees from the small business 3DEO, which specializes in Geiger-mode lidar technology, assisted with the data collection. The small business LEEP has been helping with data analytics and providing training to FEMA analysts to facilitate the use of the data. Another partner, Basler Turbo Conversions, supported engineering aspects of installing the lidar on the BT-67 aircraft, which is being leased from the company AIRtec. 

While the laboratory has been involved in disaster recovery since the 2010 Haiti earthquake, it has never been so active in these efforts as in the past year since Hurricanes Harvey, Irma, and Maria in 2017.

"We went a decade without a major hurricane hitting the continental United States," Lapadula reflected. "Now, it's like they just keep coming."



de MIT News http://bit.ly/2EGL3Pc

Three at MIT named 2018 fellows of the National Academy of Inventors

MIT President L. Rafael Reif and two engineering faculty members have been named 2018 fellows of the National Academy of Inventors (NAI). In alphabetical order:

Linda G. Griffith is the School of Engineering Teaching Innovation Professor of Biological and Mechanical Engineering and a MacVicar Fellow at MIT, where she directs the Center for Gynepathology Research. Her research is in the field of regenerative medicine and tissue engineering. Her laboratory was the first to combine a degradable scaffold with donor cells to create tissue-engineered cartilage in the shape of a human ear. The 3-D printing process she co-invented for creation of complex biomaterials scaffolds is used for manufacture of FDA-approved scaffolds for bone regeneration. She also contributed new concepts to nanoscale biophysical control of receptor engagement by biomaterials, and has developed and commercialized a microfluidic multiwell bioreactor for 3-D culture models of liver and other tissues.

Muriel Médard is the Cecil H. Green Professor of Electrical Engineering and Computer Science at MIT, where she leads the Network Coding and Reliable Communications Group at the Research Laboratory for Electronics. Her research interests are in the areas of network coding and reliable communications, particularly for optical and wireless networks. Her work in network coding, hardware implementation, and her original algorithms have received widespread recognition and awards. Médard is the co-founder of three companies to commercialize network coding — CodeOn, Steinwurf and Chocolate Cloud.

Rafael Reif serves as president of MIT. He pioneers efforts to help shape the future of higher education and champions both fundamental science and MIT’s signature style of interdisciplinary, problem-centered research. Among other things, Reif developed the Institute’s latest experiments in online learning, MITx and edX, which he spearheaded in his previous role as MIT provost. He launched The Engine, an accelerator specially geared to help “tough tech” ventures deliver innovations that address humanity’s great challenges. To advance the frontier of human and machine intelligence to accelerate the invention of artificial intelligence tools for every discipline, Reif announced the MIT Quest for Intelligence in February 2018. And in October 2018, in response to the ubiquity of computing and the rise of AI across disciplines, he announced the MIT Stephen A. Schwarzman College of Computing, the most significant reshaping of MIT since the 1950s.

Election to NAI Fellow status is the highest professional distinction accorded to academic inventors who have demonstrated a prolific spirit of innovation in creating or facilitating outstanding inventions that have made a tangible impact on quality of life, economic development, and the welfare of society.

The 2018 class of Fellows represent 125 research universities and governmental and nonprofit research institutes worldwide and are named inventors on nearly 4,000 issued U.S. patents. To date, there are over 1,000 NAI Fellows who have generated more than 11,000 licensed technologies and companies, created more than 1.4 million jobs, and generated over $190 billion in revenue.

The 2018 NAI Fellows will be inducted on on April 11, 2019, as part of the Eighth Annual Conference of the National Academy of Inventors in Houston, Texas. U.S. Commissioner for Patents Andrew H. Hirshfeld will deliver the keynote address for the induction ceremony. In honor of their outstanding accomplishments, fellows will receive a special trophy, medal, and rosette pin.



de MIT News http://bit.ly/2ByasYw

On the right path to fusion energy

A new report on the development of fusion as an energy source, written at the request of the U.S. Secretary of Energy, proposes adoption of a national fusion strategy that closely aligns with the course charted in recent years by MIT’s Plasma Science and Fusion Center (PSFC) and privately funded Commonwealth Fusion Systems (CFS), a recent MIT spinout.

Fusion technology has long held the promise of producing safe, abundant, carbon-free electricity, while struggling to overcome the daunting challenges of creating and harnessing fusion reactions to produce net energy gain. But the Consensus Study Report from the National Academies of Science, Engineering, and Medicine states that magnetic-confinement fusion technology (an MIT focus since the 1970s) is now “sufficiently advanced to propose a path to demonstrate fusion generated energy within the next several decades.”

It recommends continued U.S. participation in the international ITER fusion facility project and “a national program of accompanying research and technology leading to the construction of a compact pilot plant that produces electricity from fusion at the lowest possible capital cost.”

That approach (which the report says would require up to $200 million in additional annual funding for several decades) leverages opportunities presented by new-generation superconducting magnets, reactor materials, simulators, and other relevant technologies. Of particular emphasis from the committee is the advances in high-temperature superconducting magnets which can access higher fields and smaller machines. The report recommends a U.S. program to prove out high-field large-bore magnets. They are seen as enabling faster and less-costly cycles of learning and development than extremely large experiments like ITER, which will not come on line until 2025, while still benefitting from the knowledge that emerges from those programs.

This smaller-faster-cheaper approach is embodied in the SPARC reactor concept, which was developed at the PSFC and forms the foundation of CFS’s aggressive effort to demonstrate energy-gain fusion by the mid-2020s and produce practical reactor designs by the early 2030s. This approach is based on the similar conclusion that high-field high-temperature magnets represent a game-changing technology.  A $30 million program between CFS and MIT to demonstrate the high-field large bore superconducting magnets is underway at MIT and is a key step to a compact fusion energy system. Despite a handful of other privately funded fusion companies having offered roughly comparable timelines, the National Academies report does not envision demonstration fusion reactors appearing until the 2050 time frame.

The report also affirms that the scientific underpinnings of the tokamak approach have been strengthened over the previous decade, giving increasing confidence that this approach, which is the basis of ITER and SPARC, is capable of achieving net energy gain and forming the basis for a power plant. Based on this increased confidence the committee recommends moving forward with technology developments for a pilot power plant that would put power on the grid.

“The National Academies are a very thoughtful organization, and they’re typically very conservative,” says Bob Mumgaard, chief executive officer of CFS. “We’re glad to see them come out with a message that it’s time to move into fusion, and that compact and economical is the way to go. We think development should go faster, but it gives validation to people who want to tackle the challenge and lays out things we can do in the U.S. that will lead toward putting power on the grid.”

Andrew Holland, director of the recently formed Fusion Industry Association and Senior Fellow for Energy and Climate at the American Security Project, notes that the report’s authors were charged with creating “a consensus science report that reflects current pathways, and the current pathway is to build ITER and go through the experimental process there, while meanwhile designing a pilot plant, DEMO.”

Shifting the consensus toward a faster way forward, adds Holland, will require experimental results from companies like CFS. “That’s why it’s notable to have privately funded companies in the U.S. and around the world pursuing the scientific results that will bear this out. And it’s certainly important that this study is aimed at getting the government-based science community to think about a strategic plan. It should be seen as part of a starting gun for the fusion community coming together and organizing its own process.”

Or, as Martin Greenwald, deputy director of the PSFC and a veteran fusion researcher, puts it, “There’s a tendency in our community to argue about a 20-year plan or a 30-year plan, but we don’t want to take our eyes off what we need to do in the next three to five years. We might not have consensus on the long scale, but we need one for what to do now, and that’s been the consistent message since we announced the SPARC project — engaging the broader community and taking the initiative.

“The key thing to us is that if fusion is going to have an impact on climate change, we need answers quickly, we can’t wait until the end of century, and that’s driving the schedule. The private money that’s coming in helps, but public funding should engage with and complement that. Each side has an appropriate role. National labs don’t build power plants, and private companies don’t do basic research.”

While several approaches to fusion are being pursued in public and private organizations, the National Academies report focuses exclusively on magnetic confinement technology. This reflects the report’s role in the Department of Energy’s response to a 2016 Congressional request for information on U.S. participation in ITER, a magnetic-confinement project. The report committee’s 19 experts, who conducted two years of research, were also charged with exploring related questions of “how best to advance the fusion sciences in the U.S.” and “the scientific justification and needs for strengthening the foundations for realizing fusion energy given a potential choice of U.S. participation or not in the ITER project.”

The report’s publication comes at a time of renewed activity and interest in fusion energy, with some 20 private companies pursuing its development, increased funding in the most recent federal budget, and the formation of the Fusion Industry Association to advocate for the community as a whole. But the report cautions that “the absence of a long-term research strategy for the United States is particularly evident when compared to the plans of our international partners.”

That situation may be evolving. “We had a very nice meeting of stakeholders a month and a half ago in DC, and there was a lot of resonance among private companies, the research community, the Department of Energy, and Congressional staffers from both parties,” says Greenwald. “It seems like there’s momentum, though we don’t know yet just what form it will take.” He adds that the establishment of an industry association is very helpful for navigating and communicating in Washington.

“We would love to see the government have a role in things that lift all fusion companies, like advanced materials labs, the process of extracting heat from reactors, additive manufacturing, simulations, and other tools,” says Mumgaard. “There are many opportunities for collaboration and cooperation; every company will have a different mix of partnerships, even on personnel exchange as we do with MIT.”



de MIT News http://bit.ly/2Aa6QMp

jueves, 20 de diciembre de 2018

How different types of knowledge impact the growth of new firms

Diversifying into new industries is vital to an economy’s ability to grow and generate wealth. But to branch out into new industrial activities, a city, region or country must first have a pool of people with the right mix of knowledge and experience to make those pioneering firms a success.

So how do local economies ensure they have the right mix of experience to allow new ventures to thrive?

In a paper published this week in the Proceedings of the National Academy of Sciences, a team led by César A. Hidalgo, director of the Collective Learning Group in the MIT Media Lab, studied the effects of occupation-specific, industry-specific, and location-specific knowledge on the success of pioneer firms. These are firms operating in an industry that has not previously been present in a region.

They found these pioneering firms were significantly more likely to survive and grow when their first hires were people with experience in the same or a related industry, rather than those who had experience carrying out the same type of job.

The notion that knowledge is central to driving growth, for which economist Paul Romer was awarded the Nobel Prize in economics earlier this year, is already well-established.

The new paper also builds on previous work by Hidalgo’s group over the past decade, including a paper in Science in 2007, in which the researchers developed measures of how economies are able to successfully move into new products based on how closely related they are to their existing product base.

“We know these diversification events are more likely to happen when you have related activities at a location, but someone must still be the first to enter,” Hidalgo says. “That pioneer has to get their knowledge from somewhere.”

To understand where this knowledge comes from, and what type of knowledge is likely to lead to the greatest success, the team, who also included lead author and MIT PhD student Cristian Jara-Figueroa, MIT post doc Bogang Jun, and Edward Glaeser, the Fred and Eleanor Glimp Professor of Economics at Harvard University, investigated the different types of experience that workers carry with them when they join a new firm.

They used data from 2002 to 2013 from Brazil’s Annual Social Security Information Report (RAIS). The RAIS dataset covers around 97 percent of the country’s formal labor market, and includes fine-grained information on individual workers. Using this dataset, they studied the workforce hired by new pioneer firms within a region, to identify the industry, occupation and location of their previous jobs.

“So for a nurse in a hospital, their knowledge of nursing is their occupation-specific knowledge, while their experience in a hospital is their industry-specific knowledge,” Hidalgo says.

They found that it is far better for pioneer firms to hire people with industry-specific knowledge, even if those workers had a very different occupation in their previous job.

They then compared these results with those of new firms in a region that were not pioneers, but instead were involved in an industry that was already present in the area. They found that industry-specific knowledge was significantly more important for pioneer than nonpioneer firms.

Location-specific knowledge proved to be the second most important type of experience, while occupation-specific knowledge was not significant at all for pioneer firms, and provided a small boost for nonpioneers.

“These results strongly suggest that when regions try to develop new industries, they should focus on accumulating industry-specific knowledge that entrepreneurs can leverage,” Jara-Figueroa says. “Once the industry has been developed in the place, both types of knowledge become important.”

It may be that knowledge of an industry can only be acquired while working within it, making it harder to pass on to others than occupation-based skills and ideas that can be taught, says Hidalgo.

What’s more, industry knowledge is important because it includes a familiarity with the social network within that sector. So, for example, someone who has worked in a particular industry for some time will have a better understanding of both the suppliers and customers within the sector, and the firm’s competitors, Hidalgo adds.

The research has particular implications for governments in the developing world, according to Glaeser.

“This is related to the broad question of whether the developing world needs foreign direct investment, or whether it can succeed with home-grown entrepreneurship,” Glaeser says. “Our paper supports the view that domestic entrepreneurship can work, as long as it has access to the relevant forms of industry-specific capital.”

The research also suggests there may be a need for governments to develop more industry-specific, rather than occupation-based, education programs, Hidalgo says.

The research team now plan to investigate how this demand for industry-specific knowledge varies from industry to industry.



de MIT News http://bit.ly/2Ezlm3b

Modeling climate risk where it hits home

Long-term assessment of likely regional and local climate impacts is critical to enabling municipalities, businesses, and regional economies to prepare for potentially damaging and costly effects of climate change — from prolonged droughts to more frequent and intense extreme events such as major storms and heatwaves.

Unfortunately, the tools most commonly used to project future climate impacts, Earth-system models (ESMs), are not up to the task. ESMs are too computationally time consuming and too expensive to run at sufficient resolution to provide the detail needed at the local and regional level.

To that end, a new MIT-led study in the journal Earth and Space Science uses a regional climate model of the northeastern United States to downscale the middle and end-of-century climate projections of an ESM under a high-impact emissions scenario to a horizontal resolution of 3 kilometers. Through downscaling, output from the ESM was used to drive the regional model at a higher spatial resolution, enabling it to simulate local conditions in greater detail. The resulting high-resolution climate projections consist of more than 200 climate variables at an hourly frequency.

Among other things, the study projects that between now and the end of the century, the region will experience significantly more days per year in which mean and maximum temperatures exceed 86 degrees Fahrenheit, and fewer days per year in which the minimum temperature falls below freezing. Over that period in Boston, the annual number of days the mean temperature exceeds 86 F increases from three to 22, and the number of days the daily maximum temperature exceeds 86 F increases from 49 to 78.

“Our approach allows for analysis of changes in temperature, precipitation, and other climate variables within a single 24-hour period,” says Muge Komurcu, the lead author of the study and a research scientist with the MIT Joint Program on the Science and Policy of Global Change and Department of Earth, Atmospheric and Planetary Sciences (EAPS). “The aim of these projections is to support further assessments of climate change impacts and sustainability studies in the region.”

Downscaling of climate projections provides climate variables at the resolution needed to assess climate change impacts at regional and local scales. As a result, the variables produced in the study may be used as input to other models and analyses to assess the likely future impact of climate change on extreme precipitation and heat wave events, regional ecosystems, agriculture, the spread of infectious diseases (e.g. Lyme disease), hydrology, the economy, and other concerns.

To produce the study’s climate variables, the researchers used a high-resolution regional climate model, the Weather Research and Forecasting (WRF) model, to downscale middle and end-of-century climate projections of the Community Earth System Model (CESM) under a high greenhouse gas emissions scenario to a horizontal resolution of 3 kilometers for the northeastern U.S.

To ensure that their method is reliable, Komurcu and her co-authors — MIT EAPS professor of atmospheric science Kerry Emanuel and Purdue University Professor Matthew Huber and PhD student Rene Paul Acosta — simulated the process using historical climate observations. They showed that their technique reproduced observed historical mean and extreme climate events over a 10-year period.

The study’s 200-plus, 3-kilometer-resolution climate variables cover 55 years, encompassing middle and end-of-century time periods.

To our knowledge, this is the first and only study that has downscaled global model projections to such a high resolution for a long time period for this region,” says Komurcu.

To assist regional assessments of climate change impacts and sustainability studies in the northeastern U.S., the researchers plan to make all model input and output files from this study publicly available through the University of New Hampshire’s Data Distribution Center. The study was supported by the National Science Foundation through the New Hampshire Established Program to Stimulate Competitive Research.



de MIT News http://bit.ly/2EIh7Do

Gut-brain connection signals worms to alter behavior while eating

When a hungry worm encounters a rich food source, it immediately slows down so it can devour the feast. Once the worm is full, or the food runs out, it will begin roaming again.

A new study from MIT now reveals more detail about how the worm’s digestive tract signals the brain when to linger in a plentiful spot. The researchers found that a type of nerve cell found in the gut of the worm Caenorhabditis elegans is specialized to detect when bacteria are ingested; once that occurs, the neurons release a neurotransmitter that signals the brain to halt locomotion. The researchers also identified new ion channels that operate in this specialized nerve cell to detect bacteria.

“In terms of a precise mechanism of how the gut signals back up to the brain, it was unclear what was going on,” says Steven Flavell, an assistant professor of brain and cognitive sciences and a member of MIT’s Picower Institute for Learning and Memory. “Food is something that really motivates this animal, so people have studied this for a long time, but the mechanism of how food ingestion is detected by the nervous system to drive a behavioral change, that had really been missing.”

Flavell is the senior author of the study, which appears in the Dec. 20 issue of Cell. Jeffrey Rhoades, a former technical assistant in the Picower Institute, is the paper’s first author.

Gut-brain connection

In all animals, the gut and the brain have a strong connection. Signals from our gastrointestinal tract let us know when we’re full and help to control our appetite, via hormones such as leptin and ghrelin. The GI tract is also the source of most of the body’s serotonin, which also play a role in appetite.

Additionally, the digestive tract has its own semi-independent nervous system, known as the enteric nervous system. This system of neurons governs GI functions such as contraction of the digestive organs and the secretion of hormones and digestive enzymes.

While the full complexities of the human enteric nervous system have yet to be fully understood, many researchers use C. elegans, which has a much simpler nervous system, as a model to study feeding behavior. Researchers have previously shown that food greatly influences the locomotion of C. elegans. The worm’s diet consists mainly of bacteria, and scientists found that whenever the worms encounter a large patch of food, they slow down to consume it. When they are satiated, Flavell and his colleagues found that the animals begin wandering around their environment again. Thanks to their recent study, we are beginning to understand why.

“There was behavioral evidence that C. elegans’ nervous system is receiving information about the food in the environment, but we didn’t know how that worked,” Flavell says.

The researchers knew that serotonin release was driving the slowdown, but they didn’t know what was triggering that release. To try to figure that out, they decided to study a type of serotonin-producing enteric neurons called neurosecretory-motor (NSM) neurons, which are located in the lining of the C. elegans digestive organ, known as the pharynx.

Through a series of experiments, the researchers found that NSM neurons become active immediately when worms eat bacterial food. NSM neurons have a single long extension, or neurite, that projects into the worms’ gut. The researchers also discovered that this neurite acts as a sensory ending for NSM neurons, playing a key role in activating the neurons when the animal eats bacterial food.

Further studies revealed two new ion channels, called DEL-3 and DEL-7, that are located at the very tip of this neurite and are required for NSM neurons to be activated by bacterial food. These channels are part of a family of proteins called acid-sensing ion channels (ASICs), which are found in all animals including humans. Some of these channels have roles in taste and in pain detection, while others’ functions are still unknown. Intriguingly, these channels are also expressed in enteric neurons in the mammalian gut. Flavell speculates that ASICs may play a general role in sensing bacterial populations present along the digestive tract and potentially elsewhere.

“A particularly intriguing aspect of this work is the identification of the putative food sensors in NSM,” says Piali Sengupta, a professor of biology at Brandeis University who was not involved in the study. “These turn out to be members of the ASIC ion channel family, which are activated by multiple stimuli in other systems, including mechanical stimuli and pH. In C. elegans, these channels appear to be activated by an as yet unidentified component of bacteria, the food source of C. elegans.”

The researchers are now trying to figure out how DEL-3 and DEL-7 detect bacteria. One possibility is that the channels directly detect a compound secreted by bacteria. Alternatively, a bacterial compound may interact with a nearby receptor that then activates the ion channels, Flavell says.

Many neurons

Flavell’s lab is also planning to study other C. elegans neurons that also have extensions into the gut, to see if they play a similar role to the NSM neurons, possibly detecting other components of bacteria or other food cues.

“With the tools that we now have in place, it should be straightforward for us to go into these other cell types and ask if they are activated by food ingestion, and if so, what kinds of channels do they express,” Flavell says. “There are about 30 other ion channels that are closely related to DEL-3 and DEL-7 in the worm, and they might be detecting other bacterial signals.”

The researchers are also exploring in more detail the effects of serotonin on the rest of the animals’ brain. Once the ion channels in the NSM cells are activated, the cells begin releasing serotonin, which can then be detected by nearby neurons that express serotonin receptors.

“We’re trying to really look at the whole rest of the nervous system to see how serotonin changes the activity of many downstream cells to ultimately lead to this behavioral change,” Flavell says.

Another unanswered question is whether this basic mechanism for bacterial sensing that the Flavell lab has discovered also operates in humans. The human gut contains a diverse array of bacteria, referred to as the microbiome. In addition, the gut is lined with neuron-like cells called enterochromaffin cells that make serotonin and release it to sensory neurons that carry information to the brain. Researchers are only just beginning to understand the channels and receptors that might allow these cells to detect the contents of the gut, and how serotonergic gut-to-brain signaling might alter behavior in mammals. These new mechanisms worked out in the worm may guide future studies in this area.

The research was funded by the JPB Foundation, the Picower Institute Innovation Fund, the Picower Neurological Disorders Research Fund, the NARSAD Young Investigator program, the National Institutes of Health, the Howard Hughes Medical Institute, and the National Science Foundation.



de MIT News https://ift.tt/2Gvo2kQ

New threat to ozone recovery

Earlier this year, the United Nations announced some much-needed, positive news about the environment: The ozone layer, which shields the Earth from the sun’s harmful ultraviolet radiation, and which was severely depleted by decades of human-derived, ozone-destroying chemicals, is on the road to recovery.

The dramatic turnaround is a direct result of regulations set by the 1987 Montreal Protocol, a global treaty under which nearly every country in the world, including the United States, successfully acted to ban the production of chlorofluorocarbons (CFCs), the main agents of ozone depletion. As a result of this sustained international effort, the United Nations projects that the ozone layer is likely to completely heal by around the middle of the century.

But a new MIT study, published today in Nature Geoscience, identifies another threat to the ozone layer’s recovery: chloroform — a colorless, sweet-smelling compound that is primarily used in the manufacturing of products such as Teflon and various refrigerants. The researchers found that between 2010 and 2015, emissions and concentrations of chloroform in the global atmosphere have increased significantly.

They were able to trace the source of these emissions to East Asia, where it appears that production of products from chloroform is on the rise. If chloroform emissions continue to increase, the researchers predict that the recovery of the ozone layer could be delayed by four to eight years.

“[Ozone recovery] is not as fast as people were hoping, and we show that chloroform is going to slow it down further,” says co-author Ronald Prinn, the TEPCO Professor of Atmospheric Science at MIT. “We’re getting these little side stories now that say, just a minute, species are rising that shouldn’t be rising. And certainly a conclusion here is that this needs to be looked at.”

Xuekun Fang, a senior postdoc in Prinn’s group, is the lead author of the paper, which includes researchers from the United States, South Korea, Japan, England, and Australia.

Short stay, big rise

Chloroform is among a class of compounds called “very short-lived substances” (VSLS), for their relatively brief stay in the atmosphere (about five months for chloroform). If the chemical were to linger, it would be more likely to get lofted into the stratosphere, where it would, like CFCs, decompose into ozone-destroying chlorine. But because it is generally assumed that chloroform and other VSLSs are unlikely to do any real damage to ozone, the Montreal Protocol does not stipulate regulating the compounds.

“But now that we’re at the stage where emissions of the more long-lived compounds are going down, the further recovery of the ozone layer can be slowed down by relatively small sources, such as very short-lived species — and there are a lot of them,” Prinn says.

Prinn, Fang, and their colleagues monitor such compounds, along with other trace gases, with the Advanced Global Atmospheric Gases Experiment (AGAGE) — a network of coastal and mountain stations around the world that has been continuously measuring the composition of the global atmosphere since 1978.

There are 13 active stations scattered around the world, including in California, Europe, Asia, and Australia. At each station, air inlets atop typically 30-foot-tall towers pull in air about 20 times per day, and researchers use automated instruments to analyze the atmospheric concentrations of more than 50 greenhouse and ozone-depleting gases. With stations around the world monitoring gases at such a high frequency, AGAGE provides a highly accurate way to identify which emissions might be rising and where these emissions may originate.

When Fang began looking through AGAGE data, he noticed an increasing trend in the concentrations of chloroform around the world between 2010 and 2015. He also observed about three times the amount of atmospheric chloroform in the Northern Hemisphere compared to the Southern Hemisphere, suggesting that the source of these emissions stemmed somewhere in the Northern Hemisphere.

Using an atmospheric model, Fang and his colleagues estimated that between 2000 and 2010, global chloroform emissions remained at about 270 kilotons per year. However, this number began climbing after 2010, reaching a high of 324 kilotons per year in 2015. Fang observed that most stations in the AGAGE network did not measure substantial increases in the magnitude of spikes in chloroform, indicating negligible emission rises in their respective regions, including Europe, Australia, and the western United States. However, two stations in East Asia — one in Hateruma, Japan, and the other in Gosan, South Korea — showed dramatic increases in the frequency and magnitude of spikes in the ozone-depleting gas.

MIT researchers have back-tracked chloroform in East Asia using AGAGE measurements and 3-dimensional atmospheric transport models. Courtesy of the researchers.

The rise in global chloroform emissions seemed, then, to come from East Asia. To investigate further, the team used two different three-dimensional atmospheric models that simulate the movement of gases and chemicals, given global circulation patterns. Each model can essentially trace the origins of a certain parcel of air. Fang and his colleagues fed AGAGE data from 2010 to 2015 into the two models and found that they both agreed on chloroform’s source: East Asia.

“We conclude that eastern China can explain almost all the global increase,” Fang says. “We also found that the major chloroform production factories and industrialized areas in China are spatially correlated with the emissions hotspots. And some industrial reports show that chloroform use has increased, though we are not fully clear about the relationship between chloroform production and use, and the increase in chloroform emissions.”

“An unfortunate coherence”

Last year, researchers from the United Kingdom reported on the potential threat to the ozone layer from another very short-lived substance, dichloromethane, which, like chloroform, is used as a feedstock to produce other industrial chemicals. Those researchers estimated how both ozone and chlorine levels in the stratosphere would change with increasing levels of dichloromethane in the atmosphere.

Fang and his colleagues used similar methods to gauge the effect of increasing chloroform levels on ozone recovery. They found that if concentrations remained steady at 2015 levels, the increase observed from 2010 to 2015 would delay ozone recovery by about five months. If, however, concentrations were to continue climbing as they have through 2050, this would set a complete healing of the ozone layer back by four to eight years.

The fact that the rise in chloroform stems from East Asia adds further urgency to the situation. This region is especially susceptible to monsoons, typhoons, and other extreme storms that could give chloroform and other short-lived species a boost into the stratosphere, where they would eventually decompose into the chlorine that eats away at ozone.

“There’s an unfortunate coherence between where chloroform is being emitted and where there are frequent storms that puncture the top of the troposphere and go into the stratosphere,” Prinn says. “So, a bigger fraction of what’s released in East Asia gets into the stratosphere than in other parts of the world.”

Fang and Prinn say that the study is a “heads-up” to scientists and regulators that the journey toward repairing the ozone layer is not yet over.

“Our paper found that chloroform in the atmosphere is increasing, and we identified the regions of this emission increase and the potential impacts on future ozone recovery,” Fang says. “So future regulations may need to be made for these short-lived species.”

“Now is the time to do it, when it’s sort of the beginning of this trend,” Prinn adds. “Otherwise, you will get more and more of these factories built, which is what happened with CFCs, where more and more end uses were found beyond refrigerants. For chloroform, people will surely find additional uses for it.”

This research was supported by NASA, the National Institute of Environmental Studies in Japan, the National Research Foundation of Korea, the U.K. Natural Environment Research Council, the Commonwealth Scientific and Industrial Research Organization of Australia, the Department for Business, Energy & Industrial Strategy, and other organizations.



de MIT News https://ift.tt/2ExQnVe