lunes, 30 de diciembre de 2019

Tracking emissions in China

In January 2013, many people in Beijing experienced a multiweek period of severely degraded air, known colloquially as the “Airpocalypse,” which made them sick and kept them indoors. As part of its response, the central Chinese government accelerated implementation of tougher air pollution standards for power plants, with limits to take effect in July 2014. One key standard limited emissions of sulfur dioxide (SO2), which contributes to the formation of airborne particulate pollution and can cause serious lung and heart problems. The limits were introduced nationwide, but varied by location. Restrictions were especially stringent in certain “key” regions, defined as highly polluted and populous areas in Greater Beijing, the Pearl River Delta, and the Yangtze River Delta.

All power plants had to meet the new standards by July 2014. So how did they do? “In most developing countries, there are policies on the books that look very similar to policies elsewhere in the world,” says Valerie J. Karplus, an assistant professor of global economics and management at the MIT Sloan School of Management. “But there have been few attempts to look systematically at plants’ compliance with environmental regulation. We wanted to understand whether policy actually changes behavior.”

Focus on power plants

For China, focusing environmental policies on power plants makes sense. Fully 60 percent of the country’s primary energy use is coal, and about half of it is used to generate electricity. With that use comes a range of pollutant emissions. In 2007, China’s Ministry of Environmental Protection required thousands of power plants to install continuous emissions monitoring systems (CEMS) on their exhaust stacks and to upload hourly, pollutant-specific concentration data to a publicly available website.

Among the pollutants tracked on the website was SO2. To Karplus and two colleagues — Shuang Zhang, an assistant professor of economics at the University of Colorado at Boulder, and Douglas Almond, a professor in the School of International and Public Affairs and the Department of Economics at Columbia University — the CEMS data on SO2 emissions were an as-yet-untapped resource for exploring the on-the-ground impacts of the 2014 emissions standards, over time and plant-by-plant.

To begin their study, Karplus, Zhang, and Almond examined changes in the CEMS data around July 2014, when the new regulations went into effect. Their study sample included 256 power plants in four provinces, among them 43 that they deemed “large,” with a generating capacity greater than 1,000 megawatts (MW). They examined the average monthly SO2 concentrations reported by each plant starting in November 2013, eight months before the July 2014 policy deadline.

Emissions levels from the 256 plants varied considerably. The researchers were interested in relative changes within individual facilities before and after the policy, so they determined changes relative to each plant’s average emissions — a calculation known as demeaning. For each plant, they calculated the average emissions level over the whole time period being considered. They then calculated how much that plant’s reading for each month was above or below that baseline. By taking the averages of those changes-from-baseline numbers at all plants in each month, they could see how much emissions from the group of plants changed over time.

The demeaned CEMS concentrations are plotted in the first accompanying graph, labeled “SO2 concentrations (demeaned).” At zero on the Y axis in Figure 1 in the slideshow above, levels at all plants — big emitters and small — are on average equal to their baseline. Accordingly, in January 2014 plants were well above their baseline, and by July 2016 they were well below it. So average plant-level SO2 concentrations were declining slightly before the July 2014 compliance deadline, but they dropped far more dramatically after it.

Checking the reported data

Based on the CEMS data from all the plants, the researchers calculated that total SO2 emissions fell by 13.9 percent in response to the imposition of the policy in 2014. “That’s a substantial reduction,” notes Karplus. “But are those reported CEMS readings accurate?”

To find out, she, Zhang, and Almond compared the measured CEMS concentrations with SO2 concentrations detected in the atmosphere by NASA’s Ozone Monitoring Instrument. “We believed that the satellite data could provide a kind of independent check on the policy response as captured by the CEMS measurements,” she says.

For the comparison, they limited the analysis to their 43 1,000-MW power plants — large plants that should generate the strongest signal in the satellite observations. Figure 2 in the slideshow above shows data from both the CEMS and the satellite sources. Patterns in the two measures are similar, with substantial declines in the months just before and after July 2014. That general agreement suggests that the CEMS measurements can serve as a good proxy for atmospheric concentrations of SO2.

To double-check that outcome, the researchers selected 35 relatively isolated power plants whose capacity makes up at least half of the total capacity of all plants within a 35-kilometer radius. Using that restricted sample, they again compared the CEMS measurements and the satellite data. They found that the new emissions standards reduced both SO2 measures. However, the SO2 concentrations in the CEMS data fell by 36.8 percent after the policy, while concentrations in the satellite data fell by only 18.3 percent. So the CEMS measurements showed twice as great a reduction as the satellite data did. Further restricting the sample to isolated power plants with capacity larger than 1,000 MW produced similar results.

Key versus non-key regions

One possible explanation for the mismatch between the two datasets is that some firms overstated the reductions in their CEMS measurements. The researchers hypothesized that the difficulty of meeting targets would be higher in key regions, which faced the biggest cuts. In non-key regions, the limit fell from 400 to 200 milligrams per cubic meter (mg/m3). But in key regions, the limit went from 400 to 50 mg/m3. Firms may have been unable to make such a dramatic reduction in so short a time, so the incentive to manipulate their CEMS readings may have increased. For example, they may have put monitors on only a few of all their exhaust stacks, or turned monitors off during periods of high emissions.

Figure 3 in the slideshow above shows results from analyzing non-key and key regions separately. At large, isolated plants in non-key regions, the CEMS measurements show a 29.3 percent reduction in SO2 and the satellite data a 22.7 percent reduction. The ratio of the estimated post-policy declines is 77 percent — not too far out of line.

But a comparable analysis of large, isolated plants in key regions produced very different results. The CEMS measurements showed a 53.6 percent reduction in SO2, while the satellite data showed no statistically significant change at all.

One possible explanation is that power plants actually did decrease their SO2 emissions after 2014, but at the same time nearby industrial facilities or other sources increased theirs, with the net effect being that the satellite data showed little or no change. However, the researchers examined emissions from neighboring high-emitting facilities during the same time period and found no contemporaneous jump in their SO2 emissions. With that possibility dismissed, they concluded that manipulation of the CEMS data in regions facing the toughest emissions standards was “plausible,” says Karplus.

Compliance with the new standards

Another interesting question was how often the reported CEMS emissions levels were within the regulated limits. The researchers calculated the compliance rate at individual plants — that is, the fraction of time their emissions were at or below their limits — in non-key and key regions, based on their reported CEMS measurements. The results appear in Figure 4 in the slideshow above. In non-key regions, the compliance rate at all plants was about 90 percent in early 2014. It dropped a little in July 2014, when plants had to meet their (somewhat) stricter limits, and then went back up to almost 100 percent. In contrast, the compliance rate in key regions was almost 100 percent in early 2014 and then plummeted to about 50 percent at and after July 2014.

Karplus, Zhang, and Almond interpret that result as an indication of the toughness of complying with the stringent new standards. “If you think about it from the plant’s perspective, complying with tighter standards is a lot harder than complying with more lenient standards, especially if plants have recently made investments to comply with prior standards, but those changes are no longer adequate,” she says. “So in these key regions, many plants fell out of compliance.”

She makes another interesting observation. Their analyses had already produced evidence that firms in key areas may have falsified their reported CEMS measurements. “So that means they could be both manipulating their data and complying less,” she says.

Encouraging results plus insights for policymaking

Karplus stresses the positive outcomes of their study. She’s encouraged that the CEMS and satellite data both show emission levels dropping at most plants. Compliance rates were down at some plants in key regions, but that’s not surprising when the required cuts were large. And she notes that even though firms may not have complied, they still reduced their emissions to some extent as a result of the new standard.

She also observes that, for the most part, there’s close correlation between the CEMS and satellite data. So the quality of the CEMS data isn’t all bad. And where it’s bad — where firms may have manipulated their measurements — it may have been because they’d been set a seemingly impossible task and timeline. “At some point, plant managers might just throw up their hands,” says Karplus. The lesson for policymakers may be to set emissions-reduction goals that are deep but long-term so that firms have enough time to make the necessary investment and infrastructure adjustments.

To Karplus, an important practical implication of the study is “demonstrating that you can look at the alignment between ground and remote data sources to evaluate the impact of specific policies.” A series of tests confirmed the validity of their method and the robustness of their results. For example, they performed a comparable analysis focusing on July 2015, when there was no change in emissions standards. There was no evidence of the same effects. They accounted for SO2 emitted by manufacturing facilities and other sources, and their results were unaffected. And they demonstrated that when clouds or other obstructions interfered with satellite observations, the resulting data gap had no impact on their results.

The researchers note that their approach can be used for other short-lived industrial air pollutants and by any country seeking low-cost tools to improve data quality and policy compliance, especially when plants’ emissions are high to begin with. “Our work provides an illustration of how you can use satellite data to obtain an independent check on emissions from pretty much any high-emitting facility,” says Karplus. “And, over time, NASA will have instruments that can take measurements that are even more temporally and spatially resolved, which I think is quite exciting for environmental protection agencies and for those who would seek to improve the environmental performance of their energy assets.”

This research was supported by a seed grant from the Samuel Tak Lee Real Estate Entrepreneurship Laboratory at MIT and by the U.S. National Science Foundation.



de MIT News https://ift.tt/35b36a7

Bringing artificial intelligence and MIT to middle school classrooms

In the age of Alexa, YouTube recommendations, and Spotify playlists, artificial intelligence has become a way of life, improving marketing and advertising, e-commerce, and more. But what are the ethical implications of technology that collects and learns personal information? How should society navigate these issues and shape the future?

A new curriculum designed for middle school students aims to help them understand just that at an early age, as they grow up surrounded by the technology. The open-source educational material, designed by an MIT team and piloted at this year’s Massachusetts STEM Week this past fall, teaches students how AI systems are designed, how they can be used to influence the public — and also how to use them to be successful in jobs of the future.

During Mass STEM Week in Octover, middle schools across the commonwealth replaced their regular curriculum with an immersive week of hands-on learning led by a team including Cynthia Breazeal, associate professor of media arts and sciences at MIT; Randi Williams ’18, graduate research assistant in the Personal Robots Group at the MIT Media Lab; and the nonprofit organization i2 Learning.

“Preparing students for the future means having them engage in technology through hands-on activities. We provide students with tools and conceptual frameworks where we want them to engage with our materials as conscientious designers of AI-enabled technologies,” Breazeal says. “As they think through designing a solution to address a problem in their community, we get them to think critically about the ethical implications of the technology.”

Three years ago, the Personal Robots Group began a program around teaching AI concepts to preschoolers. This effort then broadened into planning learning experiences for more children, and the group developed a curriculum geared toward middle school students. Last spring, an AI curriculum was shared with teachers and piloted in Somerville, Massachusetts, to determine which activities resonated the most in the classrooms.

“We want to make a curriculum in which middle-schoolers can build and use AI — and, more importantly, we want them to take into account the societal impact of any technology,” says Williams.

This curriculum, How to Train Your Robot, was first piloted at an i2 summer camp in Boston before being presented to teachers from local schools during Mass STEM Week. The teachers, many of whom had little familiarity with STEM subjects, also participated in two days of professional development training to prepare them to deliver more than 20 class hours of AI content to their students. The curriculum ran in three schools across six classrooms.

The AI curriculum incorporates the work of Blakeley Hoffman Payne, a graduate research assistant in the Personal Robots Group, whose research focuses on the ethics of artificial intelligence and how to teach children to design, use, and think about AI. Students participated in discussions and creative activities, designing robot companions and using machine learning to solve real-world problems they have observed. At the end of the week, students share their inventions with their communities.

“AI is an area that is becoming increasingly important in people’s lives,” says Ethan Berman, founder of i2 Learning and MIT parent. “This curriculum is very relevant to both students and teachers. Beyond just being a class on technology, it focuses on what it means to be a global citizen.”

The creative projects provided opportunities for students to consider problems from a variety of angles, including thinking about issues of bias ahead of time, before a system is designed. For example, for one project that focused on sign language, the student trained her algorithm for understanding sign language around students of a wide range of skin tones, and incorporated adults, too — considering potential algorithmic bias to inform the design of the system.

Another group of students built a “library robot,” designed to help find and retrieve a book for people with mobility challenges. Students had to think critically about why and how this might be helpful, and also to consider the job of a librarian and how this would impact a librarian’s work. They considered how a robot that finds and retrieves books might be able to free up more of a librarian’s time to actually help people and find information for them.

Some of the current opportunities include scaling for more classrooms and schools, and also incorporating some other disciplines. There is interest in incorporating social studies, math, science, art, and music by finding ways to weave these other subjects into the AI projects. The main focus is on experiential learning that impacts how students think about AI.

“We hope students walk away with a different understanding of AI and how it works in the world,” says Williams, “and that they feel empowered to play an important role in shaping the technology.”



de MIT News https://ift.tt/2Qvj3CF

Featured video: 50 years of Interphase EDGE

Fifty years ago, in response to the assassination of Martin Luther King Jr., a new legacy was born at MIT: Project Interphase, a summer session for incoming first-year MIT students that aims to ease the transition to MIT and build community among new students.

This fall, alumni, current students, faculty, staff, administrators, friends, and family gathered to celebrate the 50th anniversary of the program, today known as Interphase EDGE (Empowering Discovery, Gateway to Excellence).

As part of the 50th anniversary celebration, the MIT Office of Minority Education, which coordinates the program, welcomed back Shirley Ann Jackson ’68, PhD ’73, president of Rensselaer Polytechnic Institute and one of the leaders behind the original idea for the program; as well as several members of the Project Interphase inaugural cohort, including Sylvester “Jim” Gates ’73, PhD ’77, a professor of physics at Brown University.

A new video celebrates the history of Interphase EDGE, which now extends beyond an initial summer session into students’ first two academic years. “I recall that experience being really instrumental in helping me to feel a part of the MIT community,” says Eboney Hearn ’01, who is now the executive director of MIT’s Office of Engineering Outreach Programs.

Adds Gates: “It was the singular, most important academic experience I ever had in my life.”

Submitted by: Office of Minority Education | Video by: MIT Video Productions | 3 min, 48 sec



de MIT News https://ift.tt/2QzJ7wf

jueves, 26 de diciembre de 2019

Jeffrey Grossman named head of the Department of Materials Science and Engineering

Jeffrey Grossman, the Morton and Claire Goulder and Family Professor in Environmental Systems and a MacVicar Faculty fellow, has been appointed the new head of the Department of Materials Science and Engineering effective Jan. 1, 2020.

Grossman received his PhD in theoretical physics from the University of Illinois and performed postdoctoral work at the University of California at Berkeley. He was a Lawrence Fellow at the Lawrence Livermore National Laboratory and returned to Berkeley as director of a Nanoscience Center and head of the Computational Nanoscience research group, with a focus on energy applications. In fall 2009, he joined MIT, where he has developed a research program known for its contributions to energy conversion, energy storage, membranes, and clean-water technologies.

Grossman’s passion for teaching and outstanding contributions to education are evident through courses such as 3.091 (Introduction to Solid-State Chemistry) — within which Grossman applies MIT’s “mens-et-manus” (mind-and-hand) learning philosophy. He uses “goodie bags” containing tools and materials that he covers in his lectures, encouraging hands-on learning and challenging students to ask big questions, take chances, and collaborate with one another.

In recognition of his contributions to engineering education, Grossman was named an MIT MacVicar Faculty Fellow and received the Bose Award for Excellence in Teaching, in addition to being named a fellow of the American Physical Society. He has published more than 200 scientific papers, holds 17 current or pending U.S. patents, and recently co-founded a company, Via Separations, to commercialize graphene-oxide membranes.

“Professor Grossman has done remarkable work in materials science and engineering, in particular energy conversion, energy storage, and clean-water technologies,” says Anantha Chandrakasan, dean of the MIT School of Engineering and the Vannevar Bush Professor of Electrical Engineering and Computer Science. “He has demonstrated exceptional commitment and vision as an educator. I am thrilled that he will be serving as the new head of our materials science and engineering department, and know he will be a tremendous leader.”



de MIT News https://ift.tt/360K4US

Chemistry bonds “quirky” researchers in hard-working Surendranath lab

When Sneaky the Lizard received his PhD in chemistry from MIT, an enthusiastic team of researchers in the lab of Yogesh “Yogi” Surendranath was there to celebrate. Although Sneaky is just a fictional, photoshopped character, he’s an important part of the lab culture, and his “graduation” was akin to a family milestone.

“Sneaky the Lizard graduated in 2018, despite never showing up to work,” says Surendranath, the Paul M. Cook Career Development Associate Professor of Chemistry, while proudly showing off a lab photo with Sneaky up-front and center. “My group is so weird, but I love them so much.”

The Surendranath lab is a tight-knit group that enjoys a lot of inside jokes — about mangoes and coconuts, as well as imaginary lizards. But it’s also about groundbreaking work in electrochemistry that is opening up new paths to a low-carbon future.

Those who work in the lab say the two are related.

“At the end of the day, we work on really, really hard problems, and in order to work in that environment and stay sane, you need a culture that’s supportive and makes it fun and exciting and interesting,” says Surendranath, who this summer received a Presidential Early Career Award for Scientists and Engineers, the highest honor the U.S. government gives to outstanding scientists and engineers beginning independent careers.

“We’re one community wherever we are, and we all take pleasure in solving these problems at the electrochemical interfaces,” says postdoc Marcel Schreier. “This allows us to be a little bit ahead sometimes. We ask more questions and try and try and try to answer them.”

All of the work in the Surendranath lab centers on using electricity to rearrange chemical bonds — fundamental scientific research with a host of possible applications. A key focus is finding ways to make carbon dioxide (CO2), a major greenhouse gas, useful — research central to addressing climate change. Surendranath, who serves as the associate director of the Carbon Capture, Utilization, and Storage Center, one of the Low-Carbon Energy Centers run by the MIT Energy Initiative (MITEI), says, “Our whole group works on the grand challenges MITEI undertakes on the low-carbon future of energy.”

A wealth of applications

Already, the Surendranath group has made major advances in the design of catalysts for converting CO2 into carbon monoxide — work that holds promise for one day using renewable energy to turn CO2 emissions into high-quality fuels. The lab has also developed a new graphite-based catalyst that could potentially replace expensive and rare metals in fuel cells.

“Our work is so fundamental, there isn’t a specific application we’re targeting. Batteries, fuel cells, any electrochemical transduction technology is going to have an interfacial question that we’re hoping to address,” says postdoc Michael Pegis.

Interestingly, the 18 members of the lab tackle many different kinds of questions within the broad spectrum of electrochemical research. While Pegis works on how electric fields influence the rate of bond-breaking and bond-forming reactions in oxygen reduction reactions — work that could improve fuel cells, for example — Jonathan “Jo” Melville, a PhD candidate and Tata Fellow, is researching nitrogen fixation for fertilizers in an effort to find a less energy-intensive way to produce food.

“Nitrogen is key for feeding billions around the world,” Melville says, noting that without nitrogen-rich fertilizers, there would not be enough arable land on earth to feed the population. Since the current system of production uses fossil fuels, generating roughly 2 percent of anthropogenic CO2 emissions, Melville is hoping to develop a sustainable alternative process. “I went into chemistry because I really care about solving the energy crisis,” he says.

Schreier’s work takes on the challenge of reaching a low-carbon future from another angle. He focuses on the catalytic capabilities of copper in the hope of finding new ways to store energy chemically — work broadly applicable to the challenge of improving the storage of energy generated by such intermittent sources as solar and wind.

PhD candidate Soyoung Kim, meanwhile, works to make useful chemicals from natural gas using metal-ion catalysts driven by electricity — a method she says could make it possible to sustain the reaction with energy from renewable sources.

For lab members — including specialists in inorganic chemistry, physical chemistry, chemical engineering, and electrochemistry — the wide variety of work taking place in the lab expands the opportunities for useful collaboration. “There’s so much knowledge in so many fields, I’ve been able to learn about new things — like computational chemistry from a postdoc who sits behind me,” Pegis says.

Surendranath deliberately fosters this synergy through regular team meetings as well as off-site activities such as hiking trips and retreats. “I think of science as a gift economy,” he says — with each researcher giving the gift of time and skills to other lab members in full expectation that similar gifts will be returned.

“We help each other all the time, informally,” Schreier says. “If someone has a problem, they will start drawing on the white board, and everyone will chime in and offer solutions.”

This esprit de corps carries through to everyday lab chores. There is no lab manager in the Surendranath lab; responsibilities are shared by the team, with individuals taking on such jobs as overseeing safety procedures, caring for particular instruments, ordering solvents, and organizing cleanups. Recently, the group worked in shifts to bar-code 35,000 chemicals. “In some cases, a lab manager can be useful, but it can be good to get together to make sure the lab is a cleaner and safer place,” Pegis says.

“We have lab tasks,” Schreier explains. “This works quite smoothly.”

Lab members also make their own hours and work out disputes among themselves. “I give my students enormous freedom,” says Surendranath, who was recently awarded the Nobel Laureate Signature Award for Graduate Education in Chemistry from the American Chemical Society, together with his graduate student Anna Wuttig PhD ’18. (Wuttig is now a postdoc at the University of California at Berkeley.) “All I care about is that they care about the science and do great work,” says Surendranath.

Mangoes, kites, and coconuts

With so much independent thinking, it’s perhaps not surprising that the word “quirky” comes up a lot when members are asked about the lab.

“Yogi is very supportive and approachable as a boss, while super-energetic and engaging when it comes to discussing science. That has attracted many hard-working and sometimes quirky people to the lab,” Kim says.

“It’s definitely a very quirky group of people,” Pegis agrees.

Indeed, the description applies even to Surendranath himself, who is crazy for mangoes, fascinated by tumbleweeds, and passionate about kite-flying. Perhaps that’s why he built a team that supports each member — quirks and all.

Schreier tells a story to illustrate. The lab was on a hike together in the White Mountains and running behind schedule because Surendranath needed to bring a coconut with him — a lab tradition with somewhat obscure origins — and he had had trouble finding one. So, once the team reached the peak, everyone was eager to head back — except Schreier. He had spotted a radio tower (a passion of his) and could not resist dashing off for a closer look, delaying everyone.

When he got back, “the whole group, with Yogi in the center, was waiting for me very patiently. It seemed to them the most normal thing that I would need to check out this transmitting tower,” he says. The experience really warmed Schreier’s heart and is one reason the team is so special to him. “It’s the way the group works. Everyone’s interests are taken seriously.”

Melville agrees, saying this depth of support has made it easier for him to cope with the pressures of grad school and noting that it all comes from the top. “Yogi sets the gold standard for proactive and ethical mentorship,” he says. “We love him.”

The feeling is mutual. “I love my people,” Surendranath says. “It is a true joy to interact with enthusiastic, like-minded, passionate people every day and engage with them on really stimulating problems … I think the culture day-to-day is more rewarding than the science, because you have an impact on people’s lives: how they mature.”



de MIT News https://ift.tt/2tT4RvC

miércoles, 25 de diciembre de 2019

Scholarships help build an exceptional student body

Senior Emily Soice, a talented violinist, has thrived at MIT, pursuing a dual major in civil and environmental engineering and music. “MIT has an amazing music program,” she says. “You really get a rigorous conservatory experience here.” A member of two performance ensembles, she enjoys connecting with others on campus through their shared love of music. 

In her engineering studies, Soice is focused on the issue of sustainable agriculture. “The wealth of research opportunities at MIT is astounding,” she says. “I’m able to contribute to that research during my undergraduate years.”

A scholarship to MIT made it possible for Soice to pursue her passions and seek out solutions to pressing global challenges. “When I got into MIT,” she recalls, “my family had been unemployed for more than a year. It wouldn’t have been possible for me to go to MIT if I didn’t have a scholarship.”

As one of only five universities in the United States with need-blind admissions for all students, both U.S. and international, MIT is committed to meeting the full financial need of every accepted student without requiring them to take out loans, according to Stuart Schmill ’86, dean of admissions and student financial services. “Scholarships allow us to attract the best students from around the world, regardless of their financial or geographic background,” he says. 

This past fiscal year, nearly 5,000 alumni and friends around the country and the world contributed $38 million toward undergraduate financial aid — a record amount for the Institute. Gifts for scholarships ensure that MIT can continue to uphold its commitment to need-blind admissions. A priority of the MIT Campaign for a Better World — launched in 2016 to drive the Institute’s work on some of humanity’s biggest challenges — undergraduate student aid continues to have significant needs.

Fundraising for scholarships helps MIT continue to bring the most-promising students to campus regardless of income level. In academic year 2018-19, MIT provided need-based financial aid awards to 59 percent of undergraduate students, with a median scholarship of approximately $53,000, the equivalent of MIT’s 2019-20 undergraduate tuition. 

Both Soice and Schmill point out that financial aid does more than attract students to MIT. “It also helps them succeed once they’re here,” explains Schmill. “If students are stressed about finances, it’s going to affect their educational choices and their ability to participate fully in the life of the Institute.” 

“There is so much open to you once you get into MIT,” says Soice. “It’s an amazing place to explore, and having a scholarship has allowed me to explore so much.” After graduation, she plans to attend graduate school, then work for a nonprofit focused on solving problems in agriculture or food systems. 

Scholarships, according to Schmill, help create “a robustly talented and diverse class in order to enhance the living and learning environment, and therefore the educational outcomes, for all our students. Every scholarship introduces a new mind into the MIT community, and simultaneously enriches the life of the recipient and the campus.”

Emily Soice agrees. “Without a scholarship,” she says, “I wouldn’t be here.” 



de MIT News https://ift.tt/2ru1PNB

lunes, 23 de diciembre de 2019

Widening metal tolerance for hydrogels

Researchers seeking to develop self-healing hydrogels have long sought to mimic the natural ability of mussels to generate strong, flexible threads underwater that allow the mussels to stick to rocks.

The natural process that gives these mussel threads, which are called byssal, the ability to break apart and re-form is a purely chemical process, not a biological one, MIT graduate student Seth Cazzell noted in a presentation to the Materials Research Society fall meeting in Boston on Dec. 5.

The critical step in the process is the chemical binding of polymer chains to a metal atom (a protein-to-metal bond in the case of the mussel). These links are called cross-linked metal coordination bonds. Their greatest strength occurs when each metal atom binds to three polymer chains, and they form a network that results in a strong hydrogel.

In a recently published PNAS paper, Cazzell and associate professor of materials science and engineering Niels Holten-Andersen demonstrated a method to create a self-healing hydrogel in a wider range of metal concentrations through the use of competition controlled by the pH, or acidity and alkalinity, of the environment. Cazzell is a former National Defense Science and Engineering Graduate Fellow.

In their model computational system, Cazzell showed that in the absence of pH-controlled competition, excess metal — typically iron, aluminum, or nickel — overwhelms the ability of the polymer to form strong cross-links. In the presence of too much metal, the polymers will bind singly to metal atoms instead of forming cross-linked complexes, and the material remains a liquid.

One commonly studied mussel-inspired metal coordinating ligand is catechol. In this study, a modified catechol, nitrocatechol, was attached to polyethylene glycol. By studying the nitrocatechol system coordinated with iron, as well as a second model hydrogel system (histidine coordinated with nickel), Cazzell experimentally confirmed that the formation of strong cross-links could be induced under excess metal concentrations, supporting their computational evidence of the competitive role of hydroxide ions (negatively charged hydrogen-oxygen pairs), which act as a competitor to the polymer for binding to metal.

In these solutions, polymers can bind to metal atoms in ones, twos, or threes. When more metal atoms bind to the hydroxide ions, there are fewer metal atoms available to bind to polymer atoms, which increases the likelihood that the polymer atoms will bind to the metal atoms in strong triple cross-links that produce the desired putty-like gel.

“What we really like about this study is we’re not looking at biology directly, but we think it’s giving us nice evidence of something that might be happening in biology. So it’s an example of materials science informing what we think the organism is actually using to build these materials,” Cazzell says.

In simulations, Cazzell plotted the effect of the hydroxide competitor on strong hydrogel formation and found that as competitor strength increases, “we can enter into a range where we can form a gel almost anywhere.” But, he says, “Eventually the competitor gets too strong, and you lose the ability to form a gel at all.”

These results have potential for use in advanced 3D printing of synthetic tissues and other biomedical applications.

This work was supported by the National Science Foundation through the MIT Materials Research Laboratory’s Materials Research Science and Engineering Center program, and by the U.S. Office of Naval Research.



de MIT News https://ift.tt/2SkrQd6

The billion-year belch

Billions of years ago, in the center of a galaxy cluster far, far away (15 billion light-years, to be exact), a black hole spewed out jets of plasma. As the plasma rushed out of the black hole, it pushed away material, creating two large cavities 180 degrees from each other. In the same way you can calculate the energy of an asteroid impact by the size of its crater, Michael Calzadilla, a graduate student at the MIT Kavli Institute for Astrophysics and Space Research (MKI), used the size of these cavities to figure out the power of the black hole’s outburst.

In a recent paper in The Astrophysical Journal Letters, Calzadilla and his coauthors describe the outburst in galaxy cluster SPT-CLJ0528-5300, or SPT-0528 for short. Combining the volume and pressure of the displaced gas with the age of the two cavities, they were able to calculate the total energy of the outburst. At greater than 1,054 joules of energy, a force equivalent to about 1,038 nuclear bombs, this is the most powerful outburst reported in a distant galaxy cluster. Coauthors of the paper include MKI research scientist Matthew Bayliss and assistant professor of physics Michael McDonald.

The universe is dotted with galaxy clusters, collections of hundreds and even thousands of galaxies that are permeated with hot gas and dark matter. At the center of each cluster is a black hole, which goes through periods of feeding, where it gobbles up plasma from the cluster, followed by periods of explosive outburst, where it shoots out jets of plasma once it has reached its fill. “This is an extreme case of the outburst phase,” says Calzadilla of their observation of SPT-0528. Even though the outburst happened billions of years ago, before our solar system had even formed, it took around 6.7 billion years for light from the galaxy cluster to travel all the way to Chandra, NASA’s X-ray emissions observatory that orbits Earth.

Because galaxy clusters are full of gas, early theories about them predicted that as the gas cooled, the clusters would see high rates of star formation, which need cool gas to form. However, these clusters are not as cool as predicted and, as such, weren’t producing new stars at the expected rate. Something was preventing the gas from fully cooling. The culprits were supermassive black holes, whose outbursts of plasma keep the gas in galaxy clusters too warm for rapid star formation.

The recorded outburst in SPT-0528 has another peculiarity that sets it apart from other black hole outbursts. It’s unnecessarily large. Astronomers think of the process of gas cooling and hot gas release from black holes as an equilibrium that keeps the temperature in the galaxy cluster — which hovers around 18 million degrees Fahrenheit — stable. “It’s like a thermostat,” says McDonald. The outburst in SPT-0528, however, is not at equilibrium.

According to Calzadilla, if you look at how much power is released as gas cools onto the black hole versus how much power is contained in the outburst, the outburst is vastly overdoing it. In McDonald’s analogy, the outburst in SPT-0528 is a faulty thermostat. “It’s as if you cooled the air by 2 degrees, and thermostat’s response was to heat the room by 100 degrees,” McDonald explains.

Earlier in 2019, McDonald and colleagues released a paper looking at a different galaxy cluster, one that displays a completely opposite behavior to that of SPT-0528. Instead of an unnecessarily violent outburst, the black hole in this cluster, dubbed Phoenix, isn’t able to keep the gas from cooling. Unlike all the other known galaxy clusters, Phoenix is full of young star nurseries, which sets it apart from the majority of galaxy clusters.

“With these two galaxy clusters, we’re really looking at the boundaries of what is possible at the two extremes,” McDonald says of SPT-0528 and Phoenix. He and Calzadilla will also characterize the more normal galaxy clusters, in order to understand the evolution of galaxy clusters over cosmic time. To explore this, Calzadilla is characterizing 100 galaxy clusters.

The reason for characterizing such a large collection of galaxy clusters is because each telescope image is capturing the clusters at a specific moment in time, whereas their behaviors are happening over cosmic time. These clusters cover a range of distances and ages, allowing Calzadilla to investigate how the properties of clusters change over cosmic time. “These are timescales that are much bigger than a human timescale or what we can observe,” explains Calzadilla.

The research is similar to that of a paleontologist trying to reconstruct the evolution of an animal from a sparse fossil record. But, instead of bones, Calzadilla is studying galaxy clusters, ranging from SPT-0528 with its violent plasma outburst on one end to Phoenix with its rapid cooling on the other. “You’re looking at different snapshots in time,” says Calzadilla.  “If you build big enough samples of each of those snapshots, you can get a sense how a galaxy cluster evolves.”



de MIT News https://ift.tt/2SgXXKR

Bose grants for 2019 reward bold ideas across disciplines

Now in its seventh year, the Professor Amar G. Bose Research Grants support visionary projects that represent intellectual curiosity and a pioneering spirit. Three MIT faculty members have each been awarded one of these prestigious awards for 2019 to pursue diverse questions in the humanities, biology, and engineering.

At a ceremony hosted by MIT President L. Rafael Reif on Nov. 25 and attended by past awardees, Provost Martin Schmidt, the Ray and Maria Stata Professor of Electrical Engineering and Computer Science, formally announced this year’s Amar G. Bose Research Fellows: Sandy Alexandre, Mary Gehring, and Kristala L.J. Prather.

The fellowships are named for the late Amar G. Bose ’51, SM ’52, ScD ’56, a longtime MIT faculty member and the founder of the Bose Corporation. Speaking at the event, President Reif expressed appreciation for the Bose Fellowships, which enable highly creative and unusual research in areas that can be hard to fund through traditional means. “We are tremendously grateful to the Bose family for providing the support that allows bold and curious thinkers at MIT to dream big, challenge themselves, and explore.”

Judith Bose, widow of Amar’s son, Vanu ’87, SM ’94, PhD ’99, congratulated the fellows on behalf of the Bose family. “We talk a lot at this event about the power of a great innovative idea, but I think it was a personal mission of Dr. Bose to nurture the ability, in each individual that he met along the way, to follow through — not just to have the great idea but the agency that comes with being able to pursue your idea, follow it through, and actually see where it leads,” Bose said. “And Vanu was the same way. That care that was epitomized by Dr. Bose not just in the idea itself, but in the personal investment, agency, and nurturing necessary to bring the idea to life — that care is a large part of what makes true change in the world."

The relationship between literature and engineering

Many technological innovations have resulted from the influence of literature, one of the most notable being the World Wide Web. According to many sources, Sir Tim Berners-Lee, the web’s inventor, found inspiration from a short story by Arthur C. Clarke titled “Dial F for Frankenstein.” Science fiction has presaged a number of real-life technological innovations, including the defibrillator, noted in Mary Shelley’s "Frankenstein;" the submarine, described in Jules Verne’s "20,000 Leagues Under the Sea;" and earbuds, described in Ray Bradbury’s "Fahrenheit 451." But the data about literature’s influence on STEM innovations are spotty, and these one-to-one relationships are not always clear-cut.

Sandy Alexandre, associate professor of literature, intends to change that by creating a large-scale database of the imaginary inventions found in literature. Alexandre’s project will enact the step-by-step mechanics of STEM innovation via one of its oft-unsung sources: literature. “To deny or sever the ties that bind STEM and literature is to suggest — rather disingenuously — that the ideas for many of the STEM devices that we know and love miraculously just came out of nowhere or from an elsewhere where literature isn’t considered relevant or at all,” she says.

During the first phase of her work, Alexandre will collaborate with students to enter into the database the imaginary inventions as they are described verbatim in a selection of books and other texts that fall under the category of speculative fiction—a category that includes but is not limited to the subgenres of fantasy, Afrofuturism, and science fiction. This first phase will, of course, require that students carefully read these texts in general, but also read for these imaginary inventions more specifically. Additionally, students with drawing skills will be tasked with interpreting the descriptions by illustrating them as two-dimensional images.

From this vast inventory of innovations, Alexandre, in consultation with students involved in the project, will decide on a short list of inventions that meet five criteria: they must be feasible, ethical, worthwhile, useful, and necessary. This vetting process, which constitutes the second phase of the project, is guided by a very important question: what can creating and thinking with a vast database of speculative fiction’s imaginary inventions teach us about what kinds of ideas we should (and shouldn’t) attempt to make into a reality? For the third and final phase, Alexandre will convene a team to build a real-life prototype of one of the imaginary inventions. She envisions this prototype being placed on exhibit at the MIT Museum.

The Bose research grant, Alexandre says, will allow her to take this project from a thought experiment to lab experiment. “This project aims to ensure that literature no longer play an overlooked role in STEM innovations. Therefore, the STEM innovation, which will be the culminating prototype of this research project, will cite a work of literature as the main source of information used in its invention.”

Nature’s role in chemical production

Kristala L.J. Prather ’94, the Arthur D. Little Professor of Chemical Engineering, has been focused on using biological systems for chemical production during the 15 years she’s been at the Institute. Biology as a medium for chemical synthesis has been successfully exploited to commercially produce molecules for uses that range from food to pharmaceuticals — ethanol is a good example. However, there is a range of other molecules with which scientists have been trying to work, but they have faced challenges around an insufficient amount of material being produced and a lack of defined steps needed to make a specific compound.

Prather’s research is rooted in the fact that there are a number of naturally (and unnaturally) occurring chemical compounds in the environment, and cells have evolved to be able to consume them. These cells have evolved or developed a protein that will sense a compound’s presence — a biosensor — and in response will make other proteins that help the cells utilize that compound for its benefit.

“We know biology can do this,” Prather says, “so if we can put together a sufficiently diverse set of microorganisms, can we just let nature make these regulatory molecules for anything that we want to be able to sense or detect?” Her hypothesis is that if her team exposes cells to a new compound for a long enough period of time, the cells will evolve the ability to either utilize that carbon source or develop an ability to respond to it. If Prather and her team can then identify the protein that’s now recognizing what that new compound is, they can isolate it and use it to improve the production of that compound in other systems. “The idea is to let nature evolve specificity for particular molecules that we’re interested in,” she adds.

Prather’s lab has been working with biosensors for some time, but her team has been limited to sensors that are already well characterized and that were readily available. She’s interested in how they can get access to a wider range of what she knows nature has available through the incremental exposure of new compounds to a more comprehensive subset of microorganisms.

“To accelerate the transformation of the chemical industry, we must find a way to create better biological catalysts and to create new tools when the existing ones are insufficient,” Prather says. “I am grateful to the Bose Fellowship Committee for allowing me to explore this novel idea.”

Prather’s findings as a result of this project hold the possibility of broad impacts in the field of metabolic engineering, including the development of microbial systems that can be engineered to enhance degradation of both toxic and nontoxic waste.

Adopting orphan crops to adapt to climate change

In the context of increased environmental pressure and competing land uses, meeting global food security needs is a pressing challenge. Although yield gains in staple grains such as rice, wheat, and corn have been high over the last 50 years, these have been accompanied by a homogenization of the global food supply; only 50 crops provide 90% of global food needs.

However, there are at least 3,000 plants that can be grown and consumed by humans, and many of these species thrive in marginal soils, at high temperatures, and with little rainfall. These “orphan” crops are important food sources for farmers in less developed countries but have been the subject of little research.

Mary Gehring, associate professor of biology at MIT, seeks to bring orphan crops into the molecular age through epigenetic engineering. She is working to promote hybridization, increase genetic diversity, and reveal desired traits for two orphan seed crops: an oilseed crop, Camelina sativa (false flax), and a high-protein legume, Cajanus cajan (pigeon pea).

C. sativa, which produces seeds with potential for uses in food and biofuel applications, can grow on land with low rainfall, requires minimal fertilizer inputs, and is resistant to several common plant pathogens. Until the mid-20th century, C. sativa was widely grown in Europe but was supplanted by canola, with a resulting loss of genetic diversity. Gehring proposes to recover this genetic diversity by creating and characterizing hybrids between C. sativa and wild relatives that have increased genetic diversity.

“To find the best cultivars of orphan crops that will withstand ever increasing environmental insults requires a deeper understanding of the diversity present within these species. We need to expand the plants we rely on for our food supply if we want to continue to thrive in the future,” says Gehring. “Studying orphan crops represents a significant step in that direction. The Bose grant will allow my lab to focus on this historically neglected but vitally important field.”



de MIT News https://ift.tt/2EJvs0n

MIT Inclusive Innovation Challenge drives a more equitable economy

In 2015, the world began to realize a stark reality — the unprecedented wealth and prosperity ushered in by the digital age was not being shared equally across society. Research indicated that income inequality was rising, and the headlines reflected a growing public fear of unemployment driven by automation. 

MIT Sloan School of Management's Erik Brynjolfsson and Andrew McAfee, co-directors of the MIT Initiative on the Digital Economy (IDE) and co-authors of “The Second Machine Age,” recognized this reality as a challenge that could be solved if the right resources were brought to bear.   

IDE’s response was the MIT Inclusive Innovation Challenge (IIC), a global tournament for entrepreneurs harnessing technology to ensure a more equitable future. Since the IIC was launched at the first MIT Solve in 2015, IDE has identified 160 organizations from around the world, awarding a total of $5 million in prizes to accelerate their missions. In three years, those IIC winners have collectively generated over $170 million in revenue, raised over $1 billion in capital, created more than 7,000 jobs, and served 350 million people. 

IIC awardees include entrepreneurs like Hugo Pinaretta, whose grandparents were smallholder farmers in Peru. He and his co-founders now lead AGROS, which applies remote sensing and precision agriculture technologies to increase small farmers’ yields. 

In another case, Helen Adeosun, a former home health care aid, launched CareAcademy, which provides online professional development to teach and up-skill caregivers, a growing profession that is unlikely to be replaced by robots or automation. Adeosun and her team provide opportunities for millions of workers to prepare for the future of health care.

The IIC has been a grassroots initiative — a small team relying on the power of an international community to scale and to drive their mission. They have worked with more than 600 experts to select IIC winners from among 4,500 global registrants and engaged more than 100 outreach partners: like-minded for-profit and nonprofit organizations that helped the IIC source inclusive innovators from every corner of the world. They also partnered with nine collaborator organizations, including Merck KGaA in Germany, MaRS Discovery District in Canada, and Liquid Telecom in Kenya, to host 14 celebrations on five continents, attended by more than 4,000 investors, policymakers, academics, and business leaders.

After accelerating the global future-of-work movement for the past four years, the IIC team began looking for ways to further amplify their impact. Coming full circle, they looked to MIT Solve, an Institute-wide initiative designed to address the world’s most pressing problems through partnership and open innovation. The IIC will transition into Solve in 2020, powering its Economic Prosperity Challenge to drive increased resources and global awareness to the inclusive innovators who are creating an equitable future of work for all. This transition will magnify the impact of IIC winners and the transformative, lasting change that is imperative for today’s global economy.



de MIT News https://ift.tt/2Sv3ei7

viernes, 20 de diciembre de 2019

Finding a good read among billions of choices

With billions of books, news stories, and documents online, there’s never been a better time to be reading — if you have time to sift through all the options. “There’s a ton of text on the internet,” says Justin Solomon, an assistant professor at MIT. “Anything to help cut through all that material is extremely useful.”

With the MIT-IBM Watson AI Lab and his Geometric Data Processing Group at MIT, Solomon recently presented a new technique for cutting through massive amounts of text at the Conference on Neural Information Processing Systems (NeurIPS). Their method combines three popular text-analysis tools — topic modeling, word embeddings, and optimal transport — to deliver better, faster results than competing methods on a popular benchmark for classifying documents.

If an algorithm knows what you liked in the past, it can scan the millions of possibilities for something similar. As natural language processing techniques improve, those “you might also like” suggestions are getting speedier and more relevant. 

In the method presented at NeurIPS, an algorithm summarizes a collection of, say, books, into topics based on commonly-used words in the collection. It then divides each book into its five to 15 most important topics, with an estimate of how much each topic contributes to the book overall. 

To compare books, the researchers use two other tools: word embeddings, a technique that turns words into lists of numbers to reflect their similarity in popular usage, and optimal transport, a framework for calculating the most efficient way of moving objects — or data points — among multiple destinations. 

Word embeddings make it possible to leverage optimal transport twice: first to compare topics within the collection as a whole, and then, within any pair of books, to measure how closely common themes overlap. 

The technique works especially well when scanning large collections of books and lengthy documents. In the study, the researchers offer the example of Frank Stockton’s “The Great War Syndicate,” a 19th century American novel that anticipated the rise of nuclear weapons. If you’re looking for a similar book, a topic model would help to identify the dominant themes shared with other books — in this case, nautical, elemental, and martial. 

But a topic model alone wouldn’t identify Thomas Huxley’s 1863 lecture, “The Past Condition of Organic Nature,” as a good match. The writer was a champion of Charles Darwin’s theory of evolution, and his lecture, peppered with mentions of fossils and sedimentation, reflected emerging ideas about geology. When the themes in Huxley’s lecture are matched with Stockton’s novel via optimal transport, some cross-cutting motifs emerge: Huxley’s geography, flora/fauna, and knowledge themes map closely to Stockton’s nautical, elemental, and martial themes, respectively.

Modeling books by their representative topics, rather than individual words, makes high-level comparisons possible. “If you ask someone to compare two books, they break each one into easy-to-understand concepts, and then compare the concepts,” says the study’s lead author Mikhail Yurochkin, a researcher at IBM. 

The result is faster, more accurate comparisons, the study shows. The researchers compared 1,720 pairs of books in the Gutenberg Project dataset in one second — more than 800 times faster than the next-best method.

The technique also does a better job of accurately sorting documents than rival methods — for example, grouping books in the Gutenberg dataset by author, product reviews on Amazon by department, and BBC sports stories by sport. In a series of visualizations, the authors show that their method neatly clusters documents by type.

In addition to categorizing documents quickly and more accurately, the method offers a window into the model’s decision-making process. Through the list of topics that appear, users can see why the model is recommending a document.

The study’s other authors are Sebastian Claici and Edward Chien, a graduate student and a postdoc, respectively, at MIT’s Department of Electrical Engineering and Computer Science and Computer Science and Artificial Intelligence Laboratory, and Farzaneh Mirzazadeh, a researcher at IBM.



de MIT News https://ift.tt/2tDHjL7

SMART and NTU researchers design polymer that can kill drug-resistant bacteria

Researchers from Singapore-MIT Alliance for Research and Technology (SMART), MIT’s research enterprise in Singapore, and Nanyang Technological University (NTU) have designed an antimicrobial polymer that can kill bacteria resistant to commonly used antibiotics, including the superbug methicillin-resistant Staphylococcus aureus (MRSA). The breakthrough can pave the way for the development of medicine to which bacteria have a significantly slower rate of developing resistance, and help prevent hundreds of thousands of deaths each year caused by drug-resistant bacteria.

The new polymer is explained in a paper published last month in Nature Communications. It was jointly published by a group of scientists at NTU and SMART's Antimicrobial Resistance Interdisciplinary Research Group (AMR IRG) and led by Mary Chan-Park, SMART AMR principal investigator and professor at NTU’s School of Chemical and Biomedical Engineering; and Kevin Pethe, associate professor at the Lee Kong Chian School of Medicine at NTU.

Increasing resistance to antimicrobial medicine is a cause for serious concern, with at least 700,000 deaths each year caused by drug-resistant infections and diseases, according to a recent World Health Organisation report. In the United States alone, an antibiotic-resistant infection is acquired every 11 seconds, while a related death occurs every 15 minutes. While alpha-peptides have long been used to treat resistant bacteria such as MRSA, they tend to be rather unstable or toxic in the body. So, for the first time, NTU and SMART researchers tested the use of beta-peptides to fight such bacteria in living beings. Designed for stability, the innovative new polymer degrades slowly in the body, giving it more time to work. Importantly, it has little to no toxicity impact.

“Typically, antibiotics don’t work on various forms of bacteria like biofilm and persistent bacteria as they become resistant,” says Chan-Park. “We are therefore really excited that our new beta-peptide polymer has shown great promise in combating existing antibiotic-resistant strains of bacteria. Further, it has also proven its lethality against biofilm and persistent types of bacteria, which current antibiotics have limited action upon.”

Innovative medical research like the new co-beta-peptide is a crucial step toward preventing the staggering number of deaths from persistent and resistant bacteria. AMR also has plans to test this polymer for curing MRSA-affected livestock. This is a growing issue globally, with up to 50 percent of pig herds in parts of Europe affected by the virus. The new drug will be particularly beneficial to farm workers, as the virus has been detected in 20-80 percent of workers in MRSA-positive herds.

While the next step for the research is to test the polymer on animals infected by MRSA in pig farms, the researchers are also preparing to have the drugs tested in clinical trials for use for the public.

“This is a promising new approach to combating antimicrobial resistance that hasn’t been done before,” says Pethe. “The toxicity and proof-of-concept studies have shown that this can be on the drug development pathway, as it shows good potency and low toxicity, and we look forward to having this developed as a topical drug for humans.”

Currently, AMR is looking for potential partners for further development of the antimicrobial polymers, particularly for human use.

The AMR IRG is a translational research and entrepreneurship program that tackles the growing threat of antimicrobial resistance. By leveraging talent and convergent technologies across Singapore and MIT, they aim to tackle AMR head-on by developing multiple innovative and disruptive approaches to identify, respond to, and treat drug-resistant microbial infections.

The Singapore-MIT Alliance for Research and Technology is MIT’s research enterprise in Singapore, established by MIT in partnership with the National Research Foundation of Singapore (NRF) in 2007. SMART is the first entity in the Campus for Research Excellence and Technological Enterprise, developed by NRF. SMART currently comprises an Innovation Center and six IRGs.



de MIT News https://ift.tt/38XG8pZ

When machine learning packs an economic punch

A new study co-authored by an MIT economist shows that improved translation software can significantly boost international trade online — a notable case of machine learning having a clear impact on economic activity.

The research finds that after eBay improved its automatic translation program in 2014, commerce shot up by 10.9 percent among pairs of countries where people could use the new system.   

“That’s a striking number. To have it be so clear in such a short amount of time really says a lot about the power of this technology,” says Erik Brynjolfsson, an MIT economist and co-author of a new paper detailing the results.

To put the results in perspective, he adds, consider that physical distance is, by itself, also a significant barrier to global commerce. The 10.9 percent change generated by eBay’s new translation software increases trade by the same amount as “making the world 26 percent smaller, in terms of its impact on the goods that we studied,” he says.

The paper, “Does Machine Translation Affect International Trade? Evidence from a Large Digital Platform,” appears in the December issue of Management Science. The authors are Brynjolfsson, who is the Schussel Family Professor of Management Science at the MIT Sloan School of Management, and Xiang Hui and Meng Liu, who are both assistant professors in the Olin Business School at Washington University in St. Louis.

Just cause

To conduct the study, the scholars examined what happened after eBay, in 2014, introduced its new eBay Machine Translation (eMT) system — a proprietary machine-learning program that, by several objective measures, significantly improved translation quality on eBay’s site. The new system initially was focused on English-Spanish translations, to facilitate trade between the United States and Latin America

Previously, eBay had used Bing Translator to render the titles of objects for sale. By one evaluation measure, called the Human Acceptance Rate (HAR), in which three experts accept or reject translations, the eMT system increased the number of acceptable Spanish-language item titles on eBay from 82 percent to 90 percent.

Using administrative data from eBay, the researchers then examined the volume of trade on the platform, within countries, after the eMT system went into use. Other factors being equal, the study showed that the new translation system not only had an effect on sales, but that trade increased by 1.06 percent for each additional word in the titles of items on eBay.

That is a substantial change for a commerce platform on which, as the paper notes, items for sale often have long, descriptive titles such as “Diamond-Cut Stackable Thin Wedding Ring New .925 Sterling Silver Band Sizes 4-12,” or “Alpine Swiss Keira Women’s Trench Coast Double Breasted Wool Jacket Belted.” In those cases, making the translation clearer helps potential buyers understand exactly what they might be purchasing.

Given the study’s level of specificity, Brynjolfsson calls it “a really fortunate natural experiment, with a before-and-after that sharply distinguished what happened when you had machine translation and when you didn’t.”

The structure of the study, he adds, has enabled the researchers to say with confidence that the new eBay program, and not outside factors, directly generated the change in trade volume among affected countries.

“In economics, it’s often hard to do causal analyses and prove that A caused B, not just that A was associated with B,” says Brynjolfsson. “But in this case, I feel very comfortable using causal language and saying that improvement in machine translation caused the increase in international trade.”

Larger puzzle: The productivity issue

The genesis of the paper stems from an ongoing question about new technology and economic productivity. While many forms of artificial intelligence have been developed and expanded in the last couple of decades, the impact of AI, including things like machine-translation systems, has not been obvious in economics statistics.

“There’s definitely some amazing progress in the core technologies, including in things like natural language processing and translation,” Brynjolfsson says. “But what’s been lacking has been evidence of an economic impact, or business impact. So that’s a bit of a puzzle.”

When looking to see if an economic impact for various forms of AI could be measured, Brynjolfsson, Hui, and Liu thought machine translation “made sense, because it’s a relatively straightforward implementation,” Brynjolfsson adds. That is, better translations could influence economic activity, at least on eBay, without any other changes in technology occurring.

In this vein, the findings fit with a larger postulation Brynjolfsson has developed in recent years — that the adoption of AI technologies produces a “J-curve” in productivity. As Brynjolfsson has previously written, broad-ranging AI technologies nonetheless “require significant complementary investments, including business process redesign, co-invention of new products and business models, and investments in human capital” to have a large economic impact.

As a result, when AI technologies are introduced, productivity may appear to slow down, and when the complementary technologies are developed, productivity may appear to take off — in the “J-curve” shape.

So while Brynjolfsson believes the results of this study are clear, he warns against generalizing too much on the basis of this finding about the impact of machine learning and other forms of AI on economic activity. Every case is different, and AI will not always produce such notable changes by itself.

“This was a case where not a lot of other changes had to happen in order for the technology to benefit the company,” Brynjolfsson says. “But in many other cases, much more complicated, complementary changes are needed. That’s why, in most cases with machine learning, it takes longer for the benefits to be delivered.”



de MIT News https://ift.tt/38YFdFP

Exploring hip hop history with art and technology

A new museum is coming to New York City in 2023, the year of hip-hop’s 50th birthday, and an MIT team has helped to pave the way for the city to celebrate the legacy of this important musical genre — by designing unique creative experiences at the intersection of art, learning, and contemporary technology.

With “The [R]evolution of Hip Hop Breakbeat Narratives,” a team led by D. Fox Harrell, professor of digital media and artificial intelligence and director of the MIT Center for Advanced Virtuality, has created an art installation that takes museum-goers on an interactive, personalized journey through hip hop history.

The installation served as the centerpiece of an event held this month by leaders of the highly anticipated Universal Hip Hop Museum (UHHM), which will officially open in just a few years in the Bronx — the future home of the UHHM, and where many agree that the genre of hip hop music originated.

“Hip hop is much more than a musical genre. It is a global phenomenon, with a rich history and massive social and cultural impact, with local roots in the Bronx,” Harrell says. “As an educational center, the Universal Hip Hop Museum will have the power to connect people to the surrounding community.”

Harrell’s immersive art installation takes museum-goers on a journey through hip hop culture and history, from the 1970s to the present. However, not everyone experiences the installation in the same way. Using a computational model of users’ preferences and artificial intelligence technologies to drive interaction, the team of artists and computer scientists from the Center for Advanced Virtuality has created layered, personalized virtual experiences.

When approaching the exhibit, museum-goers are greeted by “The Elementals,” or novel characters named after the five elements of hip hop (MC, DJ, Breakdance, Graffiti Art, and Knowledge) that guide users and ask key questions — “What is your favorite hip hop song?” or “Which from this pair of lyrics do you like the most?” Based on those answers, the Elementals take users through their own personalized narrative of hip hop history.

Harrell developed the Elementals with professors John Jennings of the University of California at Riverside and Stacey Robinson of the University of Illinois — artists collectively known as Black Kirby. This visual aesthetic ties the work into the rich, imaginative cultures and iconography of the African diaspora.

Through these conversations with the Elementals they encounter, people can explore broad social issues surrounding hip hop, such as gender, fashion, and location. At the end of their journey, they can take home a personalized playlist of songs. 

“We designed the Breakbeat Narratives installation by integrating Microsoft conversational AI technologies, which made our user modeling more personable, with a music visualization platform from the TunesMap Educational Foundation,” Harrell says.

The exploration of social issues is about as close to the heart of Harrell’s mission in the Center for Advanced Virtuality as one can get. In the center, Harrell designs virtual technologies to stimulate creative expression, cultural analysis, and positive social change.

“We wanted to tell stories that pushed beyond stereotypical representations, digging into the complexities of both empowering and problematic representations that often coexist,” he says. “This work fits into our endeavor called the Narrative, Orality, and Improvisation Research (NOIR) Initiative that uses AI technologies to forward the art forms of diverse global cultures.”

Through this art project enabled by contemporary technologies, Harrell hopes that he has helped museum leadership to achieve their goal of celebrating hip-hop’s heritage and legacy.

“Now, people internationally can have a stake in this great art.”



de MIT News https://ift.tt/36Uf8Wv

jueves, 19 de diciembre de 2019

Researchers produce first laser ultrasound images of humans

For most people, getting an ultrasound is a relatively easy procedure: As a technician gently presses a probe against a patient’s skin, sound waves generated by the probe travel through the skin, bouncing off muscle, fat, and other soft tissues before reflecting back to the probe, which detects and translates the waves into an image of what lies beneath.

Conventional ultrasound doesn’t expose patients to harmful radiation as X-ray and CT scanners do, and it’s generally noninvasive. But it does require contact with a patient’s body, and as such, may be limiting in situations where clinicians might want to image patients who don’t tolerate the probe well, such as babies, burn victims, or other patients with sensitive skin. Furthermore, ultrasound probe contact induces significant image variability, which is a major challenge in modern ultrasound imaging.

Now, MIT engineers have come up with an alternative to conventional ultrasound that doesn’t require contact with the body to see inside a patient. The new laser ultrasound technique leverages an eye- and skin-safe laser system to remotely image the inside of a person. When trained on a patient’s skin, one laser remotely generates sound waves that bounce through the body. A second laser remotely detects the reflected waves, which researchers then translate into an image similar to conventional ultrasound.

In a paper published today by Nature in the journal Light: Science and Applications, the team reports generating the first laser ultrasound images in humans. The researchers scanned the forearms of several volunteers and observed common tissue features such as muscle, fat, and bone, down to about 6 centimeters below the skin. These images, comparable to conventional ultrasound, were produced using remote lasers focused on a volunteer from half a meter away.

“We’re at the beginning of what we could do with laser ultrasound,” says Brian W. Anthony, a principal research scientist in MIT’s Department of Mechanical Engineering and Institute for Medical Engineering and Science (IMES), a senior author on the paper. “Imagine we get to a point where we can do everything ultrasound can do now, but at a distance. This gives you a whole new way of seeing organs inside the body and determining properties of deep tissue, without making contact with the patient.”

Anthony’s co-authors on the paper are lead author and MIT postdoc Xiang (Shawn) Zhang, recent doctoral graduate Jonathan Fincke, along with Charles Wynn, Matthew Johnson, and Robert Haupt of MIT’s Lincoln Laboratory.

Yelling into a canyon — with a flashlight

In recent years, researchers have explored laser-based methods in ultrasound excitation in a field known as photoacoustics. Instead of directly sending sound waves into the body, the idea is to send in light, in the form of a pulsed laser tuned at a particular wavelength, that penetrates the skin and is absorbed by blood vessels.

The blood vessels rapidly expand and relax — instantly heated by a laser pulse then rapidly cooled by the body back to their original size — only to be struck again by another light pulse. The resulting mechanical vibrations generate sound waves that travel back up, where they can be detected by transducers placed on the skin and translated into a photoacoustic image.

While photoacoustics uses lasers to remotely probe internal structures, the technique still requires a detector in direct contact with the body in order to pick up the sound waves. What’s more, light can only travel a short distance into the skin before fading away. As a result, other researchers have used photoacoustics to image blood vessels just beneath the skin, but not much deeper.

Since sound waves travel further into the body than light, Zhang, Anthony, and their colleagues looked for a way to convert a laser beam’s light into sound waves at the surface of the skin, in order to image deeper in the body. 

Based on their research, the team selected 1,550-nanometer lasers, a wavelength which is highly absorbed by water (and is eye- and skin-safe with a large safety margin).  As skin is essentially composed of water, the team reasoned that it should efficiently absorb this light, and heat up and expand in response. As it oscillates back to its normal state, the skin itself should produce sound waves that propagate through the body.

The researchers tested this idea with a laser setup, using one pulsed laser set at 1,550 nanometers to generate sound waves, and a second continuous laser, tuned to the same wavelength, to remotely detect reflected sound waves.  This second laser is a sensitive motion detector that measures vibrations on the skin surface caused by the sound waves bouncing off muscle, fat, and other tissues. Skin surface motion, generated by the reflected sound waves, causes a change in the laser’s frequency, which can be measured. By mechanically scanning the lasers over the body, scientists can acquire data at different locations and generate an image of the region.

“It’s like we’re constantly yelling into the Grand Canyon while walking along the wall and listening at different locations,” Anthony says. “That then gives you enough data to figure out the geometry of all the things inside that the waves bounced against — and the yelling is done with a flashlight.”

In-home imaging

The researchers first used the new setup to image metal objects embedded in a gelatin mold roughly resembling skin’s water content. They imaged the same gelatin using a commercial ultrasound probe and found both images were encouragingly similar. They moved on to image excised animal tissue — in this case, pig skin — where they found laser ultrasound could distinguish subtler features, such as the boundary between muscle, fat, and bone.

Finally, the team carried out the first laser ultrasound experiments in humans, using a protocol that was approved by the MIT Committee on the Use of Humans as Experimental Subjects. After scanning the forearms of several healthy volunteers, the researchers produced the first fully noncontact laser ultrasound images of a human. The fat, muscle, and tissue boundaries are clearly visible and comparable to images generated using commercial, contact-based ultrasound probes.

The researchers plan to improve their technique, and they are looking for ways to boost the system’s performance to resolve fine features in the tissue. They are also looking to hone the detection laser’s capabilities. Further down the road, they hope to miniaturize the laser setup, so that laser ultrasound might one day be deployed as a portable device.

“I can imagine a scenario where you’re able to do this in the home,” Anthony says. “When I get up in the morning, I can get an image of my thyroid or arteries, and can have in-home physiological imaging inside of my body. You could imagine deploying this in the ambient environment to get an understanding of your internal state.” 

This research was supported in part by the MIT Lincoln Laboratory Biomedical Line Program for the United States Air Force and by the U.S. Army Medical Research and Material Command's Military Operational Medicine Research Program.



de MIT News https://ift.tt/2rVRPwS

Israel Ruiz to step down as MIT’s executive vice president and treasurer

Israel Ruiz will step down next year as MIT’s executive vice president and treasurer, a position he has held since 2011. President L. Rafael Reif announced the news today in a letter to MIT faculty and staff.

“Widely respected across MIT, Israel is a brilliant strategic thinker whose commitment to excellence has advanced innovative solutions to complex challenges while earning broad support along the way,” Reif wrote. “His efforts have transformed many aspects of our campus to better serve and support the MIT community. Since my earliest days as provost, he has been among my most important advisors.”

Ruiz, who has played a key role over the past decade in numerous MIT initiatives to advance innovation and entrepreneurship — efforts ranging from MIT’s Kendall Square Initiative to the launches of MITx, edX, and The Engine — plans to devote the next chapter of his career to efforts outside of academia in which he can more directly drive innovation and impact.

“I am immensely proud of my work to help launch these transformative initiatives,” Ruiz says. “At the same time, considering the accomplishments of the last decade and my career at MIT, I’ve been contemplating a change over the last couple of years. I feel it is time for me to focus firsthand on opportunities that accelerate innovation in the way this community has inspired me to do.”

Ruiz expects to transition out of his role at MIT during the spring semester. In his letter to the community, President Reif indicated that he will work in the coming months with members of Ruiz’s senior team — including Vice President for Finance Glen Shor and Vice President for Campus Services and Stewardship Joe Higgins — to determine how best to allocate Ruiz’s responsibilities.

Ruiz arrived at MIT as a master’s student in the MIT Sloan School of Management. After completing his degree in 2001, he became a consultant to MIT, serving former president Charles Vest and former provost Robert Brown. In 2002, Ruiz formally joined MIT as manager of financial planning and analysis, becoming associate director of the Office of Budget and Financial Planning in 2003. He was named director of finance in 2005, leading an Institute-wide rebalancing that several years later yielded the first balanced general unrestricted budget in 15 years.

Ruiz was named vice president for finance in 2007; in his letter to the community, President Reif noted that MIT has not had an operating loss since then. Ruiz became MIT’s executive vice president and treasurer in 2011.

In addition to strengthening MIT’s financial position, Ruiz has played an instrumental part in many of MIT’s major initiatives of the past decade. For instance, guided by President Reif’s vision of a new model for driving innovation from the lab to commercial reality, Ruiz led the development of The Engine, an accelerator for “tough tech” entrepreneurs from MIT and across the Boston area.

Launched in 2016, The Engine has since provided selected startups with specialized lab infrastructure and substantial capital — as well as an experienced support network. Ruiz brought in significant commitments for The Engine’s first $205 million venture fund and recruited Katie Rae as The Engine’s founding president and CEO. To date, The Engine has supported 20 companies

Ruiz’s efforts to foster innovation in Cambridge have also included his extensive work with Provost Martin Schmidt and the MIT Investment Management Company (MITIMCo) to reshape Kendall Square, enhancing the dynamism of what has become home to one of the world’s greatest concentrations of innovative companies. He was central to MIT’s 2017 agreement with the federal government to redevelop the John A. Volpe National Transportation Systems Center, starting a process to turn a 14-acre parcel in Kendall Square into a more vibrant mixed-use site that will benefit MIT’s mission and the Cambridge community.

“Israel’s quiet leadership and trusted partnership have produced a dramatic change in and around Kendall Square,” says MITIMCo President Seth Alexander. “His vision has transformed MIT’s neighborhood into one of the most dynamic innovation districts anywhere in the world.”

In 2018, Ruiz played the key role in developing financial models and funding strategies to underpin the MIT Stephen A. Schwarzman College of Computing, MIT’s most significant structural change since the 1950s. He also played a critical role in negotiating the college’s $350 million foundational gift.

“I feel incredibly fortunate to have worked with Israel on several projects including The Engine and the MIT Schwarzman College of Computing,” says Dean of Engineering Anantha Chandrakasan. “He is highly collaborative, has always engaged energetically with our community in launching new initiatives, and has consistently offered me excellent advice. I continue to be grateful for his extraordinary work and leadership in creating a robust strategy during the 2008 financial crisis, and in keeping MIT on sound financial footing ever since.”

Working alongside outgoing Deputy Executive Vice President Tony Sharon, Ruiz fortified MIT’s administrative and operating units, including Finance, MIT Medical, Human Resources, Environmental Health and Safety, Sustainability, Campus Planning, Facilities, MIT Police, Information Systems and Technology, and Audit. Recognizing the need to fund renewal of the Institute’s aging infrastructure, Ruiz charted the MIT2030 campus framework, which balances new construction, renewal of older buildings, and the development of a staff to maintain these buildings for future generations of faculty and students. He then secured MIT Corporation approval of a $5.2 billion capital plan to bring this vision to life.

“What Israel has led and executed has been amazing to watch,” Chancellor Cynthia Barnhart says. “Through his leadership, vision, and partnership, we are seeing a transformation of our student residences, and of our student life and academic facilities. His deep commitment to our students has been reflected in his sustained support for key elements of the MIT student experience and education, including financial aid, student well-being, MindHandHeart, and MITx.”

Ruiz has also focused on improving services for the broader MIT community. Other accomplishments, working with Executive Director Robin Elices, have included the digital platform MIT Atlas and the MIT Atlas Service Center, as well as a new MIT Welcome Center expected to open in Kendall Square next summer. With Ruiz’s guidance, MIT has also become a leader in providing options and incentives to help employees improve their commutes while reducing vehicles on the road. According to a recent Boston Globe series, “No other major Boston-area employers … offer commuter incentives with the scale and sophistication of MIT’s.”

“I have been inspired by MIT for over two decades, since the very first day I arrived on campus as an admitted student and met Professor Paul Samuelson,” Ruiz says. “I am still mesmerized by what MIT and its brilliant people accomplish for the world.”



de MIT News https://ift.tt/372Puz9

New project advances commitment to expanding graduate housing

MIT is embarking on a project to design and construct new graduate housing at the west end of campus on the site of the West Lot parking area and Building W89 (MIT Police). Currently in an early planning stage, the apartment-style residence hall is expected to provide 550 new graduate student housing beds, completing the October 2017 commitment MIT made to add 950 beds to the graduate housing system on campus. 

Chancellor Cynthia Barnhart, Provost Martin Schmidt, and Executive Vice President and Treasurer Israel Ruiz announced the project in a letter to the graduate student community today. They highlighted the benefits of the project, noting that “graduate students and graduate student families will now have access to the convenience and independence of apartment-style living in a residence hall minutes away from the heart of campus. We are pleased to be diversifying our housing portfolio to include an option that promises to be an exciting alternative to living off-campus for many students.”

New housing will be constructed on West Lot parking site

Construction of a pair of buildings located along Vassar Street, on the site of Building W89 and the West Lot parking area adjacent to Simmons Hall, will begin after the completion of planning and design work that will be conducted over the next couple of years. In a study of potential locations by the Office of Campus Planning, this site stood out as an ideal solution for expanding graduate student housing options while also reinforcing the connection to Fort Washington Park and other graduate student communities, as well as further defining the Vassar Street corridor.

“We are very grateful for the long-term opportunities this project presents,” says Suzy Nelson, vice president and dean for student life. “Giving more graduate students the opportunity to fully engage in the Institute’s vibrant campus life is an essential way of enhancing their MIT experience. This exciting new project makes that possible, and it makes living at MIT more attractive to them and to their families.”

Graduate housing is a top priority

MIT’s graduate student population has been growing steadily over the past two decades. In recent years, two different Graduate Student Housing Working Group reports highlighted student housing as one of MIT’s competitive strengths and noted MIT’s attention to expanding on-campus graduate student housing over the years. In 1980 and 1990, MIT provided housing for 27 percent of its graduate students; today, MIT provides housing for 38 percent of the graduate student population, satisfying the preferences of today’s students, with fewer shared bedrooms and more efficiencies.

Both the working group’s Report to the Provost (May 2014) and the working group’s Report to the Chancellor (August 2018) recommended that MIT undertake a further expansion of student housing, and the findings encouraged MIT to consider flexible and new on-campus housing options. Families, in particular, face housing challenges in the Boston-Cambridge, Massachusetts area based on cost and availability. Both working groups noted that an increase in one- and two-bedroom apartment-style units would help address unmet demand and would give more graduate student families the opportunity to benefit from living on campus.

With these recommendations in mind, MIT made a commitment in October 2017 to its graduate students (and the City of Cambridge, as part of the Volpe zoning petition) to add 950 beds to MIT’s graduate student housing stock. 

Accommodations for student families and singles

As planned, the new West Campus graduate residence will fulfill the 2017 commitment and will respond to the working group reports by providing a mix of housing unit types that align with the evolving needs of graduate students and student families. 

On-campus graduate housing at MIT, which currently includes more than 2,500 beds across eight residence halls, will soon be supplemented by the new graduate student residence hall under construction in Kendall Square, providing more than 450 beds total (replacing Eastgate beds and adding 250 net new beds). The West Campus student residence is expected to add 550 new graduate student housing beds in a range of apartment styles, including single units as well as larger units suitable for families. These initiatives, in concert with the addition of 150 new graduate student housing beds resulting from renovations, will fulfill MIT’s commitment to add 950 beds to its graduate student housing system.

Next steps for the housing project

MIT is planning to work with an experienced third-party campus housing developer to design, develop, and operate the student residence hall. The developer will provide the financial capital for construction, giving MIT the financial flexibility and bandwidth to expedite adding beds while reserving MIT resources for making capital renewal improvements and addressing deferred maintenance work throughout the student housing system. In the upcoming months, MIT and the developer will work together to establish an agreement and formulate a program plan informed by the Graduate Student Housing Working Group reports. 

As part of the project, the MIT Police office, currently in Building W89, is expected to be relocated to accommodate the new complex. Planning for this move is underway, and more details will be shared when available. The project team is also working closely with the Parking and Transportation Office regarding the parking spaces that will be displaced. The Parking and Transportation Office will work with individual parkers on assignments, taking into consideration proximity to housing and offices. 

The graduate community will have opportunities to continue to engage in discussion around the residence project, providing the project team with feedback and valuable input. MIT is committed to begin permitting for this project by the end of December 2020, and construction is expected to begin in 2021-22.



de MIT News https://ift.tt/2PVuwuM

Model beats Wall Street analysts in forecasting business financials

Knowing a company’s true sales can help determine its value. Investors, for instance, often employ financial analysts to predict a company’s upcoming earnings using various public data, computational tools, and their own intuition. Now MIT researchers have developed an automated model that significantly outperforms humans in predicting business sales using very limited, “noisy” data.

In finance, there’s growing interest in using imprecise but frequently generated consumer data — called “alternative data” — to help predict a company’s earnings for trading and investment purposes. Alternative data can comprise credit card purchases, location data from smartphones, or even satellite images showing how many cars are parked in a retailer’s lot. Combining alternative data with more traditional but infrequent ground-truth financial data — such as quarterly earnings, press releases, and stock prices — can paint a clearer picture of a company’s financial health on even a daily or weekly basis.

But, so far, it’s been very difficult to get accurate, frequent estimates using alternative data. In a paper published this week in the Proceedings of ACM Sigmetrics Conference, the researchers describe a model for forecasting financials that uses only anonymized weekly credit card transactions and three-month earning reports.

Tasked with predicting quarterly earnings of more than 30 companies, the model outperformed the combined estimates of expert Wall Street analysts on 57 percent of predictions. Notably, the analysts had access to any available private or public data and other machine-learning models, while the researchers’ model used a very small dataset of the two data types.

“Alternative data are these weird, proxy signals to help track the underlying financials of a company,” says first author Michael Fleder, a postdoc in the Laboratory for Information and Decision Systems (LIDS). “We asked, ‘Can you combine these noisy signals with quarterly numbers to estimate the true financials of a company at high frequencies?’ Turns out the answer is yes.”

The model could give an edge to investors, traders, or companies looking to frequently compare their sales with competitors. Beyond finance, the model could help social and political scientists, for example, to study aggregated, anonymous data on public behavior. “It’ll be useful for anyone who wants to figure out what people are doing,” Fleder says.

Joining Fleder on the paper is EECS Professor Devavrat Shah, who is the director of MIT’s Statistics and Data Science Center, a member of the Laboratory for Information and Decision Systems, a principal investigator for the MIT Institute for Foundations of Data Science, and an adjunct professor at the Tata Institute of Fundamental Research.  

Tackling the “small data” problem

For better or worse, a lot of consumer data is up for sale. Retailers, for instance, can buy credit card transactions or location data to see how many people are shopping at a competitor. Advertisers can use the data to see how their advertisements are impacting sales. But getting those answers still primarily relies on humans. No machine-learning model has been able to adequately crunch the numbers.

Counterintuitively, the problem is actually lack of data. Each financial input, such as a quarterly report or weekly credit card total, is only one number. Quarterly reports over two years total only eight data points. Credit card data for, say, every week over the same period is only roughly another 100 “noisy” data points, meaning they contain potentially uninterpretable information.

“We have a ‘small data’ problem,” Fleder says. “You only get a tiny slice of what people are spending and you have to extrapolate and infer what’s really going on from that fraction of data.”

For their work, the researchers obtained consumer credit card transactions — at typically weekly and biweekly intervals — and quarterly reports for 34 retailers from 2015 to 2018 from a hedge fund. Across all companies, they gathered 306 quarters-worth of data in total.

Computing daily sales is fairly simple in concept. The model assumes a company’s daily sales remain similar, only slightly decreasing or increasing from one day to the next. Mathematically, that means sales values for consecutive days are multiplied by some constant value plus some statistical noise value — which captures some of the inherent randomness in a company’s sales. Tomorrow’s sales, for instance, equal today’s sales multiplied by, say, 0.998 or 1.01, plus the estimated number for noise.

If given accurate model parameters for the daily constant and noise level, a standard inference algorithm can calculate that equation to output an accurate forecast of daily sales. But the trick is calculating those parameters.

Untangling the numbers

That’s where quarterly reports and probability techniques come in handy. In a simple world, a quarterly report could be divided by, say, 90 days to calculate the daily sales (implying sales are roughly constant day-to-day). In reality, sales vary from day to day. Also, including alternative data to help understand how sales vary over a quarter complicates matters: Apart from being noisy, purchased credit card data always consist of some indeterminate fraction of the total sales. All that makes it very difficult to know how exactly the credit card totals factor into the overall sales estimate.

“That requires a bit of untangling the numbers,” Fleder says. “If we observe 1 percent of a company’s weekly sales through credit card transactions, how do we know it’s 1 percent? And, if the credit card data is noisy, how do you know how noisy it is? We don’t have access to the ground truth for daily or weekly sales totals. But the quarterly aggregates help us reason about those totals.”

To do so, the researchers use a variation of the standard inference algorithm, called Kalman filtering or Belief Propagation, which has been used in various technologies from space shuttles to smartphone GPS. Kalman filtering uses data measurements observed over time, containing noise inaccuracies, to generate a probability distribution for unknown variables over a designated timeframe. In the researchers’ work, that means estimating the possible sales of a single day.

To train the model, the technique first breaks down quarterly sales into a set number of measured days, say 90 — allowing sales to vary day-to-day. Then, it matches the observed, noisy credit card data to unknown daily sales. Using the quarterly numbers and some extrapolation, it estimates the fraction of total sales the credit card data likely represents. Then, it calculates each day’s fraction of observed sales, noise level, and an error estimate for how well it made its predictions.

The inference algorithm plugs all those values into the formula to predict daily sales totals. Then, it can sum those totals to get weekly, monthly, or quarterly numbers. Across all 34 companies, the model beat a consensus benchmark — which combines estimates of Wall Street analysts — on 57.2 percent of 306 quarterly predictions.

Next, the researchers are designing the model to analyze a combination of credit card transactions and other alternative data, such as location information. “This isn’t all we can do. This is just a natural starting point,” Fleder says.



de MIT News https://ift.tt/2Z2khcs