viernes, 30 de marzo de 2018

MIT Libraries host Grand Challenges Summit

Forty-five experts from across disciplines gathered at MIT March 19-23 for a week of workshops focused on the most vital issues in information science and scholarly communication. Supported by a grant from The Andrew W. Mellon Foundation, the summit on Grand Challenges in Information Sciences and Scholarly Communication aimed to identify critical problems that are solvable within 10 years and which have broad implications across the scholarly community.

“We gathered experts from across many domains, locations, and social roles,” said Chris Bourg, director of MIT Libraries. “We spent a lot of time developing an idea of what the challenges are from many points of view.”

The summit focused on three areas: scholarly discovery, digital curation and preservation, and open scholarship. For each topic, there was a keynote speaker and a one-and-a-half-day workshop to produce a draft research agenda.

Scholarly discovery

In the opening keynote, Kate Zwaard, director of digital strategy at the U.S. Library of Congress, shared creative projects from LC Labs, which encourages innovation with the library’s digital collections. One initiative invites the public to develop digital projects using congressional data; others use color as a way to explore the library’s catalog. “We’re still in the early days of the disruption that computation is going to bring to our profession,” said Zwaard. But she sees collaboration and experimentation as critical to keeping libraries welcoming places: “We need to invite people into the tent.” 

In the workshop that followed, participants considered several key concerns: how to make discovery environments that reflect the values of transparency, agency, and participation; how to ensure discovery is globally inclusive and supports mutual exchanges of ideas; and the political, social, policy/legal, and economic barriers to creating these kinds of environments.

Curation and preservation

Keynote speaker Anasuya Sengupta leads Whose Knowledge, a global campaign to center the knowledge of marginalized communities on the Internet. She stressed the responsibility of knowledge curators in deciding whose voices are heard and preserved. Three quarters of the people online today are from Asia, Africa, and Latin America, but who is online is not reflected in the content online. Creating a truly global, inclusive sense of knowledge, said Sengupta, is a strategic choice and an ethical one: “So many communities are waiting for us to do this work.”

Workshop participants focused on four themes in digital curation and preservation: making knowledge global, making data useful, making participation open and inclusive, and promoting skill building.

Open scholarship

A longtime advocate of internet freedom and open scholarship, MIT Media Lab Director Joi Ito discussed the profound impact of the internet on scholarship and publishing. “The business model of transferring information was completely turned on its head,” he said. “Innovation and research was pushed to the edges because the cost of collaboration was diminished. What’s not keeping up are the academic publications.” Ito described high-impact “citizen science” projects using open-source technology, his work with Creative Commons to legally share knowledge, and a first-of-its-kind collaboration between the Media Lab and the MIT Press to transform publishing.

In the final workshop, participants explored possible incentives to motivate communities to participate in open scholarship and what infrastructures are needed to sustain it. Discussion also examined the challenges to establishing the credibility, durability, and integrity of the record of human knowledge.

A forthcoming summit report will be made available on the open publishing platform PubPub, where the community will be invited to comment. Background readings and presentations are also being shared through PubPub, and each keynote speech can be streamed at grandchallenges.mit.edu.



de MIT News https://ift.tt/2J7Ax3w

With buildings and infrastructure, it pays to take a life-cycle perspective

In the face of limited funding to address massive infrastructure needs, and with climate action at top of mind, it is more important than ever for engineers, designers, and policy makers to understand the full economic and environmental costs of infrastructure project decisions — and not just impacts relating to material choice or from initial construction, but the impacts of choices across the entire life cycle of a project.

“As we develop strategies to reach sustainability goals, it is vital that we adopt methodologies that use a life-cycle perspective to evaluate impacts and use that knowledge to create a strategic path moving forward,” says Jeremy Gregory, research scientist in the MIT Department of Civil and Environmental Engineering and executive director of the MIT Concrete Sustainability Hub (CSHub).

Life-cycle analysis methodologies exist for both environmental and economic impacts: Life cycle assessment (LCA) examines environmental impacts, while life cycle cost analysis (LCCA) examines economic impacts. LCA and LCCA enable engineers, designers, and decision-makers to better understand opportunities that exist to reduce environmental and economic impacts, but CSHub research has found that these tools are rarely used at a point in the decision-making process when they can have the greatest impact. The CSHub team recently released several new papers and materials discussing research designed to improve life cycle thinking for buildings and pavements. 

“For buildings, placing too much emphasis on minimizing initial costs and not paying enough attention to the use phase can lead to higher costs, both environmentally and economically,” says Gregory. “Construction projects that focus on first costs fail to account for costs associated with lifetime energy use, and the stakeholders who aren’t typically involved in early planning stages, such as future homeowners, insurance agencies, and taxpayers, are the ones left holding the bill.”

The environmental impacts are significant; in the United States, the heating, cooling, and operation of buildings and homes accounts for more than 40 percent of carbon dioxide emissions each year. The CSHub has several projects underway that quantify the full life cycle impacts of buildings, from initial construction to demolition, and has developed building LCA tools that allow impacts to be quantified earlier in the design process than is allowed by traditional methodologies. Researchers have published several recent papers on the topic. All five papers can be found on the CSHub website in a section dedicated to building LCA.

“LCA and LCCA approaches work best when they accompany each other, by providing the necessary economic context to implement solutions into the marketplace,” explains Gregory. “Poorly insulated and leaky residential construction leads to high annual energy costs, which can result in substantially higher life-cycle costs. Likewise, roadway closures cause traffic congestion, which leads to higher costs for road users.”

For pavements, CSHub LCA work considers all life-cycle phases from initial construction to demolition, including operation, maintenance, and end-of-life phases, and factors such as traffic delay, lighting demand, and future maintenance, while LCCA research considers life cycle, context, and future, and also incorporates risk.

The team recently released a pavements LCCA and LCA info sheet, which highlights key concepts and statistics from CSHub studies. CSHub tools use probabilistic price projections compatible with existing software tools used by pavement designers, such as the Federal Highway Administration’s RealCost tool. One of the studies highlighted noted a 32 percent improvement on 20-year cost estimates and LCCA results for roadway projects in Colorado when using CSHub models.

CSHub research is supported by the Portland Cement Association and the Ready Mixed Concrete Research and Education Foundation.



de MIT News https://ift.tt/2pTncni

Engineers turn plastic insulator into heat conductor

Plastics are excellent insulators, meaning they can efficiently trap heat — a quality that can be an advantage in something like a coffee cup sleeve. But this insulating property is less desirable in products such as plastic casings for laptops and mobile phones, which can overheat, in part because the coverings trap the heat that the devices produce.

Now a team of engineers at MIT has developed a polymer thermal conductor — a plastic material that, however counterintuitively, works as a heat conductor, dissipating heat rather than insulating it. The new polymers, which are lightweight and flexible, can conduct 10 times as much heat as most commercially used polymers.

“Traditional polymers are both electrically and thermally insulating. The discovery and development of electrically conductive polymers has led to novel electronic applications such as flexible displays and wearable biosensors,” says Yanfei Xu, a postdoc in MIT’s Department of Mechanical Engineering. “Our polymer can thermally conduct and remove heat much more efficiently. We believe polymers could be made into next-generation heat conductors for advanced thermal management applications, such as a self-cooling alternative to existing electronics casings.”

Xu and a team of postdocs, graduate students, and faculty, have published their results today in Science Advances. The team includes Xiaoxue Wang, who contributed equally to the research with Xu, along with Jiawei Zhou, Bai Song, Elizabeth Lee, and Samuel Huberman; Zhang Jiang, physicist at Argonne National Laboratory; Karen Gleason, associate provost of MIT and the Alexander I. Michael Kasser Professor of Chemical Engineering; and Gang Chen, head of MIT’s Department of Mechanical Engineering and the Carl Richard Soderberg Professor of Power Engineering.

Stretching spaghetti

If you were to zoom in on the microstructure of an average polymer, it wouldn’t be difficult to see why the material traps heat so easily. At the microscopic level, polymers are made from long chains of monomers, or molecular units, linked end to end. These chains are often tangled in a spaghetti-like ball. Heat carriers have a hard time moving through this disorderly mess and tend to get trapped within the polymeric snarls and knots.

And yet, researchers have attempted to turn these natural thermal insulators into conductors. For electronics, polymers would offer a unique combination of properties, as they are lightweight, flexible, and chemically inert. Polymers are also electrically insulating, meaning they do not conduct electricity, and can therefore be used to prevent devices such as laptops and mobile phones from short-circuiting in their users’ hands.

Several groups have engineered polymer conductors in recent years, including Chen’s group, which in 2010 invented a method to create “ultradrawn nanofibers” from a standard sample of polyethylene. The technique stretched the messy, disordered polymers into ultrathin, ordered chains — much like untangling a string of holiday lights. Chen found that the resulting chains enabled heat to skip easily along and through the material, and that the polymer conducted 300 times as much heat compared with ordinary plastics.

But the insulator-turned-conductor could only dissipate heat in one direction, along the length of each polymer chain. Heat couldn’t travel between polymer chains, due to weak Van der Waals forces — a phenomenon that essentially attracts two or more molecules close to each other. Xu wondered whether a polymer material could be made to scatter heat away, in all directions.

Xu conceived of the current study as an attempt to engineer polymers with high thermal conductivity, by simultaneously engineering intramolecular and intermolecular forces — a method that she hoped would enable efficient heat transport along and between polymer chains.

The team ultimately produced a heat-conducting polymer known as polythiophene, a type of conjugated polymer that is commonly used in many electronic devices.

Hints of heat in all directions

Xu, Chen, and members of Chen’s lab teamed up with Gleason and her lab members to develop a new way to engineer a polymer conductor using oxidative chemical vapor deposition (oCVD), whereby two vapors are directed into a chamber and onto a substrate, where they interact and form a film. “Our reaction was able to create rigid chains of polymers, rather than the twisted, spaghetti-like strands in normal polymers.” Xu says.

In this case, Wang flowed the oxidant into a chamber, along with a vapor of monomers — individual molecular units that, when oxidized, form into the chains known as polymers.

“We grew the polymers on silicon/glass substrates, onto which the oxidant and monomers are adsorbed and reacted, leveraging the unique self-templated growth mechanism of CVD technology," Wang says.

Wang produced relatively large-scale samples, each measuring 2 square centimeters — about the size of a thumbprint.

“Because this sample is used so ubiquitously, as in solar cells, organic field-effect transistors, and organic light-emitting diodes, if this material can be made to be thermally conductive, it can dissipate heat in all organic electronics,” Xu says.

The team measured each sample’s thermal conductivity using time-domain thermal reflectance — a technique in which they shoot a laser onto the material to heat up its surface and then monitor the drop in its surface temperature by measuring the material’s reflectance as the heat spreads into the material.

“The temporal profile of the decay of surface temperature is related to the speed of heat spreading, from which we were able to compute the thermal conductivity,” Zhou says.

On average, the polymer samples were able to conduct heat at about 2 watts per meter per kelvin — about 10 times faster than what conventional polymers can achieve. At Argonne National Laboratory, Jiang and Xu found that polymer samples appeared nearly isotropic, or uniform. This suggests that the material’s properties, such as its thermal conductivity, should also be nearly uniform. Following this reasoning, the team predicted that the material should conduct heat equally well in all directions, increasing its heat-dissipating potential.

Going forward, the team will continue exploring the fundamental physics behind polymer conductivity, as well as ways to enable the material to be used in electronics and other products, such as casings for batteries, and films for printed circuit boards.

“We can directly and conformally coat this material onto silicon wafers and different electronic devices” Xu says. “If we can understand how thermal transport [works] in these disordered structures, maybe we can also push for higher thermal conductivity. Then we can help to resolve this widespread overheating problem, and provide better thermal management.”

This research was supported, in part, by the U.S. Department of Energy — Basic Energy Sciences and the MIT Deshpande Center.



de MIT News https://ift.tt/2GpyAxq

KSA meeting puts “Spotlight” on region’s diversity issues

The 10th annual meeting of the Kendall Square Association (KSA) took a hard look at the region’s widespread diversity issues, historical and current, and featured members of The Boston Globe’s Spotlight team, who discussed their latest investigation into racism in Boston.

The KSA is a nonprofit organization of 175 industry and academic partners — including Google, Microsoft, and MIT — in and around Kendall Square that promotes the vibrancy of the district. Each year, the KSA holds a meeting to elect new board members, provide updates on projects, and host guest speakers who share insights on technology, business, education, and other fields.

Welcoming around 350 attendees to the Boston Marriot Cambridge, Cambridge Mayor Marc McGovern praised the KSA and thanked the Spotlight reporters for taking on the “incredibly important topic” that’s critical to his city.

“Even here, in the People’s Republic of Cambridge,” he quipped, “we have race and class issues that sometimes we don’t want to address, because we think they don’t happen here, but they do. Putting a spotlight — pun intended — on that issue, and having us take a look at ourselves and self-reflect, is something we have to continuously do.”

Cambridge, he added, can be “a city of contradictions.” Even with high income levels, Cambridge “has a higher poverty rate than the state average. It’s also a community where 500 homeless people are on our streets every night and where death by overdose has doubled in the last year.”

The city is currently tackling those issues, he said: “We want to truly be the socially economic and just community we claim to be.”

The keynote panel comprised three members of the Globe’s current Spotlight team: reporter Andrew Ryan, columnist Adrian Walker, and editor Patricia Wen. The team’s latest seven-part project — titled, “Boston. Racism. Image. Reality.” — aims to “show through data that we aren’t as liberal and progressive in terms of our actions,” Wen said.

Projection screens in the front of the room displayed some of the data uncovered as part of the Spotlight series. For instance: A recent national survey by the Globe found black people ranked Boston as the least welcoming city for people of color among eight major U.S. cities; black student enrollment at many top Boston universities is in the single digits; only two black politicians have been elected to statewide office in the last 50 years; at publicly traded firms, only 1 percent of board members are black; and the median net worth for whites is nearly $250,000, while it’s a mere $8 for blacks. “That’s eight — as in single-digit eight,” said panel moderator Malick W. Ghachem, a professor of history at MIT, kicking off the discussion.

Early in the panel, Ghachem asked a core question: Why isn’t racism a scandal in Boston? He referenced the bombshell 2002 Spotlight report on the sex abuse scandal in the Boston archdiocese. The reporting team won the 2003 Pulitzer Prize for Public Service and the investigation became the basis for the 2015 film “Spotlight.”

“My guess is Hollywood won’t pick up this story,” Ghachem said. “Racism is not a scandal in America, generally speaking, and perhaps especially not a scandal in Boston.”

But Walker disagreed. It’s a scandal — just more widespread and without “specific bad actors.” People react to the issue differently than they would to “a specific, abusive priest, but that doesn’t mean it’s any less scandalous,” he said. “I don’t think it gets dismissed.”

Agreeing, Wen said media and the public generally seek a “villain” in news stories. “Racism is an issue that is just more ubiquitous” and lacks a specific villain to rail against, she said.

But, as Ghachem pointed out, the recent Spotlight report reveals Boston’s diversity issues have, in fact, been largely dismissed. The data indicate not much has changed since Globe reporters tackled the region’s racial equality in 1983: In 2015, 4.6 percent of black workers were officials and managers, compared to 4.5 percent from 1983; the black unemployment rate is still, after 30 years, double the rate among whites; and the “Vault” — an organization of Boston’s most powerful business leaders, now called “New Vault” — still has zero black members.

Some things have changed, Ryan added, such as that African-Americans “didn’t necessarily fear for their safety if they went into certain neighborhoods. But that’s a pretty low bar” for measuring progress, he said.

Conversation soon turned to the rising Seaport neighborhood in the South Boston Waterfront — one part of the Spotlight investigation — which has very few black residents. In the past decade, according to the Spotlight team, lenders have issued only three residential mortgages, out of 660, to black buyers. Yet, the neighborhood is funded by nearly $18 billion in taxpayer dollars.

The neighborhood represents a squandered opportunity for Boston to encourage diversity, according to the Spotlight team. Their report, Ryan said, aimed to “underscore the point that … it’s not just private funding that has created this situation. … When you tally up what we’ve spent on those buildings … it was pretty extraordinary,” but unrepresentative of Boston’s citizens.

Among other issues discussed were: individual vs. systematic racism; how to diversify neighborhoods and schools in Boston; how the general public can fight racism; and Amazon’s plans to lease an office building in the Seaport that could create 2,000 jobs, despite black and Latino drivers having filed a 2017 class action suit in Massachusetts accusing the online retailer of racial discrimination.  

Ghachem’s final question centered on bringing “inclusion riders” to sectors in Boston. Made popular by actress Frances McDormand’s Oscar speech this year, the concept refers to actors demanding, in contract, diversity among a film’s cast and crew. “Can inclusion riders work in the tech world?” Ghachem asked. “Is this a concept people can use to diversify higher education, technology, construction?”

There’s a good chance they could, Wen said. Using an example of law firms, she said some corporations are starting to hold law firms accountable for lack of partner diversity — which has encouraged greater diversity. “The pressure point has to come from somewhere,” she said. “That’s the only thing sometimes that will trigger change.”

The meeting included the official announcement of new KSA President C.A. Webb, who in 2015 co-founded Underscore VC, a tech venture firm, and led the New England Venture Capital Association for four years. She has served on the KSA board for years and as interim president for the past nine months.

KSA is a “connective tissue,” Webb said, that builds relationships around Kendall Square, gathers feedback from community members, and conducts research. “Then, we look for the actions that we can take that have the greatest impact, that best leverage our resources, and will respond to the needs of this community,” she said.

Her vision of Kendall Square’s future, she said, “is one of shared prosperity, a broad inclusivity, and an open access.” Plans include improvements to MBTA, including more maintenance, fixes to service advancements, and mobility solutions, “so all of us can move into, out of, and through Kendall Square more easily.”

Inviting the Spotlight team to speak at the meeting was “a carefully considered choice,” she said, as it would address racism, sexism, homophobia, and other issues rife in the region and around the world. “Today marks an important part of our journey …  to take a hard look at ourselves and ask what we must do to develop … strategies for building truly diverse and truly inclusive organizations,” she said.



de MIT News https://ift.tt/2IhXVtE

Fueling collaborations between MIT faculty and researchers across the globe

Celebrating its 35th year, MIT International Science and Technology Initiatives (MISTI) continues to ignite new international collaborations between MIT faculty and researchers abroad through the Global Seed Funds (GSF) program. MISTI GSF enables participating teams — comprised of faculty and students — to connect with their international peers with the aim of developing and launching joint projects. Many of these collaborations have led to published papers, subsequent grants, and lasting connections between individuals, and between MIT and other leading research institutions.

This year MISTI GSF received 253 applications and awarded over $2 million to 110 faculty from 23 departments across the Institute. “Through these collaborations,” explains Suzanne Berger, professor of political science and former director of MISTI, “faculty gain an access to knowledge they don’t have within their own labs.” Launched under Berger’s leadership in 2008, the grants program has tripled in size, offering more than 25 location- or institute-specific funds in the 2017-2018 grant cycle. 

Additionally, the MIT Global Partnerships Fund (GPF), administered by MISTI, was launched this year through the Office of the Associate Provost for International Activities. Created in response to MIT’s Global Strategy to cultivate region-specific faculty and Institute-level collaborations in targeted regions, the GPF will allow MIT to connect and engage with regions and areas where we have strategically decided to deepen our engagement; develop stronger collaborations with peer institutions; and explore opportunities for collaboration in education and innovation / entrepreneurship as well as research. In its first year, the MIT GPF focused on projects in Africa and Mexico. Eight faculty projects were awarded, representing eight departments and four MIT schools: Architecture and Planning; Engineering; Humanities, Arts, and Social Sciences; and Science

MISTI also administers the MIT-Imperial College London Seed Fund, which was created through a partnership between the MIT Office of the Associate Provost for International Activities and Imperial College London. Since 2015, 11 MIT-Imperial College London collaborations have received funding totaling $177,000.

Over the last decade MISTI GSF has awarded over $15.7 million to 735 faculty projects across MIT. From exploring the properties of the Higgs boson at CERN in Switzerland to leading policy reforms to change the future of hydropower in Chile, GSF faculty are working with peers to better understand — and help solve — today’s pressing global challenges.

The next MISTI GSF call for proposals will be announced in May with a deadline in early fall. For more details about the GSF program, please visit the MISTI site.

Originally launched as the MIT-Japan Program in 1983, MISTI has expanded to include opportunities for students and faculty in more than 20 countries. This past year over 1,200 MISTI students interned, researched, and taught aboard. To prepare for their experiences abroad, MISTI students complete coursework in the language and culture of their host country and attend MISTI-prepared, location-specific training sessions.

MISTI is a part of the Center for International Studies within the School of Humanities, Arts, and Social Sciences (SHASS).



de MIT News https://ift.tt/2J4XyUM

jueves, 29 de marzo de 2018

Innovation fosters inclusive teaching at MIT

“What are we doing to enable every student at MIT to make the most of the opportunities that are here for them?” Vice Chancellor Ian A. Waitz posed this question at the start of the March 9 MacVicar Day symposium, which was titled “Inclusive Pedagogies: Building a Vibrant Community of Learners at MIT.”

The MacVicar Faculty Fellows Program, which recognizes exceptional undergraduate teaching, was established in honor of Margaret MacVicar. MacVicar was, among many things, the first dean for undergraduate education, the founder of the Undergraduate Research Opportunities Program (UROP), and a crusader for diversity and inclusiveness at MIT.

This year’s fellows are David Autor, the Ford Professor of Economics and associate head of the department; Christopher Capozzola, an associate professor of history; Shankar Raman, a professor of literature; and Merritt Roe Smith, the Leverett and William Cutten Professor of the History of Technology in the Department of History and the Program in Science, Technology, and Society (STS).

After introducing the 2018 fellows, Waitz noted several ways in which MIT has responded to calls for more inclusivity, including implicit bias training, increases to financial aid, and the creation of the special subjects MIT and Slavery and Designing the First Year at MIT. But, citing recent student survey responses, he acknowledged that there was still much to do. He hoped that the symposium would give the audience the opportunity to learn from instructors who have made considerable progress in these efforts.

The term “inclusive pedagogies” refers to classroom practices and teaching strategies that include, engage, and support all students. As the afternoon’s presenters demonstrated, approaches to inclusive teaching can vary greatly, and the path to change is often unexpected and surprising.

The (stereotype) threat is real

The first speaker was Catherine Drennan, a professor of chemistry and biology and a professor and investigator with the Howard Hughes Medical Institute. She is also a 2015 MacVicar fellow.

Drennan recounted how several years ago she was asked to speak with underrepresented minority students majoring in chemistry, and was alarmed to find that there were only two. After meeting with one of the students, she learned that he was discouraged because he did not see anyone in the field who looked like him and did not feel that his teaching assistants believed in him.

Drennan recognized that the student was experiencing stereotype threat, the perceived danger of confirming a negative generalization about a racial, ethnic, gender, or cultural group. Worrying about being stereotyped can lead to feelings of being judged unfairly and can hurt students’ performance, perpetuating the problem.

“I really like doing research in education,” Drennan reflected. “I always learn something I’m not expecting when I ask questions.”

Knowing that something had to change, she created a series of videos, which highlighted the diverse backgrounds of those in the field of chemistry, and a booklet, detailing stereotype threat and ways to counteract it. After implementing these materials in department-wide teaching assistant (TA) training, she began to host weekly “clicker competitions” in her 5.111 (Principles of Chemical Science) classes. Recitation groups, led by their TAs, faced off against each other to see who could answer the most questions correctly.

The competitions, which have been replicated at the University of California at Irvine, with similar results, allowed the TAs and students to bond. The TAs became more comfortable teaching and supporting their students, who in turn experienced a greater sense of belonging.

Active learning as inclusive learning

Katrina LaCurts, a lecturer in the electrical engineering and computer science department, presented the results of her attempt to make class participation more equitable and effective in 6.033 (Computer System Engineering).

6.033 is a communication-intensive within the major (CI-M) subject, and as such focuses heavily on writing and oral presentation. Each recitation is based on a different technical paper, which students are expected to read and be ready to discuss in class. In the past, discussions would often be dominated by just a few students, or students would arrive to class unprepared and disengaged.

Hoping that a more intimate environment would lead to more participation, LaCurts encouraged her recitation instructors to incorporate small group techniques into their classes. But she quickly discovered that there was a big difference between suggesting active learning and understanding what that entailed. She found, like Drennan had in 5.111, that change would come only when instructors were sufficiently trained.

After explaining the benefits of an active learning approach, LaCurts conducted training and worked with her instructors to apply it. When compared to prior offerings of the subject, from what LaCurts and her staff jokingly call “the dark times,” students are more engaged and have a greater sense of camaraderie with their classmates. Furthermore, instructors have found that students exhibit less anxiety and a more thorough understanding of the material. Lessons are more effective and enjoyable for all involved.

Finding a home in education

Education subjects “recognize [students’] diverse set of interests and finds [a] home for them,” said Eric Klopfer, a professor in and the director of the Scheller Teacher Education Program (STEP).

There is an overrepresentation of women and underrepresented minorities in the introductory education subjects. According to Klopfer, many students were motivated to take the class because of their personal experience as part of a group that “wasn’t expected to succeed in math and science.” Their success in STEM has compelled them to give back.

Students learn about inclusive pedagogies by implementing them at the K-12 level, designing games, teaching lessons, and making presentations. This practice of working in schools gives them firsthand experience of being both teacher and student. They observe not only the diverse backgrounds of the students themselves, but also the diversity of the ways in which students learn.

With this diversity in mind, Meredith Thompson, a research scientist at the Teaching Systems Lab (TSL) and STEP, presented Swipe Right for CS. The game, which is being developed as part of a UROP, allows teachers to practice connecting students’ strengths and interests to computer science. Some rationales don’t fit, Thompson explained, and feeling and understanding that students’ motivations for learning vary is of enormous import to aspiring teachers.

Inquiry-based learning

Christine Ortiz, the former dean for graduate education and the current Morris Cohen Professor of Materials Science and Engineering, was the afternoon’s final speaker.

In a new special subject, 3.S03 (Materials, Societal Impact and Social Innovation), Ortiz and her students explored what could happen if inclusion was incorporated into every step of the learning process. Ortiz cited the lack of inclusionary perspective in pedagogy as one of the causes for disparity in STEM fields.

In response to this inequality, the class examined two bodies of literature — course-based undergraduate research and scholarly work in equity. When considered together, these areas of study could inform one another and lead to new, innovative approaches in inquiry-based learning.

After students were equipped with a foundation of inclusive principles to internalize and use in their work, they completed a research project. Each step of the process was intentional, with careful consideration given to how to include ideas of equity. Ortiz provided continuous feedback to her students, making revision an iterative and edifying process. One of the completed projects looked at sustainability as a form of social justice, with students designing a process to recycle 3-D-printed materials. This class, Ortiz concluded, “was really a joy to teach.”

In closing, Vice Chancellor Waitz expressed his appreciation for all of the presenters and their thoughtful efforts. “Thank you for trying new things. It’s wonderful to see the impacts of this work.”



de MIT News http://bit.ly/2Ic5X7G

Sprouting greenery and community

MacGregor House, situated between the Charles River and the MIT soccer fields, is a dorm known for its strong sense of community. Suites of the dormitory (called "entries") each have their own personalities, and the people who live within them often consider each other family. Part of their togetherness stems from the fact that MacGregor is a cook-for-yourself dorm. Students share communal kitchen space and come together to cook and eat dinner most nights of the week.

Even though residents enjoy cooking for themselves, when the academic and extracurricular commitments of a full MIT semester kick-in, free time to shop for fresh produce decreases.

Rachel Weissman, a first-year student studying urban studies and planning at MIT, heads a project to grow fresh produce in the halls of MacGregor. With support from the MindHandHeart Innovation Fund, a grant program promoting wellness, community, and inclusion on campus, the project set up a seven-box hydroponic garden inside the dorm in January. This spring, MacGregor residents will be able to pick tomatoes, bib lettuce, spinach, parsley, and basil from the boxes.

In addition to providing fresh vegetables, the project offers a healthy stress-reliever for the more than 15 students involved in its creation. “There’s a component of gardening which is relaxing on its own,” says Weissman. “We wouldn’t have been able to do it without the MindHandHeart Innovation Fund.”

The group hopes to expand the garden, and the participation among residents of MacGregor. “I’d just like to give a big thank you to the heads of house,” Weissman says. “We’re glad to have this opportunity, and we’re excited to see where it goes as more people get involved in the future.”

Sponsored by the Office of the Chancellor and MIT Medical, the MindHandHeart Innovation Fund is accepting applications through March 31.



de MIT News https://ift.tt/2GB7TJA

Scientists find different cell types contain the same enzyme ratios

By studying bacteria and yeast, researchers at MIT have discovered that vastly different types of cells still share fundamental similarities, conserved across species and refined over time. More specifically, these cells contain the same proportion of specialized proteins, known as enzymes, which coordinate chemical reactions within the cell.

To grow and divide, cells rely on a unique mixture of enzymes that perform millions of chemical reactions per second. Many enzymes, working in relay, perform a linked series of chemical reactions called a “pathway,” where the products of one chemical reaction are the starting materials for the next. By making many incremental changes to molecules, enzymes in a pathway perform vital functions such as turning nutrients into energy or duplicating DNA.

For decades, scientists wondered whether the relative amounts of enzymes in a pathway were tightly controlled in order to better coordinate their chemical reactions. Now, researchers have demonstrated that cells not only produce precise amounts of enzymes, but that evolutionary pressure selects for a preferred ratio of enzymes. In this way, enzymes behave like ingredients of a cake that must be combined in the correct proportions and all life may share the same enzyme recipe.

“We still don’t know why this combination of enzymes is ideal,” says Gene-Wei Li, assistant professor of biology at MIT, “but this question opens up an entirely new field of biology that we’re calling systems level optimization of pathways. In this discipline, researchers would study how different enzymes and pathways behave within the complex environment of the cell.”

Li is the senior author of the study, which appears online in the journal Cell on March 29, and in print on April 19. The paper’s lead author, Jean-Benoît Lalanne, is a graduate student in the MIT Department of Physics.

An unexpected observation

For more than 100 years, biologists have studied enzymes by watching them catalyze chemical reactions in test tubes, and — more recently — using X-rays to observe their molecular structure.

And yet, despite years of work describing individual proteins in great detail, scientists still don’t understand many of the basic properties of enzymes within the cell. For example, it is not yet possible to predict the optimal amount of enzyme a cell should make to maximize its chance of survival.

The calculation is tricky because the answer depends not only on the specific function of the enzyme, but also how its actions may have a ripple effect on other chemical reactions and enzymes within the cell.

“Even if we know exactly what an enzyme does,” Li says, “we still don’t have a sense for how much of that protein the cell will make. Thinking about biochemical pathways is even more complicated. If we gave biochemists three enzymes in a pathway that, for example, break down sugar into energy, they would probably not know how to mix the proteins at the proper ratios to optimize the reaction.”

The study of the relative amounts of substances — including proteins — is known as “stoichiometry.” To investigate the stoichiometry of enzymes in different types of cells, Li and his colleagues analyzed three different species of bacteria — Escherichia coli, Bacillus subtilis, and Vibrio natriegens — as well as the budding yeast Saccharomyces cerevisiae. Among these cells, scientists compared the amount of enzymes in 21 pathways responsible for a variety of tasks including repairing DNA, constructing fatty acids, and converting sugar to energy. Because these species of yeast and bacteria have evolved to live in different environments and have different cellular structures, such as the presence or lack of a nucleus, researchers were surprised to find that all four cells types had nearly identical enzyme stoichiometry in all pathways examined.

Li’s team followed up their unexpected results by detailing how bacteria achieve a consistent enzyme stoichiometry. Cells control enzyme production by regulating two processes. The first, transcription, converts the information contained in a strand of DNA into many copies of messenger RNA (mRNA). The second, translation, occurs as ribosomes decode the mRNAs to construct proteins. By analyzing transcription across all three bacterial species, Li’s team discovered that the different bacteria produced varying amounts of mRNA encoding for enzymes in a pathway.

Different amounts of mRNA theoretically lead to differences in protein production, but the researchers found instead that the cells adjusted their rates of translation to compensate for changes in transcription. Cells that produced more mRNA slowed their rates of protein synthesis, while cells that produced less mRNA increased the speed of protein synthesis. Thanks to this compensation, the stoichiometry of enzymes remained constant across the different bacteria.

“It is remarkable that E. coli and B. subtilis need the same relative amount of the corresponding proteins, as seen by the compensatory variations in transcription and translation efficiencies,” says Johan Elf, professor of physical biology at Uppsala University in Sweden. “These results raise interesting questions about how enzyme production in different cells have evolved."

“Examining bacterial gene clusters was really striking,” lead author Lalanne says. “Over a long evolutionary history, these genes have shifted positions, mutated into different sequences, and been bombarded by mobile pieces of DNA that randomly insert themselves into the genome. Despite all this, the bacteria have compensated for these changes by adjusting translation to maintain the stoichiometry of their enzymes. This suggests that evolutionary forces, which we don’t yet understand, have shaped cells to have the same enzyme stoichiometry.”

Searching for the stoichiometry regulating human health

In the future, Li and his colleagues will test whether their findings in bacteria and yeast extend to humans. Because unicellular and multicellular organisms manage energy and nutrients differently, and experience different selection pressures, researchers are not sure what they will discover.

“Perhaps there will be enzymes whose stoichiometry varies, and a smaller subset of enzymes whose levels are more conserved,” Li says. “This would indicate that the human body is sensitive to changes in specific enzymes that could make good drug targets. But we won’t know until we look.”

Beyond the human body, Li and his team believe that it is possible to find simplicity underlying the complex bustle of molecules within all cells. Like other mathematical patterns in nature, such as the the spiral of seashells or the branching pattern of trees, the stoichiometry of enzymes may be a widespread design principle of life.

The research was funded by the National Institutes of Health, Pew Biomedical Scholars Program, Sloan Research Fellowship, Searle Scholars Program, National Sciences and Engineering Research Council of Canada, Howard Hughes Medical Institute, National Science Foundation, Helen Hay Whitney Foundation, Jane Coffin Childs Memorial Fund, and the Smith Family Foundation.



de MIT News https://ift.tt/2E2wt0x

miércoles, 28 de marzo de 2018

Center for Theoretical Physics celebrates 50 years

To celebrate the 50th anniversary of its founding, the Center for Theoretical Physics (CTP) hosted a symposium on Saturday, March 24. "CTP50: The Center for Theoretical Physics: The First Fifty Years" brought together present and former members of the CTP as well as friends, supporters, and others interested in the past, present, and future of theoretical physics.

The celebration of 50 years of physics at the CTP featured speakers that included former students, postdocs, and faculty as well as some current CTP faculty members. Some of the key topics explored at the symposium included gravitational waves, black holes, dark matter, neutron stars, and nuclear physics; dualities and symmetries in string theory, condensed matter physics, and quantum field theory; quantum information and computing; and the foundations of quantum physics. Presentations on recent work in these areas were interspersed with historical perspectives and recollections of the CTP's last 50 years, discussion and videos illustrating the current activities in the CTP, and speculations regarding future directions in theoretical physics.

"In its 50 years, the CTP has seen its faculty, postdocs, and students make discoveries that have advanced our theoretical understanding of how the universe works," Michael Sipser, Dean of the School of Science, said in his introductory comments to lead off the day. "Now we have a new group of young faculty poised to make discoveries into the nature of the universe in areas such as dark matter — the unknown substance that comprises more than 80 percent of the matter in the universe."

The afternoon session was led off by remarks from George Fai from the Office of Nuclear Physics in the U.S. Department of Energy (DOE) who read a congratulatory letter from Tim Hallman, the associate director of the DOE Office of Science. Fai's remarks were followed by commentary from Laboratory from Nuclear Science (LNS) Director Bolek Wyslouch. The CTP is a part of LNS, and Wyslouch commented on the increased level of collaboration between young faculty in nuclear and particle physics, in both theoretical and experimental work.

David Kaiser, the Germeshausen Professor of the History of Science and a professor of physics, also gave an engaging history of the founding of the Center for Theoretical Physics in a talk entitled: "It was Fifty Years Ago Today ... A Brief Look Back at Physics, MIT and the World of 1968." The CTP was founded in 1968 under its first director, Herman Feshbach, while Viki Weisskopf was the head of the Department of Physics. In his talk, Kaiser traced the development of theoretical physics, beginning with mathematicians-astronomers-philosophers Galileo and Newton, and highlighted the relatively recent development of the notion of "theoretical physicist" as a job title. The CTP as an institute of theoretical physics was one of the first such centers in the United States.

The recent observation of gravitational waves from mergers of black holes and neutron stars by the LIGO experiment (for which MIT's Rai Weiss received the 2017 Nobel prize) occurred more than 100 years after Einstein's development of the theory of general relativity, which predicts gravitational waves that carry energy across space. This observation has in turn stimulated new developments in theory. Chung Pei-Ma SB '93 PhD '96, who is now the Judy Chandler Webb Professor of Astronomy and Physics at the University of California at Berkeley, described new progress in identifying supermassive black holes at the centers of distant galaxies, and the prospects for detecting gravitational wave signals from mergers of these objects. Sanjay Reddy, a former CTP postdoc who is now a professor at the Institute for Nuclear Theory at the University of Washington, described how combined gravitational and electromagnetic signals from a neutron star merger observed late last year have provided important new information that helps describe nuclear matter at the highest achievable densities, as well as how heavy elements such as gold and platinum are produced in the universe.

The mystery of dark matter, which constitutes roughly 80 percent of the mass density of the universe, also provided substantial material for discussion. MIT/CTP Nobel laureate Frank Wilczek described in an entertaining talk how he named the "axion" particle, which is a likely dark matter candidate, after a laundry detergent. Former CTP faculty member Lisa Randall, now the Frank B. Baird, Jr., Professor of Science at Harvard, spoke about some new ideas about dark matter, in particular about dark matter particles that may interact with one another. CTP faculty members Will Detmold, Tracy Slatyer, and Jesse Thaler, as well as experimentalist Lindley Winslow from LNS, were featured in the premiere of a new video directed by Bill Lattanzi on efforts at MIT to understand and discover dark matter.

Another theme at the symposium was the development of new approaches to understanding quantum field theories, combining methodology from string theory with insights from condensed matter physics. Former CTP postdoc Dam Son, who is now a University Professor at the University of Chicago, described a new theoretical description of a fractional quantum Hall fluid, a special topological state of matter, in terms of composite fermions with equivalent (dual) descriptions in which a particle density in one description becomes a magnetic field in the other, and vice versa. Former MIT Pappalardo Fellow David Tong, a professor of theoretical physics at Cambridge University, using methods motivated from string theory, showed how this was just one among a web of dualities, permitting descriptions of condensed matter systems in terms of very different kinds of field theories, and how these dualities are giving new insights into the structure of quantum field theory in general. Another former Pappalardo Fellow, University of Michigan professor of physics Henriette Elvang, showed how a different novel approach to quantum field theory based on scattering amplitudes can place strong constraints on what kinds of effective theories of low-energy excitations can be consistent in the presence of broken symmetries, relating to the famous work of emeritus CTP faculty member Jeffrey Goldstone in 1961 that led to the Higgs mechanism and the standard model of particle physics. Frank Wilczek also described new ideas about broken time symmetries in quantum field theory, leading to new states of matter called "time crystals" that may lead to new kinds of precision sensors.

Quantum theory — including quantum computing, quantum information, connections to quantum gravity, and its foundations — was another focal point of interest at the symposium. Andrew Childs Phd '04, now professor of computer science at the University of Maryland, described efficient methods for simulating quantum physics on quantum computers. Bill Lattanzi also premiered a second new video featuring CTP faculty members Daniel Harlow and Aram Harrow and their work on quantum error correction and black hole physics and the connections between these ideas. CTP professor Alan Guth spoke on the Cosmic Bell Experiment, a test of quantum entanglement, and Einstein's "spooky action at a distance," which makes use of some of the oldest light in the universe to address a loophole in previous experiments to test the foundations of quantum theory.

Finally, a panel on the future of theoretical physics featured a lively engagement among the most recent generation of CTP faculty including professors William Detmold, Aram Harrow, Daniel Harlow, Tracy Slatyer, and Jesse Thaler. Some of this discussion focused on the way in which current developments in theory are bringing together once disparate disciplines such as string theory, field theory, nuclear physics, and condensed matter theory in new ways, and ways in which theoretical physicists are getting more closely involved with experiment as large amounts of data become available from particle physics and astrophysics observations. Another theme was the increasing role of large-scale computing in theoretical physics, from lattice QCD, which uses large computers to solve difficult problems of nuclear interactions, to machine learning, which is increasingly used in theoretical and experimental physics, and quantum computing, which may, as Richard Feynman originally suggested, eventually be the most effective way of analyzing real or hypothetical quantum systems.

A theme throughout the day, with many former students and postdocs returning to the CTP, was the important role of interactions and community within the theoretical physics group. A video by Lillie Paquette illustrated the unified research and teaching environment in the CTP, made possible with the 2008 renovation when the Elings Center for Theoretical Physics in the Green Center was constructed. 

Another video made by Harry Bechkes was also premiered, showing the novel ways in which CTP faculty members Iain Stewart and Barton Zwiebach are using new technologies developed together with the Office of Digital Learning (led by CTP faculty member and Dean for Digital Learning Krishna Rajagopal) to enhance the teaching of MIT students learning effective field theory and quantum mechanics on campus by blending online and in-class education, while at the same time teaching learners around the globe and shaping the future of their disciplines.

At a celebratory dinner at the Samberg Center, several speakers commented on different aspects of the CTP history and culture. Professor Ernest Moniz — who has served as a CTP faculty member, department head for Physics, director of the MIT Energy Initiative, and US. Secretary of Energy during the Obama Administration — emphasized the commitment to social responsibility that has played an important role in the CTP and strongly influenced his career. This ranged from the involvement of CTP founders Herman Feshbach and Francis Low with the Union of Concerned Scientists, which decried military research at MIT and sought to aid silenced researchers behind the Iron Curtain like Andrei Sakharov, to recent examples such as the newly released book "The Physics of Energy" by the CTP's former director Robert Jaffe and its current director Washington Taylor, which gives a unified perspective on physics through the theme of energy and its role and impact on our world. 

The evening concluded with remarks by Harvard Professor Cumrun Vafa '81, who shared stories of the generous and open environment among the math and physics faculty during his formative time at MIT. Vaha echoed the sentiments of many of the symposium attendees, who had fond recollections of their undergraduate, graduate, and postdoc years at the center.



de MIT News https://ift.tt/2pOwtNq

3 Questions: Innovating for the clean energy economy

Daniel Kammen is a professor of energy at the University of California at Berkeley, with parallel appointments in the Energy and Resources Group (which he chairs), the Goldman School of Public Policy, and the Department of Nuclear Science and Engineering. Recently, he gave a talk at MIT examining the current state of clean energy innovation and implementation, both in the U.S. and internationally. Using a combination of analytical and empirical approaches, he discussed the strengths and weaknesses of clean energy efforts on the household, city, and regional levels. The MIT Energy Initiative (MITEI) followed up with him on these topics.

Q: Your team has built energy transition models for several countries, including Chile, Nicaragua, China, and India. Can you describe how these models work and how they can inform global climate negotiations like the Paris Accords?

A: My laboratory has worked with three governments to build open-source models of the current state of their energy systems and possible opportunities for improvement. This model, SWITCH , is an exceptionally high-resolution platform for examining the costs, reliability, and carbon emissions of energy systems as small as Nicaragua’s and as large as China’s. The exciting recent developments in the cost and performance improvements of solar, wind, energy storage, and electric vehicles permit the planning of dramatically decarbonized systems that have a wide range of ancillary benefits: increased reliability, improved air quality, and monetizing energy efficiency, to name just a few. With the Paris Climate Accords placing 80 percent or greater decarbonization targets on all nations’ agendas (sadly, except for the U.S. federal government), the need for an "honest broker" for the costs and operational issues around power systems is key.

Q: At the end of your talk, you mentioned a carbon footprint calculator that you helped create. How much do individual behaviors matter in addressing climate change?

A: The carbon footprint, or CoolClimate project, is a visualization and behavioral economics tool that can be used to highlight the impacts of individual decisions at the household, school, and city level. We have used it to support city-city competitions for “California’s coolest city,” to explore the relative impacts of lifetime choices (buying an electric vehicle versus or along with changes of diet), and more.

Q: You touched on the topic of the “high ambition coalition,” a 2015 United Nations Climate Change Conference goal of keeping warming under 1.5 degrees Celsius. Can you expand on this movement and the carbon negative strategies it would require?

A: As we look at paths to a sustainable global energy system, efforts to limit warming to 1.5 degrees Celsius will require not only zeroing out industrial and agricultural emissions, but also removing carbon from the atmosphere. This demands increasing natural carbon sinks by preserving or expanding forests, sustaining ocean systems, and making agriculture climate- and water-smart. One pathway, biomass energy with carbon capture and sequestration, has both supporters and detractors. It involves growing biomass, using it for energy, and then sequestering the emissions.

This talk was one in a series of MITEI seminars supported by IHS Markit.



de MIT News https://ift.tt/2J3lffZ

Structure of key growth regulator revealed

A team of researchers from Whitehead Institute and the Howard Hughes Medical Institute has revealed the structure of a key protein complex in humans that transmits signals about nutrient levels, enabling cells to align their growth with the supply of materials needed to support that growth. This complex, called GATOR1, acts as a kind of on-off switch for the “grow” (or “don’t grow”) signals that flow through a critical cellular growth pathway known as mTORC1.

Despite its importance, GATOR1 bears little similarity to known proteins, leaving major gaps in scientists’ understanding of its molecular structure and function. Now, as described online on March 28 in the journal Nature, Whitehead scientists and their colleagues have generated the first detailed molecular picture of GATOR1, revealing a highly ordered group of proteins and an extremely unusual interaction with its partner, the Rag GTPase.

“If you know something about a protein’s three-dimensional structure, then you can make some informed guesses about how it might work. But GATOR1 has basically been a black box,” says senior author David Sabatini, a member of Whitehead Institute, a professor of biology at MIT, and investigator with the Howard Hughes Medical Institute (HHMI). “Now, for the first time, we have generated high-resolution images of GATOR1 and can begin to dissect how this critical protein complex works.”

GATOR1 was first identified about five years ago. It consists of three protein subunits (Depdc5, Nprl2, and Nprl3), and mutations in these subunits have been associated with human diseases, including cancers and neurological conditions such as epilepsy. However, because of the lack of similarity to other proteins, the majority of the GATOR1 complex is a molecular mystery. “GATOR1 has no well-defined protein domains,” explains Whitehead researcher Kuang Shen, one of the study’s first authors. “So, this complex is really quite special and also very challenging to study.”

Because of the complex’s large size and relative flexibility, GATOR1 cannot be readily crystallized — a necessary step for resolving protein structure through standard, X-ray crystallographic methods. As a result, Shen and Sabatini turned to HHMI’s Zhiheng Yu. Yu and his team specialize in cryo-electron microscopy (cryo-EM), an emerging technique that holds promise for visualizing the molecular structures of large proteins and protein complexes. Importantly, it does not utilize protein crystals. Instead, proteins are rapidly frozen in a thin layer of vitrified ice and then imaged by a beam of fast electrons inside an electron microscope column.

“There have been some major advances in cryo-EM technology over the last decade, and now, it is possible to achieve atomic or near atomic resolution for a variety of proteins,” explains Yu, a senior author of the paper and director of HHMI’s shared, state-of-the-art cryo-EM facility at Janelia Research Campus. Last year’s Nobel Prize in chemistry was awarded to three scientists for their pioneering efforts to develop cryo-EM.

GATOR1 proved to be a tricky subject, even for cryo-EM, and required some trial-and-error on the part of Yu, Shen, and their colleagues to prepare samples that could yield robust structural information. Moreover, the team’s work was made even more difficult by the complex’s unique form. With no inklings of GATOR1’s potential structure, Shen and his colleagues, including co-author Edward Brignole of MIT, had to derive it completely from scratch.

Nevertheless, the Whitehead-HHMI team was able to resolve near-complete structures for GATOR1 as well as for GATOR1 bound to its partner proteins, the Rag GTPases. (Two regions of the subunit Depdc5 are highly flexible and therefore could not be resolved.) From this wealth of new information as well as from the team’s subsequent biochemical analyses, some surprising findings emerged.

First is the remarkable level of organization of GATOR1. The protein is extremely well organized, which is quite unusual for proteins that have no predicted structures. (Such proteins are usually quite disorganized.) In addition, the researchers identified four protein domains that have never before been visualized. These novel motifs — named NTD, SABA, SHEN, and CTD — could provide crucial insights into the inner workings of the GATOR1 complex.

Shen, Sabatini, and their colleagues uncovered another surprise. Unlike other proteins that bind to Rag GTPases, GATOR1 contacts these proteins at at least two distinct sites. Moreover, one of the binding sites serves to inhibit — rather than stimulate — the activity of the Rag GTPase. “This kind of dual binding has never been observed — it is highly unusual,” Shen says. The researchers hypothesize that this feature is one reason why GATOR1 is so large — because it must hold its Rag GTPase at multiple sites, rather than one, as most other proteins of this type do.

Despite these surprises, the researchers acknowledge that their analyses have only begun to scratch the surface of GATOR1 and the mechanisms through which it regulates the mTOR signaling pathway.

“There is much left to discover in this protein,” Sabatini says.

This work was supported by the National Institutes of Health, Department of Defense, National Science Foundation, the Life Sciences Research Foundation, and the Howard Hughes Medical Institute.



de MIT News https://ift.tt/2GB11vY

Understanding the Earth under Hawaii

In the 1960s, some 50 years after German researcher Alfred Wegener proposed his continental drift hypothesis, the theory of plate tectonics gave scientists a unifying framework for describing the large-scale motion of the surface plates that make up the Earth’s lithosphere — a framework that subsequently revolutionized the geosciences.

How those plates move around the Earth’s surface is controlled by motion within the mantle — the driving force of which is convection due to thermal anomalies, with compositional heterogeneity also expected. However, the technical challenge of visualizing structures inside an optically impenetrable, 6,371-kilometer-radius rock sphere has made understanding the compositional and thermal state of the mantle, as well as its dynamic evolution, a long-standing challenge in Earth science.

Now, in a paper published today in Nature Communications, researchers from MIT, Imperial College, Rice University, and the Institute of Earth Sciences in France report direct evidence for lateral variations in mantle composition below Hawaii. The results provide scientists with important new insights into how the Earth has evolved over its 4.5 billion year history, why it is as it is now, and what this means for rocky planets elsewhere.

Compositional variation

Scientists treat the mantle as two layers — the lower mantle and the upper mantle — separated by a boundary layer termed the mantle transition zone (MTZ). Physically, the MTZ is bounded by two seismic-velocity discontinuities near 410 km and 660 km depth (referred to as 410 and 660). These discontinuities, which are due to phase transitions in silicate minerals, play an important role in modulating mantle flow. Lateral variations in depth to these discontinuities have been widely used to infer thermal anomalies in the mantle, as mineral physics predicts a shallower 410 and a deeper 660 in cold regions and a deeper 410 and a shallower 660 in hot regions.

Previous petrological and numerical studies also predict compositional segregation of basaltic and harzburgitic material (and thus compositional heterogeneity) near the base of the MTZ in the relatively warm low-viscosity environments near mantle upwellings. But observational evidence for such a process has been scarce.

The new study, however, demonstrates clear evidence for lateral variation in composition near the base of the MTZ below Hawaii. This evidence could have important implications for our general understanding of mantle dynamics.

As lead author Chunquan Yu PhD '16, a former grad student in the Hilst Group at MIT who is now a postdoc at Caltech, explains, “At middle ocean ridges, plate separation results in ascending and partial melting of the mantle material. Such a process causes differentiation of the oceanic lithosphere with basaltic material in the crust and harzburgitic residue in the mantle. As the differentiated oceanic lithosphere cools, it descends back into the mantle along the subduction zone. Basalt and harzburgite are difficult to separate in cold environments. However, they can segregate in relative warm low-viscosity environments, such as near mantle upwellings, potentially providing a major source of compositional heterogeneity in the Earth’s mantle.”

Looking with earthquakes

To explore this idea, Yu and his colleagues used a seismic technique involving the analysis of underside shear wave reflections off mantle discontinuities — known as SS precursors — to study MTZ structures beneath the Pacific Ocean around Hawaii.

“When an earthquake occurs, it radiates both compressional (P) and shear wave (S) energy. Both P and S waves can reflect from interfaces in the Earth’s interior,” Yu explains. “If an S wave leaves a source downward and reflects at the free surface before arriving at the receiver, it is termed SS. SS precursors are underside S-wave reflections off mantle discontinuities. Because they travel along shorter ray paths, they are precursory to SS.”

Using a novel seismic array technique, the team were able to improve the signal-to-noise ratio of the SS precursors and remove interfering phases. As a result, much more data that otherwise would have been discarded became accessible for analysis.

They also employed so-called amplitude versus offset analysis, a tool widely used in exploration seismology, to constrain elastic properties near MTZ discontinuities.

The analysis finds strong lateral variations in radial contrasts in mass density and wavespeed across 660 while no such variations were observed along the 410. Complementing this, the team’s thermodynamic modeling, along a range of mantle temperatures for several representative mantle compositions, precludes a thermal origin for the inferred lateral variations in elastic contrasts across 660. Instead, the inferred 660 contrasts can be explained by lateral variation in mantle composition: from average (pyrolytic; about 60 percent olivine) mantle beneath Hawaii to a mixture with more melt-depleted harzburgite (about 80 percent olivine) southeast of the hotspot. Such compositional heterogeneity is consistent with numerical predictions that segregation of basaltic and harzburgitic material could occur near the base of the MTZ near hot deep mantle upwellings like the one that is often invoked to cause volcanic activity on Hawaii.

“It has been suggested that compositional segregation between basaltic and harzburgitic materials could form a gravitationally stable layer over the base of the MTZ. If so it can provide a filter for slab downwellings and lower-mantle upwellings, and thus strongly affect the mode of mantle convection and its chemical circulation,” says Yu.

This study presents a promising technique to get constraints on the thus far elusive distribution of compositional heterogeneity within Earth’s mantle. Compositional segregation near the base of the MTZ has been expected since the 1960s and evidence that this process does indeed occur has important implications for our understanding of the chemical evolution of the Earth.

Yu’s co-authors are Elizabeth Day, a former postdoc in the Hilst Group who is now a senior teaching fellow at Imperial College; Maarten V. de Hoop of Rice University; Michel Campillo of the Institute of Earth Sciences in France; Saskia Goes and Rachel Blythe of Imperial College; and Professor Robert van der Hilst, head of the MIT Department of Earth, Atmospheric and Planetary Sciences.

This work was funded by the Simons Foundation, the National Science Foundation, National Environmental Research Council, and a Royal Society Fellowship.



de MIT News https://ift.tt/2GDa53m

martes, 27 de marzo de 2018

Reimagining and rethinking engineering education

A new report from MIT puts a spotlight on worldwide trends in the changing landscape of engineering education, pinpoints the current and emerging leaders in the field, and describes some of its future directions.

“Engineers will address the complex societal challenges of the 21st century by building a new generation of machines, materials, and systems. We should fundamentally rethink how we educate engineers for this future,” says Ed Crawley, the Ford Professor of Engineering in the Department of Aeronautics and Astronautics and faculty co-director of the New Engineering Education Transformation (NEET) initiative at MIT.

This realization, Crawley says, is what prompted MIT’s engineering faculty to rethink how they were approaching their own offerings on campus, and to launch NEET. “We’re targeting MIT education at the industries of the future rather than industries of the past,” says Anette “Peko” Hosoi, associate dean of engineering and Crawley’s co-lead at NEET; Hosoi is also the Neil and Jane Pappalardo Professor of Mechanical Engineering.

While their on-campus pilot was at the design stage, Crawley decided to take a broader, benchmarking view. “I knew from my five years as founding president of Skoltech in Moscow that there were examples of educational innovation scattered across the world,” he says, “but these distributed developments are difficult to identify and learn from.”

Until now. Crawley and his colleagues in the NEET program have just released “Global state of the art in engineering education.” The report, authored by Ruth Graham, is a global review of cutting-edge practice in engineering education. It is informed by interviews with 178 thought leaders with knowledge of and experience with world-leading engineering programs, combined with case studies from four different universities. The report paints a rich picture of successful innovation in engineering education as well as some of its opportunities and challenges.

The study identifies institutions considered to be the current leaders in engineering education; Olin College and MIT were cited by the majority of experts who were consulted, along with Stanford University, Aalborg University in Denmark, and Delft University of Technology (TU Delft) in the Netherlands. Outside of the U.S. and northern Europe, the only university among the top 10 cited for their educational leadership was the National University of Singapore (NUS).

“The profile of the emerging leaders is very different,” Graham notes. “While they include universities in the U.S. and Europe — Olin College, Iron Range Engineering, and University College London are among the most frequently cited universities –  thought leaders identified emerging leaders from across the world, such as Singapore University of Technology and Design (SUTD), Pontifical Catholic University of Chile (PUC), NUS (Singapore), and Charles Sturt University (Australia).” (The report includes case studies of four of the emerging leaders: SUTD, UCL, Charles Sturt, and TU Delft.)

The study attributes this contrast to a range of sources. For one, Graham notes, “Many political leaders outside of the U.S. are making major investments in engineering education as an incubator for the technology-based entrepreneurial talent that will drive national economic growth.”

The report also identifies some key challenges facing engineering education, and in some cases higher education as a whole. These include aligning the goals of national governments and higher education, delivering student-centered learning to large student cohorts, and setting up faculty appointment and promotion systems that better reward high-quality teaching.

According to Graham’s report, three trends are likely to define the future of engineering education. The first is a tilting of the global axis of engineering education leadership so it is less focused on U.S. and northern European institutions. The second is a shift toward programs that integrate student-centered learning with a curriculum oriented to the pressing challenges of the 21st century — societal, environmental, and technological. And the third is the emergence of a new generation of leaders with the capacity to deliver student-centered curricula at scale.

The case studies highlighted in the report include universities that may be paving the way by, for example, achieving curricular coherence and integration through a connective spine of design projects. In the longer-term, the world’s leading engineering programs may be those that blend off-campus personalized learning, accessed online as students need it, with experiential learning both in work-based placements and on campus.



de MIT News https://ift.tt/2pJZ9HB

Ed Boyden receives 2018 Canada Gairdner International Award

Ed Boyden, the Y. Eva Tan Professor in Neurotechnology at MIT has been named a recipient of the 2018 Canada Gairdner International Award — Canada’s most prestigious scientific prize — for his role in the discovery of light-gated ion channels and optogenetics, a technology to control brain activity with light.

Boyden’s work has given neuroscientists the ability to precisely activate or silence brain cells to see how they contribute to — or possibly alleviate — brain disease. By optogenetically controlling brain cells, it has become possible to understand how specific patterns of brain activity might be used to quiet seizures, cancel out Parkinsonian tremors, and make other improvements to brain health.

Boyden is one of three scientists the Gairdner Foundation is honoring for this work. He shares the prize with Peter Hegemann from Humboldt University of Berlin and Karl Deisseroth from Stanford University.

“I am honored that the Gairdner Foundation has chosen our work in optogenetics for one of the most prestigious biology prizes awarded today,” says Boyden, who is also a member of MIT’s McGovern Institute for Brain Research and an associate professor in the Media Lab, the Department of Brain and Cognitive Sciences, and the Department of Biological Engineering at MIT. “It represents a great collaborative body of work, and I feel excited that my angle of thinking like a physicist was able to contribute to biology.”

Boyden, along with fellow laureate Karl Deisseroth, brainstormed about how microbial opsins could be used to mediate optical control of neural activity, while both were students in 2000. Together, they collaborated to demonstrate the first optical control of neural activity using microbial opsins in the summer of 2004, when Boyden was at Stanford. At MIT, Boyden’s team developed the first optogenetic silencing (2007), the first effective optogenetic silencing in live mammals (2010), noninvasive optogenetic silencing (2014), multicolor optogenetic control (2014), and temporally precise single-cell optogenetic control (2017).

In addition to his work with optogenetics, Boyden has pioneered the development of many transformative technologies that image, record, and manipulate complex systems, including expansion microscopy and robotic patch clamping. He has received numerous awards for this work, including the Breakthrough Prize in Life Sciences (2016), the BBVA Foundation Frontiers of Knowledge Award (2015), the Carnegie Prize in Mind and Body Sciences (2015), the Grete Lundbeck European Brain Prize (2013), and the Perl-UNC Neuroscience prize (2011). Boyden is an elected member of the American Academy of Arts and Sciences and the National Academy of Inventors.

“We are thrilled Ed has been recognized with the prestigious Gairdner Award for his work in developing optogenetics,” says Robert Desimone, director of the McGovern Institute. “Ed’s body of work has transformed neuroscience and biomedicine, and I am exceedingly proud of the contributions he has made to MIT and to the greater community of scientists worldwide.”

The Canada Gairdner International Awards, created in 1959, are given annually to recognize and reward the achievements of medical researchers whose work contributes significantly to the understanding of human biology and disease. The awards provide a $100,000 (CDN) prize to each scientist for their work. Each year, the five honorees of the International Awards are selected after a rigorous two-part review, with the winners voted by secret ballot by a medical advisory board composed of 33 eminent scientists from around the world.



de MIT News https://ift.tt/2Gu6GDZ

Featured video: Magical Bob

As a child, Institute Professor Robert S. Langer was captivated by the “magic” of the chemical reactions in a toy chemistry set. Decades later, he continues to be enchanted by the potential of chemical engineering. He is the most cited engineer in the world, and shows no signs of slowing down, despite four decades of ground-breaking work in drug delivery and polymer research.

Langer explains, “For me, magic has been discovering and inventing things. Discovering substances that can stop blood vessels from growing in the body, which can ultimately lead to treatments for cancer and blindness.”

The Langer Lab has had close to 1,000 students and postdocs go through its doors. Hundreds are now professors around the world. Many have started companies.

“I’m very proud of all of them,” says Langer. “I hope that I help them a little bit. That’s what we try to do.”

Submitted by: Melanie Miller Kaufman / Department of Chemical Engineering | Video by: Lillie Paquette / School of Engineering | 1 min, 26 sec



de MIT News https://ift.tt/2I7AvY8

lunes, 26 de marzo de 2018

Cheetah III robot preps for a role as a first responder

If you were to ask someone to name a new technology that emerged from MIT in the 21st century, there’s a good chance they would name the robotic cheetah. Developed by the MIT Department of Mechanical Engineering's Biomimetic Robotics Lab under the direction of Associate Professor Sangbae Kim, the quadruped MIT Cheetah has made headlines for its dynamic legged gait, speed, jumping ability, and biomimetic design.

The dog-sized Cheetah II can run on four articulated legs at up to 6.4 meters per second, make mild running turns, and leap to a height of 60 centimeters. The robot can also autonomously determine how to avoid or jump over obstacles.

Kim is now developing a third-generation robot, the Cheetah III. Instead of improving the Cheetah’s speed and jumping capabilities, Kim is converting the Cheetah into a commercially viable robot with enhancements such as a greater payload capability, wider range of motion, and a dexterous gripping function. The Cheetah III will initially act as a spectral inspection robot in hazardous environments such as a compromised nuclear plant or chemical factory. It will then evolve to serve other emergency response needs.

“The Cheetah II was focused on high speed locomotion and agile jumping, but was not designed to perform other tasks,” says Kim. “With the Cheetah III, we put a lot of practical requirements on the design so it can be an all-around player. It can do high-speed motion and powerful actions, but it can also be very precise.”

The Biomimetic Robotics Lab is also finishing up a smaller, stripped down version of the Cheetah, called the Mini Cheetah, designed for robotics research and education. Other projects include a teleoperated humanoid robot called the Hermes that provides haptic feedback to human operators. There’s also an early stage investigation into applying Cheetah-like actuator technology to address mobility challenges among the disabled and elderly.

Conquering mobility on the land

“With the Cheetah project, I was initially motivated by copying land animals, but I also realized there was a gap in ground mobility,” says Kim. “We have conquered air and water transportation, but we haven’t conquered ground mobility because our technologies still rely on artificially paved roads or rails. None of our transportation technologies can reliably travel over natural ground or even man-made environments with stairs and curbs. Dynamic legged robots can help us conquer mobility on the ground.”

One challenge with legged systems is that they “need high torque actuators,” says Kim. “A human hip joint can generate more torque than a sports car, but achieving such condensed high torque actuation in robots is a big challenge.”

Robots tend to achieve high torque at the expense of speed and flexibility, says Kim. Factory robots use high torque actuators but they are rigid and cannot absorb energy upon the impact that results from climbing steps. Hydraulically powered, dynamic legged robots, such as the larger, higher-payload, quadruped Big Dog from Boston Dynamics, can achieve very high force and power, but at the expense of efficiency. “Efficiency is a serious issue with hydraulics, especially when you move fast,” he adds.

A chief goal of the Cheetah project has been to create actuators that can generate high torque in designs that imitate animal muscles while also achieving efficiency. To accomplish this, Kim opted for electric rather than hydraulic actuators. “Our high torque electric motors have exceeded the efficiency of animals with biological muscles, and are much more efficient, cheaper, and faster than hydraulic robots,” he says.

Cheetah III: More than a speedster

Unlike the earlier versions, the Cheetah III design was motivated more by potential applications than pure research. Kim and his team studied the requirements for an emergency response robot and worked backward.

“We believe the Cheetah III will be able to navigate in a power plant with radiation in two or three years,” says Kim. “In five to 10 years it should be able to do more physical work like disassembling a power plant by cutting pieces and bringing them out. In 15 to 20 years, it should be able to enter a building fire and possibly save a life.”

In situations such as the Fukushima nuclear disaster, robots or drones are the only safe choice for reconnaissance. Drones have some advantages over robots, but they cannot apply large forces necessary for tasks such as opening doors, and there are many disaster situations in which fallen debris prohibits drone flight.

By comparison, the Cheetah III can apply human-level forces to the environment for hours at a time. It can often climb or jump over debris, or even move it out of the way. Compared to a drone, it’s also easier for a robot to closely inspect instrumentation, flip switches, and push buttons, says Kim. “The Cheetah III can measure temperatures or chemical compounds, or close and open valves.”

Advantages over tracked robots include the ability to maneuver over debris and climb stairs. “Stairs are some of the biggest obstacles for robots,” says Kim. “We think legged robots are better in man-made environments, especially in disaster situations where there are even more obstacles.”

The Cheetah III was slowed down a bit compared to the Cheetah II, but also given greater strength and flexibility. “We increased the torque so it can open the heavy doors found in power plants,” says Kim. “We increased the range of motion to 12 degrees of freedom by using 12 electric motors that can articulate the body and the limbs.”

This is still far short of the flexibility of animals, which have over 600 muscles. Yet, the Cheetah III can compensate somewhat with other techniques. “We maximize each joint’s work space to achieve a reasonable amount of reachability,” says Kim.

The design can even use the legs for manipulation. “By utilizing the flexibility of the limbs, the Cheetah III can open the door with one leg,” says Kim. “It can stand on three legs and equip the fourth limb with a customized swappable hand to open the door or close a valve.”

The Cheetah III has an improved payload capability to carry heavier sensors and cameras, and possibly even to drop off supplies to disabled victims. However, it’s a long way from being able to rescue them. The Cheetah III is still limited to a 20-kilogram payload, and can travel untethered for four to five hours with a minimal payload.

“Eventually, we hope to develop a machine that can rescue a person,” says Kim. “We’re not sure if the robot would carry the victim or bring a carrying device,” he says. “Our current design can at least see if there are any victims or if there are any more potential dangerous events.”

Experimenting with human-robot interaction

The semiautonomous Cheetah III can make ambulatory and navigation decisions on its own. However, for disaster work, it will primarily operate by remote control.

“Fully autonomous inspection, especially in disaster response, would be very hard,” says Kim. Among other issues, autonomous decision making often takes time, and can involve trial and error, which could delay the response.

“People will control the Cheetah III at a high level, offering assistance, but not handling every detail,” says Kim. “People could tell it to go to a specific location at the map, find this place, and open that door. When it comes to hand action or manipulation, the human will take over more control and tell the robot what tool to use.”

Humans may also be able to assist with more instinctive controls. For example, if the Cheetah uses one of its legs as an arm and then applies force, it’s hard to maintain balance. Kim is now investigating whether human operators can use “balanced feedback” to keep the Cheetah from falling over while applying full force.

“Even standing on two or three legs, it would still be able to perform high force actions that require complex balancing,” says Kim. “The human operator can feel the balance, and help the robot shift its momentum to generate more force to open or hammer a door.”

The Biomimetic Robotics Lab is exploring balanced feedback with another robot project called Hermes (Highly Efficient Robotic Mechanisms and Electromechanical System). Like the Cheetah III, it’s a fully articulated, dynamic legged robot designed for disaster response. Yet, the Hermes is bipedal, and completely teleoperated by a human who wears a telepresence helmet and a full body suit. Like the Hermes, the suit is rigged with sensors and haptic feedback devices.

“The operator can sense the balance situation and react by using body weight or directly implementing more forces,” says Kim.

The latency required for such intimate real-time feedback is difficult to achieve with Wi-Fi, even when it’s not blocked by walls, distance, or wireless interference. “In most disaster situations, you would need some sort of wired communication,” says Kim. “Eventually, I believe we’ll use reinforced optical fibers.”

Improving mobility for the elderly

Looking beyond disaster response, Kim envisions an important role for agile, dynamic legged robots in health care: improving mobility for the fast-growing elderly population. Numerous robotics projects are targeting the elderly market with chatty social robots. Kim is imagining something more fundamental.

“We still don’t have a technology that can help impaired or elderly people seamlessly move from the bed to the wheelchair to the car and back again,” says Kim. “A lot of elderly people have problems getting out of bed and climbing stairs. Some elderly with knee joint problems, for example, are still pretty mobile on flat ground, but can’t climb down the stairs unassisted. That’s a very small fraction of the day when they need help. So we’re looking for something that’s lightweight and easy to use for short-time help.”

Kim is currently working on “creating a technology that could make the actuator safe,” he says. “The electric actuators we use in the Cheetah are already safer than other machines because they can easily absorb energy. Most robots are stiff, which would cause a lot of impact forces. Our machines give a little.”

By combining such safe actuator technology with some of the Hermes technology, Kim hopes to develop a robot that can help elderly people in the future. “Robots can not only address the expected labor shortages for elder care, but also the need to maintain privacy and dignity,” he says.



de MIT News https://ift.tt/2GcaKW0

Scientists report first results from neutrino mountain experiment

This week, an international team of physicists, including researchers at MIT, is reporting the first results from an underground experiment designed to answer one of physics’ most fundamental questions: Why is our universe made mostly of matter?  

According to theory, the Big Bang should have produced equal amounts of matter and antimatter — the latter consisting of “antiparticles” that are essentially mirror images of matter, only bearing charges opposite to those of protons, electrons, neutrons, and other particle counterparts. And yet, we live in a decidedly material universe, made mostly of galaxies, stars, planets, and everything we see around us — and very little antimatter.

Physicists posit that some process must have tilted the balance in favor of matter during the first moments following the Big Bang. One such theoretical process involves the neutrino — a particle that, despite having almost no mass and interacting very little with other matter, is thought to permeate the universe, with trillions of the ghostlike particles streaming harmlessly through our bodies every second.

There is a possibility that the neutrino may be its own antiparticle, meaning that it may have the ability to transform between a matter and antimatter version of itself. If that is the case, physicists believe this might explain the universe’s imbalance, as heavier neutrinos, produced immediately after the Big Bang, would have decayed asymmetrically, producing more matter, rather than antimatter, versions of themselves.

One way to confirm that the neutrino is its own antiparticle, is to detect an exceedingly rare process known as a “neutrinoless double-beta decay,” in which a stable isotope, such as tellurium or xenon, gives off certain particles, including electrons and antineutrinos, as it naturally decays. If the neutrino is indeed its own antiparticle, then according to the rules of physics the antineutrinos should cancel each other out, and this decay process should be “neutrinoless.” Any measure of this process should only record the electrons escaping from the isotope.

The underground experiment known as CUORE, for the Cryogenic Underground Observatory for Rare Events, is designed to detect a neutrinoless double-beta decay from the natural decay of 988 crystals of tellurium dioxide. In a paper published this week in Physical Review Letters, researchers, including physicists at MIT, report on the first two months of data collected by CUORE (Italian for “heart”). And while they have not yet detected the telltale process, they have been able to set the most stringent limits yet on the amount of time that such a process should take, if it exists at all. Based on their results, they estimate that a single atom of tellurium should undergo a neutrinoless double-beta decay, at most, once every 10 septillion (1 followed by 25 zeros) years.

Taking into account the massive number of atoms within the experiment’s 988 crystals, the researchers predict that within the next five years they should be able to detect at least five atoms undergoing this process, if it exists, providing definitive proof that the neutrino is its own antiparticle.

“It’s a very rare process — if observed, it would be the slowest thing that has ever been measured,” says CUORE member Lindley Winslow, the Jerrold R. Zacharias Career Development Assistant Professor of Physics at MIT, who led the analysis. “The big excitement here is that we were able to run 998 crystals together, and now we’re on a path to try and see something.”

The CUORE collaboration includes some 150 scientists primarily from Italy and the U.S., including Winslow and a small team of postdocs and graduate students from MIT.

Coldest cube in the universe

The CUORE experiment is housed underground, buried deep within a mountain in central Italy, in order to shield it from external stimuli such as the constant bombardment of radiation from sources in the universe.

The heart of the experiment is a detector consisting of 19 towers, each containing 52 cube-shaped crystals of tellurium dioxide, totaling 988 crystals in all, with a mass of about 742 kilograms, or 1,600 pounds. Scientists estimate that this amount of crystals embodies around 100 septillion atoms of the particular tellurium isotope. Electronics and temperature sensors are attached to each crystal to monitor signs of their decay.

The entire detector resides within an ultracold refrigerator, about the size of a vending machine, which maintains a steady temperature of 6 millikelvin, or -459.6 degrees Fahrenheit. Researchers in the collaboration have previously calculated that this refrigerator is the coldest cubic meter that exists in the universe.   

The experiment needs to be kept exceedingly cold in order to detect minute changes in temperature generated by the decay of a single tellurium atom. In a normal double-beta decay process, a tellurium atom gives off two electrons, as well as two antineutrinos, which amount to a certain energy in the form of heat. In the event of a neutrinoless double-beta decay, the two antineutrinos should cancel each other out, and only the energy released by the two electrons would be generated. Physicists have previously calculated that this energy must be around 2.5 megaelectron volts (Mev).

In the first two months of CUORE’s operation, scientists have essentially been taking the temperature of the 988 tellurium crystals, looking for any miniscule spike in energy around that 2.5 Mev mark.

“CUORE is like a gigantic thermometer,” Winslow says. “Whenever you see a heat deposit on a crystal, you end up seeing a pulse that you can digitize. Then you go through and look at these pulses, and the height and width of the pulse corresponds to how much energy was there. Then you zoom in and count how many events were at 2.5 Mev, and we basically saw nothing. Which is probably good because we weren’t expecting to see anything in the first two months of data.”

The heart will go on

The results more or less indicate that, within the short window in which CUORE has so far operated, not one of the 1,000 septillion tellurium atoms in the detector underwent a neutrinoless double-beta decay. Statistically speaking, this means that it would take at least 10 septillion years, or  years, for a single atom to undergo this process if a neutrino is in fact its own antiparticle.

“For tellurium dioxide, this is the best limit for the lifetime of this process that we’ve ever gotten,” Winslow says.

CUORE will continue to monitor the crystals for the next five years, and researchers are now designing the experiment’s next generation, which they have dubbed CUPID — a detector that will look for the same process within an even greater number of atoms. Beyond CUPID, Winslow says there is just one more, bigger iteration that would be possible, before scientists can make a definitive conclusion.

“If we don’t see it within 10 to 15 years, then, unless nature chose something really weird, the neutrino is most likely not its own antiparticle,” Winslow says. “Particle physics tells you there’s not much more wiggle room for the neutrino to still be its own antiparticle, and for you not to have seen it. There’s not that many places to hide.”

This research is supported by the National Institute for Nuclear Physics (INFN) in Italy, the National Science Foundation, the Alfred P. Sloan Foundation, and the U.S. Department of Energy.



de MIT News https://ift.tt/2Gae3Nh