lunes, 9 de febrero de 2026

3 Questions: Using AI to help Olympic skaters land a quint

Olympic figure skating looks effortless. Athletes sail across the ice, then soar into the air, spinning like a top, before landing on a single blade just 4-5 millimeters wide. To help figure skaters land quadruple axels, Salchows, Lutzes, and maybe even the elusive quintuple without looking the least bit stressed, Jerry Lu MFin ’24 developed an optical tracking system called OOFSkate that uses artificial intelligence to analyze video of a figure skater’s jump and make recommendations on how to improve. Lu, a former researcher at the MIT Sports Lab, has been aiding elite skaters on Team USA with their technical performance and will be working with NBC Sports during the 2026 Winter Olympics to help commentators and TV viewers make better sense of the complex scoring system in figure skating, snowboarding, and skiing. He’ll be applying AI technologies to explain nuanced judging decisions and demonstrate just how technically challenging these sports can be.

Meanwhile, Professor Anette “Peko” Hosoi, co-founder and faculty director of the MIT Sports Lab, is embarking on new research aimed at understanding how AI systems evaluate aesthetic performance in figure skating. Hosoi and Lu recently chatted with MIT News about applying AI to sports, whether AI systems could ever be used to judge Olympic figure skating, and when we might see a skater land a quint.

Q: Why apply AI to figure skating?

Lu: Skaters can always keep pushing, higher, faster, stronger. OOFSkate is all about helping skaters figure out a way to rotate a little bit faster in their jumps or jump a little bit higher. The system helps skaters catch things that perhaps could pass an eye test, but that might allow them to target some high-value areas of opportunity. The artistic side of skating is much harder to evaluate than the technical elements because it’s subjective.

To use mobile training app, you just need to take a video of an athlete’s jump, and it will spit out the physical metrics that drive how many rotations you can do. It tracks those metrics and builds in all of the other current elite and former elite athletes. You can see your data and then see, “This is how an Olympic champion did this element, perhaps I should try that.” You get the comparison and the automated classifier, which shows you if you did this trick at World Championships and it were judged by an international panel, this is approximately the grade of execution score they would give you.

Hosoi: There are a lot of AI tools that are coming online, especially things like pose estimators, where you can approximate skeletal configurations from video. The challenge with these pose estimators is that if you only have one camera angle, they do very well in the plane of the camera, but they do very poorly with depth. For example, if you’re trying to critique somebody’s form in fencing, and they’re moving toward the camera, you get very bad data. But with figure skating, Jerry has found one of the few areas where depth challenges don’t really matter. In figure skating, you need to understand: How high did this person jump, how many times did they go around, and how well did they land? None of those rely on depth. He’s found an application that pose estimators do really well, and that doesn’t pay a penalty for the things they do badly.

Q: Could you ever see a world in which AI is used to evaluate the artistic side of figure skating?

Hosoi: When it comes to AI and aesthetic evaluation, we have new work underway thanks to a MIT Human Insight Collaborative (MITHIC) grant. This work is in collaboration with Professor Arthur Bahr and IDSS graduate student Eric Liu. When you ask an AI platform for an aesthetic evaluation such as “What do you think of this painting?” it will respond with something that sounds like it came from a human. What we want to understand is, to get to that assessment, are the AIs going through the same sort of reasoning pathways or using the same intuitive concepts that humans go through to arrive at, “I like that painting,” or “I don’t like that painting”? Or are they just parrots? Are they just mimicking what they heard a person say? Or is there some concept map of aesthetic appeal? Figure skating is a perfect place to look for this map because skating is aesthetically judged. And there are numbers. You can’t go around a museum and find scores, “This painting is a 35.” But in skating, you’ve got the data.

That brings up another even more interesting question, which is the difference between novices and experts. It’s known that expert humans and novice humans will react differently to seeing the same thing. Somebody who is an expert judge may have a different opinion of a skating performance than a member of the general population. We’re trying to understand differences between reactions from experts, novices, and AI. Do these reactions have some common ground in where they are coming from, or is the AI coming from a different place than both the expert and the novice?

Lu: Figure skating is interesting because everybody working in the field of AI is trying to figure out AGI or artificial general intelligence and trying to build this extremely sound AI that replicates human beings. Working on applying AI to sports like figure skating helps us understand how humans think and approach judging. This has down-the-line impacts for AI research and companies that are developing AI models. By gaining a deeper understanding of how current state-of-the-art AI models work with these sports, and how you need to do training and fine-tuning of these models to make them work for specific sports, it helps you understand how AI needs to advance.

Q: What will you be watching for in the Milan Cortina Olympics figure skating competitions, now that you’ve been studying and working in this area? Do you think someone will land a quint?

Lu: For the winter games, I am working with NBC for the figure skating, ski, and snowboarding competitions to help them tell a data-driven story for the American people. The goal is to make these sports more relatable. Skating looks slow on television, but it’s not. Everything is supposed to look effortless. If it looks hard, you are probably going to get penalized. Skaters need to learn how to spin very fast, jump extremely high, float in the air, and land beautifully on one foot. The data we are gathering can help showcase how hard skating actually is, even though it is supposed to look easy.

I’m glad we are working in the Olympics sports realm because the world watches once every four years, and it is traditionally coaching-intensive and talent-driven sports, unlike a sport like baseball, where if you don’t have an elite-level optical tracking system you are not maximizing the value that you currently have. I’m glad we get to work with these Olympic sports and athletes and make an impact here.

Hosoi: I have always watched Olympic figure skating competitions, ever since I could turn on the TV. They’re always incredible. One of the things that I’m going to be practicing is identifying the jumps, which is very hard to do if you’re an amateur “judge.”

I have also done some back-of-the-envelope calculations to see if a quint is possible. I am now totally convinced it’s possible. We will see one in our lifetime, if not relatively soon. Not in this Olympics, but soon. When I saw we were so close on the quint, I thought, what about six? Can we do six rotations? Probably not. That’s where we start to come up against the limits of human physical capability. But five, I think, is in reach.



de MIT News https://ift.tt/7VHbzhp

Times Higher Education ranks MIT No. 1 in arts and humanities, business and economics, and social sciences for 2026

The 2026 Times Higher Education World University Ranking has ranked MIT first in three subject categories: Arts and Humanities, Business and Economics, and Social Sciences, repeating the Institute’s top spot in the same subjects in 2025.

The Times Higher Education World University Ranking is an annual publication of university rankings by Times Higher Education, a leading British education magazine. The subject rankings are based on 18 rigorous performance indicators categorized under five core pillars: teaching, research environment, research quality, industry, and international outlook.

Disciplines included in MIT’s top-ranked subjects are housed in the School of Humanities, Arts, and Social Sciences (SHASS), the School of Architecture and Planning (SA+P), and the MIT Sloan School of Management.

“SHASS is a vibrant crossroads of ideas, bringing together extraordinary people,” says Agustín Rayo, the Kenan Sahin Dean of SHASS. “These rankings reflect the strength of this remarkable community and MIT’s ongoing commitment to the humanities, arts, and social sciences.” 

“The human dimension is capital to our school's mission and programs, be they architecture, planning, media arts and sciences, or the arts, and whether at the scale of individuals, communities, or societies,” says Hashim Sarkis, dean of SA+P. “The acknowledgment and celebration of their centrality by the Times Higher Education only renews our deep commitment to human values.”

“MIT and MIT Sloan are providing students with an education that ensures they have the skills, experience, and problem-solving abilities they need in order to succeed in our world today,” says Richard M. Locke, the John C Head III Dean at the MIT Sloan School of Management. “It’s not just what we teach them, but how we teach them. The interdisciplinary nature of a school like MIT combines analytical reasoning skills, deep functional knowledge, and, at MIT Sloan, a hands-on management education that teaches students how to collaborate, lead teams, and navigate challenges, now and in the future."

The Arts and Humanities ranking evaluated 817 universities from 74 countries in the disciplines of languages; literature and linguistics; history; philosophy; theology; architecture; archaeology; and art, performing arts, and design. This is the second consecutive year MIT has earned the top spot in this subject.

The ranking for Business and Economics evaluated 1,067 institutions from 91 countries and territories across three core disciplines: business and management; accounting and finance; and economics and econometrics. This is the fifth consecutive year MIT has been ranked first in this subject.

The Social Sciences ranking evaluated 1,202 institutions from 104 countries and territories in the disciplines of political science and international studies, sociology, geography, communication and media studies, and anthropology. MIT claimed the top spot in this subject for the second consecutive year.

In other subjects, MIT was also named among the top universities, ranking third in Engineering and Life Sciences, and fourth in Computer Science and Physical Sciences. Overall, MIT ranked second in the Times Higher Education 2026 World University Ranking



de MIT News https://ift.tt/YTajtMs

A quick stretch switches this polymer’s capacity to transport heat

Most materials have an inherent capacity to handle heat. Plastic, for instance, is typically a poor thermal conductor, whereas materials like marble move heat more efficiently. If you were to place one hand on a marble countertop and the other on a plastic cutting board, the marble would conduct more heat away from your hand, creating a colder sensation compared to the plastic.

Typically, a material’s thermal conductivity cannot be changed without re-manufacturing it. But MIT engineers have now found that a relatively common material can switch its thermal conductivity. Simply stretching the material quickly dials up its heat conductance, from a baseline similar to that of plastic to a higher capacity closer to that of marble. When the material springs back to its unstretched form, it returns to its plastic-like properties.

The thermally reversible material is an olefin block copolymer — a soft and flexible polymer that is used in a wide range of commercial products. The team found that when the material is quickly stretched, its ability to conduct heat more than doubles. This transition occurs within just 0.22 seconds, which is the fastest thermal switching that has been observed in any material.

This material could be used to engineer systems that adapt to changing temperatures in real time. For instance, switchable fibers could be woven into apparel that normally retains heat. When stretched, the fabric would instantly conduct heat away from a person’s body to cool them down. Similar fibers can be built into laptops and infrastructure to keep devices and buildings from overheating. The researchers are working on further optimizing the polymer and on engineering new materials with similar properties.

“We need cheap and abundant materials that can quickly adapt to environmental temperature changes,” says Svetlana Boriskina, principal research scientist in MIT’s Department of Mechanical Engineering. “Now that we’ve seen this thermal switching, this changes the direction where we can look for and build new adaptive materials.”

Boriskina and her colleagues have published their results in a study appearing today in the journal Advanced Materials. The study’s co-authors include Duo Xu, Buxuan Li, You Lyu, and Vivian Santamaria-Garcia of MIT, and Yuan Zhu of Southern University of Science and Technology in Shenzhen, China.

Elastic chains

The key to the new phenomenon is that when the material is stretched, its microscopic structures align in ways that suddenly allow heat to travel through easily, increasing the material’s thermal conductivity. In its unstretched state, the same microstructures are tangled and bunched, effectively blocking heat’s path.

As it happens, Boriskina and her colleagues didn’t set out to find a heat-switching material. They were initially looking for more sustainable alternatives to spandex, which is a synthetic fabric made from petroleum-based plastics that is traditionally difficult to recycle. As a potential replacement, the team was investigating fibers made from a different polymer known as polyethylene.

“Once we started working with the material, we realized it had other properties that were more interesting than the fact that it was elastic,” Boriskina says. “What makes polyethylene unique is it has this backbone of carbon atoms arranged along a simple chain. And carbon is a very good conductor of heat.”

The microstructure of most polymer materials, including polyethylene, contains many carbon chains. However, these chains exist in a messy, spaghetti-like tangle known as an amorphous phase. Despite the fact that carbon is a good heat conductor, the disordered arrangement of chains typically impedes heat flow. Polyethylene and most other polymers, therefore, generally have low thermal conductivity.

In previous work, MIT Professor Gang Chen and his collaborators found ways to untangle the mess of carbon chains and push polyethylene to shift from a disordered amorphous state to a more aligned, crystalline phase. This transition effectively straightened the carbon chains, providing clear highways for heat to flow through and increasing the material’s thermal conductivity. In those experiments however, the switch was permanent; once the material’s phase changed, it could not be reversed.

As Boriskina’s team explored polyethylene, they also considered other closely related materials, including olefin block copolymer (OBC). OBC is predominantly an amorphous material, made from highly tangled chains of carbon and hydrogen atoms. Scientists had therefore assumed that OBC would exhibit low thermal conductivity. If its conductance could be increased, it would likely be permanent, similar to polyethylene.

But when the team carried out experiments to test the elasticity of OBC, they found something quite different.

“As we stretched and released the material, we realized that its thermal conductivity was really high when it was stretched and lower when it was relaxed, over thousands of cycles,” says study co-author and MIT graduate student Duo Xu. “This switch was reversible, while the material stayed mostly amorphous. That was unexpected.”

A stretchy mess

The team then took a closer look at OBC, and how it might be changing as it was stretched. The researchers used a combination of X-ray and Raman spectroscopy to observe the material’s microscopic structure as they stretched and relaxed it repeatedly. They observed that, in its unstretched state, the material consists mainly of amorphous tangles of carbon chains, with just a few islands of ordered, crystalline domains scattered here and there. When stretched, the crystalline domains seemed to align and the amorphous tangles straightened out, similar to what Gang Chen observed in polyethylene.

However, rather than transitioning entirely into a crystalline phase, the straightened tangles stayed in their amorphous state. In this way, the team found that the tangles were able to switch back and forth, from straightened to bunched and back again, as the material was stretched and relaxed repeatedly.

“Our material is always in a mostly amorphous state; it never crystallizes under strain,” Xu notes. “So it leaves you this opportunity to go back and forth in thermal conductivity a thousand times. It’s very reversible.”

The team also found that this thermal switching happens extremely fast: The material’s thermal conductivity more than doubled within just 0.22 seconds of being stretched.

“The resulting difference in heat dissipation through this material is comparable to a tactile difference between touching a plastic cutting board versus a marble countertop,” Boriskina says.

She and her colleagues are now taking the results of their experiments and working them into models to see how they can tweak a material’s amorphous structure, to trigger an even bigger change when stretched.

“Our fibers can quickly react to dissipate heat, for electronics, fabrics, and building infrastructure.” Boriskina says. “If we could make further improvements to switch their thermal conductivity from that of plastic to that closer to diamond, it would have a huge industrial and societal impact.”

This research was supported, in part, by the U.S. Department of Energy, the Office of Naval Research Global via Tec de Monterrey, MIT Evergreen Graduate Innovation Fellowship, MathWorks MechE Graduate Fellowship, and the MIT-SUSTech Centers for Mechanical Engineering Research and Education, and carried out, in part, with the use of MIT.nano and ISN facilities.



de MIT News https://ift.tt/MEaklcs

domingo, 8 de febrero de 2026

How MIT’s 10th president shaped the Cold War

Today, MIT plays a key role in maintaining U.S. competitiveness, technological leadership, and national defense — and much of the Institute’s work to support the nation’s standing in these areas can be traced back to 1953.

Two months after he took office that year, U.S. President Dwight Eisenhower received a startling report from the military: The USSR had successfully exploded a nuclear bomb nine months sooner than intelligence sources had predicted. The rising Communist power had also detonated a hydrogen bomb using development technology more sophisticated than that of the U.S. And lastly, there was evidence of a new Soviet bomber that rivaled the B-52 in size and range — and the aircraft was of an entirely original design from within the USSR. There was, the report concluded, a significant chance of a surprise nuclear attack on the United States.

Eisenhower’s understanding of national security was vast (he had led the Allies to victory in World War II and served as the first supreme commander of NATO), but the connections he’d made during his two-year stint as president of Columbia University would prove critical to navigating the emerging challenges of the Cold War. He sent his advisors in search of a plan for managing this threat, and he suggested they start with James Killian, then president of MIT.

Killian had an unlikely path to the presidency of MIT. “He was neither a scientist nor an engineer,” says David Mindell, the Dibner Professor of the History of Engineering and Manufacturing and a professor of aeronautics and astronautics at MIT. “But Killian turned out to be a truly gifted administrator.”

While he was serving as editor of MIT Technology Review (where he founded what became the MIT Press), Killian was tapped by then-president Karl Compton to join his staff. As the war effort ramped up on the MIT campus in the 1940s, Compton deputized Killian to lead the RadLab — a 4,000-person effort to develop and deploy the radar systems that proved decisive in the Allied victory.

Killian was named MIT’s 10th president in 1948. In 1951, he launched MIT Lincoln Laboratory, a federally funded research center where MIT and U.S. Air Force scientists and engineers collaborated on new air defense technologies to protect the nation against a nuclear attack.

Two years later, within weeks of Eisenhower’s 1953 request, Killian convened a group of leading scientists at MIT. The group proposed a three-part study: The U.S. needed to reassess its offensive capabilities, its continental defense, and its intelligence operations. Eisenhower agreed.

Killian mobilized 42 engineers and scientists from across the country into three panels matching the committee’s charge. Between September 1954 and February 1955, the panels held 307 meetings with every major defense and intelligence organization in the U.S. government. They had unrestricted access to every project, plan, and program involving national defense. The result, a 190-page report titled “Meeting the Threat of a Surprise Attack,” was delivered to Eisenhower’s desk on Feb. 14, 1955.

The Killian Report, as it came to be known, would go on to play a dramatic role in defining the frontiers of military technology, intelligence gathering, national security policy, and global affairs over the next several decades. Killian’s input would also have dramatic impacts on Eisenhower’s presidency and the relationship between the federal government and higher education.

Foreseeing an evolving competition

The Killian Report opens by anticipating four projected “periods” in the shifting balance of power between the U.S. and the Soviet Union.

In 1955, the U.S. had a decided offensive advantage over the USSR, but it was overly vulnerable to surprise attack. In 1956 and 1957, the U.S. would have an even larger offensive advantage and be only somewhat less vulnerable to surprise. By 1960, the U.S.’ offensive advantage would be narrower, but it would be in a better position to anticipate an attack. Within a decade, the report stated, the two nations would enter “Period IV” — during which “an attack by either side would result in mutual destruction … [a period] so fraught with danger to the U.S. that we should push all promising technological development so that we may stay in Periods II and III as long as possible.”

The report went on to make extensive, detailed recommendations — accelerated development of intercontinental ballistic missiles and high-energy aircraft fuels, expansion and increased ground security for “delivery system” facilities, increased cooperation with Canada and more studies about establishing monitoring stations on polar pack ice, and “studies directed toward better understanding of the radiological hazards that may result from the detonation of large numbers of nuclear weapons,” among others.

“Eisenhower really wanted to draw the perspectives of scientists and engineers into his decision-making,” says Mindell. “Generals and admirals tend to ask for more arms and more boots on the ground. The president didn’t want to be held captive by these views — and Killian’s report really delivered this for him.”

On the day it arrived, President Eisenhower circulated the Killian Report to the head of every department and agency in the federal government and asked them to comment on its recommendations. The Cold War arms race was on — and it would be between scientists and engineers in the United States and those in the Soviet Union.

An odd couple

The Killian Report made many recommendations based on “the correctness of the current national intelligence estimates” — even though “Eisenhower was frustrated with his whole intelligence apparatus,” says Will Hitchcock, the James Madison Professor of History at the University of Virginia and author of “The Age of Eisenhower.” “He felt it was still too much World War II ‘exploding-cigar’ stuff. There wasn’t enough work on advance warning, on seeing what’s over the hill. But that’s what Eisenhower really wanted to know.” The surprise attack on Pearl Harbor still lingered in the minds of many Americans, Hitchcock notes, and “that needed to be avoided.”

Killian needed an aggressive, innovative thinker to assess U.S. intelligence, so he turned to Edwin Land. The cofounder of Polaroid, Land was an astonishingly bold engineer and inventor. He also had military experience, having developed new ordnance targeting systems, aerial photography devices, and other photographic and visual surveillance technologies during World War II. Killian approached Land knowing their methods and work style were quite different. (When the offer to lead the intelligence panel was made, Land was in Hollywood advising filmmakers on the development of 3D movies; Land told Killian he had a personal rule that any committee he served on “must fit into a taxicab.”)

In fall 1954, Land and his five-person panel quickly confirmed Killian and Eisenhower’s suspicions: “We would go in and interview generals and admirals in charge of intelligence and come away worried,” Land reported to Killian later. “We were [young scientists] asking questions — and they couldn’t answer them.” Killian and Land realized this would set their report and its recommendations on a complicated path: While they needed to acknowledge and address the challenges of broadly upgrading intelligence activities, they also needed to make rapid progress on responding to the Soviet threat.

As work on the report progressed, Land and Killian held briefings with Eisenhower. They used these meetings to make two additional proposals — neither of which, President Eisenhower decided, would be spelled out in the final report for security reasons. The first was the development of missile-firing submarines, a long-term prospect that would take a decade to complete. (The technology developed for Polaris-class submarines, Mindell notes, transferred directly to the rockets that powered the Apollo program to the moon.)

The second proposal — to fast-track development of the U-2, a new high-altitude spy plane —could be accomplished within a year, Land told Eisenhower. The president agreed to both ideas, but he put a condition on the U-2 program. As Killian later wrote: “The president asked that it should be handled in an unconventional way so that it would not become entangled in the bureaucracy of the Defense Department or troubled by rivalries among the services.”

Powered by Land’s revolutionary imaging devices, the U-2 would become a critical tool in the U.S.’ ability to assess and understand the Soviet Union’s nuclear capacity. But the spy plane would also go on to have disastrous consequences for the peace process and for Eisenhower.

The aftermath(s)

The Killian Report has a very complex legacy, says Christopher Capozzola, the Elting Morison Professor of History. “There is a series of ironies about the whole undertaking,” he says. “For example, Eisenhower was trying to tamp down interservice rivalries by getting scientists to decide things. But within a couple of years those rivalries have all gotten worse.” Similarly, Capozzola notes, Eisenhower — who famously coined the phrase “military-industrial complex” and warned against it — amplified the militarization of scientific research “more than anyone else.”

Another especially painful irony emerged on May 1, 1960. Two weeks before a meeting between Eisenhower and Khrushchev in Paris to discuss how the U.S. and USSR could ease Cold War tensions and slow the arms race, a U-2 was shot down in Soviet airspace. After a public denial by the U.S. that the aircraft was being used for espionage, the Soviets produced the plane’s wreckage, cameras, and pilot — who admitted he was working for the CIA. The peace process, which had become the centerpiece of Eisenhower’s intended legacy, collapsed.

There were also some brighter outcomes of the Killian Report, Capozzola says. It marked a dramatic reset of the national government’s relationship with academic scientists and engineers — and with MIT specifically. “The report really greased the wheels between MIT scientists and Washington,” he notes. “Perhaps more than the report itself, the deep structures and relationships that Killian set up had implications for MIT and other research universities. They started to orient their missions toward the national interest,” he adds.

The report also cemented Eisenhower’s relationship with Killian. After the launch of Sputnik, which induced a broad public panic in the U.S. about Soviet scientific capabilities, the president called on Killian to guide the national response. Eisenhower later named Killian the first special assistant to the president for science and technology. In the years that followed, Killian would go on to help launch NASA, and MIT engineers would play a critical role in the Apollo mission that landed the first person on the moon. To this day, researchers at MIT and Lincoln Laboratory uphold this legacy of service, advancing knowledge in areas vital to national security, economic competitiveness, and quality of life for all Americans.

As Eisenhower’s special assistant, Killian met with him almost daily and became one of his most trusted advisors. “Killian could talk to the president, and Eisenhower really took his advice,” says Capozzola. “Not very many people can do that. The fact that Killian had that and used it was different.”

A key to their relationship, Capozzola notes, was Killian’s approach to his work. “He exemplified the notion that if you want to get something done, don’t take the credit. At no point did Killian think he was setting science policy. He was advising people on their best options, including decision-makers who would have to make very difficult decisions. That’s it.”

In 1977, after many tours of duty in Washington and his retirement from MIT, Killian summarized his experience working for Eisenhower in his memoir, “Sputnik, Scientists, and Eisenhower.” Killian said of his colleagues: “They were held together in close harmony not only by the challenge of the scientific and technical work they were asked to undertake but by their abiding sense of the opportunity they had to serve a president they admired and the country they loved. They entered the corridors of power in a moment of crisis and served there with a sense of privilege and of admiration for the integrity and high purpose of the White House.”



de MIT News https://ift.tt/H8YzCrG

viernes, 6 de febrero de 2026

“This is science!” – MIT president talks about the importance of America’s research enterprise on GBH’s Boston Public Radio

In a wide-ranging live conversation, MIT President Sally Kornbluth joined Jim Braude and Margery Eagan live in studio for GBH’s Boston Public Radio on Thursday, February 5. They talked about MIT, the pressures facing America’s research enterprise, the importance of science, that Congressional hearing on antisemitism in 2023, and more – including Sally’s experience as a Type 1 diabetic.

Reflecting on how research and innovation in the treatment of diabetes has advanced over decades of work, leading to markedly better patient care, Kornbluth exclaims: “This is science!”

With new financial pressures facing universities, increased competition for talented students and scholars from outside the U.S., as well as unprecedented pressures on university leaders and campuses, co-host Eagan asks Kornbluth what she thinks will happen in years to come.

“For us, one of the hardest things now is the endowment tax,” remarks Kornbluth. “That is $240 million a year. Think about how much science you can get for $240 million a year. Are we managing it? Yes. Are we still forging ahead on all of our exciting initiatives? Yes. But we’ve had to reconfigure things. We’ve had to merge things. And it’s not the way we should be spending our time and money.”   

Watch and listen to the full episode on YouTube. President Kornbluth appears one hour and seven minutes into the broadcast.

Following Kornbluth’s appearance, MIT Assistant Professor John Urschel – also a former offensive lineman for the Baltimore Ravens –   joined Edgar B. Herwick III, host of GBH’s newest show, The Curiosity Desk, to talk about his love of his family, linear algebra, and football.

On how he eventually chose math over football, Urschel quips: “Well, I hate to break it to you, I like math better… let me tell you, when I started my PhD at MIT, I just fell in love with the place. I fell in love with this idea of being in this environment [where] everyone loves math, everyone wants to learn. I was just constantly excited every day showing up.”

Prof. Urschel appears about 2 hours and 40 minutes into the webcast on YouTube.

Coming up on Curiosity Desk later this month…

Airing weekday afternoons from 1-2 p.m., The Curiosity Desk will welcome additional MIT guests in the coming weeks. On Thursday, Feb. 12 Anette “Peko” Hosoi, Pappalardo Professor of Mechanical Engineering, and Jerry Lu MFin ’24, a former researcher at the MIT Sports Lab, visit The Curiosity Desk to discuss their work using AI to help Olympic figure skaters improve their jumps.

Then, on Thursday, Feb. 19, Professors Sangeeta Bhatia and Angela Belcher talk with Herwick about their research to improve diagnostics for ovarian cancer. We learn that about 80% of the time ovarian cancer starts in the fallopian tubes and how this points the way to a whole new approach to diagnosing and treating the disease. 



de MIT News https://ift.tt/frEd3P9

I’m walking here! A new model maps foot traffic in New York City

Early in the 1969 film “Midnight Cowboy,” Dustin Hoffman, playing the character of Ratso Rizzo, crosses a Manhattan street and angrily bangs on the hood of an encroaching taxi. Hoffman’s line — “I’m walking here!” — has since been repeated by thousands of New Yorkers. Where cars and people mix, tensions rise.

And yet, governments and planners across the U.S. haven’t thoroughly tracked where it is that cars and people mix. Officials have long measured vehicle traffic closely while largely ignoring pedestrian traffic. Now, an MIT research group has assembled a routable dataset of sidewalks, crosswalks, and footpaths for all of New York City — a massive mapping project and the first complete model of pedestrian activity in any U.S. city.

The model could help planners decide where to make pedestrian infrastructure and public space investments, and illuminate how development decisions could affect non-motorized travel in the city. The study also helps pinpoint locations throughout the city where there are both lots of pedestrians and high pedestrian hazards, such as traffic crashes, and where streets or intersections are most in need of upgrades.

“We now have a first view of foot traffic all over New York City and can check planning decisions against it,” says Andres Sevtsuk, an associate professor in MIT’s Department of Urban Studies and Planning (DUSP), who led the study. “New York has very high densities of foot traffic outside of its most well-known areas.”

Indeed, one upshot of the model is that while Manhattan has the most foot traffic per block, the city’s other boroughs contain plenty of pedestrian-heavy stretches of sidewalk and could probably use more investment on behalf of walkers.

“Midtown Manhattan has by far the most foot traffic, but we found there is a probably unintentional Manhattan bias when it comes to policies that support pedestrian infrastructure,” Sevtsuk says. “There are a whole lot of streets in New York with very high pedestrian volumes outside of Manhattan, whether in Queens or the Bronx or Brooklyn, and we’re able to show, based on data, that a lot of these streets have foot-traffic levels similar to many parts of Manhattan.”

And, in an advance that could help cities anywhere, the model was used to quantify vehicle crashes involving pedestrians not only as raw totals, but on a per-pedestrian basis.

“A lot of cities put real investments behind keeping pedestrians safe from vehicles by prioritizing dangerous locations,” Sevtsuk says. “But that’s not only where the most crashes occur. Here we are able to calculate accidents per pedestrian, the risk people face, and that broadens the picture in terms of where the most dangerous intersections for pedestrians really are.”

The paper, “Spatial Distribution of Foot-traffic in New York City and Applications for Urban Planning,” is published today in Nature Cities.

The authors are Sevtsuk, the Charles and Ann Spaulding Associate Professor of Urban Science and Planning in DUSP and head of the City Design and Development Group; Rounaq Basu, an assistant professor at Georgia Tech; Liu Liu, a PhD student at the City Form Lab in DUSP; Abdulaziz Alhassan, a PhD student at MIT’s Center for Complex Engineering Systems; and Justin Kollar, a PhD student at MIT’s Leventhal Center for Advanced Urbanism in DUSP.

Walking everywhere

The current study continues work Sevtsuk and his colleagues have conducted charting and modeling pedestrian traffic around the world, from Melbourne to MIT’s Kendall Square neighborhood in Cambridge, Massachusetts. Many cities collect some pedestrian count data — but not much. And while officials usually request vehicle traffic impact assessments for new development plans, they rarely study how new developments or infrastructure proposals affect pedestrians.

However, New York City does devote part of its Department of Transportation (DOT) to pedestrian issues, and about 41 percent of trips city-wide are made on foot, compared to just 28 percent by vehicle, likely the highest such ratio in any big U.S. city. To calibrate the model, the MIT team used pedestrian counts that New York City’s DOT recorded in 2018 and 2019, covering up to 1,000 city sidewalk segments on weekdays and up to roughly 450 segments on weekends.

The researchers were able to test the model — which incorporates a wide range of factors — against New York City’s pedestrian-count data. Once calibrated, the model could expand foot-traffic estimates throughout the whole city, not just the points where pedestrian counts were observed.

The results showed that in Midtown Manhattan, there are about 1,697 pedestrians, on average, per sidewalk segment per hour during the evening peak of foot traffic, the highest in the city. The financial district in lower Manhattan comes in second, at 740 pedestrians per hour, with Greenwich Village third at 656.

Other parts of Manhattan register lower levels of foot traffic, however. Morningside Heights and East Harlem register 226 and 227 pedestrians per block per hour. And that’s similar to, or lower than, some parts of other boroughs. Brooklyn Heights has 277 pedestrians per sidewalk segment per hour; University Heights in the Bronx has 263; Borough Park in Brooklyn and the Grand Concourse in the Bronx average 236; and a slice of Queens in the Corona area averages 222. Many other spots are over 200.

The model overlays many different types of pedestrian journeys for each time period and shows that people are generally headed to work and schools in the morning, but conduct more varied types of trips in mid-day and the evening, as they seek out amenities or conduct social or recreational visits.

“Because of jobs, transit stops are the biggest generators of foot traffic in the morning peak,” Liu observes. “In the evening peak, of course people need to get home too, but patterns are much more varied, and people are not just returning from work or school. More social and recreational travel happens after work, whether it’s getting together with friends or running errands for family or family care trips, and that’s what the model detects too.”

On the safety front, pedestrians face danger in many places, not just the intersections with the most total accidents. Many parts of the city are riskier than others on a per-pedestrian basis, compared to the locations with the most pedestrian-related crashes.

“Places like Times Square and Herald Square in Manhattan may have numerous crashes, but they have very high pedestrian volumes, and it’s actually relatively safe to walk there,” Basu says. “There are other parts of the city, around highway off-ramps and heavy car-infrastructure, including the relatively low-density borough of Staten Island, which turn out to have a disproportionate number of crashes per pedestrian.”

Taking the model across the U.S.

The MIT model stands a solid chance of being applied in New York City policy and planning circles, since officials there are aware of the research and have been regularly communicating with the MIT team about it.

For his part, Sevtsuk emphasizes that, as distinct as New York City might be, the MIT model can be applied to cities and town anywhere in the U.S. As it happens, the team is working with municipal officials in two other places at the moment. One is Los Angeles, where city officials are not only trying to upgrade pedestrian and public transit mobility for regular daily trips, but making plans to handle an influx of visitors for the 2028 summer Olympics.

Meanwhile the state of Maine is working with the MIT team to evaluate pedestrian movement in over 140 of its cities and towns, to better understand the kinds of upgrades and safety improvements it could make for pedestrians across the state. Sevtsuk hopes that still other places will take notice of the New York City study and recognize that the tools are in place to analyze foot traffic more broadly in U.S. cities, to address the urgent need to decarbonize cities, and to start balancing what he views as the disproportionate focus on car travel prevalent in 20th century urban planning.

“I hope this can inspire other cities to invest in modeling foot traffic and mapping pedestrian infrastructure as well,” Sevtsuk says. “Very few cities make plans for pedestrian mobility or examine rigorously how future developments will impact foot-traffic. But they can. Our models serve as a test bed for making future changes.” 



de MIT News https://ift.tt/mwSefEo

jueves, 5 de febrero de 2026

Some early life forms may have breathed oxygen well before it filled the atmosphere

Oxygen is a vital and constant presence on Earth today. But that hasn’t always been the case. It wasn’t until around 2.3 billion years ago that oxygen became a permanent fixture in the atmosphere, during a pivotal period known as the Great Oxidation Event (GOE), which set the evolutionary course for oxygen-breathing life as we know it today.

A new study by MIT researchers suggests some early forms of life may have evolved the ability to use oxygen hundreds of millions of years before the GOE. The findings may represent some of the earliest evidence of aerobic respiration on Earth.

In a study appearing today in the journal Palaeogeography, Palaeoclimatology, Palaeoecology, MIT geobiologists traced the evolutionary origins of a key enzyme that enables organisms to use oxygen. The enzyme is found in the vast majority of aerobic, oxygen-breathing life forms today. The team discovered that this enzyme evolved during the Mesoarchean — a geological period that predates the Great Oxidation Event by hundreds of millions of years.

The team’s results may help to explain a longstanding puzzle in Earth’s history: Why did it take so long for oxygen to build up in the atmosphere?

The very first producers of oxygen on the planet were cyanobacteria — microbes that evolved the ability to use sunlight and water to photosynthesize, releasing oxygen as a byproduct. Scientists have determined that cyanobacteria emerged around 2.9 billion years ago. The microbes, then, were presumably churning out oxygen for hundreds of millions of years before the Great Oxidation Event. So, where did all of cyanobacteria’s early oxygen go?

Scientists suspect that rocks may have drawn down a large portion of oxygen early on, through various geochemical reactions. The MIT team’s new study now suggests that biology may have also played a role.

The researchers found that some organisms may have evolved the enzyme to use oxygen hundreds of millions of years before the Great Oxidation Event. This enzyme may have enabled the organisms living near cyanobacteria to gobble up any small amounts of oxygen that the microbes produced, in turn delaying oxygen’s accumulation in the atmosphere for hundreds of millions of years.

“This does dramatically change the story of aerobic respiration,” says study co-author Fatima Husain, a postdoc in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS). “Our study adds to this very recently emerging story that life may have used oxygen much earlier than previously thought. It shows us how incredibly innovative life is at all periods in Earth’s history.”

The study’s other co-authors include Gregory Fournier, associate professor of geobiology at MIT, along with Haitao Shang and Stilianos Louca of the University of Oregon.

First respirers

The new study adds to a long line of work at MIT aiming to piece together oxygen’s history on Earth. This body of research has helped to pin down the timing of the Great Oxidation Event as well as the first evidence of oxygen-producing cyanobacteria. The overall understanding that has emerged is that oxygen was first produced by cyanobacteria around 2.9 billion years ago, while the Great Oxidation Event — when oxygen finally accumulated enough to persist in the atmosphere — took place much later, around 2.33 billion years ago.

For Husain and her colleagues, this apparent delay between oxygen’s first production and its eventual persistence inspired a question.

“We know that the microorganisms that produce oxygen were around well before the Great Oxidation Event,” Husain says. “So it was natural to ask, was there any life around at that time that could have been capable of using that oxygen for aerobic respiration?”

If there were in fact some life forms that were using oxygen, even in small amounts, they might have played a role in keeping oxygen from building up in the atmosphere, at least for a while.

To investigate this possibility, the MIT team looked to heme-copper oxygen reductases, which are a set of enzymes that are essential for aerobic respiration. The enzymes act to reduce oxygen to water, and they are found in the majority of aerobic, oxygen-breathing organism today, from bacteria to humans.

“We targeted the core of this enzyme for our analyses because that’s where the reaction with oxygen is actually taking place,” Husain explains.

Tree dates

The team aimed to trace the enzyme’s evolution backward in time to see when the enzyme first emerged to enable organisms to use oxygen. They first identified the enzyme’s genetic sequence and then used an automated search tool to look for this same sequence in databases containing the genomes of millions of different species of organisms.

“The hardest part of this work was that we had too much data,” Fournier says. “This enzyme is just everywhere and is present in most modern living organism. So we had to sample and filter the data down to a dataset that was representative of the diversity of modern life and also small enough to do computation with, which is not trivial.”

The team ultimately isolated the enzyme’s sequence from several thousand modern species and mapped these sequences onto an evolutionary tree of life, based on what scientists know about when each respective species has likely evolved and branched off. They then looked through this tree for specific species that might offer related information about their origins.

If, for instance, there is a fossil record for a particular organism on the tree, that record would include an estimate of when that organism appeared on Earth. The team would use that fossil’s age to “pin” a date to that organism on the tree. In a similar way, they could place pins across the tree to effectively tighten their estimates for when in time the enzyme evolved from one species to the next.

In the end, the researchers were able to trace the enzyme as far back as the Mesoarchean — a geological era that lasted from 3.2 to 2.8 billion years ago. It’s around this time that the team suspects the enzyme — and organisms’ ability to use oxygen — first emerged. This period predates the Great Oxidation Event by several hundred million years.

The new findings suggest that, shortly after cyanobacteria evolved the ability to produce oxygen, other living things evolved the enzyme to use that oxygen. Any such organism that happened to live near cyanobacteria would have been able to quickly take up the oxygen that the bacteria churned out. These early aerobic organisms may have then played some role in preventing oxygen from escaping to the atmosphere, delaying its accumulation for hundreds of millions of years.

“Considered all together, MIT research has filled in the gaps in our knowledge of how Earth’s oxygenation proceeded,” Husain says. “The puzzle pieces are fitting together and really underscore how life was able to diversify and live in this new, oxygenated world.”

This research was supported, in part, by the Research Corporation for Science Advancement Scialog program.



de MIT News https://ift.tt/m96ASXY