viernes, 27 de marzo de 2026

MIT hackathon tackles real-world challenges in Ukraine

During this year’s Independent Activities Period (IAP), students, researchers, and collaborators across seven time zones came together to tackle urgent technical challenges facing Ukraine as the full-scale war enters its fourth year. 

A four-week hackathon, Build for Ukraine 2.0, brought MIT students and Ukrainian collaborators into a shared innovation environment where power outages, air-raid alerts, and subzero temperatures were part of the daily reality of teamwork.

The event was co-led by the MIT-Ukraine Program, MIT Edgerton Center, and MIT Lincoln Laboratory Beaver Works, with support from Mission Innovation X, MathWorks, and MIT.nano.

Designed and taught as an IAP subject EC.S01/EC.S11 (Build for Ukraine 2026), the hackathon paired technically diverse participants with Ukrainian organizations seeking near-term solutions to problems arising directly from wartime conditions.

“It’s not every working group that has to reschedule team meetings because some members are in Ukraine and just had a blackout,” says Hosea Siu ’14, SM ’15, PhD ’18, one of the lead organizers. “This class is unusual — in the most meaningful ways.”

A collaborative class built for real-world urgency

Build for Ukraine centered on co-design and rapid prototyping with in-country partners. Organizers spent the fall gathering challenge statements from stakeholders in Ukraine, Taiwan, the United Kingdom, Spain, and across the United States. The goal: identify problems where a small, interdisciplinary team could make measurable progress in one month.

The participant pool reflected MIT’s open IAP structure. First-year undergraduates worked alongside senior engineers, international researchers, and Ukrainian colleagues participating remotely despite frequent blackouts. Many joined meetings from darkened apartments in Kyiv, Kharkiv, and Cherkasy — often relying on unstable heaters and backup battery packs. One participant excused himself from a design review due to an air-raid alert.

“These groups developed what I call ‘quantum entanglement,’” says Svetlana Boriskina, a principal research scientist at MIT and director of the Multifunctional Metamaterials Laboratory in the Department of Mechanical Engineering. “They were sharing data in real time across continents, while experiencing the war’s impacts directly and indirectly.”

Setting the foundation: briefings and technical overviews

The first week introduced participants to the geopolitical, technical, and humanitarian landscape that would frame their work. Topics included:

  • War context and co-design practices. Boriskina and Elizabeth Wood, faculty director of the MIT-Ukraine Program and professor of history at MIT, outlined current conditions in Ukraine. Student mentor Natalie Dean ’26 (vice president of MIT’s Assistive Technology Club) led a session on co-design — emphasizing partnership with, not for, Ukrainian collaborators.
  • Extreme-environment engineering. Boriskina introduced two possible technical tracks proposed by her collaborators at Kharkiv Institute of Physics and Technology: radiation-hardened materials and self-powered sensors for extreme environments, and acoustic analysis for monitoring supercritical water cooling systems in nuclear reactors. One team, later known as HotPot, adopted the latter challenge.
  • AI, Open Source Intelligence, and disinformation. Phil Tinn ’16, a research scientist at SINTEF and an affiliate of the MIT-Ukraine Program, along with specialists from IN2, described how disinformation narratives travel across platforms, from Telegram to global social media. Cambridge University researcher Jon Roozenbeek discussed early threat-signal detection using pricing fluctuations in fake SMS verifications. Ukrainian partners presented on large language model bias propagation, bot detection, and media-anomaly analysis — groundwork for the eventual VibeTracking team.
  • Explosive ordnance disposal. Experts from MineSight and the U.S. Army National Guard detailed the scale of landmine contamination in Ukraine — by some estimates affecting a third of the country. These sessions inspired Clearview Interface, which worked on improving visual feedback for de-mining tools.
  • Drone detection. Engineers from Skyfall and MIT’s student community introduced acoustic, radiofrequency (RF), and fiber-optic-tether detection methods for drones — leading to two separate teams: Birdwatch (acoustic detection) and Hrobachki (RF detection).

Five teams, seven time zones, and one month of development

Nearly 90 people joined the project through Discord, and by the end of week one, five core teams had formed. Roles blurred: Undergraduates mentored professionals; Ukrainian engineers supplied real-time operational data; and faculty offered rapid problem-solving guidance. Each team completed a Preliminary Design Review, Critical Design Review, and final presentation to an audience of more than 80 people, online and in person.

Despite the compressed timeline, the teams delivered promising prototypes and analyses with potential real-world application.

Team highlights

Clearview Interface — Visualizing metal-detector data for safer de-mining

Two undergraduates from Olin College developed a method for converting complex metal-detector audio signals — often an overwhelming sequence of indistinguishable beeps — into intuitive visual information. Their approach could help de-miners identify object types more quickly and accurately, enhancing both safety and mapping. The team reverse-engineered commercial detector outputs and produced a preliminary interface they plan to refine this spring.

HotPot — Acoustic monitoring for nuclear-reactor cooling systems

This team of seven (five at MIT and two from the Kharkiv Institute of Physics and Technology) worked to detect transitions from water to supercritical states inside steam pipes — a critical safety parameter in nuclear facilities that have remained in operation during wartime. Combining physics simulations, hardware engineering, and acoustics, the group analyzed data from Ukrainian partners and proposed a model capable of identifying supercritical conditions via remote monitoring.

Birdwatch — Acoustic detection of fiber-optic-controlled drones

With drones frequently used along the front and often tethered to fiber-optic control lines that evade RF detection, the Birdwatch team built an audio-based detection system using a network of cameras and microphones. They trained their model on drone signatures recorded across MIT’s campus and integrated early detections into a decision-support tool to help operators interpret and act on the alerts.

Hrobachki — Radiofrequency localization for long-range drones

Two MIT students, along with collaborators at Kenyon College, Olin College, and a partner in Cherkasy, Ukraine, focused on RF detection for drones operating beyond front-line distances. They established nodes at MIT, Olin, and the town of Milton, Massachusetts, demonstrating the feasibility of distributed RF sensing for aerial threat identification.

VibeTracking — Following the movement of disinformation narratives

The smallest team — a master’s student in Lviv supported by several advisors — collaborated with IN2 to build a large-language-model pipeline that classifies and groups narratives across platforms such as Telegram and X. Their system demonstrated the likely propagation path of a specific narrative, illustrating how early-stage disinformation can be identified before it reaches mainstream channels.

Resilience, connection, and next steps

On the final day of presentations, specialists from Ukrainian universities, industry partners, and MIT-affiliated programs filled the room and populated the Zoom call. Their response was enthusiastic, not only because of what the teams produced in four weeks, but because of the collaborative networks formed under difficult conditions.

“The most important outcome is the community that emerged,” Boriskina says. “These teams built tools — but they also built relationships that will carry this work forward.”

Organizers expect several projects to continue this spring through research internships, Undergraduate Research Opportunity Program projects, and follow-on collaborations with Ukrainian institutions.

Students interested in joining ongoing Build for Ukraine projects can email the MIT-Ukraine Program. To support MIT-Ukraine initiatives, contact Svitlana Krasynska.



de MIT News https://ift.tt/xHtmVQD

jueves, 26 de marzo de 2026

Implantable islet cells could control diabetes without insulin injections

Most diabetes patients must carefully monitor their blood sugar levels and inject insulin multiple times per day, to help keep their blood sugar from getting too high.

As a possible alternative to those injections, MIT researchers are developing an implantable device that contains insulin-producing cells. The device encapsulates the cells, protecting them from immune rejection, and it also carries an on-board oxygen generator to keep the cells healthy.

This device, the researchers hope, could offer a way to achieve long-term control of type 1 diabetes. In a new study, they showed that these encapsulated pancreatic islet cells could survive in the body for at least 90 days. In mice that received the implants, the cells remained functional and produced enough insulin to control the animals’ blood sugar levels.

“Islet cell therapy can be a transformative treatment for patients. However, current methods also require immune suppression, which for some people can be really debilitating,” says Daniel Anderson, a professor in MIT’s Department of Chemical Engineering and a member of MIT’s Koch Institute for Integrative Cancer Research and Institute for Medical Engineering and Science. “Our goal is to find a way to give patients the benefit of cell therapy without the need for immune suppression.”

Anderson is the senior author of the study, which appears today in the journal Device. Former MIT research scientist Siddharth Krishnan, who is now an assistant professor of electrical engineering at Stanford University, and former MIT postdoc Matthew Bochenek are the lead authors of the paper. Robert Langer, the David H. Koch Institute Professor at MIT, is also a co-author.

Insulin on demand

Islet cell transplantation has already been used successfully to treat diabetes in patients. Those islet cells typically come from human cadavers, or more recently, can be generated from stem cells. In either case, patients must take immunosuppressive drugs to prevent their immune system from rejecting the transplanted cells.

Another way to prevent immune rejection is to encapsulate cells in a protective device. However, this raises new challenges, as the coating that surrounds the cells can prevent them from receiving enough oxygen.

In a 2023 study, Anderson and his colleagues reported an islet-encapsulation device that also carries an on-board oxygen generator. This generator consists of a proton-exchange membrane that can split water vapor (found abundantly in the body) into hydrogen and oxygen. The hydrogen diffuses harmlessly away, while oxygen goes into a storage chamber that feeds the islet cells through a thin, oxygen-permeable membrane.

Cells encapsulated within this device, they found, could produce insulin for up to a month after being implanted in mice.

“A month is a good timeframe in that it shows basic proof-of-concept. But from a translational standpoint, it’s important to show that you can go quite a bit longer than that,” Krishnan says.

In the new study, the researchers increased the lifespan of the devices by making them more waterproof and more resilient to cracking. They also improved the device electronics to deliver more power to the oxygen generator. The implant is powered wirelessly by an external antenna placed on the skin, which transfers energy to the device. By optimizing the circuitry, the researchers were able to increase the amount of power reaching the oxygen-generating system.

The additional power allowed the device to produce more oxygen, helping the encapsulated cells to survive and function more effectively. As a result, the cells were able to generate much more insulin over time.

Protein factories

In studies in rats and mice, the researchers showed that the new device could function for at least 90 days after being implanted under the skin. During this time, donor islet cells were able to produce enough insulin to keep the animals’ blood sugar levels within a healthy range.

The researchers saw similar results with islet cells derived from induced pluripotent stem cells, which could one day provide an indefinite supply that could be used for any patient who needs them. These islets didn’t fully reverse diabetes, but they did achieve some control of blood sugar levels.

“We’re hoping that in the future, if we can give the cells a little bit longer to fully mature, that they’ll secrete even more insulin to better regulate diabetes in the animals,” Bochenek says.

The researchers now plan to study whether they can get the devices to last for even longer in the body — up to two years, or longer.

“Long-term survival of the islets is an important goal,” Anderson says. “The cells, if they’re in the right environment, seem to be able to survive for a long time. We are excited by the duration we’ve already achieved, and we will be working to extend their function as long as possible.”

The researchers are also exploring the possibility of using this approach to deliver cells that could produce other useful proteins, such as antibodies, enzymes, or clotting factors.

“We think that these technologies could provide a long-term way to treat human disease by making drugs in the body instead of outside of the body,” Anderson says. “There are many protein therapies where patients must receive repeated, lengthy infusions. We think it may be possible to create a device that could continuously create protein therapeutics on demand and as needed by the patient.”

The research was funded, in part, by Breakthrough TID, the Leona M. and Harry B. Helmsley Charitable Trust, the National Institutes of Health, and a Koch Institute Support (core) Grant from the National Cancer Institute.



de MIT News https://ift.tt/PVcoYZv

Study reveals why some cancer therapies don’t work for all patients

Drugs that block enzymes called tyrosine kinases are among the most effective targeted therapies for cancer. However, they typically work for only 40 to 80 percent of the patients who would be expected to respond to them.

In a new study, MIT researchers have figured out why those drugs don’t work in all cases: Many of these tumors have turned on a backup survival pathway that helps them keep growing when the targeted pathway is knocked out.

“This seems to be hardwired into the cells and seems to be providing activation of a critical survival pathway in cancer cells,” says Forest White, the Ned C. and Janet C. Rice Professor of Biological Engineering at MIT. “This pathway allows the cells to be resistant to a wide variety of therapies, including chemotherapies.”

Additionally, the researchers found that they could kill those drug-resistant cancer cells by treating with both a tyrosine kinase inhibitor and a drug that targets the backup pathway. Clinical trials are now underway to test one such combination in lung cancer patients.

White is the senior author of the study, which appears this week in the Proceedings of the National Academy of Sciences. Cameron Flower PhD ’24, who is now a postdoc at Dana-Farber Cancer Institute and Boston Children’s Hospital, is the paper’s lead author.

Tumor survival

Tyrosine kinases are involved in many signaling pathways that allow cells to receive input from the external environment and convert it into a response such as growing or dividing. There are about 90 types of these kinases in human cells, and many of them are overactive in cancer cells.

“These kinases are very important for regulating cell growth and mitosis, and pushing the cell from a nondividing state to a dividing state depends on the activity of a lot of different tyrosine kinases,” Flower says. “We see a lot of mutations and overexpression of these kinases in cancer cells.”

These cancer-associated kinases include EGFR and BCR-ABL. Many cancer drugs targeting these kinases, including imatinib (Gleevec), have been approved to treat leukemia and other cancers. However, these drugs are not effective for all of the patients whose tumors overexpress tyrosine kinases — a phenomenon that has puzzled cancer researchers.

That lower-than-expected success rate motivated the MIT team to look into these drugs and try to figure out why some tumors do not respond to them.

For this study, the researchers examined six different cancer cell lines, which originally came from lung cancer patients. They chose two cell lines with EGFR mutations, two with mutations in a tyrosine kinase called MET and two with mutations in a tyrosine kinase called ALK. Each pair included one line that responded well to the tyrosine kinase inhibitor targeting the overactive pathway and one line that did not.

Using a technique called phosphoproteomics, the researchers were able to analyze the signaling pathways that were active in each of the cells, before and after treatment. Phosphoproteomics is used to identify proteins that have had a phosphate group added to them by a kinase. This process, known as phosphorylation, can activate or deactivate the target protein.

The researchers’ analysis revealed that the drugs were working as intended in all of the cancer cells. Even in resistant cells, the drugs did knock out signaling by their target kinase. However, in the cells that were resistant, an alternative network was already turned on, which helped the cells survive in spite of the treatment.

“Even before the therapy begins, the cells are in a state that intrinsically is resistant to the drug,” Flower says.

This survival network consists of signaling pathways that are regulated by another type of kinases known as SRC family kinases. Activation of this network appears to help cancer cells proliferate and possibly to migrate to new locations in the body. In addition to lung cancer, researchers from White’s lab have also found SRC family kinases activated in melanoma cells, where they also play a role in drug resistance, and in glioblastoma, a type of brain cancer.

“As inhibitors for SRC kinases are also drugs, the work suggests that combining inhibitors of driver oncogenes with SRC inhibitors could increase the number of patients who would benefit. This strategy merits testing in new clinical trials,” says Benjamin Neel, a professor of medicine at NYU Grossman School of Medicine, who was not involved in the study.

These findings might also explain why some patients who initially respond to tyrosine kinase inhibitors end up having their tumors recur later; the cells may end up activating this same survival pathway, but not until sometime after the initial treatment.

Combating resistance

The researchers also found that treating the resistant cells with both a tyrosine kinase inhibitor and a drug that inhibits SRC family kinases led to much greater cell death rates. By coincidence, a clinical trial testing the combination of a tyrosine kinase inhibitor called osmertinib and an SRC inhibitor is now underway, in patients with lung cancer. The MIT team now hopes to work with the same drug company to run a similar trial in pancreatic cancer patients.

The researchers also showed that they could use phosphoproteomics to analyze patient biopsy samples to see which cells already have the SRC pathways turned on.

“We are really excited to watch these clinical trials and to see how well patients do on these combinations. And I really think there’s a future for using tyrosine phosphoproteomics to guide this clinical decision-making,” White says.

This therapy might also be useful for patients whose tumors are originally susceptible to tyrosine kinase inhibitors but then later become resistant by turning on SRC pathways.

“Among the sensitive cells, some of them are able to upregulate this survival pathway and survive, which might be the residual disease that’s still there after treatment,” White says. “One of the interesting avenues here is, could we improve therapy for almost everybody, regardless of whether their tumors have intrinsic or adaptive resistance?”

The research was funded by the National Institutes of Health and the MIT Center for Precision Cancer Medicine.



de MIT News https://ift.tt/jH2oUJu

“Near-misses” in particle accelerators can illuminate new physics, study finds

Particle accelerators reveal the heart of nuclear matter by smashing together atoms at close to the speed of light. The high-energy collisions produce a shower of subatomic fragments that scientists can then study to reconstruct the core building blocks of matter.

An MIT-led team has now used the world’s most powerful particle accelerator to discover new properties of matter, through particles’ “near-misses.” The approach has turned the particle accelerator into a new kind of microscope — and led to the discovery of new behavior in the forces that hold matter together.

In a study appearing this week in the journal Physical Review Letters, the team reports results from the Large Hadron Collider (LHC) — a massive underground, ring-shaped accelerator in Geneva, Switzerland. Rather than focus on the accelerator’s particle collisions, the MIT team searched for instances when particles barely glanced by each other.

When particles travel at close to the speed of light, they are surrounded by an electromagnetic halo that flattens when particles pass close but don’t collide. The pancaked energy fields produce extremely high-energy photons. Occasionally, a photon from one particle can ping off another particle, like an intense, quantum-sized pinprick of light.

The MIT team was able to pick out such near-miss pinpricks, or what scientists call “photonuclear interactions,” from the LHC’s particle-collision data. They found that when some photons pinged off a particle, they kicked out a type of subatomic particle, known as a D0 meson, that the scientists could measure for the first time.

D0 mesons are subatomic particles that contain a charm quark, a rare type of quark not normally found in ordinary nuclear matter. Quarks are the fundamental building blocks of all matter, and are bound by gluons, which are massless particles that are the invisible glue, or “strong force” that holds matter together. The rare charm quarks can only be created in high-energy interactions. As such, they provide an especially clean, unambiguous probe of quarks and gluons inside a nucleus.

Through their measurements of D0 mesons , the researchers could estimate how tightly gluons are packed, and, essentially, how strong the strong force is within a particle’s nucleus.

Our result gives an indication that when nuclear matter is squeezed together, then gluons start behaving in a funny way,” says lead author Gian Michele Innocenti, an assistant professor of physics at MIT. “We need to know how these gluons behave in these extreme conditions because gluons keep the universe together. And at this point, photonuclear interactions are the best way we have to study gluon behavior.”

The study’s co-authors include members of the CMS Collaboration — a global consortium of physicists who operate and maintain the Compact Muon Solenoid (CMS) experiment, which is one of the largest detectors within the LHC that was used to collect the study’s data.

Bringing a “background” into focus

With each run, the Large Hadron Collider fires off needle-thin beams of particles in opposite directions around a 27-kilometer-long underground ring. When the beams cross paths, particles can collide. If the collisions happen to take place in a region of the ring where the CMS detector is set up, the detector can record the collisions, and scientists can then analyze the aftermath to reconstruct the fragments that make up the original particles.

Since the LHC began operations in 2008, the focus has been overwhelmingly on the detection and analysis of “head-on” collisions. Physicists have known that by accelerating particle beams, they would also produce photonuclear interactions — near-miss events where a particle might collide not with another particle, but with its cloud of photons. But such light-nucleus interactions were thought to be simply noise.

“These photonuclear events were considered a background that people wanted to cancel,” Innocenti says. “But now people want to use it as a signal because a collision between a photon and a nucleus can essentially be like a super-high-accuracy microscope for nuclear matter.”

When a photon pings off a particle, the abundance, direction, and energy of the produced D0 meson relates directly to the energy and density of the gluons in the nucleus. If scientists can detect and measure this photon interaction, it would be like using an extremely small and powerful flashlight to illuminate the nuclear structures. But until now, it was assumed that photonuclear interactions would be impossible to pick out amid the various physics processes that can occur in such collisions.

“People didn’t think it was possible to remove the huge mess of all these other collisions, to zoom in on single photons hitting single nuclei producing a D0 meson,” Innocenti says. “We had to devise a system to recognize those very rare photonuclear interactions while data was being taken of particle collisions.”

Illuminating charm

For their new study, Innocenti and his colleagues first simulated what a photonuclear interaction would look like amid a shower of other particle collisions. In particular, they simulated a scenario in which a photon pings off a nucleus and produces a D0 meson. Although these events are rare, D0 mesons are among the most abundant particles that contain a charm quark. The team reasoned that if they could detect signs of a charm quark in Dmesons that are produced in a photonuclear interaction, it could give valuable information about the gluons that hold the nucleus together.

With their simulations, the researchers then developed an algorithm to detect photonuclear interactions. They implemented the algorithm at the CMS detector to search for signals in real-time during the LHC’s particle-colliding runs.

“We had to collect tens of billions of collisions in order to extract a few hundred of these rare instances where a photon hits a nucleus and produces one of these exotic D0 meson particles,” Innocenti explains.

From this enormous dataset, the team identified a clean sample of these rare events by exploiting CMS’s advanced detector capabilities to select near-miss events and reconstruct the properties of the D0 mesons.

Through this process, the team detected instances of D0 meson production and then worked back to calculate properties of the particles’ charm quarks and the gluons that would have held them together in the original nucleus. 

“We are constraining what happens to gluons when they are squeezed in ions that are very large that are traveling very fast,” Innocenti says. “So far, our data confirms what people expect in terms of high-density nuclear matter. In reality, this is the first time we’ve shown this kind of measurement is feasible. ”

The team is working to improve the measurement’s accuracy in order to provide a clearer picture of how quarks and gluons are arranged inside a nucleus.

“Gluons are a very strong force that keeps the universe together,” Innocenti says. “The description of the strong force is at the basis of everything we see in nature. Now we have a way to either fully confirm, or show deviations from, that description.”

This work was supported, in part, by the U.S. Department of Energy, including support from a DOE Early Career Research Program award, and it builds on the contributions of a large MIT team of graduate students, undergraduate researchers, scientists, and postdocs.



de MIT News https://ift.tt/Z9pNbae

miércoles, 25 de marzo de 2026

AI system learns to keep warehouse robot traffic running smoothly

Inside a giant autonomous warehouse, hundreds of robots dart down aisles as they collect and distribute items to fulfill a steady stream of customer orders. In this busy environment, even small traffic jams or minor collisions can snowball into massive slowdowns.

To avoid such an avalanche of inefficiencies, researchers from MIT and the tech firm Symbotic developed a new method that automatically keeps a fleet of robots moving smoothly. Their method learns which robots should go first at each moment, based on how congestion is forming, and adapts to prioritize robots that are about to get stuck. In this way, the system can reroute robots in advance to avoid bottlenecks.

The hybrid system utilizes deep reinforcement learning, a powerful artificial intelligence method for solving complex problems, to figure out which robots should be prioritized. Then, a fast and reliable planning algorithm feeds instructions to the robots, enabling them to respond rapidly in constantly changing conditions.

In simulations inspired by actual e-commerce warehouse layouts, this new approach achieved about a 25 percent gain in throughput over other methods. Importantly, the system can quickly adapt to new environments with different quantities of robots or varied warehouse layouts.

“There are a lot of decision-making problems in manufacturing and logistics where companies rely on algorithms designed by human experts. But we have shown that, with the power of deep reinforcement learning, we can achieve super-human performance. This is a very promising approach, because in these giant warehouses even a 2 or 3 percent increase in throughput can have a huge impact,” says Han Zheng, a graduate student in the Laboratory for Information and Decision Systems (LIDS) at MIT and lead author of a paper on this new approach.

Zheng is joined on the paper by Yining Ma, a LIDS postdoc; Brandon Araki and Jingkai Chen of Symbotic; and senior author Cathy Wu, the Class of 1954 Career Development Associate Professor in Civil and Environmental Engineering (CEE) and the Institute for Data, Systems, and Society (IDSS) at MIT, and a member of LIDS. The research appears today in the Journal of Artificial Intelligence Research.

Rerouting robots

Coordinating hundreds of robots in an e-commerce warehouse simultaneously is no easy task.

The problem is especially complicated because the warehouse is a dynamic environment, and robots continually receive new tasks after reaching their goals. They need to be rapidly redirected as they leave and enter the warehouse floor.

Companies often leverage algorithms written by human experts to determine where and when robots should move to maximize the number of packages they can handle.

But if there is congestion or a collision, a firm may have no choice but to shut down the entire warehouse for hours to manually sort the problem out.

“In this setting, we don’t have an exact prediction of the future. We only know what the future might hold, in terms of the packages that come in or the distribution of future orders. The planning system needs to be adaptive to these changes as the warehouse operations go on,” Zheng says.

The MIT researchers achieved this adaptability using machine learning. They began by designing a neural network model to take observations of the warehouse environment and decide how to prioritize the robots. They train this model using deep reinforcement learning, a trial-and-error method in which the model learns to control robots in simulations that mimic actual warehouses. The model is rewarded for making decisions that increase overall throughput while avoiding conflicts.

Over time, the neural network learns to coordinate many robots efficiently.

“By interacting with simulations inspired by real warehouse layouts, our system receives feedback that we use to make its decision-making more intelligent. The trained neural network can then adapt to warehouses with different layouts,” Zheng explains.

It is designed to capture the long-term constraints and obstacles in each robot’s path, while also considering dynamic interactions between robots as they move through the warehouse.

By predicting current and future robot interactions, the model plans to avoid congestion before it happens.

After the neural network decides which robots should receive priority, the system employs a tried-and-true planning algorithm to tell each robot how to move from one point to another. This efficient algorithm helps the robots react quickly in the changing warehouse environment.

This combination of methods is key.

“This hybrid approach builds on my group’s work on how to achieve the best of both worlds between machine learning and classical optimization methods. Pure machine-learning methods still struggle to solve complex optimization problems, and yet it is extremely time- and labor-intensive for human experts to design effective methods. But together, using expert-designed methods the right way can tremendously simplify the machine learning task,” says Wu.

Overcoming complexity

Once the researchers trained the neural network, they tested the system in simulated warehouses that were different than those it had seen during training. Since industrial simulations were too inefficient for this complex problem, the researchers designed their own environments to mimic what happens in actual warehouses.

On average, their hybrid learning-based approach achieved 25 percent greater throughput than traditional algorithms as well as a random search method, in terms of number of packages delivered per robot. Their approach could also generate feasible robot path plans that overcame congestion caused by traditional methods.

“Especially when the density of robots in the warehouse goes up, the complexity scales exponentially, and these traditional methods quickly start to break down. In these environments, our method is much more efficient,” Zheng says.

While their system is still far away from real-world deployment, these demonstrations highlight the feasibility and benefits of using a machine learning-guided approach in warehouse automation.

In the future, the researchers want to include task assignments in the problem formulation, since determining which robot will complete each task impacts congestion. They also plan to scale up their system to larger warehouses with thousands of robots.

This research was funded by Symbotic.



de MIT News https://ift.tt/UHpf568

Why solid-state batteries keep short-circuiting

Batteries that use solid metal as their charge-carrying electrolyte could potentially be a safer and far more energy-dense alternative to lithium-ion batteries. However, these solid-state batteries have been plagued by the formation of metallic cracks called dendrites that cause them to short circuit.

The problem has so far prevented such batteries from becoming a major player in energy storage. But now, research from MIT could finally help engineers find a way to get past this hurdle.

For decades, many researchers have treated dendrites as largely the result of mechanical stress — like cracks that form on the sidewalk when a tree root grows underneath. But MIT engineers have discovered the exact opposite: Faster dendrite growth was associated with lower stress levels in a commonly used battery electrolyte material. Using a new technique that allowed them to directly measure the stress around growing dendrites, the researchers found cracks formed at stress levels as low as 25 percent of what would be expected under mechanical stress alone.

The experiments, published in Nature today, instead revealed another culprit: chemical reactions caused by high electrical currents that weaken the electrolyte and make it more susceptible to dendrite growth. Researchers had previously proposed that such reactions cause dendrite growth, but the new study provides the first experimental data on the interplay between chemical and mechanical stress in dendrite formation.

“Direct measurement techniques allowed us to see how tough the material is as we cycle the cell,” says Cole Fincher, the paper’s first author and an MIT PhD student in materials science and engineering. “What we saw was that if you just test the ceramic electrolyte on the benchtop, it’s about as tough as your tooth. But during charging, it gets a lot weaker — closer to the brittleness of a lollipop.”

The findings reveal why developing stronger electrolytes alone hasn’t solved the dendrite problem. It also points to the importance of developing more chemically stable materials to finally fulfill the promise of high-density solid-state batteries.

“There’s a large community of researchers that are constantly trying to discover and design better solid electrolytes to enable the solid-state battery,” says senior author Yet-Ming Chiang, MIT’s Kyocera Professor of Materials Science and Engineering. “This study provides guidance in those efforts. We discovered a new mechanism by which these dendrites grow, allowing us to explore ways to design around it to make solid-state batteries successful.”

Joining Fincher and Chiang on the paper are MIT PhD student Colin Gilgenbach; Thermo Fisher Scientific scientists Christian Roach and Rachel Osmundsen; MIT.nano researcher Aubrey Penn; MIT Toyota Professor in Materials Processing W. Craig Carter; MIT Kyocera Professor of Materials Science and Engineering James LeBeau; University of Michigan Professor Michael Thouless; and Brown University Professor Brian W. Sheldon.

Measuring stress

Dendrites have presented a major roadblock to battery development since the 1970s. One reason lithium-ion batteries have become ubiquitous while other approaches have stalled is that their commonly used graphite anodes are less susceptible to dendrite formation. That’s a shame because solid-state batteries that use lithium metal as an anode and a solid electrolyte could theoretically store far more energy in the same sized package with less weight. They could thus enable longer-lasting phones and laptops, or electric cars with double the range of today’s options.

“There’s no more energy-dense form of lithium than lithium metal,” Chiang says. “But the dendrite problem has limited progress with solid-state batteries.”

Lithium metal is soft like taffy. Fincher, who has been studying the dendrite problem in the labs of Chiang and Carter, says one puzzle is how such a soft material can penetrate into the hard electrolyte materials being explored for use in solid-state batteries.

“The ceramics that have been used in these applications are stiff, like a coffee mug, so it’s been hoped that solid-state batteries would stop this relatively soft dendrite from growing,” Fincher explains.

Believing that mechanical stress causes dendrites, scientists have worked to develop stronger electrolytes that can withstand more mechanical stress. Some researchers have proposed that chemical reactions play a role in dendrite formation, but how those reactions worked with mechanical stress was not known.

For their Nature study, the researchers set out to directly observe mechanical and chemical changes in a commonly used solid-state electrolyte material as dendrites grew. Solid-state batteries are typically organized like a sandwich, which makes it hard to look inside the middle electrolyte layer. For their first experiment, the researchers developed a special solid-state battery cell in which the ceramic layers can be observed from the side, allowing the researchers to watch dendrite growth occurring in the electrolyte.

The researchers also used a measurement technique called birefringence microscopy to precisely measure the stress around the dendrite, which Fincher developed as part of his PhD thesis.

“It works the same way as polarized sunglasses when you look at something like a windshield,” Fincher explains of the technique. “When light comes through, residual stresses in the glass enable light of some orientations to pass faster than others, and that can give rise to observable rainbow patterns. These patterns can be used to measure stress.”

The technique gave the researchers a way to both visualize and quantify stress around actively growing dendrites for the first time, leading to the unexpected findings.

“Normally you would expect that the faster a dendrite grows, the more stress it creates,” Chiang says. “Instead, we observed exactly the opposite. The faster it grew, the lower the stress around it, meaning the solid electrolyte is breaking under a lower stress, and therefore it’s been embrittled.”

In fact, the dendrites grew at stress levels far weaker than expected. Fincher describes the weaker electrolyte as electrochemically corroded.

“Imagine you test a piece of glass one day, and the next day it’s only a quarter as strong,” Chiang says. “It was very surprising.”

Led by LeBeau, the researchers then cooled the electrolyte to extremely low temperatures and applied a powerful imaging technique called cryogenic scanning transmission electron microscopy that allowed them to study the area around the dendrite on nearly atomic scales. The imaging revealed that the passage of ionic current through the material had caused chemical reactions that made it more brittle.

“The electric current drives the flow of lithium ions through the solid electrolyte,” Chiang explains. “That causes a highly concentrated flow of lithium ions at the dendrite tip. We believe that leads to a chemical reduction of the material compound, which leads to its decomposition into new phases. You start with a crystalline phase of the electrolyte, then there’s a volume contraction after the deposition that is consistent with the embrittlement we see.”

Toward better batteries

The experiment was done on one of the most stable electrolytes used in solid-state batteries, making the researchers confident the findings will carry over to other electrolyte materials.

“This tells us we have to look for electrolyte materials that are even more stable, especially when in contact with lithium metal, which chemically speaking is very reducing,” Chiang says. “This will help direct the search for new materials.”

For instance, Chiang says now that they understand more about the chemical changes causing embrittlement, researchers could explore materials that actually get tougher as cracks grow.

The researchers say it will take more work to figure out what electrochemical reactions are taking place to make the electrolyte so much weaker. But they say their approach for directly observing stresses could also help improve materials for use in devices like fuel cells and electrolyzers.

The work was supported by the center for Mechano-Chemical Understanding of Solid Ionic Conductors, a Department of Energy Engineering Frontiers Research Center, the National Science Foundation, and Fincher’s Department of Defense Science and Engineering Graduate Fellowship, and was carried out using MIT.nano facilities.



de MIT News https://ift.tt/tJEYHUg

Wristband enables wearers to control a robotic hand with their own movements

The next time you’re scrolling your phone, take a moment to appreciate the feat: The seemingly mundane act is possible thanks to the coordination of 34 muscles, 27 joints, and over 100 tendons and ligaments in your hand. Indeed, our hands are the most nimble parts of our bodies. Mimicking their many nuanced gestures has been a longstanding challenge in robotics and virtual reality.

Now, MIT engineers have designed an ultrasound wristband that precisely tracks a wearer’s hand movements in real-time. The wristband produces ultrasound images of the wrist’s muscles, tendons, and ligaments as the hand moves, and is paired with an artificial intelligence algorithm that continuously translates the images into the corresponding positions of the five fingers and palm.

The researchers can train the wristband to learn a wearer’s hand motions, which the device can communicate in real-time to a robot or a virtual environment.

In demonstrations, the team has shown that a person wearing the wristband can wirelessly control a robotic hand. As the person gestures or points, the robot does the same. In a sort of wireless marionette interaction, the wearer can manipulate the robot to play a simple tune on the piano and shoot a small basketball into a desktop hoop. With the same wristband, a wearer can also manipulate objects on a computer screen, for instance pinching their fingers together to enlarge and minimize a virtual object.

The team is using the wristband to gather hand motion data from many more users with different hand sizes, finger shapes, and gestures. They envision building a large dataset of hand motions that can be plumbed, for instance, to train humanoid robots in dexterity tasks, such as performing certain surgical procedures. The ultrasound band could also be used to grasp, manipulate, and interact with objects in video games, design applications, or other virtual settings.

“We think this work has immediate impact in potentially replacing hand tracking techniques with wearable ultrasound bands in virtual and augmented reality,” says Xuanhe Zhao, the Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT. “It could also provide huge amounts of training data for dexterous humanoid robots.”

Zhao, Gengxi Lu, and their colleagues present the wristband’s new design in a paper appearing today in Nature Electronics. Their MIT co-authors are former postdocs Xiaoyu Chen, Shucong Li, and Bolei Deng; graduate students SeongHyeon Kim and Dian Li; postdocs Shu Wang and Runze Li; and Anantha Chandrakasan, MIT provost and the Vannevar Bush Professor of Electrical Engineering and Computer Science. Other co-authors are graduate students Yushun Zheng and Junhang Zhang, Baoqiang Liu, Chen Gong, and Professor Qifa Zhou from the University of Southern California.

Seeing strings

There are currently a number of approaches to capturing and mimicking human hand dexterity in robots. Some approaches use cameras to record a person’s hand movements as they manipulate objects or perform tasks. Others involve having a person wear a glove with sensors, which records the person’s hand movements and transmits the data to a receiving robot. But erecting a complex camera system for different applications is impractical and prone to visual obstacles. And sensor-laden gloves could limit a person’s natural hand motions and sensations.

A third approach uses the electrical signals from muscles in the wrist or forearm that scientists then correlate with specific hand movements. Researchers have made significant advances in this approach, however these signals are easily affected by noise in the environment. They are also not sensitive enough to distinguish subtle changes in movements. For instance, they may discern whether a thumb and index finger are pinched together or pulled apart, but not much of the in-between path.

Zhao’s team wondered whether ultrasound imaging might capture more dexterous and continuous hand movements. His group has been developing various forms of ultrasound stickers — miniaturized versions of the transducers used in doctor’s offices that are paired with hydrogel material that can safely stick to skin.

In their new study, the team incorporated the ultrasound sticker design into a wearable wristband to continuously image the muscles and tendons in the wrist.

“The tendons and muscles in your wrist are like strings pulling on puppets, which are your fingers,” Lu says. “So the idea is: Each time you take a picture of the state of the strings, you’ll know the state of the hand.”

Mapping manipulation

The team designed a wristband with an ultrasound sticker that is the size of a smartwatch, and added onboard electronics that are about as small as a cellphone. They attached the wristband to a volunteer’s wrist and confirmed that the device produced clear and continuous images of the wrist as the volunteer moved their fingers in various gestures.

The challenge then was to relate the black and white ultrasound images of the wrist to specific positions of the hand. As it turns out, the fingers and thumb are capable of 22 degrees of freedom, or different ways of extending or angling. The researchers found that they could identify specific regions in their ultrasound images of the wrist that correlate to each of these 22 degrees of freedom. For instance, changes in one region relate to thumb extension, while changes in another region correlate with movements of the index finger.

To establish these connections, a volunteer wearing the wristband would move their hand in various positions while the researchers recorded the gestures with multiple cameras surrounding the volunteer. By matching changes in certain regions of the ultrasound images with hand positions recorded by the cameras, the team could label wrist image regions with the corresponding degree of freedom in the hand. But to do this translation continuously, and in real-time, would be an impossible task for humans.

So, the team turned to artificial intelligence. They used an AI algorithm that can be trained to recognize image patterns and correlate them with specific labels and, in this case, the hand’s various degrees of freedom. The researchers trained the algorithm with ultrasound images that they meticulously labeled, annotating the image regions associated with a specific degree of freedom. They tested the algorithm on a new set of ultrasound images and found it correctly predicted the corresponding hand gestures.

Once the researchers successfully paired the AI algorithm with the wristband, they tested the device on more volunteers. For the new study, eight volunteers with different hand and wrist sizes wore the wristband while they formed various hand gestures and grasps, including making the signs for all 26 letters in American Sign Language. They also held objects such as a tennis ball, a plastic bottle, a pair of scissors, and a pencil. In each case, the wristband precisely tracked and predicted the position of the hand.

To demonstrate potential applications, the team developed a simple computer program that they wirelessly paired with the wristband. As a wearer went through the motions of pinching and grasping, the gestures corresponded to zooming in and out on an object on the computer screen, and virtually moving and manipulating it in a smooth and continuous fashion.

The researchers also tested the wristband as a wireless controller of a simple commercial robotic hand. While wearing the wristband, a volunteer went through the motions of playing a keyboard. The robot in turn mimicked the motions in real-time to play a simple tune on a piano. The same robot was also able to mimic a person’s finger taps to play a desktop basketball game.

Zhao is planning to further miniaturize the wristband’s hardware, as well as train the AI software on many more gestures and movements from volunteers with wider ranging hand sizes and shapes. Ultimately, the team is building toward a wearable hand tracker that can be worn by anyone, to wirelessly manipulate humanoid robots or virtual objects with high dexterity.

“We believe this is the most advanced way to track dexterous hand motion, through wearable imaging of the wrist,” Zhao says. “We think these wearable ultrasound bands can provide intuitive and versatile controls for virtual reality and robotic hands.”

This research was supported, in part, by MIT, the U.S. National Institutes of Health, the U.S. National Science Foundation, the U.S. Department of Defense, and Singapore National Research Foundation through the Singapore-MIT Alliance for Research and Technology.



de MIT News https://ift.tt/kJj9ysL