jueves, 30 de abril de 2026

Beacon Biosignals is mapping the brain during sleep

The human brain remains one of the most fascinating and perplexing mysteries in medicine. Scientists still struggle to match neurological activity with brain function and detect problems early, slowing efforts to treat neurological disorders and other diseases.

Beacon Biosignals is working to make sense of the brain by monitoring its activity while people sleep. The company, which was founded by Jake Donoghue PhD ’19 and former MIT researcher Jarrett Revels, developed a lightweight headband that uses electroencephalogram (EEG) technology to measure brain activity while people enjoy their normal sleep routines at home. Those data are processed by machine-learning algorithms to monitor the effects of novel treatments, find new signs of disease progression, and create patient cohorts for clinical trials.

“There’s a step-change in what becomes possible when you remove the sleep lab and bring clinical-grade EEG into the home,” says Donoghue, who serves as Beacon’s CEO. “It turns sleep from a constrained, facility-based test into a scalable source of high-quality data for diagnostics, drug development, and longitudinal brain health.”

Beacon partners with pharmaceutical companies to accelerate its path to patients. The company’s FDA 510(k)-cleared medical device has already been used in over 40 clinical trials across the globe as part of studies aimed at treating conditions including major depressive disorder, schizophrenia, narcolepsy, idiopathic hypersomnia, Alzheimer’s disease, and Parkinson’s disease.

With each deployment, Beacon learns more about how the brain works — insights it is using to create a “foundation model” of the brain.

“It’s our belief that the dataset that’s going to transform brain health doesn’t exist yet — but we are rapidly creating it,” Donoghue says. “Our platform can characterize the heterogeneity of disease progression, generating dynamic insights that are impossible to fully capture through static modalities like sequencing or imaging. The brain is an electric organ and changes through synaptic plasticity, so tracking brain function across many diseases at scale will allow us to discover novel subgroups of diseases and map them over time.”

Illuminating the brain

Donoghue trained in the Harvard-MIT Program in Health Sciences and Technology, completing his PhD in neuroscience at MIT under the guidance of professor of cognitive sciences Earl K. Miller, along with clinical training for an MD. While in the program, Donoghue trained at Massachusetts General Hospital and Boston Children’s Hospital, where he helped care for patients, including in oncology, during the rise of genomic sequencing to guide precision cancer therapies. He later worked in neurology and psychiatry, where care often relied on more iterative approaches — highlighting an opportunity to bring similarly data-driven precision to brain health.

“What struck me most was the inability to measure brain function in the ways that cardiologists can longitudinally monitor cardiac function in patients from home,” Donoghue says. “At MIT, I built this conviction that processing a lot of brain data and working to correlate that with brain function would be transformative to how these neurological diseases are identified and treated.”

Toward the end of his training, Donoghue began developing his ideas further, engaging with mentors including HST and Harvard Medical School professors Sydney Cash and Brandon Westover. He had met Revels, who was working as a research software engineer in MIT’s Julia Lab, during his PhD, and convinced him to co-found Beacon with him in 2019.

“We decided building a business to understand the organ of interest — the brain — would be a great start to understanding heterogeneous neuropsychiatric diseases and building better treatments,” Donoghue recalls.

Beacon began as a computation and analytics company building wearable devices to expand clinical impact and reach. From its early days, Beacon has been partnering with large pharmaceutical companies running clinical trials, offering a less invasive way to watch brain activity and learn how their drugs are impacting the brain as well as how patients sleep.

“It was clear sleep was the right window to understand the brain,” Donoghue says. “Neural activity during sleep can be an order of magnitude higher and more structured, almost like a language. It’s a great surface area for understanding brain function and how different drugs affect the brain.”

Donoghue says Beacon’s devices can collect lab-grade data on each patient for multiple sequential nights, resulting in higher quality assessment. The company uses machine learning to extract insights, such as the time patients spend in different sleep stages and the number of small awakenings that occur throughout the night. It can also detect subtle sleep architecture changes that might lead to cognitive decline.

“We’re starting to take features of sleep activity and link them to outcomes in a way that’s never been done with this level of precision,” Donoghue says.

To date, Beacon has taken part in clinical trials for sleep and psychiatric disorders as well as neurodegenerative diseases, where sleep changes can emerge years before the presentation of symptoms.

“We do a lot of work in areas like Alzheimer’s disease and Parkinson’s, which affected my grandfather,” Donoghue says. “We’re analyzing features of rapid-eye-movement and slow-wave sleep to detect early changes that precede clinical symptoms. It’s an opportunity to move these diseases from late recognition to much earlier, data-driven detection.”

Improving brain treatments for millions

Last year, Beacon acquired an at-home sleep apnea testing company that serves more than 100,000 patients each year across the U.S., accelerating access to high-quality, comprehensive testing in the home and expanding the reach of its platform. Then in November, the company raised $97 million to accelerate that expansion.

“The vision has always been to reach patients and help people at scale,” Donoghue says. “What’s powerful is that we’re building a longitudinal record of brain function over time,” Donoghue says. “A patient might come in for sleep apnea screening, but if they develop Parkinson’s years later, that earlier data becomes a window into the disease before symptoms emerged. That turns routine testing into a foundation for entirely new prognostic biomarkers — and a path to detecting and intervening in brain disease earlier, potentially before symptoms ever begin.”



de MIT News https://ift.tt/hfvSsJt

A materials scientist’s playground

Scientists and engineers around the world are working to improve quantum bits, or qubits, the minuscule building blocks of the quantum computer. Qubits are incredibly sensitive, making it easy for errors to be introduced, lowering device yield. But a new cluster tool at MIT.nano introduces capabilities that will allow researchers to continue advancements in qubit performance.

Passersby outside MIT.nano may have recently noticed a complex looking piece of equipment being installed on the first-floor cleanroom. What looks like a sci-fi movie prop is actually a state-of-the-art, custom-built molecular beam epitaxy (MBE): a physical vapor deposition system that operates under ultra-high vacuum to produce high-quality thin films. With the ability to grow different crystalline materials on a wafer, the tool will support quantum researchers and materials scientists by allowing them to study how film growth affects the properties of the materials used in making qubits.

“To realize the full promise of quantum computing, we need to build qubits that are robust, reproducible, and extensible,” says William D. Oliver, the Henry Ellis Warren (1894) Professor of Electrical Engineering and Computer Science and professor of physics at MIT. “To date, most of the improvements to superconducting qubit performance are traceable to circuit design — essentially, designing qubit circuits that are less sensitive to their environmental noise. However, those improvements have largely run their course. Going forward, we need to address the fundamental materials science and fabrication engineering required to reduce the sources of environmental noise. This multi-chamber, cassette-loaded, 200-millimeter wafer MBE system is exactly the right tool at the right time. And there’s no place better to do this research than at MIT.nano.”

That is because MIT.nano is preconditioned to receive this type of system with physical space, climate controls, policies and procedures for researchers, and expert staff to manage the lab. Through an equipment support plan, Oliver’s Engineering Quantum Systems (EQuS) group is able to install and run the tool inside MIT.nano, a high-performance, safe, and reliable environment.

A controlled environment is essential for the MBE. “Think of this system like an inverted International Space Station (ISS),” explains Patrick Strohbeen, research scientist in the EQuS group. “The ISS is a small chamber of atmosphere surrounded by the vacuum of space. This MBE system is a chamber of space-level vacuum surrounded by atmosphere.” That vacuum of space is kept at a steady negative 90 degrees Celsius, which enables precise growth of thin films on an atomic scale. It is the largest single deposition chamber (1-meter diameter) the manufacturer, DCA, has sold in the United States.

The journey of a wafer

The system, which in total takes up 600 square feet, is made up of six chambers. First is the load lock, where the wafer is placed into the system and brought down from atmospheric pressure to near the vacuum level of space. Then, the wafer enters the distribution center. This space acts like a central hub, transferring the wafers to other chambers. Next is the deposition, or “growth,” chamber. This is where the system’s primary function takes place — depositing materials, specifically atoms of superconducting metal, onto a substrate, typically silicon. From there, it moves to the oxidation chamber, which facilitates the growth of key ceramic materials for qubits. A fifth storage chamber can hold an additional 10 wafers within the vacuum.

A unique aspect of this system is its sixth chamber, designed for X-ray photoelectron spectroscopy (XPS). Using this chamber, researchers can shoot a photon in the form of X-rays at the surface and, when it hits the surface, it will excite the electron inside the material so that the electron jumps out and is picked up by a sensor that then tells the researcher about the environment the electron came from. As individual layers of atoms are put down in the growth chamber, scientists can move the wafer to the XPS chamber to measure changes in the material structure of the film and back again, all while keeping it inside the vacuum space.

Why is this important? “The quantum community has excellent device physicists and device engineers,” said Strohbeen. “The last piece of the puzzle is: We need to understand the materials platform that we’re using for these devices.” The buried interfaces, so far, have been understudied due to the difficulty in probing them, he explained.

For those of us who are not MBE experts, think of the snow that fell in Massachusetts this winter. How can you tell how much ice is on the pavement without removing all of the snow on top of it? And without changing the natural setting where the snow, ice, and pavement meet? With this system, specifically the XPS chamber, scientists can study the interfaces of buried materials without disturbing the physical or chemical environments. “It is a materials scientist’s playground,” jokes Strohbeen — a controlled space where researchers can learn about and explore materials’ interactions within layers of atoms.

Why MIT.nano?

When Oliver, who is also the director of the MIT Center for Quantum Engineering, secured the MBE Quantum, the next question was where to put it. Enter MIT.nano. Housing 45,000 square feet of cleanroom, this facility exists at MIT to support complex, sensitive equipment with both the infrastructure and the staff needed to maintain it.

“MIT.nano’s ultra-stable building utilities and lab environment are exactly what is needed to support a system that demands extreme repeatability and purity,” says Nick Menounos, MIT.nano associate director of infrastructure. “The success of this installation grew from the early collaboration. Professor Oliver engaged the MIT.nano team in the procurement process almost two years in advance. That foresight, combined with the infrastructure momentum we gained from the recent CHIPS Act project, meant that we could prepare the cleanroom perfectly. We compressed the installation process that normally takes several months and had this extraordinary machine running in under three weeks.”

“From the very beginning, the MIT.nano staff were helpful, knowledgeable, and willing to go above and beyond to make this happen,” says Oliver. “While the MIT.nano facility is certainly an infrastructural crown jewel at MIT, it’s the MIT.nano staff who make it the national treasure it is today.”

Positioning the MBE Quantum in the cleanroom helps the team focus on scalability and device yield. Humidity and particle count, two things carefully measured and maintained at MIT.nano, can affect the output of the device. Minimizing as many variables as possible is key to improving qubit performance. The cleanroom also allows for new device research because an array of fabrication and metrology tools are available without having to leave the clean environment.

“We’re really excited to see what we can do with it,” says Strohbeen. “We bought it as a materials science tool, and it will also be a device development tool due to the flexibility of having it in the cleanroom.”

The MBE system was purchased through a combination of grants from the Army Research Office (ARO) and from the Laboratory for Physical Sciences (LPS). The ARO grant, a Defense University Research Instrumentation Program grant, is the premier grant from ARO for funding large capital equipment purchases that should prove disruptive in technologically relevant areas. It arrives at an important time on campus, as one of MIT’s strategic initiatives — the MIT Quantum Initiative — aims to apply quantum breakthroughs to the most consequential challenges in science, technology, industry, and national security.



de MIT News https://ift.tt/UHNIgds

Making the case for curiosity-driven science

“The thing that really struck me when I came to MIT and strikes me every single day is the stuff that’s going on here is amazing. The science, the engineering… every day I hear something that makes my jaw drop,” remarked President Sally Kornbluth during a live discussion with Lizzie O’Leary of Slate’s “What Next: TBD” podcast.

Kornbluth spoke about everything from the importance of curiosity-driven science and why basic science is critical to our nation’s future, to AI and education, and even bravely joined O’Leary in a rendition of the Williams College song, “The Mountains,” in honor of their shared alma mater.

“We are in this time of incredible uncertainty,” said Kornbluth of the current state of higher education and funding for scientific research. “What we are trying to do is keep the science robust.”

Bouncing back to her time at Duke and her love of college basketball, she noted it’s a combination of zone coverage and man-to-man defense when trying to address skepticism about higher education in Washington, D.C. She emphasized: “As one of the top institutions in the world it’s part of our responsibility to articulate the importance of science. Behind the scenes, I am – along with many other [university] presidents – I am in D.C. all the time now. I want to speak to Congressmen and women, Senators, people in the executive branch to explain the importance of what we are doing.”

Kornbluth emphasized that the pipeline of basic science that flows from U.S. universities is a critical asset for our country, cautioning that to keep straining this pipeline could have enormous negative ramifications for the U.S. down the line.

“If you think about research done in this country, it’s done in in universities, it’s done in national labs, and it’s done in industry,” said Kornbluth. Universities are where most of the science with a long pathway to impact, requiring patience, starts. She pointed to immunotherapy for cancer, which began 30-40 years ago in basic immunotherapy research, as an example. With that pipeline being drained, what does the future hold for new cancer therapies or new AI and quantum technologies?

Kornbluth also underscored that uncertainty and lost funding are having a “huge impact on the talent pipeline,” delving into the unique role universities play in training graduate students, who are the next generation of scientific researchers. “We hear, ‘Oh it would be okay if research was more in industry.’ I say, ‘Would you fly on a plane with a pilot who had never flown?’ How do they think people learn how to do research? We are training the next generation… and we are losing funding for them.” She added: “I think we are going to see reverberations for many decades if we don’t rectify that issue.”

When asked how she and her colleagues are working to keep research moving forward, Kornbluth explained that at MIT, “we have tried to find alternative ways to elevate the science. We have a series of presidential initiatives that cut across the whole campus in things like health and life sciences, quantum, humanities and social sciences. The notion is that we are trying to create new opportunities.”

Still, she acknowledged that losses from the endowment tax and diminished federal funding are painful. “There are only four schools right now that are subject to the 8% endowment tax, which is a tax on our earnings. For us, that means $240 million dollars a year plus other losses in grants. So, let’s say the whole thing is, we budgeted for a loss of $300 million a year on a $1.7 billion budget… That has definitely had an impact on us. No question about it. 

“The other thing about it is again there’s all this uncertainty. Our investigators are writing a ton of grants. They don’t know if they’re going off into the void or they really have the sort of competitive opportunities they’ve always had in the past.”

Asked why universities did not see this moment coming, Kornbluth offered a few thoughts. “Look at MIT – 30,000 companies have come from MIT. When you look at something like that, why would you think any government that wants economic flourishing in their country would come after MIT?” she reflected. “It just never would have occurred to us.”

Turning towards the rapid advances in AI, and how the field is impacting education, Kornbluth noted that at MIT and other universities, “we have to focus on the human element, we have to educate our students, they need to know how to write and do mathematics…they have to view AI as a tool to augment their capabilities. That is how we are thinking about it.”

In the course of the conversation, Kornbluth also expressed her unwavering support for international students, noting that most want the opportunity to stay and contribute to research in the U.S. after graduation. “The talent brought to us through our international community is unbelievable. We can attract the very best in the world. You can bet when they talk about competitiveness with China, for example, in AI, quantum, etc., they are not sitting around in China saying, ‘Oh it’s great America is taking all our students.’ They’re thinking, ‘It’s great that America doesn’t want to take as many of our students anymore because we can train them.’ It’s a competitive issue that we really should lean into.”



de MIT News https://ift.tt/AvKkgIJ

miércoles, 29 de abril de 2026

An engineer’s guide to birds

Feathers give birds their dazzling colors. They repel water and trap heat, keeping them warm and dry. They can even stifle sound, allowing species such as owls to hunt in virtual silence.

All of these functions come from the remarkable structure of feathers, explored in two chapters in “Birds Up Close,” by MIT materials engineer and lifelong birder Lorna J. Gibson. The book takes a microscopic look at birds’ feathers, bones, bills, eggs, and the mechanics of flight to explain their extraordinary abilities — how they can hover in place, silently swoop down on prey, and fly hundreds of miles without tiring.

Gibson spent four decades studying the mechanical behavior of materials — examining their underlying structure to determine what makes them hard or soft, supple or brittle. She specialized in cellular materials, such as engineered honeycombs and foams, as well as natural ones such as wood and bamboo.

Now a post-tenure professor, she’s turned her materials engineering perspective to birds, a subject that has long fascinated her. She’s given talks on the properties of feathers, including the Department of Materials Science and Engineering’s Wulff Lecture in 2017, and studied how sandgrouse carry water in their feathers to their young and how woodpeckers avoid brain damage despite their constant battering.

As a graduate student, she recalls, a colleague told her that woodpeckers have foam between their skulls and brains to cushion the blows of pecking. Intrigued, she dug into the topic and discovered a 1976 study in which neurologists dissected a woodpecker’s head and found no foam at all. So how do woodpeckers avoid brain injury?

“Eventually I recognized that because woodpeckers have such tiny brains, they don’t need the kind of protection that larger animals would need,” Gibson writes in the preface. That understanding led to talks for birders — and eventually to the idea for a whole book explaining how birds work through the lens of materials science and mechanics.

Engineering meets birding

Gibson describes “Birds Up Close,” published by the MIT Press, as a book by an engineer for anyone interested in birds, drawing largely on published research. Readers need no scientific or engineering background; sidebars include calculations for those who want more detail.

“I wasn’t writing it for engineers; I was writing it for birders — people who are curious about natural history,” she says. “I think engineers will enjoy it because there are engineering pieces to it, but I really wrote it for birders.”

Birding has surged in popularity in the United States; 96 million people — about one in three Americans — consider themselves birders, according to the U.S. Fish and Wildlife Service.

Those readers will find no shortage of memorable facts. Two chapters on feathers, titled “Fantastic Feathers,” explore striking features such as the wood duck’s brilliant colors and hummingbirds’ iridescence. Gibson explains both the science behind feather colors and how we perceive them.

“The color we see comes from light that is reflected from a surface,” she writes, describing the pigments responsible for the blacks and grays of seagulls as well as the vibrant reds and greens of the African turaco. But the blue jay in your backyard gets its color another way: blue is not a pigment at all, but a structural color, produced by the interaction of light with microscopic structures within the feathers.

Using photography, sketches, and microscope images, Gibson examines the microscopic structures of feathers. The contour feathers covering a bird’s body, for example, are branched structures with connecting barbs and parallel barbules. Scanning electron microscope images reveal details invisible to the naked eye, including the foamy core of a feather shaft.

She uses the same approach to explore how male hummingbirds produce high-pitched buzzing sounds with their tail feathers during dramatic courtship dives. The sound comes from the fluttering edges of the outermost tail feathers — like blowing across a blade of grass.

Gibson also shows how barn owl feathers enable stealthy flight. Comb-like serrations on the wing break up airflow and reduce noise.

Richard Prum, an ornithologist at Yale University who contributed images to the book, says Gibson’s engineering perspective deepens how we think about birds and evolution. Prum, author of “The Evolution of Beauty,” notes that her approach helps explain not just how birds survive, but how their unique features evolve and function.

“The public has absorbed generations of statements about survival and adaptation and ecology,” Prum says. “But that really sweeps under the rug, how do birds do it?”

Each chapter focuses on a different feature and can be read independently — readers can skip to the second feathers chapter to learn how water literally rolls off a duck’s back, or to the bills chapter to explore the bristles on a woodpecker’s tongue that help it capture insects inside a tree.

The final chapters focus on flight — how raptors soar thousands of feet and glide effortlessly, and how geese gain energy efficiency flying in V formations. Gibson is frank: These chapters are more technical, focusing on forces like lift and drag. “The reward is that you’ll learn some of the secrets of bird flight,” she writes.

The human side of the science

It’s not just natural history and science that fill “Birds Up Close.” In her preface, Gibson recalls childhood walks along the Niagara River in Ontario and a summer trip watching breeding colonies of puffins and guillemots in the Farne Islands in her parents’ native England.

In the epilogue, she reflects on writing an earlier version of the book while her wife, Jeannie, faced an aggressive brain cancer in 2019 and died a year later, as the world shut down amid the Covid-19 pandemic. Unable to visit friends and family in Canada — or host them in Boston — Gibson found solace exploring local green spaces, such as Jamaica Pond and the Arnold Arboretum.

On difficult days, “I would be out on a walk, spot something — the kingfisher cackling from its perch on a branch overhanging Leverett Pond, or a wood duck paddling on Jamaica Pond or a hawk circling overhead — and stop in awe and think: Oh, wow, I love seeing that. And for that moment, the grief would disappear.”

Returning to the manuscript, she noticed it was missing something critical. “I had all the science there, but I felt it was too much like a textbook,” she said. She consulted friends, colleagues, and an editor who helped turn “this textbook-y thing into something people would enjoy reading.”

The mix of the scientific and personal stood out to Scott Edwards, professor of organismic and evolutionary biology at Harvard University.

“That’s what science is,” Edwards says. “Science is done by humans. It’s not like we can morph into some ultra-objective person when we’re being a scientist. We bring to science our whole selves.”

He also praises the clear writing and illustrations, which “cuts through all the noise and gets right to the core of the message.” He plans to use the book in his class on birds at Harvard.

“Birds Up Close” goes on sale May 5. Gibson is scheduled to discuss the work at the MIT Museum on May 6. She will also appear at other events; a full list is available on the book’s website. She reflects on the book’s reach:

“Part of it was my own sense of awe and wonder. I couldn’t believe the things that I found out about birds,” Gibson says. “I think a lot of birders are into what’s called listing — seeing lots of species and keeping track of how many different species of birds they see. That’s great, if that’s what you want to do. But this book is really a different way of looking at birds.”



de MIT News https://ift.tt/YwRyWoG

A month in Panama: Rethinking what real estate development can be

Cherry Tang, a master of science in real estate development student at the MIT Center for Real Estate, recently participated in an experiential learning opportunity in Panama working with Conservatorio, a development firm based in Casco Viejo. What began as a modeling exercise quickly became a deeper exploration of how development, community, and environment intersect, shaped as much by people and culture as by the work itself.

“I went in expecting to build a financial model. I didn’t expect that the experience would fundamentally reshape how I think about development,” Tang reflects.

The project centered on Santa Catalina, a remote surf town on Panama’s Pacific coast. The development comprises approximately 140 residential units across condos, villas, and homes, along with vacant lots, four retail spaces, a surf school with a stadium, and a restaurant with a pool — all envisioned as the town’s first true center.

At first glance, Tang says, Santa Catalina didn’t resemble a typical “prime” development market. It had limited infrastructure, low density, and no established core.

“What it does have is something powerful: world-class surf and access to Coiba National Park, a premier diving destination,” Tang says. “Here, the ocean becomes the anchor tenant.”

The project is designed as an open, walkable master-planned community that integrates seamlessly with the existing town. Anchored by surfing and diving, it introduces a diverse product mix and a 600-meter linear park, positioning it as the future heart of Santa Catalina and a differentiated alternative to both local developments and traditional resort-style communities.

Tang saw this as a different vision of place-making. “It wasn’t about building a resort. It was about building a center of gravity for a community that has never really had one.”

Tang’s primary role was to build the project’s financial model from the ground up. The capital structure, with land contributed as equity and sales deposits used to fund construction, required a different way of thinking than the institutional frameworks she had used in previous roles in Toronto and Boston.

“It was more than a technical exercise,” she explains. “It reinforced how financial, physical, and strategic decisions are deeply interconnected, and how thoughtful structuring can unlock projects that might otherwise not be feasible.”

Working directly with KC Hardin, founder and CEO of Conservatorio, and the broader leadership team, Tang gained firsthand exposure to real-time development decision-making. She presented her financial model to leadership and prospective investors, and her assumptions helped shape conversations around phasing, design, and construction.

Development is a feedback loop between underwriting and the built environment,” Tang says.

Throughout the month, Tang and her colleagues met with a range of people shaping the project’s future. They spent time with local developers and brokers, learning about infrastructure improvements and ongoing real estate activity in the region. 

Tang described meeting one family with long-standing ties to the area as one of the more memorable moments.

“Their coastline conservation work in Panama is deeply inspiring,” she says.

They also met with scientists from the Smithsonian Tropical Research Institute, trekking through mangroves and learning about coastal ecosystems and the long-term environmental implications of development.

“It was a vivid reminder that development decisions don’t exist in isolation,” says Tang.

Outside of work, Panama had its way of leaving an impression. Sailing through the Panama Canal ... watching cargo ships pass through landscapes filled with monkeys and sloths ... living in Casco Viejo — each added another layer to the experience for Tang. The neighborhood itself served as a real-life case study in thoughtful, community-oriented development.

“What stayed with me most was Conservatorio’s approach to revitalization, not through displacement, but through deep engagement, trust-building, and creating pathways for local residents to be part of the area’s transformation.”

That same spirit was reflected in everyday moments, from co-workers who went out of their way to make interns feel welcome.

“Strangers greeted us like neighbors,” says Tang. “The level of warmth and hospitality defined the experience as much as the work itself.”

By the end of the month, the experience left her with more than technical skills — she had a shift in perspective.

“I began to see development less as a formula and more as a system,” she explains. “One that sits at the intersection of finance, design, environment, and community.”

Her takeaway is that value can be created in unconventional ways, and leadership in real estate is grounded in trust, curiosity, and a deep respect for place.

Tang arrived in Panama to build a model. She left with a deeper understanding of what it means to build thoughtfully — as a developer, and as a steward of place.



de MIT News https://ift.tt/hvQVg6X

The MIT-IBM Computing Research Lab launches to shape the future of AI and quantum computing

The following is a joint announcement by the MIT Schwarzman College of Computing and IBM.

IBM and MIT today announced the launch of the MIT-IBM Computing Research Lab, advancing their long-standing collaboration to shape the next era of computing. The new lab expands its scope to include quantum computing, alongside foundational artificial intelligence research, with the goal of unlocking new computational approaches that go beyond the limits of today’s classical systems.

The MIT-IBM Computing Research Lab builds on a distinguished history of scientific excellence at the intersection of research and academia. Evolving from the MIT-IBM Watson AI Lab, which originated in 2017 on MIT’s campus, the new lab reflects a transformed technology landscape — one which AI has entered mainstream deployment, and quantum computing is rapidly advancing toward practical impact. Together, MIT and IBM aim to help lead research in AI and quantum and to redefine mathematical foundations across both domains.

“We expect the MIT-IBM Computing Research Lab to emerge as one of the world’s premier academic and industrial hubs accelerating the future of computing,” says Jay Gambetta, director of IBM Research and IBM Fellow, and IBM chair of the MIT-IBM Computing Research Lab. “Together, the brightest minds at MIT and IBM will rethink how models, algorithms, and systems are designed for an era that will be defined by the sum of what’s possible when AI and quantum computing come together.”

“For a decade, the collaboration between MIT and IBM has produced leading-edge research and innovation, and provided mentorship and supported the professional growth of researchers both at MIT and IBM,” says Anantha Chandrakasan, MIT’s provost, who, as then-dean of the School of Engineering, spearheaded the creation of the MIT-IBM Watson AI Lab and will continue as MIT chair of the lab. “The incredible technical achievements sets the bar high for our work together over the next 10 years. I look forward to another decade of impact.”

Addressing the next frontiers in computation

The MIT-IBM Computing Research Lab will serve as a focal point for joint research between MIT and IBM in AI, algorithms, and quantum computing, as well as the integration of these technologies into hybrid computing systems. The lab is designed to accelerate progress toward powerful new computational approaches that take advantage of rapid advances in AI and quantum-centric supercomputing, including those that combine maturing quantum hardware with classical systems and advanced AI methods.

This research initiative will include improving capabilities and integrating AI with traditional computing, alongside pursuing advances in small, efficient, modular language model architectures, novel AI computing paradigms, and enterprise-focused AI systems designed for deployment in real-world environments, where reliability, transparency, and trust are essential.

In parallel, the lab will rethink the mathematical and algorithmic foundations that underpin the next era of computing by accelerating the development of novel quantum algorithms for complex problems, with impacts in areas such as materials science, chemistry, and biology.

Additionally, the lab will investigate mathematical and algorithmic foundations of machine learning, optimization, Hamiltonian simulations, and partial differential equations, which are used to approximate the behaviors of dynamical systems that currently stump classical systems beyond limited scales and accuracy. Innovations from the lab could have wide implications for global industries, from more accurate weather and air turbulence prediction to better forecasts of financial market performance. Similarly, with improved optimization approaches, research from the lab could help lower risks in areas like finance, predict protein structures for more targeted medicine, and streamline global supply chains.

With its focus on AI, algorithms, and quantum, the MIT-IBM Computing Research Lab will complement and enhance the work of two of MIT’s strategic initiatives, the MIT Generative AI Impact Consortium and the MIT Quantum Initiative. MIT President Sally Kornbluth launched these strategic initiatives to broaden and deepen MIT’s impact in developing solutions to serious global challenges. The MIT-IBM Computing Research Lab will also leverage IBM’s longtime leadership and expertise in quantum computing. As part of its ambitious roadmap, IBM has laid out a clear path to delivering the world’s first fault-tolerant quantum computer by 2029, and is working across industries to drive value from quantum-centric supercomputing, tightly integrating quantum computers with high-performance computing and AI accelerators to solve the world’s toughest problems.

Deep integration with scientific domains

The MIT-IBM Computing Research Lab will also continue to serve as a foundation for training the next generation of computational scientists and innovators. It will do so by engaging faculty and students across MIT departments, enabling new computational approaches to accelerate discoveries in the physical and life sciences.

The lab will continue to be co-directed by Aude Oliva, senior research scientist at MIT’s Computer Science and Artificial Intelligence Laboratory, and David Cox, vice president of AI Foundations at IBM Research. MIT and IBM have appointed leads for each of the lab’s three focus areas — AI, algorithms, and quantum. Jacob Andreas, associate professor in the Department of Electrical Engineering and Computer Science (EECS), and Kenney Ng, principal research scientist at IBM Research and the MIT-IBM science program manager, will co-lead AI; Vinod Vaikuntanathan, the Ford Foundation Professor of Engineering in EECS, and Vasileios Kalantzis, IBM Research senior research scientist, will co-lead algorithms; and Aram Harrow, professor of physics, and Hanhee Paik, IBM director of Quantum Algorithm Centers, will co-lead quantum.

“The MIT-IBM Computing Research Lab reflects an important expansion of the collaboration between MIT and IBM and the increasing connections across AI, algorithms, and quantum. This deepened focus also underscores a strong alignment with the MIT Schwarzman College of Computing’s mission to advance the forefront of computing and its integration across disciplines,” says Dan Huttenlocher, dean of the MIT Schwarzman College of Computing and MIT co-chair of the lab. “I’m excited about what this next chapter will enable in these three areas, and their impact broadly.”

Building on nearly a decade of collaboration

The MIT-IBM Watson AI Lab helped pioneer a model for academic-industry research collaboration, aligning long-term scientific inquiry with real-world impact. Since its inception, the lab has funded over 210 research projects involving over 150 MIT faculty members and over 200 IBM researchers. Collectively, the projects have led to over 1,500 peer-reviewed articles. The lab also helped shape the career growth of a number of MIT students and junior researchers, funding more than 500 students and postdocs.

“The true measure of this lab is not just innovation, but transformation of a field. Hundreds of students have contributed to thousands of publications in top conferences and journals, demonstrating their capabilities to address meaningful problems,” says Oliva. “The MIT-IBM Computing Research Lab builds on an extraordinary legacy of impact to advance a trusted collaboration that will redefine the future of AI and quantum computing in a way never seen before.”

“By coupling academic rigor with industrial scale, the lab aims to define the computational foundations that will power the next generation of AI, quantum, and scientific breakthroughs,” says Cox. “By bringing together advances in AI, algorithms, and quantum computing under one integrated research effort, we’re creating the conditions to rethink the mathematical and computational foundations of science and engineering.”

The MIT-IBM Computing Research Lab will capitalize on this foundation, expanding both the scientific scope and the ecosystem of collaborators across the Cambridge-Boston region and beyond.



de MIT News https://ift.tt/9cKzX1x

MIT engineers’ virtual violin produces realistic sounds

There is no question that violin-making is an art form. It requires a musician’s ear, a craftsperson’s skill, and an historian’s appreciation of lessons learned over time. Making a violin also takes trust: Violin makers, or luthiers, often must wait until the instrument is finished before they can hear how all their hard work will sound.

But a new tool developed by MIT engineers could help luthiers play around with a violin’s design and tweak its sound even before a single part is carved.

In a study appearing today in the journal npj Acoustics, the MIT team reports on a new “computational violin” — a computer simulation that captures the detailed physics of the instrument and realistically produces the sound of a violin when its strings are plucked.

While there are software programs and plug-ins that enable users to play around with virtual violins, their sounds are typically the result of sampling and averaging over thousands of notes played by actual violins.

In contrast, the new computational violin takes a physics-based approach: It produces sound based on the way the instrument, including its vibrating strings, physically interacts with the surrounding air.

As a demonstration, the researchers applied the computational violin to play two short excerpts: one from “Bach’s Fugue in G Minor,” and another from “Daisy Bell” — a nod to the first song that was ever produced by a computer-synthesized voice.

The computational violin currently simulates the sound of plucked strings — a type of playing that musicians know as “pizzicato.” Violin bowing, the researchers say, is a much more complicated interaction to model. However, the computational violin represents the first physics-based foundation of a strung violin sound that could one day be paired with a model of bowing to produce realistic, bowed violin music.

For now, the team says the new virtual violin could be used in the initial stages of violin design. Luthiers can tweak certain parameters such as a violin’s wood type or the thickness of its body, and then listen to the sound that the instrument would make in response.

“These days, people try to improve designs little by little by building a violin, comparing the sound, then making a change to the next instrument,” says Yuming Liu, senior research scientist at MIT. “It’s very slow and expensive. Now they can make a change virtually and see what the sound would be.”

“We’re not saying that we can reproduce the artisan’s magic,” adds Nicholas Makris, professor of mechanical engineering at MIT. “We’re just trying to understand the physics of violin sound, and perhaps help luthiers in the design process.”

Makris and Liu’s MIT co-authors include Arun Krishnadas PhD ’23 and former postdoc Bryce Campbell, along with Roman Barnas of the North Bennet Street School.

Sound matrix

The quality of a violin’s sound is determined by its dimensions and design. The instrument is made from thoughtfully crafted parts and materials that all work to generate and amplify sound. In recent years, scientists have sought to understand what artisans have intuited for centuries, in terms of what specific parameters shape a violin’s sound.

In one early effort in 2006, scientists, as part of the Strad3D project, put a rare Stradivarius violin through a CT scanner. The violin was crafted in 1715 by the master violinmaker Antonio Stradivari, during what is considered the “Golden Age” of violin making. To better understand the violin’s anatomy and its relation to sound, the scientists scanned the instrument and produced 600 “slices,” or views, of the violin.

The CT scans are available online for people to view and use as data for their own experiments. For their study, Makris and his colleagues first imported the CT scans into a solid modeling software program to generate a detailed three-dimensional model of the violin. They then ran a finite element simulation, essentially dividing the violin into millions of tiny individual cubes, or “elements.”

For each cube, they noted its material type, such as if a cube from the violin’s back plate is made from maple or spruce, or if a string is made from steel or natural fibers. They then applied physics-based equations of stress and motion to predict how each material element would move in relation to every other element across the instrument.

They also carried out a similar process for the air surrounding the violin, dividing up a roughly cubic-meter volume of air and applying acoustic wave equations to predict how each tiny parcel of air would move and contribute to generating sound.

“The entire thing is a matrix of millions of individual elements,” explains Krishnadas. “And ultimately, you see this whole three-dimensional being, which is the violin and the air all connected and interacting with each other.”

A plucky model

The team then simulated how the new computational violin would sound when plucked. When a violinist plucks a string, they pull the string sideways and let it go, causing the string to vibrate. These vibrations travel across the instrument and inside it; the air’s vibrations are amplified as they travel out of the violin and into the surroundings, where a listener hears the vibrations as sound.

For their purposes, the engineers simulated a simple string pluck by directing one of the virtual violin’s strings to stretch out and then rebound. The simulation computed all the resulting motions and vibrations of the millions of elements in the violin, and the sound that the pluck would produce.

For notes that require pressing down on a violin’s fingerboard, they simulated the same plucking, and in addition, included a condition in which the string is held fixed in the section of the fingerboard where a violinist’s finger would press down.

The researchers carried out this computational process to virtually pluck out the notes in several measures of “Daisy Bell” and “Bach’s Fugue in G Minor.”

“If there’s anything that’s sounding mechanical to it, it’s because we’re using the exact same time function, or standard way of plucking, for each note,” says Makris, who is himself a lute player. “A musician will adapt the way they’re plucking, to put a little more feeling on certain notes than others. But there could be subtleties which we could incorporate and refine.”

As it is, the new computational model is the first to generate realistic sound based on the laws of physics and acoustics. The researchers say that violin makers could use the model to test how a violin might sound when certain dimensions or properties are changed. For instance, when the researchers varied the thickness of the virtual violin’s back plate or changed its wood type, they could hear clear differences in the resulting sounds.

“You can tweak the model, to hear the effect on the sound,” Makris says. “Since everything obeys the laws of physics, including a violin and the music it makes, this approach can add an appreciation to what makes violin sound. But ultimately, we get most of our inspiration from the artisans.”

This work was supported, in part, by an MIT Bose Research Fellowship.



de MIT News https://ift.tt/u7lgihe