With help from Derek Robertson
As leaders battled it out over Russia’s invasion of Ukraine at the United Nations in New York this week, Cynthia Breazeal, dean for digital learning at MIT and director of MIT RAISE, came to the city on a different mission: to change minds about AI and robots.
It’s a field dominated by talk of efficiency and cost-saving, and a subject of controversy because of the job losses and surveillance implications of the technology.
Breazeal is a tech evangelist of a different kind, using the UN General Assembly carnival to preach about “design justice” and “AI that helps promote human flourishing.” And she wants your kids to use AI both in school and outside of it.
In her research, AI is a potential savior for children whose education trajectories were skewed by Covid, including refugees or children with disabilities not supported by school systems.
The numbers in need of support are huge. There are around 36 million displaced children around the world, and globally “80 percent of children with intellectual disabilities don’t go to school,” Tim Shriver, Chair of the Special Olympics, told me.
With the United Nations predicting a global teacher shortage of around 70 million by 2030, so-called Socially Assistive Robots — which aim to interact with people in emotional or intellectually helpful ways — may be among the most viable ways to help these groups develop and catch up with peers.
The UN’s Office of Innovation is worried about “Generation AI” — officials say that national AI strategies, where they exist, barely mention children. If AI is biased, and embeds that bias in people from a young age, it’s not clear who is responsible or how it can be undone, for example.
How do you test AI with children? “ We actually teach the teacher and the parents enough about AI, that it’s not this scary thing,” Breazeal said of plans for a pilot project in pro-refugee Clarkston, Georgia — the “Ellis Island of the south.”
“We want to be super clear on what the role is of the robot versus the community, of which this robot is a part of. That’s part of the ethical design thinking,” Breazeal continued, “we don’t want to have the robot overstep its responsibilities. All of our data that we collect is protected and encrypted.”
How do parents and teachers react to the role of a robot in their children’s lives? “It’s not about replacing people at all, but augmenting human networks,” Breazeal said, “This is not about a robot tutor, where teachers feel like competing against the robot,” she said.
Breazeal said the children she’s studied are “not confusing these robots with a dog or a person, they see it as its own kind of entity,” almost “like a Disney sidekick that plays games with you, as a peer.”
What works with children in need might also be leveraged for other age groups and communities. “A huge part of this project is to come up with ethical frameworks and processes that go beyond the specifics of this particular case study,” Breazeal said.
As new technologies like blockchain, and immersive, networked VR start to transform the digital landscape, veterans of the current digital governance regime are starting to think about their broader impact.
At a panel held by the McCourt Institute yesterday titled “Digital Governance and the State of Democracy: Why Does it Matter?” a group including Stanford’s Erik Brynjolfsson, Upworthy co-founder Eli Pariser, and the Georgetown Ethics Lab’s Maggie Little convened to hash it out. The main point of consensus: Whatever we did last time around hasn’t worked.
“We’ve built a system that not by design, but by effect, amplifies the more primitive parts of our brains,” Brynjolfsson said. “That’s not the part that’s created the civilization we want.”
Pariser proposed a more radically public-minded conception of digital spaces than what the world has seen in recent years, saying “I think we live in an essentially autocratic digital environment. Yes, you can participate, you can post a tweet, but if you want to change how Twitter works, you need to have $50 billion, or it’s one person, at the end of the day; there’s like five guys who make all the decisions ultimately about how these systems work.”
“If people don’t feel like they have meaningful power over their environment, then they start to shrink back or look for someone powerful enough to punch through,” he added. “Like Elon.” — Derek Robertson
Divisions within the Democratic Party around crypto policy are starting to get clearer.
A bill introduced last month to the Senate Agriculture Committee that would place crypto regulation under the purview of the Commodity Futures Trading Commission, largely seen as more favorable than the alternative of the SEC, has gained both a notable supporter and a detractor. As POLITICO’s Sam Sutton reported first in today’s Morning Money, Sen. Sherrod Brown (D-Ohio), who chairs the Senate Banking Committee, said w