Wisconsin Public Radio

Wisconsin Bill Would Put Guardrails On Teen Use Of Humanlike AI Chatbots

Measure would ban certain suggestions to minors, but industry warns rules are too broad and intrusive.

By , Wisconsin Public Radio - Feb 5th, 2026 09:40 am
Mobile phone. (CC0).

Mobile phone. (CC0).

As a proliferation of free and easy artificial intelligence tools transform how people learn, work and socialize, Wisconsin lawmakers heard testimony Wednesday on a proposal that seeks to regulate kids’ use of human-esque chatbots.

The bill seeks to limit the exposure of minors to chatbots, amid growing concern that teens are socially influenced — sometimes to the point of self-harm — by interacting with lifelike AI companions from companies like ChatGPT or Character.AI.

A reported two-thirds of teens use chatbots, according to the Pew Research Center, with about 30 percent using them daily. Lawmakers at a hearing of the Assembly Committee on Science, Technology, and AI expressed concerns about those numbers.

“Children and teens may form unhealthy or even dangerous parasocial relationships with AI chatbots,” said Rep. Lindee Brill, R-Sheboygan Falls. “We have a responsibility as lawmakers to act in defense of our children in this rapidly evolving technological environment.”

The proposal would limit how chatbot apps can interact with kids under 18. The humanlike qualities laid out in the bill include remembering past conversations, asking unprompted and emotional questions, and maintaining a personal-seeming interaction.

Those kinds of chatbots couldn’t be available to a Wisconsin child without certain features, like making sure the chatbots are incapable of encouraging self-harm or substance use, and that they wouldn’t seek to displace mental health services or trusted adults in the child’s life. The chatbots would also need to have guardrails against encouraging violence, illegal activities or sexual behavior.

Companies that violate the rule could face action by the Department of Justice, including fines, and potential lawsuits from families, depending on the outcome of the interaction.

The risk, the bills’ authors say, can be seen in extreme instances like when chatbots have allegedly encouraged depressed teenagers to take their own lives. Chatbots have reportedly encouraged their users to explore self-harm or violence, and a growing number of people rely on the tools to ask for mental health advice or spiritual guidance, which can create a false sense of intimacy.

And studies show prolonged interactions with these tools can negatively affect mental health and social development.

“We don’t want to do a parent’s job. That’s not we’re trying to do,” said Brill. “What we’re trying to do is give parents the tools and come alongside as government and say, ‘What does Wisconsin really need to do to protect kids?’”

But critics of the bill said that, as written, age verification requirements would be onerous, forcing some tech companies to stop operating in Wisconsin. They also argued that age verification exposes young people to exposing more sensitive data.

A letter from a coalition of tech companies, the liberal Chamber of Progress and the conservative Taxpayers Protection Alliance argues the proposal could violate First Amendment protections by placing governmental regulations on speech, instead of trusting parents to monitor their kids’ tech use.

At the hearing, Kouri Marshall, a government affairs director at Chamber of Progress, argued the Wisconsin bill is overly broad and could criminalize AI tools used in classrooms.

“An AI math tutor that remembers a student’s struggles and offers encouragement could fall under this law,” he said. “So could a language learning chatbot.”

Other states have also explored limiting the risk of exploitative or manipulative AI chatbots. California has passed a law that requires a chatbot to repeatedly notify its minor users that it is not a person, and to implement safety measures against violence, self-harm and sexually explicit content. In October, a bipartisan group of U.S. senators introduced a federal age verification bill for chatbots.

Also on Wednesday, lawmakers heard testimony about a proposal to formally define “artificial intelligence” in law as distinct from a person. The bill specifies that AI cannot own property, get married or serve in positions of professional leadership.

Wisconsin lawmakers explore age verification requirements on companionship chatbots was originally published by Wisconsin Public Radio.

If you think stories like this are important, become a member of Urban Milwaukee and help support real, independent journalism. Plus you get some cool added benefits.

Leave a Reply

You must be an Urban Milwaukee member to leave a comment. Membership, which includes a host of perks, including an ad-free website, tickets to marquee events like Summerfest, the Wisconsin State Fair and the Florentine Opera, a better photo browser and access to members-only, behind-the-scenes tours, starts at $9/month. Learn more.

Join now and cancel anytime.

If you are an existing member, sign-in to leave a comment.

Have questions? Need to report an error? Contact Us