I wonder if part of the problem isn't just AI-created content but the algorithms themselves.
Many algorithms reward creators for publishing hyper-frequently which forces them to be on a creation hamster wheel.
When that's the incentive, you either get
a) something created by AI,
b) something created by humans sounding like AI,
c) burned out creators who still create great content but whose personal life goes down the drain,
d) content from the few lucky ones who can avoid a)-c), at least for a while (which isn't great for variety because you always see content created by the same names).
I'm exaggerating - but only slightly, if the phenomenon of "burned out YouTube stars" is any indication. Thankfully, Medium is not like YouTube and yet, some of the same systemic pressures still apply.
In addition to maybe having a Medium-wide code of conduct about when and how to use AI-generated content, it might also help to take a look at what can be done to not push creators towards a "hamster wheel" scenario.
How can the algorithm reward truly creative content, even if it's produced at a slower and more sustainable pace?
In addition to improving quality (and further set Medium apart from less positive social media platforms), that could also help increase diversity on here. Many people can't dedicate their entire days to creating content... but they might still have something important to say! :)
So, to conclude, I think the question "what should we do with AI-generated content?" is the symptom, while the actual cause of the problem goes deeper ("how do we make sure the algorithms are supporting our goals?").
It's not an easy issue to solve and one that influences us in so many different areas of life--and yet, if there's a company I trust to solve this issue, it's Medium.