Interesting, I just used AI to uncover the Flynn Effect based on a 1984 meta analysis concluding that IQ test performance increased on average 0.31%/year with too many nuanced details to include in a comment (perhaps the crux of the issue at hand). Since then, other studies have concluded that the trend has been reversed in many western nations (ironically, somewhat correlated with the advent of the internet, but also other socioeconomic changes). IQ test scores may be a poor proxy for intelligence, especially over time, and it may be a boiling frog scenario where we don't have the tools to detect we're boiling till it's too late.
But it seems possible that the standard deviation of a human intelligence distribution may change as curious people learn to use AI as an initial springboard into new concepts that otherwise would require more resources to breach, to follow up with more rigorous, practical exploration and connection to other concepts, while less curious people may take it at face value, becoming dependent on it to replace critical thought. The proportions of that curiosity in the population may remain the same, but the mean becomes less descriptive of the population.
So the smart get smarter and, depending on the analytical spin, one could claim that humans on average are smarter while others may claim we are dumber -- very similar to widening wealth inequality making claims of a growing economy plausible, while the experience of many people is the opposite. The more simple of the two claims will resonate with the portions of the population below a certain critical thinking threshold which may skew to the right of the mean of an IQ bell curve. Perhaps "eat the smart" accelerates (as is popular in the US), creating an overall drag on the mean. Lots of interesting dynamics to keep an eye on.