I'll tell you everything you need to know just based on the last Walking Dead seasons and Hollywood in general, they will probably suck. Since the Walking Dead is too big to escape the Hollywood treatment. They will just be media avenues to push the virtue signalling agenda like everything that gets big in media becomes. Game of Thrones, Star Wars, Star Trek. All were great until the axe called DEI came down.