NYU Alignment Research Group. (ARG? Seriously?!)
So we've got that going for us. LLMs strategically manipulating people into acquiring power sure sounds like a serious flaw in the software. A bit more information and context at the unfortunately named