tinyBuild doesn’t spy on employees with AI, says CEO who suggested doing so to identify ‘time vampires’

Speaking at this week’s Develop:Brighton conference, tinyBuild CEO Alex Nichiporchik gave examples of how large language models like ChatGPT could be used by game studios to identify “potentially problematic players on the team.” Suggestions included feeding employees’ text conversations and video call transcripts into a system to detect certain words that could indicate burnout and “time vampires”.

After receiving online criticism, Niciporchik tweeted to say that parts of his presentation were taken out of context and that the examples were “hypothetical”.

Well, I’m not going to embed a tinyBuild trailer, I am. Watch on YouTube

“We don’t use AI tools for HR, this part of the presentation was hypothetical,” Nichiporchik said. in the final tweet of a thread.

During the presentation, Nichiporchik described a process he called “I, Me Analysis”, as reported by Whynow Gaming. The process involves feeding Slack transcripts and automated video call transcripts into ChatGPT to count the number of times an employee uses the words “I” and “me”.

“There’s a direct correlation between the number of times someone uses ‘I’ or ‘me’ in a meeting, versus the amount of words they use overall, the person’s likelihood of burning out,” Nichiporchik reportedly said. “I should really protect this, because as far as I know, nobody invented this.”

He also explained how a similar process could be used to identify “time vampires” – employees who talk too much in meetings. “Once that person is no longer with the company or the team, the meeting takes 20 minutes and we do five times that.”

With these systems, Nichiporchik suggested that a company might be able to “identify someone who is on the verge of burnout, who might be the reason that co-workers who work with that person are burning out, and you might be able to identify and fix it early on.”

Whynow Gaming also reports that Nichiporchik said he ran these processes retroactively on former tinyBuild employees, but are now starting to actively use them. “We had our first case last week, where a head of studio was not well placed, no one told us. If we had waited a month, we probably wouldn’t have had a studio. So I’m very happy that it worked out,” he reportedly said.

This would seem to contradict Nichiporchik’s insistence now that tinyBuild “doesn’t use AI tools for HR” or that the presentation was purely “hypothetical”, which he further stated on the tinyBuild blog.

As many have already pointed out, the use of large language models for employee monitoring under the guise of detecting burnout is deeply dystopian. “Black Mirror-y”, to use Nichiporchik’s own description. The same goes for describing employees as “time vampires”, or suggesting that a ChatGPT process could lead someone to “no longer be part of the company”. (Also, as someone else has surely pointed out, you don’t need a big language model to count the number of instances of a word in text. You can just hit ctrl+F in a decent text editor.)

I use “I” and “me” a lot in work meetings, but that’s only because I fight burnout by singing the Because I’m Me hook at the start of every Teams call. Allow me to make a presentation at the next Develop.

Leave a Comment