AI is People
HR’s love affair with big data means that, at some point, businesses end up choosing magic over insight.
You’re reading the newsletter of Hear Me Out. We talk to employees off the record, then help their leaders make work more rewarding. Learn about our service.
Mother Teresa powerlifting. The Golden Gate bridge, built for light rail. Impressionist drawings of gay, black ballet dancers. A new, surreal aesthetic is emerging from the primordial soup of the internet, generated by tools like DALL-E 2 and Midjourney: images that feel conjured from an alternate reality, but are actually drawn by an AI. These uncanny, dreamlike images have kicked off the latest round of a perennial, but still much-hyped question: Will automation finally render writers and artists obsolete? Or do they supply the spark? (The answer is “yes.”) But the discourse sidesteps a more useful question: How do we engage with new digital tools without getting lost in them? And the implications for businesses, and HR teams working to hire and retain key employees, reach far beyond the art world.
Behind the Magic, Countless Hours of Human Labor
All good creative work is the tip of a very large iceberg. When Steve Jobs spent $100,000 on a corporate identity for NeXT, the precursor of the modern Mac, he received a single logo option. That logo, however, came with a 100-page book detailing the entire creative process.
AI-generated art is no different: behind the stunning artwork created with these complex systems lies a mountain of tedious, human labor. Artists spend hours tweaking text prompts and generating hundreds of variations to create a unique style or render a particular scene. (I’m speaking from experience here: I created the banner image for this piece with Midjourney myself, including overpainting in Photoshop to achieve the final effect.) And mastering the tools can take hours of reading, exploring others’ prompts to see what works and what doesn’t, and digging through reddit posts and chat threads for hidden gems. Prompt engineering, as it’s come to be known, is already its own field of study, with at least one dissertation in the works.
But it’s not just the labor of digital artists that makes these systems possible. It’s also the labor of millions of others, spanning millennia. This includes all the artists, both world-famous and amateurs, who created the work these systems were trained on. It also includes the people who labeled the images so the AI knows what it’s looking at, many of whom work for pennies on the dollar in developing countries. Applications like driverless car development rely on millions of labeled images and thousands of hours of labeled video. Countless hours of human labor, all to make automated outputs look effortless.
The Illusion of Magic is Central to Tech’s Narrative
In tech, hiding human labor behind a mystical façade is par for the course. Kiwibot, which bills its tiny, four-wheeled robots as “the adorable future of delivery,” conveniently fails to mention the students it employs in Medellin, Columbia, who navigate the bots out of corners for about $2 an hour. And the only thing stopping a simple YouTube or Google search from returning images of terrorism and child abuse is the labor of third-party content moderators, many of whom suffer nightmares and panic attacks from constant exposure to extreme violence. Our fully-automated, luxurious lives can only be made possible in exchange for theirs, and our peace of mind is sustained by their invisibility.
The fact that we humans buy into these myths so easily shouldn’t come as a surprise. After all, everyone loves a good magic trick. Consider ELIZA, the first chatbot, created by Joseph Weizenbaum in 1946. The bot was rudimentary by today’s standards, using simple rules and keywords to simulate psychoanalysis. But to Weizenbaum’s alarm, early users confided in the program, wanted to be alone with it, and seemed to believe it felt genuine empathy. The experience inspired him to become one of the earliest technology critics, deriding his peers as “compulsive programmers” and “the artificial intelligentsia” for their uncritical overreliance on technological solutions. 38 years later, Rebecca Solnit would build upon this argument in her writing on the tyranny of the quantifiable, “the way what can be measured almost always takes precedence over what cannot.”
Automation Distracts from Real Efforts for Change
These days, HR teams are victims of this tyranny, awash in engagement and retention technologies driven by quantification for its own sake. Venture-backed startups promise to predict which candidates have the best chance to succeed, nudge managers into being more attentive, and flag employees who are checked out. These technologies leverage our existing dependence on digital tools to surveil, classify, and influence our behavior on the job. And all too often, the false sense of security they provide executive teams distracts from real, on-the-ground efforts to keep employees engaged and productive.
The most pernicious example of tech-induced blinders is also one of the most common: the humble engagement survey. Precisely measuring employees’ willingness to endorse the culture is an easy way to calm the nerves of metrics-hungry executives, providing a feeling of control in an area of the business that often feels completely out of their control. But that precision often comes at the expense of accuracy.
Engagement survey data is far from representative. Typical completion rates range from 65-85 percent, according to Culture Amp, and can go even lower. While that might sound ideal (after all, most U.S. elections barely crack 50 percent), the sample is skewed, since the least engaged are also the least likely to respond. And in practice, obsessing over employee satisfaction hasn’t kept employees very satisfied, with many leaving the passion economy to find work that truly aligns with their values.
That’s why at Hear Me Out, we believe the only path to long-term engagement is through listening and responding to people on their terms, not shoehorned into a patchwork of chatbots and surveys. While automated feedback tools can offer a high-level picture of employees’ needs, they rarely explain what employees really believe, or why they feel a certain way about the company. True insight, as any product manager will tell you, comes from one-on-one interviews with a moderator who can get to the root of things, following up to reveal the full context in rich, nuanced detail.
The risk of believing in magic is that, at some point, we stop looking for other solutions. Despite the hype, technologies like engagement surveys and AI-driven retention tools sacrifice depth for efficiency. They replace real solutions with easy ones, and in selling us the dream, they cover up the people who make it all work.
The more advanced the technology, the more we need to remember that the engine is still people, whether we’re talking about engineers and their software, or artists and their art. In either case, without the painstaking, creative, and strategic work that only human intelligence can drive, all we’re left with is a picture that doesn’t add up.