When technically savvy teams get talking, the AI conversation can be a bit intimidating. But you don’t need to speak in algorithms and data sets to understand the basics. Get answers to seven simple questions about AI from Phillip Maggs, Superside’s Creative Director of New Horizons, and hold your own in any creative conversation.
What’s the deal with AI and hands? Why are my AI designs being flagged as NSFW? And how does generative AI work, anyway?
We all have a million questions about AI… but we might not always be comfortable asking them. With every new Midjourney model and evolution of ChatGPT, it can feel like getting up to speed with this technology moves further and further out of reach.
But it’s never too late to catch up. And I have the perfect person to help: Phillip Maggs, Creative Director of New Horizons at Superside, which is a fancy way of saying he’s in charge of innovative design services, including our pilot program for AI-enhanced creative.
In other words, Maggs knows a thing or two about AI in design, so sit back and get answers to the questions about AI you may be just a little reluctant to ask!
Generative AI has two elements to it: Context and cognition. The context is a massive data set of pretty much all human endeavors to date, and the cognition is the algorithm’s inference machine that will reproduce the next word or diffuse images out of noise.
Right, but what does that really mean? Basically, generative AI is built on data sets, which are massive amounts of data scraped from all over the internet. Anything from literature to technical manuals and even this blog post can be part of these data sets. Then, when you give an AI tool a prompt, it uses custom-built algorithms to generate an answer derived from that data.
Here’s the best way to explain it simply:
How much do you know about your car’s engine? Sure, you might recognize the names of parts like “pistons,” “spark plugs” and “crankshaft” (okay, maybe not that last one) but do you know how they work together? That’s the “cognition” part of AI, the engine that turns your prompt into the baseline for your next creation. And while every car has an engine, they’re all built slightly differently, like AI models.
In the same way a car’s engine turns fuel into movement, AI turns mountains of data into text, images, or even full videos. But it still needs a prompt to know what to do with all that data, just like a car needs your input to do anything other than just idle in place. You don’t have to understand how the engine works to get the car moving—but that kind of know-how can help you when things go wrong.
It takes a lot of human feedback to tell AI what it’s doing right or what it’s doing wrong because a lot of times it doesn’t know it’s making hands. It knows it’s diffusing pixels out of noise. We have to work with it to try and make it better.
Short answer: Hards are hard. Lots of humans still have a hard time drawing hands, even after years of practice. And even with all of human knowledge at its circuit-board fingertips, generative AI is still in its infancy when it comes to producing convincing images, text and other types of media. So expecting AI to get hands right is a bit like expecting a third grader to paint the Mona Lisa.
Because we’re still in the early days of generative AI, the truth is these tools just don’t really know what they’re doing. Everything they give us is essentially their best guess at answering a prompt. That’s why AI hallucinations are a thing, and why you should always spend a little extra time on your prompt to get a result that’s exactly what you want—and why you probably shouldn’t use the first thing an AI makes as a final deliverable.
AI doesn’t necessarily learn, rather it will look for similarities between the data available to it and what you’re asking it to do. In that way, AI will learn that certain words and pixels are closer to each other. AI needs lots and lots of data to learn.
When we work with AI, it’s tempting to think we’re in a sort of teacher-student relationship. We ask an AI design tool to do something, and it gives the prompt its best shot. If it gets something wrong, we give it a gentle nudge, and hopefully, it produces a better answer the next time. But that’s not exactly how it works…
Imagine you’re a baseball coach trying to teach a pitcher to throw the best fastball they can, right down the middle. With a human pitcher, you might just give them a few bits of advice and suggest some adjustments before they start getting the gist of it.
With AI, the approach is more like throwing an infinite amount of pitches until it gets that fastball right. Then, it has the data it needs to throw fastballs over and over again. It might take thousands of pitches for AI to get the right pitch, but in the time it takes a human pitcher to throw the ball once, AI has thrown it a thousand times.
This is part of what makes AI such a powerful tool. With creative processes that are so dependent on getting a bunch of things wrong until you find the right approach, AI can help you get to the optimal result faster.
Everyone thought that AI was coming to disrupt more lower-level tasks and that it would take a very long time for it to be creative or think critically. I would say AI will disrupt every industry in the same way that computation did. More people can do more with less, and we’ll be able to have entirely new industries in the future.
There was a time when AI was a niche tool. It was great for chatbots since all it had to do was answer the most frequently asked questions a company might get from its customers. It was used for digital assistants like Siri because Apple essentially built a contained environment where it only needed a limited amount of data.
But the generative AI genie is out of the bottle. With that comes disruption.
Think of just about any creative workflow, and you can find countless ways to disrupt it with AI. Illustrators can use AI to explore different styles without having to spend time hand-drawing, opening the doors to more creativity and giving them a head start on choosing a direction. Writers can streamline research and editorial tasks. Creative directors can generate pre-visualizations and quick concepts in minutes instead of days.
That’s disruption. It makes work that took days either irrelevant or lightning-quick, freeing up time for innovation, and ultimately, entirely new industries.
AI is a democratizing force like education. It means more people have more access to do more things they want to do. It means you don’t need the world’s most expensive equipment to produce video, you don’t need the world’s best hardware to produce 3D. It allows people to express themselves more.
Prompting ChatGPT to write poems celebrating your coworkers’ achievements. Making fun profile pictures for your team with DALL-E. Using AI voice-overs to share inside jokes. These are all things everyone can do with AI.
Play is such an important part of creativity, and without AI, it was limited to people who’d spent years building up the necessary skill set to be able to do so.
With AI, anyone can pretend to be a designer for a day, whether it’s to share a meme in Slack or to quickly generate preliminary ideas for a new marketing campaign. Of course, It’s not a tool for replacing the professional designers on your team; but it can help everyone else work with them more efficiently.
Sometimes, the AI will get it wrong and flag SFW content as NSFW. It has a blacklist of words, but sometimes it gets it wrong. Look at your prompting style and method and keep your words as SFW as possible.
If we crack open your browser history, how much NSFW stuff will we find?
Just kidding. But unless you’re deliberately trying to create NSFW content, your prompts are probably just getting flagged because they’re using specific words or phrases that the AI’s algorithm has been programmed to recognize as NSFW. You’re getting false positives, but it’s pretty easy to get around this with a bit of experimentation.
For example, try stripping your Midjourney prompt down to its most essential components. Are there different words you could be using to try and generate your content?
Then, experiment with different variations. Treat AI like play, and be as creative as you can. This experimentation will help you become more familiar with the type of content AI tools can generate reliably without any issues.
Data sets have been aggregated from publicly available images, which probably have been airbrushed or photoshopped. The images will tend towards smoother skin tones and more beautified imagery.
Generally speaking, any weirdness you find across AI-generated images—like worlds populated exclusively with models—can be explained by one of two factors:
If your prompt is too simple, an AI tool will try and give you something that’s most similar to it, staying away from any specifics that might go against the language in your prompt. That means if its data set is already biased towards model-esque people—like most are—you’re going to get people that look like models.
By making your prompt more specific—and experimenting with the language in it—you can work against some of the biases in your AI tool’s data set. Fewer models, more real people.
There’s no question, AI is the hot topic of the year (decade?). But just like with any powerful tool, knowledge is what’s going to keep you ahead of the curve.
Hopefully, Maggs’ answers helped add to your knowledge bank today!
And for tomorrow, remember to put details into your prompts, experiment, and look for ways AI can enhance design projects and eliminate some of the tedium in your creative workflows—not people it can replace.
Nick is a Content Writer and Strategist specializing in long-form marketing content and turning SEO traffic into paying customers. He's well-versed in the technology industry and pulls from his experience as a marketer who's worked closely with many creatives to craft content for Superside. Two truths and a lie: He's been a professional wrestler, writes on a blue typewriter and reads 100 books a year.