Zephyrnet Logo

Creating Filters on Spark AR that Unveil Thoughts About Ourselves IRL

Date:


(L-R) Actor/Screenwriter Cassidy Civiero & Digital Artist Ruth Isabel Guerra (me!)

BTW, this project starts with two strangers meeting online…

Ruth Isabel Guerra

In November of 2019, I watched a video of Maisie Williams (Arya Stark, for a good portion of you) consume increasingly hotter vegan wings on YouTube. The clip, as you might have guessed, was an episode of Hot Ones. In it, she talked about her latest endeavor—the launch of a digital platform bearing her name but with a springtime twist, Daisie (the D standing in for her co-creator Dom Santry). The site and app was designed as a way for emerging creatives to connect and collaborate on new artistic projects. Naturally, I was curious about it. I signed up for an account, uploaded my profile pic (this very one you see in my Medium bio), slapped my website on it, scrolled through some pages, and closed my computer. Like most new things that crop up online, I quickly forgot about it.

Until I got an email in my inbox from someone named Cassidy.

I don’t receive emails like this too often. I set up my website years ago as a placeholder, if you will, for the person I want the internet to think I am. I update it every few months with some things I create that I’m not overly embarrassed about, and, consistent to my brand, I frequently forget about it. This last part has become a coping mechanism for me—to be vulnerable with my work online and then immediately discard it from my thoughts so I don’t regret posting in the first place.

1. How to use subtle AR filters to survive your Zoom meetings?

2. The First No-Headset Virtual Monitor

3. Augmented reality (AR) is the future of Restaurant Menu?

4. Creating remote MR productions

Uncertain of where this email thread would take us, Cassidy and I started asking questions about each other. We chatted about our home life, relationships with people closest to us, mental health, and most importantly, what we wanted to say about it all. These convos quickly found their way into a Google doc, where we could word vomit everything we were feeling on any given day.

An excerpt from our Google doc (I highlighted the phrases in green that spoke to me the most)

Our thoughts took shape as images, too.

From this simple exercise, it became clear what our project was really about and that AR was the medium we would use to talk about it.

I had been interested in AR lenses for quite some time, after working for Snapchat for almost a year and subsequently stumbling upon Instagram lens creators like Ommy Akhe and Johanna Jaskowska on my feeds. Their work felt like where Gen Z and Millenial digital creators (like me) were inevitably headed—straight back into the motherboard that gave birth to us: The Internet.

The most striking thing to me about face filters (or, CGI that use facial tracking) is that it’s not a new concept. I remember testing out various mask effects on my dinky webcam in the early 2000s, when I was no more technologically-savvy as I am now. But what started as a funny joke of cycling through different hat styles to mess with your friends has become, well, less of a punchline and a whole lot more commonplace.

Now, almost every selfie we see online has been edited with a filter that morphs oh so effortlessly with the person’s face and surroundings. Freckles. Doll eyes. Rainbow vomit. Butterflies circling our heads. Sinkholes that invite us to fall into them. They’re all made to look like they belong in the real world, and are part of how we present ourselves and our casual lives to the digital masses.

Nonetheless, I’m the first to say that I’m a consumer of this culture. As a child, I learned how to use a computer faster than my parents were able to figure it out. I would wake up at 5am before my Saturday morning cartoons to watch infomercials on the newest gadgets made available for the low, low price of $19.95. My love for movies didn’t come from watching them in theaters; it grew out of my obsession with cameras and editing. For as long as I can remember, I’ve had this constant itch to iterate, to experiment, to play, to transform.

And so this itch fueled mine and Cassidy’s project—as both a fascination for our ever-evolving technology and how the digital space, particularly social media, has affected how we feel on a daily basis, for better or worse. We didn’t get to all the filters we wanted to create (it was a steep AR learning curve for me), but we were able to capture the essence of our real thoughts splattered across that working Google doc.

Our mantra of existence.

Cassidy models “I Wake Up, I Go Online” filter

This was my first pass at creating a 3D model in Blender 2.8, and it. did. not. turn. out. as. planned. I had followed an easy enough tutorial on YouTube on building 3D text, but I could not for the life of me figure out how to import the awesome metallic texture I gave it into Spark AR. (Again, the learning curve was steeeeeep.)

WakeUp-Online.blend
Patches on Blender

Luckily, I could build a similar texture in Spark, but with a limited color wheel, which I compensated for by adding various light sources and colors that switched on and off at different durations and positions on the 3D text.

Spark patches for lights and colors

Using a combo of nodes connected to the QuickAnimBlock (part of a preset template in Spark that I could’ve changed the name of but didn’t bother), I made the object rotate 360-degrees at a fixed speed and also allowed it to be placed on any part of a floor surface by the user.

Spark patches for spinning 3D world object

And the final result:

Preview of “I Wake Up, I Go Online” filter

All in all, this took me about 14+ hours to figure out over the course of a single weekend (#sorrynotsorry). The end result is satisfactory. The texture of the 3D object produces a glitchy, screen-like effect that mimics how hypnotizing our little technology machines can be. The phrase repeated over several lines displays how much technology has become part of our daily routines and vice versa. I mean, what’s the first object you reach for when you wake up in the morning? For me, it’s my phone.

Beauty in the eyes of our followers.

Cassidy models “Persona” filter

This filter originated from a phrase that we, unfortunately, hear too often: “She’s not as hot as she looked online.” The saying is two-fold for its effects on a woman’s sense of self. It echoes the pressures we face to look a certain way in order to be appreciated in photos (perhaps one of the many reasons why image filters and editing apps are so ubiquitous) only to be judged as “fake” or “not as [fill in adjective here]” by the way we look in person. It also has become a frequent excuse for others to ghost, deflect rejection or jealousy, and downplay past relationships with women. Phrases like this one have caused me to reconsider sharing photos of myself, much less selfies I’ve taken, over the years.

I approached this build as a digital collage, which is a medium I’m familiar with from previous projects. I gathered eyes, eyebrows, and lips from digital scans of very traditional beauty magazine spreads. I then scaled up their sizes at least 2x in Photoshop to simulate the many face filters that make us look like perfect dolls. Using a 2D Face Mesh, I placed the cut-out facial images on the areas where they would fall on a real face. (I’d like to note here that I built this particular filter with Cassidy’s face in mind.)

Image files over 2D Face Mesh

The only configuring I did for this filter on Spark are adjusting the axis points for the two separate image files — eyes with eyebrows and lips — and positioning them at different distances from the face. The two files were fighting each other to be the source file for the face tracker, and if I kept the images at equal distances from the face, one would inevitably dissolve into the face. I’m sure there is a more eloquent way to both describe and fix the issue, but this was the quickest solution I could come up with.

Transformations on Spark

Final result:

Preview of “Persona” filter

The filter appears just real enough that it’s attractive on a person but exaggerated enough that it also looks unsettling. Maybe this was also my subconscious reaction to filters manifesting itself onto a filter. When I use filters or see others using them, I am intrigued, concerned, and weirded out all at the same time.

Who are we today?

Cassidy models “404 Error” filter

These were probably my favorite photos of the bunch. They are indicative of our collective behaviors in public, as we are often transfixed on our personal devices, no matter our surroundings or view. I mean, here we are in sunny DTLA, one of the architectural wonders of the world, and rather than taking it all in or just taking photos of the landscape, we’re getting selfies. (Granted, this was the point of the photoshoot, but you get the point.) This also leads me to the amount of times I catch myself texting while crossing a street walk — not even bothering to check if there are oncoming vehicles — which is a little disturbing to process.

The filter we used here was also my favorite to create. It merges concepts from the first two filters: techie in its design like “I Wake Up” but plastered on a human face like “Persona.” The 3D text was again built in Blender 2.8 in a similar fashion as before (but with less hours wasted).

I used a lined Face Mesh as a texture source and placed it on top of a light blue metallic texture I built in Spark, along with various contrasting light sources and colors.

faceMesh.png
Spark patches for light sources and color

The filter is also tap responsive and allows someone to switch between the phrases “404 ERROR” and “IDENTITY NOT FOUND.” I used a simple combo of nodes to do this, starting with the Screen Tap patch.

Spark patches for tap responsiveness

Final result:

Preview of “404 Error” filter

One of the effects I meditate on the most re: filters is the process of sharing and adopting. When we see a filter we like on someone, our natural inclination is to try that filter out and post it. Then, all of a sudden, there is one more person on someone else’s feed with this same filter. And then another. And another. And another. At some point, this one filter on multiple faces start to all blend together. This also goes for Instagram posts we fancy. We like it and repost it. (With the proper credit to the original poster, I hope.)

At what point does this transaction of data blur the lines between who we are IRL and who we are under a URL? When does repurposed content become part of our own unique voice? And at what point do we all just turn into humanoids crossing the street in the same direction, staring at the glass on our phones?

Tread lightly on me.

Cassidy models “Fragile” filter

This filter is kind of like the reverse-filter. It’s exactly the level of vulnerability that we’d never want the world to know about us, and yet it follows most of us everywhere we go—both in digital and physical spaces. It’s the filter we choose not to use but it’s the feeling we can’t ever shake off.

I created this 2D frame in Adobe Illustrator and applied facial tracking to it in Spark.

fragile-frame.png

Final result:

Preview of “Fragile” filter

Although simple in its design, this one probably hits the closest to home. It’s what I feel most times when I open up to the internet. It’s what I feel while writing this. It makes me think how afraid I am sometimes of my own work — creating and then dumping on a website — hoping that one day someone will come across it and appreciate it, but logging off before I can find out.

You can find all four filters under the “Face” tab on my Instagram: @ridguerra.

Source: https://arvrjourney.com/creating-filters-on-spark-ar-that-unveil-thoughts-about-ourselves-irl-b6741d2b7262?source=rss—-d01820283d6d—4

spot_img

Latest Intelligence

spot_img