Categories
Uncategorized

XR.3

For this week’s project I made an AR experience using Adobe Aero.

A still from my construction of the AR experience.

My idea was to create a card that you could send to your friends who are ending their Saturn return this year, which includes me. It’s a very monumental time for the world of astrology, because Saturn is leaving its home sign of Aquarius. I saw that there was an existing Saturn asset, so I had to do this as the experience.

You can check it out if you have the Adobe Aero app on iOS, using my QR code below.

Basically, when you approach the Saturn, it disappears and reveals some golden mathematical objects floating in space. Then if you tap some of them, there is a sign that appears to congratulate you on enduring another transit of Saturn.

I thought it would be cool to have them float around in a “trippy” way as it suited the cosmic vibe of the card.

One issue I had was to hide some of the assets until I wanted them to appear. However, I decided to just hide them inside of the Saturn instead of making them appear from nothing. This seemed to be a suitable workaround for me.

Enjoy!

Categories
Uncategorized

Personal Project Musings

I have been thinking about personal projects since I started this program. I’ve been particularly inspired by Steve Hansen’s presentation about 3-D. I think I want to take on some project that involves using 3-D modeling to create 2-D still images, aka single frames or shots from a 3-D scene. In this way, I would be using 3-D in a similar way to Steve, where it is a tool for a different graphic end.

I am not sure of the actual medium of the final artifact, but I think it may be a zine or a poster. Something in print. I’m wanting to explore the techniques of 3-D modeling software. So this project would be an excuse to learn that software, but with a context and a purpose that would orient my self-learning. This context and purpose would be something that I get to choose that is tailored to my personal interests. So it would be free of the normal constraints of a school project, or a client’s brief, hopefully inducing more creativity.

I think it would take about a year to execute this project. Because I’d need a good amount of time to plan out what I want to do at first, then have time to learn the tools, and then to execute, then time to modify my execution as I run into inevitable problems.

Other than time, I’d need to acquire the necessary software and tools, and also materials for print. I think it’s a great idea though and I want to try and do this for special projects next year. In the meantime, I’d need to figure out what the topic would be. There’s a lot to figure out.

Categories
Uncategorized

XR.2

This week for my AR/XR module I reviewed the Google Lens/Google Translate app. Google Lens is a new app from Google that basically utilizes all of Google’s capabilities with the phone camera to interact with the world around you in helpful ways.

The Google Translate feature of this app allows you to point your phone to written text in the real world, and have the translation appear on your screen superimposed (covering up) the IRL text.

I found the app very sleek and user-friendly, which is something I have come to expect from Google’s apps. At least, I found this to be the case for the Google Translate feature of Google Lens. I don’t know about the other features because I did not try them out.

One of my favorite parts of this app was how intuitive the usage of AR was. There wasn’t any frills – it was just point it at the thing you want translated and see the translation. To me, this has a clear real world advantage of using AR over the vanilla Google Translate app. When you are traveling in a foreign country, you may not have the language skills nor the technology to input foreign orthographies into your Google Translate app. Being able to point your phone at what you want translated is such an elegant, precise application of AR. This made me happy, because I think that many AR apps can feel contrived, or simply aesthetic (not that there’s anything wrong with that, but I personally am more moved by the functional nature of this app).

I can imagine that for someone who isn’t already familiar with the basics of AR, such as the fact that you can point your phone at anything at all, that this kind of technology may be confusing. There is no tutorial for the totally new user. Since I am already familiar with the general concept of AR (my first experience with it was on the Nintendo 3DS, probably back in like 2011), I understand that you have to point the camera at something and sometimes wait a little bit for the software to do its thing. If you didn’t know that, you might think that nothing was happening, and move the camera away too fast. To account for this type of user behavior, the app would benefit from a tooltip that informs the user to wait until the translation is complete.

Categories
Uncategorized

XR . 1

For the first assignment in the XR module of New Media, I chose to implement a scannable animated emoji in EyeJack.

This is the initial still from the emoji. When scanned in EyeJack, it animated to reveal that it is the kissyface emoji and plays a kissing sound effect.

I landed on this idea while thinking of what I could do that would be a simple showcase of the EyeJack AR app’s capabilities of scanning a flat image to then animate it. I chose to do this because I like how you can’t tell what it is from the still. It’s kind of cheeky. Like you’re getting a little delightful surprise from an unassuming smileyface.

I created the animated GIF by drawing each frame and compiling in Photoshop.

This project went smoothly without any hitches. I was actually quite surprised at how easy it was to upload, and how well the AR worked.

Here’s the QR code to be viewed with your EyeJack app.