Widgeting Around - Project 1
This page details my experience building the first project of course CS 428 at UIC called “Things Will Never be the Same”. You can find also find a video demo of the application at this link.
The project involves building and displaying certain augmented reality widgets that display useful information to the user. The widgets are displayed over image markers from the Vuforia Mars dataset. The widgets include -
- Date widget over the Astronout image marker
- Time widget over the Drone image marker
- Temperature and humidity widget over the Fissure image marker
- Wind speed and direction widget over the Oxygen image marker
- Weather condition widget over the Mars Box - Front image marker
These widgets have their own models (made by me) that are displayed over their respective image markers. The information for the date and time widgets are obtained from the system that the application is run, whereas all the weather information is received from periodic calls to the OpenWeatherMap API.
Running the Code
For running the code, you need the following on your computer -
Once you have satisfied the above requirements, go to the GitHub repo for the project and clone the project. Obtain copies of the above mentioned image markers used in the project. Open the project in the Unity editor and find the
RequestHandler game object in the scene. You can add your OpenWeatherMap API key in the API Key field in the inspector for the
RequestHandler game object. Without this key, the application will not be able to query weather data from OpenWeatherMap. Once these steps have been followed, the user can enter Play Mode in Unity to run the application.
As mentioned above, the time and date widgets get data from the system that the application is running on. The models are a calendar-like object that displays the current date and a capsule-shaped clock that displays time. The application has a functionality to show the information in either imperial (default) or metric units based on the user clicking the U key. The date is shown in MM/DD/YYYY format for imperial and DD/MM/YYYY for metric.
Next is the temperature and the humidity widgets. The temperature is displayed using a bulb-like thermometer filled with red mercury depending on the current temperature, whereas humidity is shown as a beaker that has water filled based on the percentage of humidity. You can press U to switch from Farenheit (imperial) to Celsius (metric).
The windspeed and direction widget is represented using a flag post. The height of the flag on the post varies depending on the wind speed, whereas the direction of the flag varies depending on the direction. To get an accurate direction, place the top of the Oxygen image marker in line with North, because the flag orients itself relative to the top of the image marker. Windspeed can be mph (imperial) or m/s (metric).
The weather condition widget displays one of 9 different weather conditions listed here. Each weather condition has a separate model to represent that condition, as well as a unique, looping sound effect taken from SoundBible. The sound decreases as the distance from the model increases. Some of the models are animated, using particle systems to achieve effects like rain, snow, and fog. The different weather conditions can also be toggled through using the Space Key after the current weather condition is initialized, in which case, the user will lose track of the current condition.
The Use of Widgets in a Future with AR Glasses
- There is a lot of potential for widgets in a future where everyone is wearing AR glasses. Imagine, instead of checking your phone everytime, you had an augmented widget in the corner of your vision that always tells you the time. This can easily be imagined for any similar types of information such as date, weather etc.
- Another useful widget to have would be to read messages in an augmented window of your vision. This would pair up really well with speech recongnition technology that could be used to type a response back to the sender. It would look very weird in public when you see someone seemingly “talk to themeselves”, but in a way, this seems more doable, than an AI assistant that interacts with you the way Jarvis does with Tony Stark.
- Health and fitness tracking apps can benefit a lot from an AR upgrade. A lot of the notification ecosystem we have today depends on the phone, so if we can bypass it and have the notifications appear on an AR glass, we can make sure that a dependent user will not miss, for example, an insulin dose, or an antibiotic dose.
- A big space that could be disrupted is advertising. As a customer strolls through the street and looks at various signs and logos, you could use them as image targets to receive extra information about a business or a brand. This is the natural next step from notification-based advertising on smartphones.
- A lot of the existing productivity tools can be naturally enhanced with AR glasses. We rely on our phones or laptops to track things like to-do lists etc, which have an extra step of interacting with your phone which might lead to an unnecessary distraction. Having such productivity trackers augmented in our vision ensures that the trackers themselves don’t defeat their purpose.
- As always, one of the burning questions with regards to a AR-based notifications and widgets is interaction. How would the interaction be for each widget and how do we ensure that a widget doesn’t pop up at unwanted times, for example, while driving, and block out our vision?
- Some of the widgets, like time and weather, need to be at a specific point in the display and not rely on an image target or a plane to be initialized. It can be argued that this can be achieved without the use of AR.
- Having widgets initialize on an image target or a plane enables us to view all angles of the widget, including undesirable areas such as the backside of a solid object. This doesn’t help if there isn’t useful information on that side.
- The control room marker has not been implemented. This is because I was not able to detect a key press of a virtual button at any level of sensitivity. This was either due to the poor quality of my camera or because of an unknown error which I did not have time to fix. As a result, I have gone with a keyboard key to change the units from imperial to metric as opposed to a virtual button
- Any changes in the units will only reflect after the next response comes back from the API call. Since the
RequestHandlergame object takes 30 seconds between every API call, the user is advised to wait at least that long to be able to observe results
- Rain - http://soundbible.com/1999-Rain.html - by Pwlae
- Sunny Day - http://soundbible.com/1661-Sunny-Day.html - by stephan
- Outdoor Carnival - http://soundbible.com/2139-Outdoor-Carnival.html - by Daniel Simon
- Distant Thunder and Light Rain - http://soundbible.com/886-Distant-Thunder-And-Light-Rain.html - by Mike Koenig
- Monsoon - http://soundbible.com/948-Monsoon.html - by Mike Koenig
- Perfect Thunderstorm - http://soundbible.com/916-Perfect-Thunder-Storm.html - by Mike Koenig
- Snowing - http://soundbible.com/633-Snowing.html - by Mike Koenig
- Wind - http://soundbible.com/1810-Wind.html - by Mark DiAngelo