As I’ve mentioned several times before, I am a student worker at my university’s library. In just four years, I have seen plenty of changes come about. The computer system my department uses has changed, our methodologies for certain tasks has changed, and my department has completely moved spaces in recent months. Right before I went to college, a café opened in the library, the full collection of books was moved between floors, and a computer lab was put in.
All in all, the library has seen a lot of change. In my time, I haven’t particularly paid much mind or attention to these changes. However, there is one area of change that has really captured my attention.
Last summer, around a fifth of the shelf space for books and journals was removed, with books being replaced and reorganized into a smaller space. It was strange returning back to school last fall, as there was a huge empty space in the area where I usually worked. I was curious about why so many shelves were being removed, but didn’t think too much about it. A classmate later told me that the space was being cleared for the creation of a virtual reality center.
So, in the months since, I have watched as construction has slowly advanced in this area of the library. Walls and supports were constructed after the area was taped off from students, reaching the point where the entire area was closed off. I held out hope that I would get to see the virtual reality technology before I graduated this spring.
The virtual reality center was opened between semesters, but mostly for department and student workers. Yet, after sending a few emails back and forth I paid a visit to the Tennessee Technological University HIVE last week.
HIVE stands for Hybrid Immersive Visualization Environment. The area is part of a brand new initiative intended to facilitate research in a number of areas among student and faculty alike. While visiting as part of a group, I was able to see some of the technology and research firsthand.
As a heads up, I won’t be able to show you what the demos looked like, as some were made in-house at the HIVE. Plus, there’s no way to recreate the feel of virtually reality, but I’ll try to describe it as best as I can.
After hearing so much about the Oculus Rift the past few years, with its promise for affordable virtual reality, I finally got to experience it for myself. I was shown a tech demo created in Unity and Maya by the HIVE’s interns and student workers in collaboration with the Tennessee Aquarium in Chattanooga, Tennessee.
I was informed that the HIVE’s Oculus Rift was the latest, most updated model. I didn’t know what to expect with the headset, as I had never used one before. Overall, it was a comfortable fit. The head straps felt fine and the device itself wasn’t very heavy, so prolonged use didn’t seem like an issue on the neck. I arrived earlier than the standard tour and was given more time with the device. Though I only used it for a few minutes, I didn’t feel light-headed or dizzy at any moment. However, the demo environment did look blurry and out of focus for the first few seconds I put on the headset, but this was alleviated fairly quickly. I’m not sure if my eyes simply needed to adjust to the Oculus or if the attendant modified the settings.
The demo started out in a sunny park enclosed on all sides by green mountains. There was a looped stream in the middle of the park, with a small hill in the center and lush trees scattered about. The attendee dropped me in the water, where the movement was automatic and I simply floated along. There were large boulders in the water, various flora, as well as a variety of fish swimming about.
What impressed me the most was that I had completely free movement to look around in the water. Fish swam right by me and I could turn my head and track their movements. I looked up and saw the sun shining through the shimmering water and traces of the blue sky above. It was an odd sensation as I looked around, because I felt like a floating head. I looked down and around and didn’t see my body.
The demo will eventually be used in the Tennessee Aquarium as part of an exhibit for children to explore. The purpose of the demo is to show the effects of pollution on the environment. When I was floating through the stream, the attendant altered the state of the water, making it murky and filled with algae. She clarified that it was not the latest version of the demo. In the most recent version, sound had been added, the wildlife were all native to the Nickajack Lake near Chattanooga, Tennessee, and the effects of pollution were more visible, with fish even dying and floating to the surface.
Moving on, I joined a group and saw a demonstration of the VisCube in action. I hadn’t heard much about this piece of technology, so I didn’t quite know what to expect. We were taken into a side room, which was totally dark, and received a ten minute demonstration. The VisCube is composed of four large screens, three on each wall and one on the floor. We were informed that Tennessee Tech is the only institution in the state of Tennessee with a VisCube.
As we entered the room, we were each given a pair of glasses. These were similar to the 3D glasses usually given out for a 3D movie, but felt much sturdier. I wear glasses, but was able to comfortably fit the 3D glasses over them. One pair of glasses was different from the rest and had sensors across the top.
When stepping into the VisCube, the second pair of glasses controlled the perspective of the image on the screen. You could move your head, like the Oculus Rift, and the image would change. The people with the first set of glasses saw a 3D image of what the person in the center saw. In other demos, the 3D environment was static and the glasses were instead used to look around, but not change, the environment.
The first demo we saw was a 3D model of the human heart. The person on the platform moved around and could look at the heart from different angles, while everyone else was given a view of what he saw. By leaning forward, the man could actually stick his head inside the heart and see all the inner parts, the atriums, ventricles, etc. The attendant said another version of the demo had the motion of a beating heart and you could see the blood flow.
The second demo was a recreation of Derryberry hall, the main building on Tennessee Tech’s campus. Student workers had accurately recreated the surrounding outside area and inside halls of the building. They did some very impressive work and it looked real to life, with the hall proportions the same and the rooms all accurately laid out. The attendant noted that bumping had not been added to the demo yet, resulting in an airy feel to it. Bumping refers to the weight of the environment. For example, if the demo had bumping, there would be motion as you walked up a set of stairs and the walls would be physical. As it was, you could actually walk through walls in the demo.
Later, we were able to talk to an intern working at the HIVE. As a computer scientist, her project for the semester was to create bumping for the Derryberry Hall demo in Unity. I’m not in computer science, but it looked like a lot of hard work. She’s doing some very impressive work.
I was given time with the VisCube during the third demo. This demo was an ingame map from Quake, composed of the environments without any characters or items involved. She asked me if I had motion sickness and I said I didn’t. I usually don’t, but I started to feel a bit light headed in this demo. She moved me around the environment, up and down stairs and around corners, using a handheld controller that projected a white line directly into the screens. I assumed the device controlled movement through the map and moved the camera around.
When she stopped, I could freely look around. It felt like I had been dropped directly into the game. I peered around a street corner and could sense the depth between the wall and the street. She stopped me on a bridge and mentioned that I could look over the edge. So, I stepped forward and saw the ground beneath my feet. Incredibly, there was real depth between the ledge and the ground below.
Unannounced, she moved forward and dropped me off the ledge, about a twenty foot drop to the ground. The movement was incredible. Though I of course had my feet on the ground the whole time, it absolutely felt like I was falling. My body tensed and my legs felt weak as the avatar fell to the ground. I could feel my body involuntarily bracing for the impact, all in a span of a second or two. The demo had an amazing psychological impact.
This was real virtual reality. I was immersed in the game world and the environments felt totally real. Whenever I stood still, it felt like I was really in the game. Essentially, I had about a fifteen foot square of free movement in the game. Staring out into the distance, it really felt like I could start walking in that direction. I would soon hit a real wall, of course, but it certainly looked and felt real.
I traded glasses with another attendee and they were shown more of the map. I felt a bit envious, as the tour guide took the man straight up in the air, where they hovered a good seventy five feet above the ground. I wonder what that must have felt like. If my fall felt so real, what would it feel like to fly up and around the map? What would it be like to look down and feel as though you are high above the ground? Honestly, I wish I could have stayed in the VisCube all day.
When I left the HIVE after the demonstrations, I felt energetic, like I had just seen a small glimpse something bigger. The entire tour only took about twenty minutes, but felt like it had been much longer. Before the tour, I had only seen brief glimpses of virtual reality. I remembered when Avatar first came out and I was hyped for months beforehand. The plot was nothing special, but it did what no film had done before and fully showed off what a 3D film could look like. The film was so engrossing that it felt like I was really there. The world was so lifelike and real that it was on my mind for days.
Visiting the HIVE was a similar experience.
Before visiting the HIVE, I hadn’t really considered the dual utilization of virtual reality. After watching demos and hearing people’s reactions, I was excited to see how the technology could be used for video games. I was aware of how it might be utilized in other areas, but hadn’t really thought about it in that regard. Visiting the HIVE gave me a great glimpse of how this technology can be used for research and education.
The VisCube could certainly be utilized by medical students studying organs and the human body. One demo that we didn’t see was similar to the heart demo, but on a larger scale. It featured the whole human body. Similar to the heart demo, someone could look at a body from all angles and even stick their head inside for a closer look to see all the inner workings of a body up close. Something like that is more effective than looking at static images in a textbook and much cleaner than dissecting actual organs for study.
When going through the Derryberry Hall demo, we were taken through the main lecture hall. They told us that it was possible to stand on the stage and fill all the seats with human avatars. Imagine being able to practice a speech in front of a realistic crowd in preparation for an actual speech. That could certainly help with stage fright, no doubt.
I think it is great that a demo is being prepared there for use in an aquarium. With it, children can see the realistic effects of pollution on an environment. Combining fun and education can be a great means of promoting learning. As realistic as 3D technology can look, it provides an excellent medium for education. Hopefully, the Oculus Rift fulfills the promise of cost-friendly, efficient virtual reality.
Aside from the vast educational implementations of this technology, I couldn’t stop thinking about how it could be used in a video game. I would love to see other maps similar to the one from Quake. I wouldn’t care if they were completely empty. It would be an absolute joy to walk through the environments and take in the beauty.
While there has been controversy over whether some games are more of a casual experience than something interactive, this technology is begging for more experiences. Imagine something out of Bioshock, walking through Rapture or Columbia, looking around and seeing real depth, either in a submerged city or a moving city in the sky. Or, how about an open world like Skyrim? Exploring haunted houses probably would be too realistic for me, but would provide a challenge for those brave souls out there. Or, how about something like No Man’s Sky? Exploring the cosmos and distant worlds with 3D depth and realistic environments.
I realize that technology like the VisCube is still too expensive for the average consumer, but hopefully virtual reality will become cheap enough for more people to fully enjoy it. As such, it could take years for something like the VisCube to be utilized in home video games, due to cost and size. Still, the video game industry is a rapidly changing one. Graphics today look leaps and bounds over the games from twenty years ago. Who’s to say we won’t have something like the VisCube in every home?
After seeing the research being conducted at the HIVE, I am excited to see how it will continue to be used in the future, both in education and entertainment. The sky’s the limit for how this technology can be utilized.