GTC 2014: Beyond the Keynotes
During our coverage, it wasn’t all just keynotes and talks that happened down at the GTC 2014. The event was also a time for all manufacturers and end-product distributors supporting the NVIDIA GPU line to show up and to push forward their ideas and newest innovations to partners, enthusiasts and investors.
In between keynotes, we took the time to explore this hallway and even see what these innovations were.
The Eye Tribe
Ocular or vision control has been around the market for some time. However, the problem with this fledgling technology has always been about its latency: the response time between our eyes and what we see has always hampered innovators to push further with the technology. The primary reason between this lag was simple: a lack of powerful hardware that could calibrate the machines efficiently.
Fast forward a few years, throw in a healthy amount of NVIDIA’s GPU technology, and we get a company like the Eye Tribe making headway with its eye-sensing technology. Before you ask: no, the product isn’t on the market yet. The product is available as a Software Development Kit (SDK) for a price of US$99 (approx. RM330). The Eye Tribe was accurate in its use of our eyes, allowing us to scroll through a webpage merely by looking up and down of the page, and more importantly, without any lag.
What really took the cake for us was how you could actually play games with the Eye Tribe. The representative showed a game of Fruit Ninja and played using nothing but his eyes. Because he was cutting the fruits with his eyes, he didn’t even miss a single fruit! Talk about deadly accuracy.
The Oculus Rift
Yes, the gaming world was gripped by the shocking news of Facebook’s sudden and curious acquisition of the Oculus Rift. Even more surprising was the sudden absence of its CTO, John Carmack, from the event altogether. But that didn’t stop the company from exhibiting its VR headset.
What we tested at the Oculus VR booth wasn’t the first generation Oculus Rift either. This was Crystal Cove, the second generation devkit, which actually has a projected display of 960 x 1080 resolutions for each eye, which is a total of 1920x1080 for both eyes. Simply put, where the first generation Oculus Rift only had maximum resolution of 720p, Crystal Cove has a maximum resolution of 1080p, which is Full HD.
Our time with the demo units gave a chance to play a little mini-game with another tester, where we went about with each other using sword-and shield using characters. Sadly, we couldn’t actually record our gameplay, but you’ll have to take the word of our writer, who was absolutely smitten by the device.
There’s no word as to when the Oculus Rift will officially be available to the world, but we believe it will be a while longer before the company makes an announcement. That being said, we’re definitely still holding onto our seats.
Tegra and Electric Cars
The Audi that rolled out up on the stage during Jen-Hsun’s keynote wasn’t just a show-and-tell event. In fact, it’s not just Audi that’s actually making cars that are fitted with NVIDIA embedded systems.
After the keynote, we were shown the way to the outside of the convention center, where we got the chance to sit in one of the few cars partners that have signed up with NVIDIA. There was Audi, BMW, and even Tesla. Now, we’re not bias, but it’s not every day that we get a chance to sit in a car full-blown electrically powered car (No, hybrid cars do not count. That’s why they’re call hybrids) like a Tesla car.
Unfortunately, our writer couldn’t drive the car, chiefly because he didn’t have an international license. Luckily, there was an on-sight handler who gave us a ride. The model which we rode in was the Tesla P85. The inside was extremely spacious and because the car is fully electric, it was dead silent. We’re not joking. When you get in the car, there’s absolutely no indication that the car is actually operational. All you get is a dashboard that lights up once you start the ignition, and that’s it.
At this point, some of you are thinking “Oh, it’s an electric car. It’s definitely not a fast car.” Well, we’ve never been so glad to prove all the skeptics wrong. The model we sat in, the Tesla P85, was tuned so that it could generate 485HP (not brake horsepower), which gave it a top speed of 269 km/h. In short, the car was pretty much on par in terms of speed and power with the Aston Martin V10 Vantage or the Bentley Continental GT.
It goes without saying that every piece of electronic from the windshield wiper to the car’s dashboard is powered by NVIDIA’s Tegra K1 mobile computer. The most notable piece of electronics, however, was its built-in control panel. At best, we were expecting a panel size no bigger than two 5-inch phablets besides the driver. Instead, we were greeted by a huge capacitive display panel that measured in at approximately four full-sized iPads. From it, we could operate the on-board GPS and navigation; activate its mobile data system and a whole lot of other programs.
Now, with all that being said and done, here is the bad news: the cars and electric technology that we witnessed and experienced that day will not, sadly, be coming to our shores anytime soon. But don’t give up hope just yet, because keep it in mind, Audi is still one of the many automobile manufacturers that have a strong presence in Malaysia and could very well push for the car’s existence in the country.
The Rise of Mobile Supercomputers and Driverless Cars
While still touching on the subject of cars, it is at this point that we would draw your attention away from those electrically powered cars and, for the moment, steer towards another application for the NVIDIA Tegra K1 SoC.
It was specifically after the keynote on Day 2 that we were given a special (if not summary) tour of the company’s endeavor into ADAS (Advanced Driver Assistance Systems); Facial Detection and Driver Monitoring; and the UI Composer Studio with Material Definition Language.
If this sounds all Greek to you, stay with us: we’ll explain each one to the best of our abilities. Let’s start with the first one: ADAS.
ADAS, in a nutshell, is essentially as its name suggests: it is a program designed to assist the driver on the road in real-time. The concept and idea has been propagated in the automobile circle for a while now. To be precise, it’s been around since the idea was first implemented by Mercedes between the year 2005 and 2006.
Over the years, the technology has evolved; companies have advanced the idea and made more intuitive and extremely adaptive to the road. In the demo shown to us, our handler showed how they had made ADAS a system that could tell you certain things about your driving; if your car veered to one side of the road too much, the outline would alert you to it. As another example: if you were driving in a 50mph (that’s approximately 90km/h), the system would capture and image of the sign and plaster it onto your dashboard interface.
At this point, we should remind you that ADAS is merely a system that detects the cars in front of you and how fast you’re actually supposed to be driving within the zone. The question is: how does it manage to even detect all this?
If your answer was “through a camera,” you are correct. This was essentially the net part that our minder presented to us. Dubbed the Facial Detection and Driver Monitoring Module, this module can best be described as the eyes of the entire ADAS system. Funnily enough, the module was also shown to serve a more driver-oriented purpose. As our minder stood directly in front of the camera module, we noticed that some lines popped up directly on his face (On the screen, not his actual face). The lines, as explained to us, indicated when the subject’s eyes were on the road and when the subject’s eyes were facing elsewhere. The detection rate was astounding, with minimal amount lag in between each time.
The third part of this system actually bordered more towards a cosmetic function for the car’s dashboard. With this section focusing on the design and outlook of the vehicle’s dashboard and panel interface, this demonstration was a direct mirror to what we had seen inside of the electric car we had ridden in earlier.
There’s a reason why we’ve mentioned all this. Remember how we mentioned that during the first day’s keynote, Jen-Hsun Huang brought out a driver-less car? Well, the culmination of all that technology actually resulted in the self-driving car, which you see below:
Yes, that is the exact same car which drove itself on stage and yes, the car is using all that aforementioned technology.
Again: Audi isn’t the only car company that is taking part in this NVIDIA initiative. Hyundai, Aston Martin, BMW Tesla, and Bentley are just a few of the automobile manufacturers who are working together with NVIDIA in the development of driver-less cars.
TITAN Blacks, Gaming Notebooks, and Framerates
We’ve mentioned it time and time again: it’s just not an NVIDIA event without the mention of display of the company’s graphics cards.
Aside from the announcement of the TITAN Z on the first day’s keynote, we spotted something else that was equally as impressive. It was obviously a gaming rig, but this particular gaming rig was fitted with four TITAN Black graphics cards. As a quick recap: the TITAN Black is the successor of the TITAN graphics card that features some of the same performance tweaks found on the GeForce GTX 780 Ti.
The rig was demonstrating its prowess by running a racing simulation flawlessly on three 50-inch Full HDTVs. Obviously, there was no lag from it, considering the power. But what was most impressive about the rig was that even with all the ambient noise in the background, the rig wasn’t even making any loud noises, despite the fact that it was running a full load in the graphics department.
Aside from that, NVIDIA also showcased the latest notebooks fitted with the latest GeForce GTX 800M discrete graphics. Tucked away in one corner, we came face to face with Alienware’s newest gaming notebook, as well as MSI’s GS60 Ghost Ultrabook. To show off their graphical prowess, both notebooks were set to run the newly released multiplayer game, Titanfall.
As an extra added bonus, we actually got a chance to play the game, but we didn’t play it on the notebook; we played it on the NVIDIA SHIELD. This was the first time in HardwareZone Malaysia's history that we actually had a chance to field test the device’s game streaming functions and frankly, we were not disappointed. Controls were sharp, the lag in-between the transmissions were virtually non-existent. Unfortunately, all these gaming boons didn’t change the fact that our writer didn’t know how to play the game properly and in turn, lost more than a couple of matches in-game.
NVIDIA also exhibited some of the PC displays, which were fitted with G-Sync technology. We won’t get too much into it here, but for those of who are unfamiliar with the company’s technology can read up about it from our previous coverage here.
And with that, this article pretty much sums up some of the more interesting events that happened in between each keynote. The NVIDIA GTC 2014 is definitely one of the many events we looked forward to in the year. With that being said, here’s to hoping that we get a chance to cover it again next year in April 2015.
You can check out our coverage of all three days here.
For more new from NVIDIA, please follow us here.