Qi Xin Yi Wei founder and CEO Huang Tongbing

For VR, interaction is an important innovation, and GPU performance is an important bottleneck. For the former, there are still many startup companies releasing new interactive systems, and the latter is where GPU giant Nvidia needs to work.

At the GTC China 2016 held at Nvidia on September 13th, Qixin Yiwei, a domestic startup with eye tracking technology, released the first eye-control peripheral for HTC Vive and applied the technology to VR. Above.

This product, known as the world’s first VR eye accessory, brings two major functions to VR: eye-control interaction and gaze point rendering. The former brings new interaction to VR, while the latter improves GPU through new rendering methods. Rendering efficiency.

Eye control: an upgraded version of the gyroscope?

The earliest Cardboard was very simple in interaction. The user turned his head to move the screen and moved the cursor to select an object. This was achieved using a built-in gyroscope. The addition of eye tracking allows this interaction to be upgraded. In addition to turning the head, the user can also choose by looking at different objects in the field of vision while the head is still.

Qixin Yiwei Eye Control VR Accessories

In order to track the user's eye movements, Qixin Yiwei introduced this accessory built into the HTC Vive, which is a product superimposed on the helmet lens. Each pair of devices has a sensor (camera) and is designed below; an infrared fill light surrounds the lens because the VR helmet is fully enclosed; it also has a USB Type-C interface for power supply. And data transmission. The reason why the camera was designed to be in the bottom, according to Huang Xinbing, CEO of Qixin Yiwei, revealed that it was because he wanted to use the Fresnel lens helmet. Its concentric circle texture would interfere, so it was done below.

In terms of tracking, this product supports full field of view tracking, can cover the entire VR display screen; has 220HZ rendering frequency, low latency; while supporting the superposition of myopic lenses, enabling users to play without wearing glasses, while avoiding glasses Impact on eye tracking.

In order to improve the availability of eye tracking, the company introduced deep learning techniques in algorithms, according to Huang Tongbing.

Prior to deep learning, the traditional methods of machine learning such as SVM did not perform well because many of the features were artificially set. Actually, many situations may not be the case. The advantage of deep learning is that the machine can automatically learn, without limiting the features, so that training based on a large number of samples can get a very bovine model. This model can be used for human eye tracking.


Previous technology can recognize the availability rate of the human eye at 95%, and 10,000 individuals may not be able to use 500 people. Therefore, it is not enough to use it at the consumer level. Special industries can. To use eye tracking to the consumer level must achieve higher availability, deep learning can increase its availability from 95% to 99.9% or even higher.

Developers pay more attention to gaze rendering

Although the eyeball interaction sounds very cool, the actual use is not as natural as the capture handle + space positioning. As mentioned above, it is more like an upgraded version of the gyroscope. In fact, the developer's expectation of eye tracking technology is more of gaze point rendering. Huang Tongbing said that the developer's attention to the function of their system is: gaze point rendering, interaction and eye movement analysis.

The gaze point rendering (also called focus rendering) uses the near-infrared sensor to track the human eye, to determine the fixation point of the human eye, and only performs high-definition rendering of the gaze area, and this area changes with the fixation point. This technology further narrows the rendering range based on multi-resolution shading technology, which greatly improves rendering efficiency.

Look at the effect of rendering

Huang Tongbing said that after the VR viewing house developer used the system, “the scene that the 1080's graphics cards could run smoothly, coupled with eye tracking may use 980 to run, which on the one hand greatly reduces their hardware costs. At the same time, visual effects can be improved, and various effects such as complicated lighting can be applied." Of course, the actual effect depends on the developer's different circumstances.

The need to reduce GPU performance needs should be mobile VR, and mobile VR needs to consider the balance between power consumption and performance. Huang Tongbing said, "To achieve a mobile VR, the principle is similar, the structure needs to be changed. There is a limit on the power consumption of mobile VR, so this will generally reduce the frame rate properly, in fact, can also run full frame, but the move The frame rate of VR itself is low, so there is no need to run full frames."

According to reports, the power consumption of this product with monocular tracking is less than 500mW. If it is working in power saving mode, it will be lower. The request for processing performance Huang Tongbing said that Qualcomm's Xiaolong 820 is supported, and "the mainstream has no problem."

However, this product will not be sold in the consumer version. It is expected to land in JD.com crowdfunding in October, priced at around RMB 3,000, mainly for developers. The consumer-oriented products will be launched in cooperation with helmet manufacturers and built into the latter products. The specific product launch time is still to be determined, but it is learned that Qualcomm's all-in-one solution will focus on pushing this technology, and domestic friends and 3Glasses are also very active.

USB German Power Strips

Power Strip With Usb Ports,Usb Charging Strip,Power Strip With Usb Charging Ports,Multiple Usb Power Strip

Yang Guang Auli Electronic Appliances Co., Ltd. , https://www.ygpowerstrips.com

Posted on