Pratima H
INDIA: Most of us have spent the better part of our childhoods making Super Mario jump from one ladder to another, slaying enemies on the way, chasing pots of gold, bouncing high with bonus rewards and trying hard to evade ambush attacks that suck up its energy. What for? Let us know if you figure out some day.
But on a fascinating second thought, Mario, if reincarnated in a new avatar, would not be traipsing through those jungles or dungeons anymore. Today, the backdrop would have to be a topology of clouds.
After all, think of anything, yes almost anything from desktops, tablets, apps, documents, selfies, air tickets, presentations, songs, movies – anything – today is parked on some cloud. Why should graphic heroes be left behind then? But Clouds, ah, can steal points with the same agility with which they can shower them. Is it easy to hop on to the silver lining while skipping and averting slippery floors?
An interview with Rohit Biddappa, Head of Enterprise Marketing at NVIDIA, helps us gather the game better as we understand how graphic acceleration, virtualization, cloud, application virtualization, and new chip-level trends as well as questions on Moore’s Law are redefining the plot of Graphic computing as we know it.
How has Nvidia shaped across this new computing landscape that has surfaced in the last few years?
As you know well, we are a visual computing player. From the 1990s when we brought in breakthroughs with GPU (Graphic Processing Unit) to parallel tasking to current advances in Cloud, we have come a long way. Apart from leaps made in the gaming side and for consumer markets, we have also showed enterprise markets what we can do with Quadcore processors for CAD (Computer-Aided Design), CAM (Computer-Aided Manufacturing); and rich design capabilities. In the recent change that the industry has gone through, a lot is happening around graphics in Cloud. Some years back only mainstream applications used gaming GPUs, but a lot has changed. We have been putting lot of efforts into R&D and collaborations (into hardware realms like with VMware or Citrix) into new advancements like bringing grid into mainstream market.
Graphic on Cloud – why?
NVIDIA invented the graphics processor or GPU. It has now pioneered the vGPU or virtual graphics processing unit technology that enables hardware sharing of graphics processing. The end user would no longer need a workstation and can use graphics from any thin client. Sitting at the comfort of any PC, Tablet or phone, users can work on graphics and collaborate through Cloud. Some businesses (like a car’s design which is confidential) can also navigate tough IP security issues by allowing visibility on the work’s progress but no access that can jeopardize IP. Grids work well for enterprises which see benefit for private cloud. Graphics virtualization and rising demand for greater compute power in datacenters and workstations point towards the premise of graphics virtualization and low efficiency of current VDI options with differing workloads for multiple enterprise users. Moreover, all types of information and data are becoming graphics-intensive, leading to challenges in virtualizing compute resources without compromising on performance.
What do you bring to this new table?
NVIDIA has been trying to plug the gap and by virtualizing the GPU, is delivering greater compute power to enterprise users without any significant hardware investment. It is a more efficient and scalable form of desktop virtualization, and helps businesses eliminate silos in IT access. We have partnered with companies like VMWare to enable graphics-accelerated desktop virtualization as a service, or Google to integrate the technology in future Chromebooks for the enterprise. Our proprietary solution – GRID provides instant access to powerful applications and collaboration on the go. I strongly feel that with the support of NVIDIA GRID vGPU with VMware Horizon 6 built on vSphere, the economics and scalability of the virtualization solution are even better.
Would Indian market welcome this well?
We started looking at India market with this early last year and it is still at early stages but is showing lot of potential. We have several pilots lined up in India and are at evangelizing stage now. Typical cases have a three-year refresh cycle of workstations that might be looking at an infrastructure concept like this. We feel that companies will go about in a phased manner and those with IP issues could be early adopters.
Limiting access serves security but does it not adversely affect collaboration?
There is enough scope for reviews, and collaboration. But once a user logs off, data is off limits. This also saves from dangers of lost devices, thefts etc. It is compatible with all major visual apps and allows for data usage in a collaborative context.
Where does application virtualization fit in with the way consumption models are changing?
VDI captures the move for applications on desktops, and they are being used for general business apps. Grid is a different level of abstraction, and this is because visual computing is a different ballgame. You can virtualize large data sets of complex nature quite easily today.
Is Grid the same as what Supercomputing has been fiddling well with?
Both of them are related but not directly connected. For instance, a Tesla vehicle can be a complete solution about combining large data sets very quickly but some applications like car design may use different kinds of computing.
So many new buzzwords have started dotting the chip segment, like HBM (Hybrid Bandwidth Memory) where Samsung or Hynix or AMD have a headstart on, or like High Memory Cube where Micron is taking a leap or Intel’s TSV (Through Silicon Via) work. How is Nvidia grappling with this new topology and what context does the upcoming Pascal architecture take here?
Nvidia has been focusing on technology in a specific landscape of visualization. We have identified Cloud as a major driver and Grid as part of the solution. We would be launching a SaaS gaming model for consumer markets and things like 4K TV products or convergence devices. Now we are moving strongly and briskly towards complete solutions instead of just compute ingredients. Examples like Self-driving cars or Tesla’s offerings or Machine learning/cognitive abilities in advanced automotives prove that point.
What about the debate on relevance of Moore’s law in current times? Is software replacing hardware there?
That is a moot point. Software today is an always-connected ingredient. Today’s apps are complex, powerful and huge but hardware is still a crucial factor. I feel that optimization of software without hardware is not possible.