01. Juni 2021, 14:30 Uhr | Tobias Schlichtmeier
»GeForce RTX 3080 Ti« and »GeForce RTX 3070 Ti« are basis of new games.
Nvidia's Jeff Fisher and Manuvir Das flew in virtually to Computex in Taipei. Using Microsoft's Flight Simulator, the team had specially created the city of Taipei with a GeForce RTX3080. You can find out which innovations Nvidia has in store for gamers and companies here.
Jeff Fisher, senior vice president of Nvidia's GeForce business, announced two new gaming GPUs – the »GeForce RTX 3080 Ti« and the »GeForce RTX 3070 Ti.« They are said to be the basis for gaming of the future, but equally intended for designers and students. RTX already accelerates over 130 games and apps, more games will follow, for example Icarus or Doom Eternal. According to Fisher, the RTX 3080 Ti is 1.5 times faster than its predecessor. It will be available starting June 3 at prices starting at $1,199, while the RTX 3070 Ti will be available starting June 10 at $599.
Manuvir Das, head of enterprise computing, announced the certification of new servers for use with Nvidia's AI Enterprise software, among other announcements. Thus, the company is expanding its Certified Systems program to now include more than 50 systems. Essentially, three areas are to be strengthened at the GPU manufacturer: the hardware base, the software platform for artificial intelligence (AI) and the software platform for collaborative design.
To help system builders create AI-optimized designs, the company has launched the »Nvidia-Certified« program. It is aimed at servers with GPU acceleration, Das explained. »It's time to democratize AI by making its power accessible to every business and its customers«, Das said.
With the »Base-Command« platform – a cloud-based development platform – companies can easily implement AI projects. For example, the software is designed for large-scale AI development workflows with multiple users and teams, hosted either on-premises or in the cloud. It enables researchers and data scientists to work simultaneously on accelerated compute resources, helping organizations maximize the productivity of both their developers and their AI infrastructure.
Google Cloud is among the first cloud service providers planning to enable Nvidia's Base Command Platform for managing and orchestrating clusters in their cloud instances.
Base Command is available now through a monthly premium subscription offered jointly by Nvidia and NetApp. The platform with NetApp applications includes access to Nvidia's »DGX SuperPOD« AI supercomputer and NetApp data management. Google Cloud also plans to add Base Command to its Marketplace later this year.
Many manufacturers already offer x86 servers – including Advantech, Asus, Dell, Gigabyte or Lenovo. As of now, there are over 50 servers that are certified to run Nvidia's AI Enterprise software. They are primarily used in mainstream data centers. This is how the company brings computing power to various industries - healthcare, manufacturing, or retail, for example.
With the servers, companies can support workloads in data centers and hybrid clouds. This includes running AI Enterprise Suite - an AI and data analytics software on VMware vSphere. Also, from Omniverse Enterprise, an advanced design simulation software, and Red Hat OpenShift for developing AI systems. The servers also integrate Cloudera Data Engineering for machine learning to train models in a short time.
Several server manufacturers are announcing new systems with Nvidia's BlueField-2 data processing units at this year's Computex. If systems with Nvidia GPUs are already available with the option to add BlueField-2 DPUs, many applications might not need a GPU, but still benefit from a DPU.
Several server manufacturers are announcing new systems with Nvidia's BlueField 2 data processing units at this year's Computex.
Servers from Asus, Dell or Gigabyte are suitable for businesses looking for the additional performance, security, and manageability they get with Bluefield 2. Servers running primarily software-defined networks – a stateful load balancer or distributed firewall, for example – software-defined storage or traditional enterprise applications will benefit from the DPU. With it, they can accelerate, offload and isolate infrastructure workloads for networking, security, and storage.
Systems running VMware vSphere, Windows, or hyper-converged applications also benefit from including a DPU, whether they are running AI and machine learning applications, graphics-intensive workloads, or traditional business applications.