Nvidia

2022 - 9 - 21

Post cover
Image courtesy of "NVIDIA Blog"

NVIDIA Delivers Quantum Leap in Performance, Introduces New Era ... (NVIDIA Blog)

NVIDIA today unveiled the GeForce RTX® 40 Series of GPUs, designed to deliver revolutionary performance for gamers and creators, led by its new flagship, ...

The RTX 4080 16GB has 9,728 CUDA cores and 16GB of high-speed Micron GDDR6X memory, and with DLSS 3 is 2x as fast in today’s games as the GeForce RTX 3080 Ti and more powerful than the GeForce RTX 3090 Ti at lower power. The RTX 4080 12GB has 7,680 CUDA cores and 12GB of Micron GDDR6X memory, and with DLSS 3 is faster than the RTX 3090 Ti, the previous-generation flagship GPU. In full ray-traced games, the RTX 4090 with DLSS 3 is up to 4x faster compared to last generation’s RTX 3090 Ti with DLSS 2. [NVIDIA Omniverse](https://www.nvidia.com/en-us/omniverse/)™ — included in the NVIDIA Studio suite of software — will soon add [NVIDIA RTX Remix](https://www.nvidia.com/en-us/geforce/news/rtx-remix-announcement/), a modding platform to create stunning RTX remasters of classic games. Portal with RTX will be released as free, official downloadable content for the classic platformer with RTX graphics in November, just in time for Portal’s 15th anniversary. The RTX 4090 is the world’s fastest gaming GPU with astonishing power, acoustics and temperature characteristics. The Micro-Mesh Engine provides the benefits of increased geometric complexity without the traditional performance and storage costs of complex geometries. For decades, rendering ray-traced scenes with physically correct lighting in real time has been considered the holy grail of graphics. The - Shader Execution Reordering (SER) that improves execution efficiency by rescheduling shading workloads on the fly to better utilize the GPU’s resources. It can overcome CPU performance limitations in games by allowing the GPU to generate entire frames independently. “Ada provides a quantum leap for gamers and paves the way for creators of fully simulated worlds.

Post cover
Image courtesy of "TechCrunch"

Nvidia debuts new products for robotics developers, including ... (TechCrunch)

At its fall 2022 GTC developer conference, Nvidia announced new products geared toward robotics developers, including a cloud-based Isaac Sim and the Jetson ...

Click here to find out more about our partners. Find out more about how we use your information in our Privacy Policy and Cookie Policy. You can select 'Manage settings' for more information and to manage your choices.

Post cover
Image courtesy of "Bloomberg"

Nvidia Puts AI at Center of Latest GeForce Graphics Card Upgrade (Bloomberg)

Nvidia Corp., the most valuable semiconductor maker in the US, unveiled a new type of graphics chip that uses enhanced artificial intelligence to create ...

The top-of-the-line RTX 4090 will cost $1,599 and go on sale Oct. Other versions that come in November will retail for $899 and $1,199. Codenamed Ada Lovelace, the new architecture underpins the company’s GeForce RTX 40 series of graphics cards, unveiled by co-founder and Chief Executive Officer Jensen Huang at an online event Tuesday.

Post cover
Image courtesy of "Reuters"

Nvidia unveils new gaming chip with AI features, taps TSMC for ... (Reuters)

Nvidia Corp on Tuesday announced new flagship chips for video gamers that use artificial intelligence (AI) to enhance graphics, saying it has tapped Taiwan ...

ban on selling Nvidia's top data center AI chips to China. The Lovelace chips have extended that technique to generate entire frames of a game using AI. Nvidia designs its chips but has them manufactured by partners. The flagship GeForce RTX 4090 model of the chip will sell for $1,599 and go on sale on Oct. Nvidia has gained attention in recent years with its booming data center business, which sells chips used in artificial intelligence work such as natural language processing. Register now for FREE unlimited access to Reuters.com

Post cover
Image courtesy of "TechCrunch"

Nvidia unveils Drive Thor, one chip to rule all software-defined ... (TechCrunch)

Nvidia revealed a next-generation automotive-grade chip that will unify a wide-range of in-car technology and go into production in 2025.

Click here to find out more about our partners. Find out more about how we use your information in our Privacy Policy and Cookie Policy. You can select 'Manage settings' for more information and to manage your choices.

Post cover
Image courtesy of "Reuters"

Chipmaker Nvidia launches new system for autonomous driving (Reuters)

Chip giant Nvidia Corp on Tuesday unveiled its new computing platform called DRIVE Thor that would centralize autonomous and assisted driving as well as ...

"There's a lot of companies doing great work, doing things that will benefit mankind and we want to support them," Shapiro said. ban on exports of two top Nvidia computing chips for data centers to China. [read more](/business/autos-transportation/upset-by-high-prices-gms-cruise-develops-its-own-chips-self-driving-cars-2022-09-14/) Register now for FREE unlimited access to Reuters.com [(GM.N)](https://www.reuters.com/companies/GM.N) autonomous driving unit Cruise last week said it had developed its own chips to be deployed by 2025.

Post cover
Image courtesy of "Forbes"

Nvidia Cancels Atlan Chip For AVs, Launches Thor With Double ... (Forbes)

At the fall 2022 Nvidia CEO Jensen Huang announced the cancelation of the Atlan automated driving chip announced in 2021 and the introduction of Thor with ...

For comparison, the Parker SoC that powered version 2 of Tesla AutoPilot (in combination with a Pascal GPU) from 2016 delivered about 1 TOPS and was followed in 2020 by the Xavier chip with 30 TOPs. When it was announced, Atlan promised the highest performance of any automotive SoC to date with up to 1,000 trillion operations per second (TOPS) of integer computing capability. At this week’s fall 2022 GTC, Huang announced that Atlan had been canceled and replaced with a new design dubbed Thor that will offer twice the performance and data throughput, still arriving in 2025.

Post cover
Image courtesy of "TechCrunch"

Nvidia debuts new high-end RTX 4090 GPU after previous ... (TechCrunch)

The upgrade comes at an interesting time for PC users, who have been starved out of the GPU market by crypto miners for years, and now have their choice of ...

Click here to find out more about our partners. Find out more about how we use your information in our Privacy Policy and Cookie Policy. You can select 'Manage settings' for more information and to manage your choices.

Post cover
Image courtesy of "Forbes"

NVIDIA Launches Lovelace GPU, Cloud Services, Ships H100 ... (Forbes)

It's impossible to convey the excitement of a Jensen Huang keynote address at GTC, but here's what caught my eye.

We have no investment positions in any of the companies mentioned in this article and do not plan to initiate any in the near future. Disclosures: This article expresses the opinions of the author, and is not to be taken as advice to purchase from nor invest in the companies mentioned. Next up is a new cloud service that will enable far more professionals to interact and collaborate on the development and testing of digital twins using cloud services. Actually there are another dozen technologies that merit attention such as the addition of NVIDIA Clara to the Broad Institute Tera Cloud Platform. NVIDIA also announced several cloud services that will reduce barriers to adoption of the company’s extensive software portfolio. As it matures, the cloud-based 3D graphics generation will greatly expend the utility of Omniverse, the industry’s only metaverse platform that is targeting professional and creative communities. While at Cambrian-AI we tend to focus on the data center and AI at the edge, we would be remiss if we did not mention the star of the show, the Lovelace GPU. Seizing the opportunity, NVIDIA is positioning the Hopper GPU as a break-though in reducing the exorbitant costs of training these massive models. He indicated that NVIDIA DGX servers and HGX modules using SXM and NVLink will ship in Q1, 2023, and expects wide-scale public cloud support in the same timeframe. NVIDIA made a slew of technology and customer announcements at the Fall GTC this year. He also pointed out that the Grace-Hopper superchip will deliver 7X the fast-memory capacity (4.6TB) and 8000 TFLOPS versus today’s CPU-GPU configuration, critical for the recommender models used by eCommerce and media super-websites. As background, lest you think that LLM’s are a solution looking for a problem, just the generative model craze alone is staggering.

Post cover
Image courtesy of "TrustedReviews"

Nvidia RTX 4090 vs Nvidia RTX 4080: Which Lovelace GPU is better? (TrustedReviews)

Nvidia recently revealed its latest batch of GPUs, code-named Lovelace. But how do the latest releases compare to each other?

The high-end RTX 4080 uses 113 RT-TFLOPS, while the RTX 4090 features 191 RT-TFLOPS. Meanwhile, the RTX 4090 has only one variation but it costs a lot more than its companions, with a starting price of $1599. The RTX 4080 comes in either 12GB or 16GB, with the latter costing more. The first configuration of the RTX 4080 comes with 12GB of memory, alongside 7,680 CUDA Cores, 639 Tensor-TFLOPs and 92 RT-TFLOPs. Looking at the specs of each GPU, it’s clear that RTX 4090 has a lot more power. Having more CUDA Cores means that the hardware can process more data parallelly.

Post cover
Image courtesy of "TrustedReviews"

Nvidia RTX 4080 vs Nvidia RTX 3080: Is newer better? (TrustedReviews)

Nvidia just announced its new line of GPUs with the Nvidia RTX 4000 Series, code-named Lovelace. Here's how it compares to its predecessor.

The company also revealed that full ray tracing will be coming to Looking back to the RTX 3080, which costs only $699 (12GB) / $649 (10GB), it is definitely the better option if you’re looking to upgrade without breaking the bank. While it’s expected that next-generation hardware can be more costly, the price of the latest RTX 4080 is a lot more expensive when compared to its predecessor. This means that gamers should be able to play supported DLSS games with even higher frame rates and more impressive graphics. DLSS is developed by Nvidia and uses AI to boost a game’s framerate performance higher, allowing you to play games at a higher frame rate without overloading your GPU. This is the company’s third generation of RTX graphics cards, with plenty more updates and improvements for gamers and creatives.

Post cover
Image courtesy of "Kotaku"

Nvidia's New 4000-Series PC Graphics Cards Are Too Damn ... (Kotaku)

Are graphics cards like the just-revealed RTX 4090 and RTX 4080s becoming unaffordable?

I hope there is some sort of relief on the horizon, because as the one Redditor put it, “I love PC gaming, but I can’t fucking afford to be a part of it anymore.” The RTX 4080 16GB is 3x the performance of the RTX 3080 Ti on next-gen content like Cyberpunk with RT Overdrive mode or Racer RTX—for the same price of $1199. But viewing events from the consumer side, it really feels like the costs of enthusiast PC gaming are continuing to skyrocket, and at a time when the costs of just about everything else are, too. The price point of RTX 4090 starts at $1599. Of course, even then its MSRP is $899, which is $400 more than the RTX 3070’s original MSRP of $499. Now Nvidia’s revealed a 16GB RTX 4080, which many observers take to be the closest to a true 3080 successor, for a whopping $1199—an increase of $500. They are trying to sell you a 4070 rebranded as a 4080 for 900$ lmao.” [One commenter looked back](https://old.reddit.com/r/hardware/comments/xjbobv/geforce_rtx_4090_revealed_releasing_in_october/ip7pdmc/) to 2018’s GeForce 10-series to pinpoint why today’s prices felt so exorbitant. [a ray-traced version of Portal](https://www.nvidia.com/en-us/geforce/news/portal-with-rtx-ray-tracing/). For example, the RTX 2070 cost almost as much as the prior high-end GTX 1080, despite being less of a flagship card. With the 20 series, they bumped all of the prices a whole fucking tier, and it looks like they are doing it again. Indeed, in 2018, Nvidia attracted criticism for pricing its then-new RTX 20-series cards a full “tier” higher than the previous 10-series cards had cost. Today, after many months of leaks, rumors, and speculation, Nvidia finally officially revealed its next generation of graphics cards, the RTX 4000 series.

Post cover
Image courtesy of "GSMArena.com"

Nvidia announces RTX 40-series graphics cards with DLSS 3 and 2 ... (GSMArena.com)

TSMC apple bionic TSMC snapdragon TSMC ryzen And TSMC nvidia. What are you going to do guys without TSMC ? Reply. F. FatShady; CLs; 22 minutes ago.

Thats enough for me to lose interest on gaming and PC building. [Feem, 4 hours ago](#2588273)@PMKLR3m Yeah, the contract between EVGA and Nvidia got terminated and EVGA said it was becau... [more](#2588273)Nvidia became too much greedy towards partners and with no support at all; for example the price of GPU is declining and Nvidia is undercutting the partners by dropping the price without any information for the OEM ... Dont waste your money on ngredia,who forget about u for the past 4 years and focus on GPU mining. [GregLu, 4 hours ago](#2588292)Nvidia became too much greedy towards partners and with no support at all; for example the pri... [Anonymous, 4 hours ago](#2588275)Euro prices make me sad, and if nvidia didn't lie like with 3000 series, we can add anoth... :( [more](#2588298)I don't know about AMD and their partner/relation but if you have news, I'm all ears. - 👍 [GregLu, 1 hour ago](#2588368)I don't know about AMD and their partner/relation but if you have news, I'm all ears... - Anonymous - xhm

Post cover
Image courtesy of "Data Center Knowledge"

NVIDIA's Omniverse Lets You Create Digital Twins of Your Data ... (Data Center Knowledge)

Nvidia promises to make 3D modeling and digital technology more accessible for your data center, just not at the moment.

We in the data center industry know what that means, more demand for access to storage, network, and computer. Systems to support the chips are coming in the first half of 2023. The implementation of robotics to maintain data center equipment would get a boost from NVIDIA’s Omniverse, based on case studies from other verticals such as the automotive and railway industries. We admit, many of the firm’s announcements have a decided cool factor, leveraging the power of 3D for realistic simulations, but is there anything here that will change your life as a data center pro today or in the very near future? We're seeing a trend here in that NVIDIA wants simulation technology to be readily available and as plug-and-play as possible for enterprises. Our friends at Siemen’s provided us with access to a [webinar on digital twin technology](https://new.siemens.com/global/en/markets/data-centers/events-webinars/webinar-digital-twin-applications-for-data-centers-apac-emea.html).

Post cover
Image courtesy of "CRN Australia"

Nvidia eyeing "AI factories" with H100 GPUs (CRN Australia)

Nvidia revealed Tuesday that its next-generation H100 Tensor Core GPU is now in full production and will ship next month.

“Our customers are looking to deploy data centers…that are basically AI factories producing AIs for production use cases,” he said. “First, we are super excited to announce that the Nvidia H100 is now in full production,” said Ian Buck, vice president and general manager of Nvidia’s Tesla Data Center business, during a media pre-briefing before Nvidia CEO Jensen Huang’s keynote speech Tuesday at the Nvidia GPU Technology Conference (GTC). “You don‘t need to train the large language model from scratch. “In much the same way today that you can experience the Internet on multiple devices, different types of browsers, and everything seamlessly, you be able to interact with 3D environments in much the same way,” Kerris said. “It will democratise access to large language models…” he said. “Starting with the announcement in the keynote, our customers will be able to get access to the Nvidia H100 in Nvidia’s Launchpad.”

Post cover
Image courtesy of "TechCrunch"

Nvidia launches new services for training large language models (TechCrunch)

Today at the company's fall 2022 GTC conference, Nvidia announced the NeMo LLM Service and BioNeMo LLM Service, which ostensibly make it easier to adapt LLMs ...

Click here to find out more about our partners. Find out more about how we use your information in our Privacy Policy and Cookie Policy. You can select 'Manage settings' for more information and to manage your choices.

Post cover
Image courtesy of "Gizmodo"

Nvidia's Flagship RTX 4090 card is $1599 and Available in October (Gizmodo)

Team Green also announced the RTX 4080 starts at $899, though we don't know a release date.

Still, Nvidia’s RTX 4080 is also here if $1,599 seems too costly, and Huang said the company will continue to support its RTX 3000 line as well. Taken together with the RTX 4090's hardware, Nvidia promises its new card will be 2x faster at non ray-tracing tasks and 4 time faster at ray-tracing ones. Huang demonstrated how it works with The Elder Scrolls III: Morrowind, a game from 2002, and while we don’t know much about what kind of power it will need in addition to an RTX 4000 series GPU, it’s going to be available for modders everywhere. With blanket promises that the card is 2-4 times faster than the RTX 3090 Ti, which is an upgraded version of the 3090, Huang showed 4K framerates above 100 fps on both Cyberpunk 2077 and Microsoft Flight Sim, both with ray-tracing on. Which is good, because the RTX 4090 looks like a significant improvement over the RTX 3090, and especially over any GTX 1000 or RTX 2000 series GPUs you might still be rocking. To support that dream, the company also announced DLSS 3.0 upgraded tech that supplies more frames with less work, plus SER, a new RTX graphics engine that promises to drastically increase ray tracing performance.

Post cover
Image courtesy of "Kotaku Australia"

Nvidia's New 4000-Series PC Graphics Cards Are Too Damn ... (Kotaku Australia)

Today, after many months of leaks, rumours, and speculation, Nvidia finally officially revealed its next generation of graphics cards, the RTX 4000.

[is widely rumoured to have a large surplus of now](https://www.hardwaretimes.com/nvidia-set-to-offer-steeper-price-cuts-in-september-to-get-rid-of-excess-inventory-report/) that the crypto GPU-mining craze has (temporarily?) subsided. Of course, even then its MSRP is $US899 ($AU1344), which is $US400 (approx $AU600) more than the RTX 3070’s original MSRP of $US499 ($AU746). The price point of RTX 4090 starts at $US1599 ($2,220). I hope there is some sort of relief on the horizon, because as the one Redditor put it, “I love PC gaming, but I can’t fucking afford to be a part of it anymore.” [lamented one Redditor](https://old.reddit.com/r/hardware/comments/xjbobv/geforce_rtx_4090_revealed_releasing_in_october/ip7dx9h/). But viewing events from the consumer side, it really feels like the costs of enthusiast PC gaming are continuing to skyrocket, and at a time when the costs of just about everything else are, too. The RTX 4080 16GB is 3x the performance of the RTX 3080 Ti on next-gen content like Cyberpunk with RT Overdrive mode or Racer RTX — for the same price of $US1199. Now Nvidia’s revealed a 16GB RTX 4080, which many observers take to be the closest to a true 3080 successor, for a whopping $US1199 ($AU1792) — an increase of $US500 ($AU747). [Said another](https://old.reddit.com/r/hardware/comments/xjbobv/geforce_rtx_4090_revealed_releasing_in_october/ip7doby/), “The prices are downright insulting. But what about the 12GB RTX 4080 at $US899 ($AU1344)? [a ray-traced version of Portal](https://www.nvidia.com/en-us/geforce/news/portal-with-rtx-ray-tracing/). With the 20 series, they bumped all of the prices a whole fucking tier, and it looks like they are doing it again.

Post cover
Image courtesy of "HPCwire"

Nvidia Introduces New Ada Lovelace Architecture, OVX Systems ... (HPCwire)

In his GTC keynote today, Nvidia CEO Jensen Huang launched another new Nvidia GPU architecture: Ada Lovelace, named for the legendary mathematician regarded ...

There is also Omniverse Replicator, a 3D synthetic data generator for researchers, developers, and enterprises that integrates with Nvidia’s AI cloud services. Omniverse Cloud will also be available as Nvidia managed services via early access by application. “Using this technology to generate large volumes of high-fidelity, physically accurate scenarios in a scalable, cost-efficient manner will accelerate our progress towards our goal of a future with zero accidents and less congestion.” “Planning our factories of the future starts with building state-of-the-art digital twins using Nvidia Omniverse,” said Jürgen Wittmann, head of innovation and virtual production at BMW Group. With Omniverse Cloud, users can collaborate on 3D workflows without the need for local compute power. “In the case of OVX, we do optimize it for digital twins from a sizing standpoint, but I want to be clear that it can be virtualized. Nvidia said that the RTX 6000 would be available in a couple of months from channel partners with wider availability from OEMs late this year into early next year to align with developments elsewhere in the industry. The second generation OVX system features an updated GPU architecture and enhanced networking technology. Ada Lovelace is not a subset of Nvidia’s Hopper GPU architecture (announced just six months prior), nor is it truly a successor — instead, Ada Lovelace is to graphics workloads as Hopper is to AI and HPC workloads. “With a massive 48GB frame buffer, OVX, with eight L40s, will be able to process giant Omniverse virtual world simulations.” The company also announced two GPUs based on the Ada Lovelace architecture — the workstation-focused RTX 6000 and the datacenter-focused L40 — along with the Omniverse-focused, L40-powered, second-generation OVX system. In his GTC keynote today, Nvidia CEO Jensen Huang launched another new Nvidia GPU architecture: Ada Lovelace, named for the legendary mathematician regarded as the first computer programmer.

Nvidia GeForce RTX 40 series doubles power of last gen's GPUs (New Atlas)

Nvidia has unveiled its latest lineup of graphics cards, the GeForce RTX 40 series. Built on third-generation architecture, the cards are claimed to boast ...

Nvidia has unveiled its latest lineup of graphics cards, the GeForce RTX 40 series. Valve’s classic Portal is getting a free update with 4K visuals and ray tracing, which could lead to some really interesting effects as the portals bend and warp light sources, as well as giving off their own colored light. As with previous generations, there are three models in the GeForce RTX 40 series, although they’ve had a bit of a change to naming conventions.

Post cover
Image courtesy of "iTWire"

iTWire - NVIDIA unveils DRIVE Thor — centralised car computer ... (iTWire)

COMPANY NEWS: NVIDIA today introduced NVIDIA DRIVE Thor, its next-generation centralised computer for safe and secure autonomous vehicles. DRIVE Thor,...

The DRIVE Thor SoC and AGX board are developed to comply with ISO 26262 standards. [GTC keynote address](https://www.youtube.com/supported_browsers?next_url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DPWcNlRI00jo&feature=youtu.be), in which Huang announced NVIDIA DRIVE Thor and other key automotive developments with IVI, mapping, simulation and more. With this engine, DRIVE Thor can accelerate inference performance of transformer deep neural networks by up to 9x, which is paramount for supporting the massive and complex AI workloads associated with self-driving. DRIVE Thor is designed for the highest levels of functional safety. "The shift to software-defined vehicles with centralized electronic architectures is accelerating, driving a need for more powerful and more energy-efficient compute platforms," said Guidehouse Insights principal research analyst Sam Abuelsamid. "DRIVE Thor is the superhero of centralised compute, with lightning-fast performance to deliver continuously upgradable, safe and secure software-defined supercomputers on wheels." This equips automakers with the compute headroom and flexibility to build software-defined vehicles that are continuously upgradeable through secure, over-the-air software updates. The advantage of the NVLink-C2C is its ability to share, schedule and distribute work across the link with minimal overhead. Another advantage of DRIVE Thor is its 8-bit floating point (FP8) capability. DRIVE Thor supports multi-domain computing, isolating functions for automated driving and IVI. DRIVE Thor with MIG support for graphics and compute uniquely enables IVI and advanced driver-assistance systems to run domain isolation, which allows concurrent time-critical processes to run without interruption. ZEEKR CEO An Conghui said: "ZEEKR users demand a luxury experience that includes the latest technology and safety features.

Post cover
Image courtesy of "StorageReview.com"

NVIDIA GTC 2022 Delivers Graphics, Animation, Performance, and ... (StorageReview.com)

NVIDIA kicked off the GTC 2022 session with a keynote by CEO Jensen Huang that was heavy with impressive graphics and animation.

NVIDIA is also working with operating system partners like Canonical, Red Hat, and SUSE to bring full-stack, long-term support to the platform. NVIDIA DRIVE Thor is the next-generation centralized computer for safe and secure autonomous vehicles. The NVIDIA Jetson family now spans six Orin-based production modules supporting a full range of edge AI and robotics applications. NVIDIA also announced that its automotive pipeline has increased to over $11 billion over the next six years, following a series of design wins with vehicle makers from around the globe. Using Omniverse Cloud, individuals and teams can experience in one click the ability to design and collaborate on 3D workflows without the need for any local compute power. The L40 GPU’s third-generation RT Cores and fourth-generation Tensor Cores will deliver powerful capabilities to Omniverse workloads running on OVX, including accelerated ray-traced and path-traced rendering of materials, physically accurate simulations, and photorealistic 3D synthetic data generation. The NVIDIA BioNeMo Service is a cloud application programming interface (API) that expands LLM use cases beyond language and scientific applications to accelerate drug discovery for pharma and biotech companies. NVIDIA Omniverse Cloud is the company’s first software- and infrastructure-as-a-service offering. A five-year license for the NVIDIA AI Enterprise software suite is now included with H100 for mainstream servers. First on the agenda was the announcement of the next-generation GeForce RTX 40 series GPUS powered by ADA Lovelace, designed to deliver extreme performance for gamers and creators. Next was the NVIDIA DLSS 3, the next revolution in the company’s Deep Learning Super Sampling neural graphics technology for games and creative apps. Additionally, some of the world’s leading higher education and research institutions will use H100 to power their next-generation supercomputers.

Explore the last week