-
SchermGeheugenGewichtProcessorHard DiskSSDSchermResolutieConnectiviteitGeheugenDesktopOSSchermTypeSnelheid ZwartSnelheid KleurenModelSnelheidVorm factorInterne / ExterneAantal poortenUplink / mediaMontage
- Alles bekijken
Bundles
Verwante producten
Product Informatie
NVIDIA RTX 6000 ADA Generation
Performance For Endless Possibilities
The way people work is undergoing a drastic change with distributed teams and remote workers as the new normal. Artists are facing an ever-increasing demand for differentiated and visually compelling content. Designers and engineers are striving to create more complex and efficient designs in highly supply-constrained environments. Scientists, researchers, and medical professionals are faced with incredible challenges that require the rapid development of solutions on a global scale.
The NVIDIA RTX™ 6000 Ada Generation is designed to meet the challenges of today’s professional workflows. Built on the NVIDIA Ada Lovelace architecture, the RTX 6000 combines 142 third-generation RT Cores, 568 fourth-generation Tensor Cores, and 18176 CUDA® cores with 48GB of graphics memory to deliver the next generation of AI graphics and petaflop inferencing performance for unprecedented speed-up of rendering, AI, graphics, and compute workloads. RTX 6000-powered workstations provide what you need to succeed in today’s ultra-challenging business environment.
Processor | |
CUDA | Ja |
CUDA cores | 18176 |
Grafische processor familie | NVIDIA |
Grafische processor | Quadro RTX 6000 |
Lithografie | 4 nm |
Geheugen | |
Grafische geheugen | 48 GB |
Grafische adapter, soort geheugen | GDDR6 |
Geheugenbus | 384 Bit |
Poorten & interfaces | |
Soort aansluiting | PCI Express 4.0 |
Aantal DisplayPorts | 4 |
DisplayPort versie | 1.4 |
Design | |
Type koeling | Actief |
Aantal ventilatoren | 1 ventilator(en) |
Vormfactor | Half-Height/Half-Length (HH/HL) |
Energie | |
Vermogensverbruik (max) | 300 W |
Gewicht en omvang | |
Gewicht | 1,18 kg |
Diepte | 266,7 mm |
Hoogte | 111,8 mm |
Warranty | |
Base Warranty | Non Non |