![](/Content/images/logo2.png)
Original Link: https://www.anandtech.com/show/1827
G72 and G73 Info; Plus NVIDIA's Quest for a GPU Socket
by Kristopher Kubicki on October 19, 2005 12:00 AM EST- Posted in
- GPUs
We continue to hear new details about G72 and G73 here in Taiwan, and the latest batch of info from our vendors is that G72 and G73 will be pin compatible with NV40 and NV43. In other words, your next NVIDIA video card might have the same PCB from the 6600GT, but with a different GPU.
This means lower cost to the manufacturer - there is no need for new R+D or board designs. It also means G72 and G73 will launch very fast when the decision comes from NVIDIA as vendors can very easily switch production from the older chips to the new ones. Two vendors confirmed with us that they are already retooling their PCB for six pin 12V molex with the anticipation that G72 and G73 SLI might need the additional power, but even NVIDIA won't comment to the manufacturers at this point.
A lot seems to hinge on ATI's future choices with R580, X1600 and X1300. As of now, the launch date for X1600 is still late November and NVIDIA isn't exactly hurting for new value and midrange SKUs with the success of 6600GT. The X800GTO and X800GTO2 really give 6600GT a run for its money, but we digress.
NVIDIA's Secret Flip Chip GPU
Manufacturers seem to think G72 and G73 will be an easy tool over from NV40/43, but another vendor claims NVIDIA has bigger plans. They claim that NVIDIA is working on flip chip GPU sockets for motherboards. Apparently, inside NVIDIA engineering teams have several prototypes where the GPU, rather than the CPU, is the main focus of a motherboard with two sockets: one for the GPU and another for the CPU. Whether or not such a machine will ever see the light of day is difficult to say right now. However, the idea of pin compatible GPUs already suggests that we are halfway there when it comes to buying GPUs the same way we buy CPUs: in flip chips. We have plenty of questions, like how the memory interface will work and how that will affect performance, but GPU sockets are likely less a question of "if", but rather "when".