|
Post by mathmac on Apr 26, 2009 7:09:29 GMT
aquamac and colleagues have done a lot of hard work on getting us going on using NVIDIA PC cards as graphics cards, and this and the work of others seems to indicate I will have to wait for Snow Leopard for full functionality for the GTX 260 and 285 I am playing with. But suppose (a) I do not need the card to be a boot monitor (so do not care about EFI64) (b) I do not need to display a picture (c) I DO need the card recognized under CUDA Is there a hack I can do to at least get as far as (c) under 10.5.6? I have got as far as editing things like GeForce.kext and others (and forcing a Cache rebuild), e.g. to add the 285 PCI ID string, but the card, while it registers under PCI and Graphics as a generic card, does not appear under CUDA, let alone drive a monitor, either with or without that PCI ID added. My test setup is a 2008 Mac Pro with an 8800GT and either a Zotac 260 with 216 cores and 892M, or Palit 285 with 240 cores and 2G RAM. Under Windows bootcamp both configs appear as CUDA dual GPU configurations (and run monitors), so the hardware is plugged in fine. The 8800GT seems to manage the startup OK under all OS (so there is some EFI64 stuff in there to get the sys up). It is not a power issue as I have all three 6-pin PCI-E connectors supplied in various ways. Both cards are in the correct slots 1 and 2. I do not know whether I am (a) just missing a critical hack (b) missing a fundamental piece of driver software (c) wasting my time until Snow Leopard comes out. On the other hand the Quadro 4800 Mac has been announced not needing Snow to work...... Bit stuck right now - any ideas? I had a look at the post on GFI strings and it seemed to offer some Snow Lep help, but with two different cards in there I was not sure how to adapt the advice. Thanks in advance - note that I am a regular Mac Pro user - this is not a hackintosh!
|
|
|
Post by aquamac on Apr 26, 2009 18:00:49 GMT
Hi Mathmac,
I have found that in a real Mac pro I could only get the GTX 280 going with Snow Leopard. Also only with a real Mac card in the first PCIe slot. You will still need NVDarwin so the card can be initialized properly as it obviously does not have an EFI rom. Support will be here very soon in Leopard with the release of the Quadro FX 4800. There is no way to get things going sensibly in Leo at the moment unless you are looking for basic VESA support. That would mean no QE or CI and you would still have to have a mac card in slot 1.
When the 4800 comes out maybe there is a chance to use it's rom to turn a GTX 280 into a full mac card but it will involve replacing the rom chip unless you are lucky enough to already have a 128 kb chip fitted already.
|
|
|
Post by mathmac on Apr 27, 2009 6:46:59 GMT
Thanks very much. That all makes sense and I will stop banging my head against a brick wall for the time being. I will see what is in the 10.5.7 extensions and Quadro driver disc - I might get lucky as I do not need to boot from these cards - but I fear it is at best a wait for Snow Leopard. I was just intrigued by some info on Macforums where a poster got CUDA up long before he could get the video working properly. Added in edit. So you have this sequence of steps for two cards under Snow Lep: aquamac.proboards.com/index.cgi?board=hack1&action=display&thread=569Most of that makes sense to me and I can just about figure out how to adapt the two card situation. BUT - this method seems tailored to a hackintosh - you refer to EFI 8 (my Mac Pro thinks it is using EFI V1.3) and has a rather different multi-boot (I use ALT on boot to switch between 10.5.6, Windows XP32, other OS etc.). Do you know how to adapt those instructions to a real Mac Pro? Should I throw out all the nvresman etc. If not what would I do with nvdarwin and SL?
|
|
|
Post by aquamac on Apr 27, 2009 17:58:50 GMT
I am not sure if EFI strings will work on a real mac Pro - to be honest, I never tried it. If you wanted to give it a go, then you would want to remove NVDarwin for sure as it will overwrite your strings when it starts up. You certainly do not want to remove NVDAResman.kext or any other NV kexts as these are required to get your card working properly.
The link you gave is for a Hackintosh, would be interesting to know if it worked on a real Mac Pro though. May result in a KP!
|
|
|
Post by mathmac on Apr 28, 2009 9:59:57 GMT
OK - I will have a go at the weekend. Meantime, there are a couple of questions involved in trying to adapt your two-card file input file used to make the hex.
1. The NVCAP values you have: if I run the NVCAP maker app on my two rom files (created with nvflash in windows) I get the values you have for the Mac 8800Gt rom but a different set for my 285! Should I use yours or my 285 values, given that you are using 280.
2. With my 285, as I am on a real mac pro and cannot get the display to work at all, I cannot run your program to determine the top level pci code. Should I just use trial and error (1,2,3,4,5 etc in the middle), or should I pare down the boot to use VESA to run your program?
Sorry to ask so many questions - I went through your script carefully and it makes sense to me, but I have these issues to nail down.
|
|
|
Post by mathmac on Apr 30, 2009 7:44:29 GMT
I got a chance to try this last night. Only Kernel Panics so far, I am sorry to say. Right now I do not know whether (a) I am missing a key setting (b) this is hopeless without newer drivers (c) the EFI on the real Mac just does not process the HEX. For anyone who is interested, here is my current attempt at the in.plist file - if anyone spots a mistake, or indeed gets this mixed card config (cards are in the text below) going on a hackintosh, do let me know. I have tried some variations around this, especially for NVCAP and memory ( am guessing 80 for 2G card, but also tried 40), and move my 8800 around to extract the top level PCI code (file below assumess 8800 in bottom slot). Always results in KP, with a traceback to NVDAResman, unless I power down the 285. KP stopped if I removed the 200 series PCI ID from NVDAResman kext, but then other odd things started to happen...like the Apple renderer kicking in and other strange goings on.
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>PciRoot(0x0)/Pci(0x5,0x0)/Pci(0x0,0x0)</key> <dict> <key>@0,compatible</key> <string>NVDA,NVMac</string> <key>@0,device_type</key> <string>display</string> <key>@0,name</key> <string>NVDA,Display-A</string> <key>@1,compatible</key> <string>NVDA,NVMac</string> <key>@1,device_type</key> <string>display</string> <key>@1,name</key> <string>NVDA,Display-B</string> <key>@2,#adress-cells</key> <string>0x01000000</string> <key>@2,#size-cells</key> <string>0x00000000</string> <key>@2,compatible</key> <string>NVDA,sensor-parent</string> <key>@2,device_type</key> <string>NVDA,gpu-diode</string> <key>@2,hwctrl-params-version</key> <string>0x02000000</string> <key>@2,hwsensor-params-version</key> <string>0x02000000</string> <key>@2,name</key> <string>sensor-parent</string> <key>@2,reg</key> <string>0x02000000</string> <key>NVCAP</key> <data>BAAAAAAAAwAMAAAAAAAABwAAAAA=</data> <key>NVPM</key> <data>AQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA==</data> <key>VRAM,totalsize</key> <data>AAAAIA==</data> <key>device_type</key> <string>NVDA,GeForce</string> <key>model</key> <string>NVIDIA GeForce 8800GT</string> <key>name</key> <string>NVDA,Parent</string> <key>rom-revision</key> <string>3172a</string> </dict> <key>PciRoot(0x0)/Pci(0x1,0x0)/Pci(0x0,0x0)</key> <dict> <key>@0,compatible</key> <string>NVDA,NVMac</string> <key>@0,device_type</key> <string>display</string> <key>@0,name</key> <string>NVDA,Display-A</string> <key>@1,compatible</key> <string>NVDA,NVMac</string> <key>@1,device_type</key> <string>display</string> <key>@1,name</key> <string>NVDA,Display-B</string> <key>@2,#adress-cells</key> <string>0x01000000</string> <key>@2,#size-cells</key> <string>0x00000000</string> <key>@2,compatible</key> <string>NVDA,sensor-parent</string> <key>@2,device_type</key> <string>NVDA,gpu-diode</string> <key>@2,hwctrl-params-version</key> <string>0x02000000</string> <key>@2,hwsensor-params-version</key> <string>0x02000000</string> <key>@2,name</key> <string>sensor-parent</string> <key>@2,reg</key> <string>0x02000000</string> <key>NVCAP</key> <data>BAAAAAAADwAAAAAAAAAABwAAAAA=</data> <key>NVPM</key> <data>AQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA==</data> <key>VRAM,totalsize</key> <data>AAAAgA==</data> <key>device_type</key> <string>NVDA,GeForce</string> <key>model</key> <string>NVIDIA GeForce GTX 285 (Palit 2GB)</string> <key>name</key> <string>NVDA,Parent</string> <key>rom-revision</key> <string>3172a</string> </dict> </dict> </plist>
|
|
|
Post by aquamac on Apr 30, 2009 21:08:19 GMT
Hi Mathmac,
You need to use NVDarwin, strings won't work, but NVDarwin will!
|
|
|
Post by mathmac on May 1, 2009 9:39:56 GMT
EDITED now trying this out with NVdarwin and Snow Leopard - one more go before I decide to wait and see how this new alleged EVGA Mac 285 pans out!
Hmmm, still getting KPs, I noted that the current nvdarwin does not have the PCI ID for the 285, so I have added the 0x05e3 string in the ndarwin plist. Still getting KP. I also read what was said on insanelymac about where the cards should be in a real mac. What else do you have to do - how critical is the memory given that I have 2G card? I cannot see where to edit that i the plist.
Also (sorry) how do I actually check nvdarwin is loading at all?
|
|
|
Post by mathmac on May 3, 2009 9:05:58 GMT
I edited the CoreVidia (new nvdarwin) plist with my best params, also hacked the CoreVidia binary to replace 280 by 285 and the PCI IDs as well + got my head around the permissions better. 8800+285 now alive. Looking into acceleration status etc. Thanks for all your help. Someone has asked Xdarwin to update CoreVidia already to make this less painful. Only working with 1Gig settings - replacing 40 by 80 in the plist causes trouble.
|
|