rz
New Member
Posts: 2
|
Post by rz on Jan 19, 2010 0:04:38 GMT
Are the PciRoot strings generated dependent on software in any way? Or once I have them they're good so long i keep the cards in the same slots? (it is really tedious to take the cards out, reboot, etc).
My machine was going in kernel panic when booting form PEG2 (even the installer), but I'm aware that it is necessary.
Is there anything I need to do besides the steps in this guide?
|
|
|
Post by tickleboo on Jan 20, 2010 1:32:45 GMT
Hey Aquamac, just wanted to say that I really appreciate your taking the time to write up this guide and help people all of the way through the thread. Just wanted to report sucess on the first try even My setup: Asus p6t Deluxe v2 2 Nvidia GTX 275 I7 920 BTW: the Hex for 896mb cards is 00000038, you might want to put that in your main post Thanks again!
|
|
|
Post by candykane on Jan 22, 2010 9:54:53 GMT
Can some one please take a look I'm trying to get 3 nvidia 9800 GTX+ 512 MB to play nice on a Asus P6T7 Supercomputer (think the super stands for super problems) anyway its not my systeem but just trying to help the man with 2 left hands. he has bin begging me to help him out for weeks now. I got 10.6.2 clean with just chamelion 2 rc4 and a few kext files that aqua reconmended in an other thread on this forum. And iportable 10.5.4 on a USB stick i got the dual GTX file extended it for 3 slots basically copied the first half and pasted it to the bottom thought. booted iportable (leo) went trough the motions 1 card first slot get ID remove , then second card second slot ect ect pasted it all in 1 file. <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>PciRoot(0x0)/Pci(0x1c,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)</key> <dict> <key>@0,compatible</key> <string>NVDA,NVMac</string> <key>@0,device_type</key> <string>display</string> <key>@0,name</key> <string>NVDA,Display-A</string> <key>@1,compatible</key> <string>NVDA,NVMac</string> <key>@1,device_type</key> <string>display</string> <key>@1,name</key> <string>NVDA,Display-B</string> <key>@2,#adress-cells</key> <string>0x01000000</string> <key>@2,#size-cells</key> <string>0x00000000</string> <key>@2,compatible</key> <string>NVDA,sensor-parent</string> <key>@2,device_type</key> <string>NVDA,gpu-diode</string> <key>@2,hwctrl-params-version</key> <string>0x02000000</string> <key>@2,hwsensor-params-version</key> <string>0x02000000</string> <key>@2,name</key> <string>sensor-parent</string> <key>@2,reg</key> <string>0x02000000</string> <key>NVCAP</key> <data> BAAAAAAAAwAMAAAAAAAABwAAAAA= </data> <key>NVPM</key> <data> AQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA== </data> <key>VRAM,totalsize</key> <data> AAAAIA== </data> <key>device_type</key> <string>NVDA,GeForce</string> <key>model</key> <string>NVIDIA GeForce 9800 GTX DDL</string> <key>name</key> <string>NVDA,Parent</string> <key>rom-revision</key> <string>3172a</string> </dict> <key>PciRoot(0x0)/Pci(0x3,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)</key> <dict> <key>@0,compatible</key> <string>NVDA,NVMac</string> <key>@0,device_type</key> <string>display</string> <key>@0,name</key> <string>NVDA,Display-A</string> <key>@1,compatible</key> <string>NVDA,NVMac</string> <key>@1,device_type</key> <string>display</string> <key>@1,name</key> <string>NVDA,Display-B</string> <key>@2,#adress-cells</key> <string>0x01000000</string> <key>@2,#size-cells</key> <string>0x00000000</string> <key>@2,compatible</key> <string>NVDA,sensor-parent</string> <key>@2,device_type</key> <string>NVDA,gpu-diode</string> <key>@2,hwctrl-params-version</key> <string>0x02000000</string> <key>@2,hwsensor-params-version</key> <string>0x02000000</string> <key>@2,name</key> <string>sensor-parent</string> <key>@2,reg</key> <string>0x02000000</string> <key>NVCAP</key> <data> BAAAAAAAAwAMAAAAAAAABwAAAAA= </data> <key>NVPM</key> <data> AQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA== </data> <key>VRAM,totalsize</key> <data> AAAAIA== </data> <key>device_type</key> <string>NVDA,GeForce</string> <key>model</key> <string>NVIDIA GeForce 9800 GTX DDL</string> <key>name</key> <string>NVDA,Parent</string> <key>rom-revision</key> <string>3172a</string> </dict> <key>PciRoot(0x0)/Pci(0x7,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)</key> <dict> <key>@0,compatible</key> <string>NVDA,NVMac</string> <key>@0,device_type</key> <string>display</string> <key>@0,name</key> <string>NVDA,Display-A</string> <key>@1,compatible</key> <string>NVDA,NVMac</string> <key>@1,device_type</key> <string>display</string> <key>@1,name</key> <string>NVDA,Display-B</string> <key>@2,#adress-cells</key> <string>0x01000000</string> <key>@2,#size-cells</key> <string>0x00000000</string> <key>@2,compatible</key> <string>NVDA,sensor-parent</string> <key>@2,device_type</key> <string>NVDA,gpu-diode</string> <key>@2,hwctrl-params-version</key> <string>0x02000000</string> <key>@2,hwsensor-params-version</key> <string>0x02000000</string> <key>@2,name</key> <string>sensor-parent</string> <key>@2,reg</key> <string>0x02000000</string> <key>NVCAP</key> <data> BAAAAAAAAwAMAAAAAAAABwAAAAA= </data> <key>NVPM</key> <data> AQAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA== </data> <key>VRAM,totalsize</key> <data> AAAAIA== </data> <key>device_type</key> <string>NVDA,GeForce</string> <key>model</key> <string>NVIDIA GeForce 9800 GTX DDL</string> <key>name</key> <string>NVDA,Parent</string> <key>rom-revision</key> <string>3172a</string> </dict> </dict> </plist> made a hexstring and put that in the plist in the extra folder well that did not work, the computer panicked was I doing something wrong? on a side note still love your G4 case work keep up the good work.
|
|
|
Post by aquamac on Jan 22, 2010 19:39:06 GMT
The PCI addresses for the ASUS P6T7 are as follows (blue pci slots 1,3 & 5 - don't use the one furthest from the processor). The boards all seem to work fine with 2 cards in slot 1 & 5.:
PciRoot(0x0)/Pci(0x3,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)
PciRoot(0x0)/Pci(0x3,0x0)/Pci(0x0,0x0)/Pci(0x2,0x0)/Pci(0x0,0x0)
PciRoot(0x0)/Pci(0x7,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)/Pci(0x0,0x0)
As you can see one of yours is incorrect.
But here is a strange thing, I have tried 2 boards with triple GTX 280's cloned hard drives, one works fine one KP's. All bios settings are the same. I have also failed to get triple cards going in a P6T6, but the same input list works for another member on this forum.
I would be interested to see how you get on now you have the correct PCI addresses.
|
|
|
Post by lordofthedread on Jan 27, 2010 23:28:52 GMT
Hi Aquamac thats a nice forum you got running of course like almost everyone i have a question : I have 2 different cards that i would like to use on my all new i5 hackintosh My mobo is a P55 UD3P and my grafic cards are 1 GTS 250 from Gigabyte and 1 GT 210 from XFX since i dont have any Leopard systems running i wondered if theres another way into making or editing the files ? Thx in advance
|
|
|
Post by aquamac on Jan 28, 2010 6:17:26 GMT
Hi there and welcome.
You can edit the input files in snow to change items like the cards name and slot number. You need to download Apple's Developer tools and use "Property List Editor". When you open up your in.plist for the items you want top change, select string instead of Data, you will then see the actual name as opposed to a load of numbers. Edit the name with one of your choice then when you have finished change the item back to Data and save the file. You have now edited you in.plist.
When you come to making the output file this must be done in Leo there is no other way I'm afraid.
|
|
|
Post by lordofthedread on Jan 28, 2010 8:05:49 GMT
Thx for this quick answer.
Is there a chance that if i change the slot number (by plugin one by one and editing my in.plist) i can get the two cards to work ?
I will find a way to get Leopard somehow i may have some laptop able to run it somewhere ... i'll let you know how its going
|
|
|
Post by aquamac on Jan 28, 2010 19:16:44 GMT
Hi,
You will probably have to set your bios to boot from the second card as it is a Gigabyte! Set to PEG2 and attach your monitor to the second card, otherwise you will get a KP when booting.
|
|
|
Post by frankov on Feb 4, 2010 22:27:02 GMT
I followed your instructions, and everything seems to work just fine ! The only problem I have is with the total VRAM size. I tried to add the string for 1024MB cards, but the System Profiler shows only 771MB on both cards. Here's an image and my in.plist (in attachment) Is this normal ? EDIT : Also, is 'Core Image' supposed to be 'Supported' or 'Hardware Accelerated' ? Thanks a lot !
|
|
|
Post by aquamac on Feb 5, 2010 20:08:54 GMT
You must have the incorrect value in your in.plist before you made the strings. You will have to do them again with the value:
<00000040>
where it says VRAM,totalsize.
Core Image sometimes says Supported and sometimes says Hardware Accelerated, depending which version of OSX you are using.
|
|
|
Post by frankov on Feb 5, 2010 23:26:47 GMT
I just opened the in.plist file to re-check and the VRAM key had HTML code for the < > characters.
I'm going to edit it again and make sure it didn't convert it again this time.
Thanks a lot for the fast reply, mate.
---------------------------------------------------------------------------------------
EDIT
You were right as usual, now my 2 cards are installed, support everything and indicate the right VRAM.
Thanks again !
|
|
|
Post by justanopinion on Feb 11, 2010 16:48:58 GMT
First off, I wanted to thank you for your work. I have been trying to set up my hackintosh for awhile. When I found out that I might have to remove one of my graphics cards everytime I wanted to boot up in OSX instead of Win7, I was extremely frustrated. Although I haven't quite got it working here either.... at least I have hope. So.. I have two EVGA 9800 GTX+ in my computer. P55-UD5 (Gigabyte) with an i7. PCI addresses popped into a plist. Plist moved over to a Leopard install to get the HEX outputs. Bios set to boot form PEG2. If I install with a 9800 in the second slot, it's the only one that will work. Boot up with be fine with just that one installed. Once I put the second one in and try to boot it up, it'll load up everything then I'll get a couple of black screens, with the computer sounding like everythings loaded (i.e., the fan noise drops to a soft hum instead of the rumbling during boot). If I install with a 9800 in the first slot, it's the only one that will work. Boot up will be fine with just that one installed. Once I put the second one in and try to boot up, same deal as above. I use the same string in both. Whether only one card installed or not. So, just to clarify. One string, outputted from leopard. Works for top card if only one installed, and was installed when I installed SL. Works for bottom card if only one installed, and was installed when I installed SL. But once I put both in, black screen; soft hum. Thoughts? Or if you need anything more from me, I'd be happy to give more information.
|
|
|
Post by oniijin on Feb 17, 2010 5:18:44 GMT
thanks for this great post. I got dual working in 10.5.8. However now I'm trying for 10.6.2.
I’m running 10.6.2 on asus p5k deluxe, and trying to have both 8800 and 8600. As of now, using EFI strings I can boot using either, but not both at the same time. If I have both plugged in, it KP's right before it hits GUI after the grey pinwheel. Any ideas?
Things I've tried: Installed NV files from 10.6.0 used two 8800 strings in EFI used two 8600 strings in EFI In bios switched to pci/peg (from peg/pci) Used strings from EFI Studio, then manually entered them Tried to add device IDs to kext files tried different chameleons (v2,3,4) tried using graphicsenabler in boot/plist
all ended the same way with KP right before GUI loads.
|
|
|
Post by inky76 on Feb 18, 2010 8:47:39 GMT
same result as justanopinion - but I run a p6t deluxe v2.. just can't get both cards to run simultaneously.. gtx260 896 and 8800gt 512
|
|
|
Post by kocoman on Feb 18, 2010 15:33:16 GMT
It is possible to make this work
ie: get the pciroot..
for a notebook that has internal video then a ViDock (pci-e) attached to it?
The problem is I can't remove the internal adapter.
|
|