How do we get graphics cards working with Puppy?

Problems and successes with specific brands/models of computer video hardware
Message
Author
User avatar
Mike Walsh
Posts: 6351
Joined: Sat 28 Jun 2014, 12:42
Location: King's Lynn, UK.

How do we get graphics cards working with Puppy?

#1 Post by Mike Walsh »

Afternoon, kiddiwinks.

Now; I figured this is probably the best place for this. If not, Flash; please move it.

Question:-

What, exactly, is involved in getting a discrete graphics card up-and-running under Puppy?

Background

I have a basic, Asus GeForce 210 that I purchased a few years ago for use with the now-defunct Compaq tower. I was never able to use it for two reasons:-

1 ) The PCI-e slot in the Compaq was truly ancient; it was the very early 1.0a standard, so probably wouldn't have properly supported the card (the GeForce 210 needs at least PCI-e 2.0 or higher).

2 ) The other reason I was never able to use it was because some of the pins at the 'far' end of the slot were in fact damaged. You figure it out....

-----------------------------------

Having upgraded to a brand new HP 'compact' tower, with a PCI-e 3.0 slot, I'm considering whether to give this another try. The main reason I'd like to do so is purely because the card has 1GB of its own dedicated memory, so will do away with the need for the Intel Pentium's 'on-die' GPU to have to 'borrow' it from RAM.

Now; being Nvidia, I gather these work pretty well with Linux. I want to try this in Bionicpup64; there's several drivers available, but.....how the hell d'you know which one you need? What do you do with the driver once you've selected it? And more to the point, I'm going to need 'hand-holding', and being walked through this stuff step-by-step! (I'm always the same with any new procedure that I've never attempted before. Having got the hang of it, I'm usually fine after that.)

The new monitor that came with the HP has both VGA and HDMI; needless to say, I'm using HDMI. The card has outputs for VGA, HDMI and DivX...

I can install the card, no problem. It's what you do after that I'm not so sure about.....

Any and all advice will, as usual, be very much appreciated.

TIA.


Mike. :wink:

User avatar
rockedge
Posts: 1864
Joined: Wed 11 Apr 2012, 13:32
Location: Connecticut, United States
Contact:

#2 Post by rockedge »

can the card do GPU computations?

User avatar
OscarTalks
Posts: 2196
Joined: Mon 06 Feb 2012, 00:58
Location: London, England

#3 Post by OscarTalks »

Hi Mike,

They are quite good in my experience.
Assuming you are able to get it connected to a slot and configured in the BIOS I would expect Puppy to automatically load the nouveau driver and everything would look OK. The nouveau drivers are better these days than they were a few years back.

The http://nvidia.com website has a driver search facility which tells you which card needs which driver. At a glance looks like GeForce 210 needs 340 but you should check that.

If you want to upgrade to an nvidia proprietary driver, BionicPup64 has some in Quickpet so that is probably all you need to do. The nouveau driver has to be disabled and blacklisted as it clashes with the nvidia driver, but I think it is all set up to do that. I believe you have to reboot though so it is not easy to test if running with no save.

If pre-built drivers are not available or if you like delving into it in more depth, you can download the .run installer from nvidia and use that to compile, but you have to exit to prompt and issue a few commands from there. There is the get-nvidia utility which is supposed to handle that process and make it a bit easier.
Oscar in England
Image

dancytron
Posts: 1519
Joined: Wed 18 Jul 2012, 19:20

#4 Post by dancytron »

Just to add, that most of the cards have a sound driver that supplies the sound through hdmi.

If you aren't going to use the sound in hdmi, you will probably need to blacklist the graphics card's sound driver.

As long as the drivers are already compiled, like they are for most puppies, it is pretty easy.

User avatar
Mike Walsh
Posts: 6351
Joined: Sat 28 Jun 2014, 12:42
Location: King's Lynn, UK.

#5 Post by Mike Walsh »

@ Oscar:-

Mm. Ok. So; if I've got this straight, from what you're saying, I don't have to worry about the Nvidia drivers straight away, no?

I can get the card installed, set whatever needs setting in the UEFI/BIOS, boot up, and she should load and use the nouveau drivers automatically?

That sounds fairly straight-forward....


Mike. :wink:

ndujoe1
Posts: 851
Joined: Mon 05 Dec 2005, 01:06

#6 Post by ndujoe1 »

how do you blacklist the HDMI sound driver. thanks.

dancytron
Posts: 1519
Joined: Wed 18 Jul 2012, 19:20

#7 Post by dancytron »

ndujoe1 wrote:how do you blacklist the HDMI sound driver. thanks.
I think it might vary from card to card, but in Debian Dog Stretch 64 I created a file /etc/modprobe.d/blacklist.conf with "blacklist snd_hda_intel" in it.

see https://techgage.com/news/disabling_nvi ... der_linux/

In puppy, I seem to recall there is a gui utility to do the same thing once you find out what the driver is called for that particular card.

User avatar
mikeslr
Posts: 3890
Joined: Mon 16 Jun 2008, 21:20
Location: 500 seconds from Sol

#8 Post by mikeslr »

ndujoe1 wrote:how do you blacklist the HDMI sound driver. thanks.
In Puppies:

Menu>System>Bootmanager, click the Modules Tab, Click 'Blacklist' Module. Find the unwanted module in the Right-pane. Select it and click the Remove button.

SAVE (that change to your SaveFile/Folder) and reboot; i.e., its a component of bootmanager.

User avatar
don570
Posts: 5528
Joined: Wed 10 Mar 2010, 19:58
Location: Ontario

#9 Post by don570 »

I have nvidia gtx 1030 card inside a dell 990 and there is no need to blacklist the hdmi sound.

Line out and headphone jack work fine . The HDMI monitor works as well.


My instructions for installing nvidia package in fatdog64....
http://45.33.15.200/puppy/viewtopic.php ... 7c#1022637

_____________________________________________________

User avatar
bigpup
Posts: 13886
Joined: Sun 11 Oct 2009, 18:15
Location: S.C. USA

#10 Post by bigpup »

First, in the computers bios.
Need it set to be using the Nvidia card and not integrated graphics.
Look for graphics settings.
Something that will change what is used.

Some bios, auto switch, if they see a separate graphics card.

If the Nvidia card is used. Bionicpup64 8.0 will use the generic Nouveau driver.
It works OK for most graphics.
However, the Nvidia driver does provide all features support.

Bionicpup64 8.0 has Quickpet
Quickpet->Drivers has several ones already packaged for Bionicpup.
Nvidia says the 340 driver is the one that works for this card.

Do not try a newer driver. They may or may not still support that card. That is old hardware and newer drivers have dropped support for older hardware.

May have to run Quickpet->Info->Bionicpup updates to get all the Nvidia drivers showing.

Just click on the 340 driver.
It will download, install, and do everything needed so it is now being used.

If you do not want to use the pets already made for Bionicpup64 8.0.

How to install a Nvidia run package
http://www.murga-linux.com/puppy/viewtopic.php?t=110611
The things they do not tell you, are usually the clue to solving the problem.
When I was a kid I wanted to be older.... This is not what I expected :shock:
YaPI(any iso installer)

User avatar
Mike Walsh
Posts: 6351
Joined: Sat 28 Jun 2014, 12:42
Location: King's Lynn, UK.

#11 Post by Mike Walsh »

@ all:-

Well, thanks for the varied advice, guys. I'll let y'all know how I get on with this in a few days time; it's a 'mini-project' for the weekend, I think.

I'll try the existing 340 driver .pet in Quickpet to start with, if I decide to give the Nvidia driver a go. Otherwise, I'll follow bigpup's instructions and see if I can build it from the .run package (I've already got the current 340.108 package from the Nvidia website, and have just downloaded shinobar's get-nvidia .pet.)

Initially, I want to see if Oscar's info about the nouveau drivers holds water. I'll decide what to do after that. I'm not too fussed about ultimate picture quality, though additional adjustments would of course be nice to have. The main objective here is to get the graphics running off their own dedicated pool of RAM, rather than 'poaching' system RAM.... I'm not too sure if the card's DDR3 RAM means it will actually run slower than the on-die adapter - which is using DDR4 - though from what I understand, graphics RAM is usually one generation ahead of its designation, i.e., GDDR3 is equivalent to DDR4 SDRAM, speedwise. And apparently bandwidth v latencies run to opposites in each instance.....meaning that ultimately, system RAM & graphics VRAM can't be directly compared, any way you look at it.

All bloody confusing to a neophyte. Anybody more 'clued-up' about this than I am, by any chance? :?

Stay tuned.....!


Mike. :wink:
Last edited by Mike Walsh on Fri 07 Feb 2020, 01:45, edited 3 times in total.

User avatar
perdido
Posts: 1528
Joined: Mon 09 Dec 2013, 16:29
Location: ¿Altair IV , Just north of Eeyore Junction.?

#12 Post by perdido »

Probably be a good idea to make a backup of /etc/X11/xorg.conf before you get going.
Just in case of murphy's law.

User avatar
Mike Walsh
Posts: 6351
Joined: Sat 28 Jun 2014, 12:42
Location: King's Lynn, UK.

#13 Post by Mike Walsh »

perdido wrote:Probably be a good idea to make a backup of /etc/X11/xorg.conf before you get going.
Just in case of murphy's law.
Thanks for the reminder, perdido. Murphy's law and I have an ongoing history of repeated encounters.... :lol: :roll:

Sod's law, of course, states that if I don't, then you know darned well what's gonna happen..!

Good advice , mate. Thanks for the tip! (Plus, I think, at least 2 complete backups of Bionicpup64.....in different locations. Just to be on the safe side, like...) :)


Mike. :wink:

User avatar
Mike Walsh
Posts: 6351
Joined: Sat 28 Jun 2014, 12:42
Location: King's Lynn, UK.

#14 Post by Mike Walsh »

UPDATE

Well, I'm.....gobsmacked, is probably the appropriate expression. That was so easy it was unreal.

Oscar, you were absolutely 'on the money', mate. I had an hour or so to spare while Mama was getting her weekly hair-do, so I thought 'In for a penny....'

Powered-off. Unhooked everything. Had 'er up on the worktop. Off with the case-side; unhooked the optical drive, undid 4 screws and swung the hard drive cage out of the way. Removed the appropriate slot cover; plugged the card in, and clipped the retainer in place.

Did it all back up. Plugged everything back in, with the HDMI cable now going to the card instead. Powered-on, booted into Bionicpup, and.....it just works. Unbelievable. So the 'nouveau' drivers obviously have improved, 'cos there's absolutely no visible difference between the card and the on-die GPU. And that's pretty neat.

Actually, I tell a lie. There is one improvement 'twixt the two.

Right from day one, I'd noticed a slight, diagonal imperfection in the upper left quadrant of the new monitor. Only noticeable when scrolling, but it's been getting gradually more noticeable, y'know?

This has now gone. (*sighs of relief...*) Perhaps a slight conflict between the hardware of this particular monitor, the GPU, even the HDMI cable? Who knows.....

If the 'nouveau' driver is this good nowadays, then for my personal use-case I'm going to stick with it. I don't see the need to mess about with the Nvidia drivers, because they're not going to do anything spectacular, from what I can see of things.

Audio appears unaffected. Although

Code: Select all

lspci
....gives me a new Nvidia audio device (in addition to the card itself), I don't think anything will need altering. Retrovol is still dispensing the Intel HD audio device's output without issues.

It even gives me a new core-temp reading in gKrellM, for the card itself.....which appears to be holding absolutely steady at around 38/40°C. Can't ask for more than that.

--------------------------------------------

There didn't appear to be any settings in the BIOS/UEFI for manually selecting graphics output, so I have to assume that the card was auto-detected & configured. Certainly, it all appears to be functioning as it should.

PupSysInfo now gives this for graphics output:-

Code: Select all

▶—— Video ——◀

Display Specifications:
• Monitor VertRefresh: 60.00 times/s
• Screen Dimensions: 1920x1080 pixels (508x285 millimeters)
• Screen Depth: 24 bits (planes)

Xorg Startup Log (/var/log/Xorg.0.log):
• Xorg Driver in use: nouveau
• Loaded Modules: dbe dri2 exa fb fbdevhw glx shadowfb
• X.Org version: 1.19.6

OpenGL 2D/3D Rendering:
• Direct Rendering: Yes
• Vendor: nouveau
• Renderer: NVA8
• Version: 3.1 Mesa 18.2.2

VGA controller [0300]: NVIDIA Corporation GT218 [GeForce 210] [10de:0a65] (rev a2)
• Kernel Driver: nouveau
• Memory Used by Driver: 1660.00 KB
• Path: /lib/modules/4.19.23/kernel/drivers/gpu/drm/nouveau/nouveau.ko
• Description: nVidia Riva/TNT/GeForce/Quadro/Tesla/Tegra K1+
• Video RAM: 1008M total, 256M 32M prefetchable

Remember, I didn't want this for gaming, or owt like that. I primarily wanted to stop the graphics from poaching system RAM to use it as VRAM. I'm probably going to obtain a second 4GB stick of DDR4 to fill the remaining slot, but that will be it, as far as it goes. This will then be the perfect 'Puppy' system for me.....and I remain astounded, even after all these years, as to just how good Pup really is.

Amazing.

Now all I need to do is to get used to the slight extra whine from the fan on the card's own cooler.... Aside from that, ATM I'm as happy as a pig in the brown stuff!!


Mike. :wink:

User avatar
Mike Walsh
Posts: 6351
Joined: Sat 28 Jun 2014, 12:42
Location: King's Lynn, UK.

#15 Post by Mike Walsh »

Footnote:-

Just as an end to this tale, I've been doing a wee bit of research. That the card is functioning as intended, of that there is no doubt. However:-

-----------------------------------------

.....it turns out this is a very old card. Bigpup wasn't wrong when he said that!

It dates from 2009; this is when Nvidia first released the reference version of this card. And the chip itself - the GT218 - actually dates back to the Quadros, etc, a few years earlier still.

It has a grand total of 16 cores, okay? The on-die Intel UHD 610, built in to the G5400, has 96!! Along with pixel & texture fill rates that blow the 210 into the dust.... The main downside is that the 'internal' GPU borrows system RAM.....whereas the GeForce doesn't, since it has its own 1 GB of VRAM to draw from.

Okay, okay; I know what most of you are going to say. Dump the Nvidia, use the on-die 'internal'. Mm. I'll say this; at the end of the day, what counts is real-world performance. I've tried out everything I use which is in any WAY graphics-intensive.....including a first-person 'shoot-em-up' type of game called 'Xonotic', which I run as a Windoze PortableApp under WINE. It works darned well, too! No tearing, juddering, artefacts, anything like that.

I only tried Xonotic once on the old Compaq, and gave it up as a bad job. It was so slow as to be unplayable. I'm not a 'gamer', though I do play around with 'em very occasionally when I get totally bored. On this new HP, it's completely different, and is in fact quite an enjoyable diversion for half-an-hour. Very similar to 'Assault Cube', 'Doom'.....that kind of genre. Modern hardware definitely seems to help for this sort of thing!

-----------------------------------------

I think I'm going to keep the card. There's a very slim possibility I might look around for something a little bit 'beefier', though for my use-case it's not important. The main reason I might do so is to see if I can get hold of one of those 'silent' types (with the huge heatsink). There's certainly room for it in the Pavilion's case. The noise from that little fan on the GeForce 210 is quite intrusive after a couple of hours.....

We'll see.

EDIT:- Probably one of these, I think:-

https://www.amazon.co.uk/ASUS-GT710-SL- ... B07489XSJP

GeForce GT710; 2 GB DDR5, 192 CUDA cores.....and only around GBP £35. I know these things are described as 'occupying 2 slots', because of that huge, 'passive' heatsink; on most boards, there's several PCI-e x 16 slots fairly close together, the heatsink overhangs the neighbouring slot, so one tends to get 'wasted'. This HP board has a single PCI-e x 16 slot, and a single PCI-e x 4 slot.....and they're well spaced apart.

Perfect.


Mike. :wink:
Last edited by Mike Walsh on Sun 09 Feb 2020, 19:12, edited 1 time in total.

jamesbond
Posts: 3433
Joined: Mon 26 Feb 2007, 05:02
Location: The Blue Marble

#16 Post by jamesbond »

@Mike, you have got all the important points right.

1. I just want to add a few comments. Xonotic is a game from 2000-ish era.
Your 2009 nvidia is almost a decade ahead, so it should not have a problem playing it.
If this is the kind of game that you're playing, then the nvidia will be more than enough.

2. To make sure that your integrated GPU doesn't eat system RAM, you need to disable it in BIOS. If you don't, and just leave it inactive, it would still steal system RAM for itself.

3. If you like xonotic so much, they actually have a Linux version, both 32-bit and 64-bit. You don't wine for that.
Fatdog64 forum links: [url=http://murga-linux.com/puppy/viewtopic.php?t=117546]Latest version[/url] | [url=https://cutt.ly/ke8sn5H]Contributed packages[/url] | [url=https://cutt.ly/se8scrb]ISO builder[/url]

User avatar
Mike Walsh
Posts: 6351
Joined: Sat 28 Jun 2014, 12:42
Location: King's Lynn, UK.

#17 Post by Mike Walsh »

@ jamesbond:-
jamesbond wrote:2. To make sure that your integrated GPU doesn't eat system RAM, you need to disable it in BIOS. If you don't, and just leave it inactive, it would still steal system RAM for itself.

3. If you like xonotic so much, they actually have a Linux version, both 32-bit and 64-bit. You don't wine for that.
re: #2. I have to assume that this thing is auto-detected/configured/enabled/disabled, as & when. There is absolutely nothing in the UEFI/BIOS to cover doing it manually, but that's typical of HP; the original BIOS in my previous machine was extremely simplified, too. (Compaq were bought out by HP, way back in 2004).

re: #3. Thanks for the tip about the Linux download. I've obtained it, and it works very well. The only reason I've been running it under WINE is because I get a lot of 'portable' Windoze apps from PortableApps.com, and came across it one day when I was browsing their listings. The old Compaq never had the grunt to run it, but trying it out the other day on the new machine it works surprisingly well. Not being an habitual gamer, I hadn't bothered to look for a Linux version.

(As an aside, the game and the card are actually a lot closer together than you realise, date-wise. According to the Wikipedia article, the first release of Xonotic was in fact on the 8th September, 2011:-

https://en.wikipedia.org/wiki/Xonotic )


Mike. :wink:

jamesbond
Posts: 3433
Joined: Mon 26 Feb 2007, 05:02
Location: The Blue Marble

#18 Post by jamesbond »

Mike Walsh wrote:re: #2. I have to assume that this thing is auto-detected/configured/enabled/disabled, as & when. There is absolutely nothing in the UEFI/BIOS to cover doing it manually, but that's typical of HP; the original BIOS in my previous machine was extremely simplified, too. (Compaq were bought out by HP, way back in 2004).
Oh, I see. Ok. Well that's too bad then. If you really want to know then open terminal and run "free -h" before and after you install the graphics card. But I suppose there is no point in doing that because if the result is unexpected, there is nothing you can do anyway due to the BIOS restriction.
Thanks for the tip about the Linux download. I've obtained it, and it works very well.
No worries. I find the game enjoyable too. Played it with my kids from the time they were easy targets until the point where __I__ became the easy target :lol:
(As an aside, the game and the card are actually a lot closer together than you realise, date-wise. According to the Wikipedia article, the first release of Xonotic was in fact on the 8th September, 2011:-

https://en.wikipedia.org/wiki/Xonotic )
And if you read a bit further, Xonotic is actually a fork of Nexuiz. Nexuiz development started at 2001, using a modified Quake engine. Quake was a game from late 1990s (whose engine was open-sourced). So my original point stands :D
Fatdog64 forum links: [url=http://murga-linux.com/puppy/viewtopic.php?t=117546]Latest version[/url] | [url=https://cutt.ly/ke8sn5H]Contributed packages[/url] | [url=https://cutt.ly/se8scrb]ISO builder[/url]

User avatar
rcrsn51
Posts: 13096
Joined: Tue 05 Sep 2006, 13:50
Location: Stratford, Ontario

#19 Post by rcrsn51 »

re: #2. I have to assume that this thing is auto-detected/configured/enabled/disabled, as & when. There is absolutely nothing in the UEFI/BIOS to cover doing it manually,
In my experience, this is the usual procedure. I assumed it was because the add-on card uses the same IRQ numbers as the on-board card and automatically hides it.

User avatar
Mike Walsh
Posts: 6351
Joined: Sat 28 Jun 2014, 12:42
Location: King's Lynn, UK.

#20 Post by Mike Walsh »

FOOTNOTE:-

We-e-e-elll.....that was a very short-lived experiment. I've proved the point to my own satisfaction, though; the card had been sitting in the back of a cupboard for over 4 years - never worked with the old Compaq - and I was dying to try it out.

However, tech has come on a long way in the last 5 years. The 'on-die' GPUs that are now built-in to many CPUs are comparable in performance to some of the higher-end cards from 5 years ago. The GeForce 210's fan is incredibly noisy for such a tiny thing; 'twas driving me to distraction. And since installing it on Friday, I've had a very odd network problem. Instead of connecting to my router, it's insisted on connecting to a peculiar external IP that I've never seen before. The tray icon, whether SNS or Network Wizard, has proclaimed there's an active connection.....but every one of my browsers has told a different tale. No connection; no DNS resolution.

Even the firewall's been complaining.....the only way to get a 'proper' connection has been to re-boot the router, followed by re-booting Puppy. And that's plain ridiculous.

So; a couple of hours ago, out came the 210. Graphics are now running from the 'on-die' Intel UHD 610 GPU once again; curiously, even the afore-mentioned 'glitch' on-screen has now disappeared. (Which I'm not complaining about!) The router/network are now behaving perfectly.....and peace & quiet reigns once more. It's a no-brainer, really; the 'on-die' GPU has six times as many cores, is way more advanced than the 210 in every respect......and is silent!!! Yay!

------------------------------------------

What I take away from this is that it doesn't always do to try and mix tech which is more than a decade apart in development terms.....especially given the breakneck pace of that development. I'm guessing that somewhere there must have been IRQ conflicts or other similar things going on. The Phoenix Award BIOS on the old Compaq was very good at telling me stuff like that; on this modern HP, the BIOS/UEFI 'hides' so much stuff from you it's crazy. And there's no visible method for toggling it to give more.....

I'll wait until I can afford a more modern GPU card before I try this out again. Frankly, it'll be cheaper to buy another stick of RAM for the UHD 610 to poach as VRAM.....

(Just as a demo, have a look here:-

http://madebyevan.com/webgl-water/

It's a WebGL demo gjuhasz pointed me towards in my 'portable-Chrome' thread last night. When running this with the GeForce card, not only was it starting to run quite hot, but the Pentium was also up to around 70C+ from its usual 35-40.

Running it again just now from the 'on-die' GPU, all I see is the temperatures rising to the low 50s. And that's it.)


Mike. :wink:
Last edited by Mike Walsh on Mon 10 Feb 2020, 16:36, edited 1 time in total.

Post Reply