Opengl for zsnes under win.

Strictly for discussing ZSNES development and for submitting code. You can also join us on IRC at irc.libera.chat in #zsnes.
Please, no requests here.

Moderator: ZSNES Mods

Post Reply
MaxSt
ZSNES Developer
ZSNES Developer
Posts: 113
Joined: Wed Jul 28, 2004 7:07 am
Location: USA
Contact:

Post by MaxSt »

Nightcrawler wrote: The backbuffer does not have to be in video memory.
If it's not it video memory, then how GPU will be able to filter it?

MaxSt.
byuu

Post by byuu »

Nightcrawler wrote:The backbuffer does not have to be in video memory.
In DDraw, you can use a system RAM buffer, and then use lpdd->Blt/Fast to draw it on screen, instead of using a backbuffer. This will never be filtered by the graphics card, you lose any and all hardware acceleration, and you cannot use lpdd->Flip() [page flipping] for smoother rendering.
As I believe you were referring to D3D, though... if you specify a backbuffer in D3D (as you were intending to use DrawPrimitives to draw your textured quad sprites), the backbuffer must be in video RAM. lpd3dd->Present(...) flips the backbuffer and the main screen video pointers around. That wouldn't work if the backbuffer was in system RAM. I know of no way to use DrawPrimitive on system memory, and to reiterate: it wouldn't be worth it as you would lose all video acceleration. Hardware acceleration occurs because the video card does all video manipulation within the card itself. So unless the video card natively supports the filtering you want, you lose hardware acceleration by adding it.

To be honest, your best bet would be to double the size of your tilesets and sprites, and then filter them individually before uploading them to the video card as textures. The effect would be slightly less pleasing on the eye than a full screen filter if you don't alias the edges with varying alpha levels to make things blend smoothly, but it would work...
There is no such thing.
All currect hardware is limited to only 2 magnification filters - point filtering and bilinear filtering.
No other hardware magnification filters exist.
Exactly. And despite this simple and obvious truth, it never stops anyone from wanting every last graphics API in the world added. I would like to see what can be done with pixel shaders, as I know nothing about them... but that would seem like a bad idea to add for another 5 years while video cards catch up anyway.

I don't think I've ever heard of a consumer-level graphics card supporting OpenGL better than DDraw. And even if it does, you're only copying a 256x224 RAM buffer onto the video card. Does the speed really even matter? But I guess it doesn't hurt having the option there to choose between the two, just in case... not up to me anyway.
Reznor007
Lurker
Posts: 118
Joined: Fri Jul 30, 2004 8:11 am
Contact:

Post by Reznor007 »

Well, much along the same lines, Windows XP Media Center Edition, Zoom Player, MAME, and a few other programs, despite being 2d only, are now using Direct3D for display over DDraw, even Xbox Media Center does.

As for what Pixel/Fragment shaders can do, well...almost anything really. It basically lets you use your GPU as a generic processing device.

Look here http://www.beyond3d.com/articles/shadercomp/results/ for a D3D Pixel Shader 2.0 program that runs a version of Frogger entirely on the GPU(game logic, hit detection, etc.) the only thing your main CPU does is inputs(video card doesn't have keyboard acess).

EDIT-Oh, and I wouldn't say shaders are 5 years off. Pixel Shader 2.0 capable hardware has been around since late 2002. Any nVidia card from GeforceFX and up, or Radeon 9500 and up can run PS 2.0 programs just fine(although the low end GeforceFX models are extremely slow at floating point math).
Noxious Ninja
Dark Wind
Posts: 1271
Joined: Thu Jul 29, 2004 8:58 pm
Location: Texas
Contact:

Post by Noxious Ninja »

There's also

http://www.gpgpu.org/
http://graphics.stanford.edu/projects/brookgpu/
Reznor007 wrote:EDIT-Oh, and I wouldn't say shaders are 5 years off. Pixel Shader 2.0 capable hardware has been around since late 2002. Any nVidia card from GeforceFX and up, or Radeon 9500 and up can run PS 2.0 programs just fine(although the low end GeforceFX models are extremely slow at floating point math).
Yeah, but people are still running ZSNES on original Pentiums.
[u][url=http://bash.org/?577451]#577451[/url][/u]
bohdy
Rookie
Posts: 13
Joined: Sun Feb 13, 2005 9:28 pm

Re: Opengl for zsnes under win.

Post by bohdy »

MaxSt wrote:
pagefault wrote:Plus it allows support of new filters that work on 3d hardware.
There is no such thing.
All currect hardware is limited to only 2 magnification filters - point filtering and bilinear filtering.
No other hardware magnification filters exist.

MaxSt.
You forgot to mention cubic (flat and gaussian) filtering, but that is beside the point. He was talking about fragment program based filters that are in fact executed by the hardware (and have been mentioned many times in this topic).
byuu

Post by byuu »

Reznor007 wrote:EDIT-Oh, and I wouldn't say shaders are 5 years off. Pixel Shader 2.0 capable hardware has been around since late 2002. Any nVidia card from GeforceFX and up, or Radeon 9500 and up can run PS 2.0 programs just fine(although the low end GeforceFX models are extremely slow at floating point math).
I have a Radeon 7500 and a GeForce 4 MX. I have absolutely no intention of upgrading my video cards again until they're both dead from old age/use. I would imagine there are many people such as myself out there, and thus it would not be beneficial to rely on the technology being there for at least a few years. The technology should only be used in 3D games, where everyone upgrades their graphics cards biannually.
<sarcastically> I want pictures, dammit >_<
You forgot to mention cubic (flat and gaussian) filtering, but that is beside the point.
Those are designed for polygon filtering. But yes, beside your point I suppose.
Is anyone ever going to post pictures of this so called hq2x pixel shader? I'd like to see how it looks for myself.
Last edited by byuu on Mon Feb 28, 2005 4:22 am, edited 1 time in total.
Nach
ZSNES Developer
ZSNES Developer
Posts: 3904
Joined: Tue Jul 27, 2004 10:54 pm
Location: Solar powered park bench
Contact:

Post by Nach »

byuusan wrote: I have a Radeon 7500 and a GeForce 4 MX. I have absolutely no intention of upgrading my video cards again until they're both dead from old age/use.
I still have a GeForce 2 in my main machine, and I pretty much have no reason to upgrade.
Maybe when they make a decent GCN emulator which offloads some work onto the GPU will I upgrade.
May 9 2007 - NSRT 3.4, now with lots of hashing and even more accurate information! Go download it.
_____________
Insane Coding
Reznor007
Lurker
Posts: 118
Joined: Fri Jul 30, 2004 8:11 am
Contact:

Post by Reznor007 »

Noxious Ninja wrote:There's also

http://www.gpgpu.org/
http://graphics.stanford.edu/projects/brookgpu/
Reznor007 wrote:EDIT-Oh, and I wouldn't say shaders are 5 years off. Pixel Shader 2.0 capable hardware has been around since late 2002. Any nVidia card from GeforceFX and up, or Radeon 9500 and up can run PS 2.0 programs just fine(although the low end GeforceFX models are extremely slow at floating point math).
Yeah, but people are still running ZSNES on original Pentiums.
Yeah, but there's nothing stopping them from using older versions of ZSNES. That's something I like about MAME development. The devs don't get too bothered if a new feature requires new hardware. If people insist/have to use old hardware, they can use older versions of the software.
Reznor007
Lurker
Posts: 118
Joined: Fri Jul 30, 2004 8:11 am
Contact:

Post by Reznor007 »

Nach wrote:
byuusan wrote: I have a Radeon 7500 and a GeForce 4 MX. I have absolutely no intention of upgrading my video cards again until they're both dead from old age/use.
I still have a GeForce 2 in my main machine, and I pretty much have no reason to upgrade.
Maybe when they make a decent GCN emulator which offloads some work onto the GPU will I upgrade.
The emulator Dolphin uses Pixel shader 2.0 for graphics. I'm not sure how compatible it is, but I've managed to run Ikaruga on it.
Reznor007
Lurker
Posts: 118
Joined: Fri Jul 30, 2004 8:11 am
Contact:

Post by Reznor007 »

byuusan wrote:
Reznor007 wrote:EDIT-Oh, and I wouldn't say shaders are 5 years off. Pixel Shader 2.0 capable hardware has been around since late 2002. Any nVidia card from GeforceFX and up, or Radeon 9500 and up can run PS 2.0 programs just fine(although the low end GeforceFX models are extremely slow at floating point math).
I have a Radeon 7500 and a GeForce 4 MX. I have absolutely no intention of upgrading my video cards again until they're both dead from old age/use. I would imagine there are many people such as myself out there, and thus it would not be beneficial to rely on the technology being there for at least a few years. The technology should only be used in 3D games, where everyone upgrades their graphics cards biannually.
Even bi-annually right now would be a card from early 2003, which would be a Radeon 9500 pro for $200 US(at the time). Even Geforce3/4(non-MX models) and Radeon 8500 have Pixel shader 1.x capabilities.
byuu

Post by byuu »

Even bi-annually right now would be a card from early 2003, which would be a Radeon 9500 pro for $200 US(at the time). Even Geforce3/4(non-MX models) and Radeon 8500 have Pixel shader 1.x capabilities.
... bianually means twice a year. Perhaps you were thinking of biennial? And I was saying that only 3D gamers waste their money like that. As I said above, I have a Radeon 7500 and I don't plan to upgrade for at least 3-5 years, or until the card dies of old age. Preferably the latter.
Reznor007
Lurker
Posts: 118
Joined: Fri Jul 30, 2004 8:11 am
Contact:

Post by Reznor007 »

byuusan wrote:
Even bi-annually right now would be a card from early 2003, which would be a Radeon 9500 pro for $200 US(at the time). Even Geforce3/4(non-MX models) and Radeon 8500 have Pixel shader 1.x capabilities.
... bianually means twice a year. Perhaps you were thinking of biennial? And I was saying that only 3D gamers waste their money like that. As I said above, I have a Radeon 7500 and I don't plan to upgrade for at least 3-5 years, or until the card dies of old age. Preferably the latter.
Yeah...misread it, my fault.

But saying only hardcore gamers have new hardware isn't really relevent, as hardware capable of shaders aren't new. While doing some research for something else, I found that ATI alone had sold 1 million DX9 (shader 2.0 capable) video chips as of March 3 2003. That's 1 million top of the line cards sold in less than 6 months. It's been 2 years since then, and prices have dropped alot.
byuu

Post by byuu »

Reznor007 wrote:While doing some research for something else, I found that ATI alone had sold 1 million DX9 (shader 2.0 capable) video chips as of March 3 2003. That's 1 million top of the line cards sold in less than 6 months. It's been 2 years since then, and prices have dropped alot.
You've got me there. I can only speak for myself, thusly I have no idea what percentage of people have PS2.0-capable graphics cards. It seems that most people on these forums fancy 486's and first-generation Pentiums and expect perfect full speed 60fps emulation with sound, however... :roll:
MaxSt
ZSNES Developer
ZSNES Developer
Posts: 113
Joined: Wed Jul 28, 2004 7:07 am
Location: USA
Contact:

Re: Opengl for zsnes under win.

Post by MaxSt »

bohdy wrote:You forgot to mention cubic (flat and gaussian) filtering
I didn't forget. No current hardware support such things.
bohdy wrote:He was talking about fragment program based filters that are in fact executed by the hardware
There is no proof exist that it's possible to implement filters as complex as hq2x in shaders.

MaxSt.
Nightcrawler
Romhacking God
Posts: 922
Joined: Wed Jul 28, 2004 11:27 pm
Contact:

Re: Opengl for zsnes under win.

Post by Nightcrawler »

bohdy wrote:He was talking about fragment program based filters that are in fact executed by the hardware
There is no proof exist that it's possible to implement filters as complex as hq2x in shaders.

MaxSt.
I forget the factual details, but Pixel Shader 3.0 spec is supported by new nvidia cards and that is supposed to have a significantly larger amount of code space for pixel shader programs as opposed to 2.0. It would probably be more likely to be able to do something like HQ2x with a 3.0 capable card.
[url=http://transcorp.romhacking.net]TransCorp[/url] - Home of the Dual Orb 2, Cho Mahou Tairyku Wozz, and Emerald Dragon SFC/SNES translations.
[url=http://www.romhacking.net]ROMhacking.net[/url] - The central hub of the ROM hacking community.
Clements
Randomness
Posts: 1172
Joined: Wed Jul 28, 2004 4:01 pm
Location: UK
Contact:

Post by Clements »

Pixel Shader 3.0 allows for 32,768 maximum instructions, while GeForce FX handle up to 512.
Reznor007
Lurker
Posts: 118
Joined: Fri Jul 30, 2004 8:11 am
Contact:

Post by Reznor007 »

byuusan wrote:
Reznor007 wrote:While doing some research for something else, I found that ATI alone had sold 1 million DX9 (shader 2.0 capable) video chips as of March 3 2003. That's 1 million top of the line cards sold in less than 6 months. It's been 2 years since then, and prices have dropped alot.
You've got me there. I can only speak for myself, thusly I have no idea what percentage of people have PS2.0-capable graphics cards. It seems that most people on these forums fancy 486's and first-generation Pentiums and expect perfect full speed 60fps emulation with sound, however... :roll:
Yeah, that kind of stuff is annoying, but once old hardware starts holding things back, you just have to stop supporting it. The next version of Windows(Longhorn) is going to require Pixel shader 2.0 hardware as a minimum. It may seem a bit extreme to limit things at that point, but once you think about it, by the time it comes out, capable hardware will have been around for about 4-5 years.

This applies even to ZSNES. Changing to C from ASM is going to slow it down a bit, improving emulation accuracy and timing will slow it down, but it's the only way to make it better. MAME just recently dropped their 68000 ASM CPU core in favor of the more accurate C version.
funkyass
"God"
Posts: 1128
Joined: Tue Jul 27, 2004 11:24 pm

Post by funkyass »

Unless someone can cough up a shader that'll do transparency rendering entirely on the GPU, all this talk of hardware acceleration is useless.
byuu

Post by byuu »

The next version of Windows(Longhorn) is going to require Pixel shader 2.0 hardware as a minimum.
I heard that was just to get all the effects. There's supposed to be a low-quality mode as well. I'm going to hold out upgrading that as long as I can, as well :)
MAME just recently dropped their 68000 ASM CPU core in favor of the more accurate C version.
Really, just recently? Everyone always mentions how concerned MAME is with accuracy... I'm surprised they ever used starscream over the musashi 68k core...
Noxious Ninja
Dark Wind
Posts: 1271
Joined: Thu Jul 29, 2004 8:58 pm
Location: Texas
Contact:

Post by Noxious Ninja »

byuusan wrote:
The next version of Windows(Longhorn) is going to require Pixel shader 2.0 hardware as a minimum.
I heard that was just to get all the effects. There's supposed to be a low-quality mode as well. I'm going to hold out upgrading that as long as I can, as well :)
I'm hoping Wine continues their great progress. There's only a few things left that make me need Windows.
[u][url=http://bash.org/?577451]#577451[/url][/u]
Nightcrawler
Romhacking God
Posts: 922
Joined: Wed Jul 28, 2004 11:27 pm
Contact:

Post by Nightcrawler »

Clements wrote:Pixel Shader 3.0 allows for 32,768 maximum instructions, while GeForce FX handle up to 512.
Thanks.. that will certainly allow for MUCH more complex algorithms than 512 instructions could ever hope to acheive!

ATI choose not to support 3.0 yet in their current generation. I feel 3.0 support will help the longevity of the new Geforce cards. No one takes advantage of 3.0 yet.. but I'm sure they will soon enough and since it's a major code space advantage, I think it is worth having if you were buying a new card and into the new 3d games.

I only have a Geforce 2. I don't have ANY shaders of any kind! haha
I don't even have a hardware T&L engine!
[url=http://transcorp.romhacking.net]TransCorp[/url] - Home of the Dual Orb 2, Cho Mahou Tairyku Wozz, and Emerald Dragon SFC/SNES translations.
[url=http://www.romhacking.net]ROMhacking.net[/url] - The central hub of the ROM hacking community.
Reznor007
Lurker
Posts: 118
Joined: Fri Jul 30, 2004 8:11 am
Contact:

Post by Reznor007 »

byuusan wrote:
The next version of Windows(Longhorn) is going to require Pixel shader 2.0 hardware as a minimum.
I heard that was just to get all the effects. There's supposed to be a low-quality mode as well. I'm going to hold out upgrading that as long as I can, as well :)
MAME just recently dropped their 68000 ASM CPU core in favor of the more accurate C version.
Really, just recently? Everyone always mentions how concerned MAME is with accuracy... I'm surprised they ever used starscream over the musashi 68k core...
Well, for a long time they have been compiling the official binaries with the C core, but the ASM core was left in as a compile time option for people that wanted it. A few versions back the ASM code was completely ripped out. Even the C 68K core was updated alot lately, to support the Sega/Hitachi FD1089 and FD1094 encrypted 68K's(very complex encryption).

I have heard that Longhorn will have a fallback mode for old hardware, simply because some new hardware wouldn't be fully compliant(Intel integrated crap, S3, etc). I seriously wish Intel would stop making video chips. They all suck and it does nothing but hold back PC graphics. Old hardware will require new drivers though, as MS is making a new system called WGF(Windows Graphics Foundation) which requires a new driver architecture.
Reznor007
Lurker
Posts: 118
Joined: Fri Jul 30, 2004 8:11 am
Contact:

Post by Reznor007 »

Nightcrawler wrote:
Clements wrote:Pixel Shader 3.0 allows for 32,768 maximum instructions, while GeForce FX handle up to 512.
Thanks.. that will certainly allow for MUCH more complex algorithms than 512 instructions could ever hope to acheive!

ATI choose not to support 3.0 yet in their current generation. I feel 3.0 support will help the longevity of the new Geforce cards. No one takes advantage of 3.0 yet.. but I'm sure they will soon enough and since it's a major code space advantage, I think it is worth having if you were buying a new card and into the new 3d games.

I only have a Geforce 2. I don't have ANY shaders of any kind! haha
I don't even have a hardware T&L engine!
Here's the official differences between PS2.0 and PS3.0
http://www.microsoft.com/whdc/winhec/pa ... VIDIA.mspx

Geforce2 actually does have hardware T&L. That was the big new feature for the Geforce 1/2 line over the old TNT line. Geforce3/4 added Shader 1.x stuff. GeforceFX was Shader2.0 and Geforce 6800 is 3.0

On ATI's side, the Radeon/Radeon 7x00 are hardware T&L(DX7), 8500/9200 are Shader 1.x(DX8) and 9500+/X700+ are Shader 2.x(DX9).

I myself use a Radeon 9600 128MB that I got late November 2003 for $75 at Best Buy. I'm not sure if I want to wait and upgrade to PCI Express next, or get a higher end AGP card...
Noxious Ninja
Dark Wind
Posts: 1271
Joined: Thu Jul 29, 2004 8:58 pm
Location: Texas
Contact:

Post by Noxious Ninja »

The GeForce FX line was pretty bad. nVidia is lucky there weren't many heavy DX9 games yet. Fortunately, the GeFirce 6 series is much better.
[u][url=http://bash.org/?577451]#577451[/url][/u]
MaxSt
ZSNES Developer
ZSNES Developer
Posts: 113
Joined: Wed Jul 28, 2004 7:07 am
Location: USA
Contact:

Post by MaxSt »

Clements wrote:Pixel Shader 3.0 allows for 32,768 maximum instructions, while GeForce FX handle up to 512.
hq4x32.obj = 300Kb.

MaxSt.
Post Reply