RTX 3090 worth it?

Why not work with 8/16 bit tiff?

That can be done, but it is relatively inefficient.

The primary problem is that unless the process of decompressing, deinterlacing and denoising is perfect is that unless this can be done perfectly, the flaws are permanently baked into the pictures.

Secondly, it can’t carry a audio track. which can lead to audio sync complications.

Generally, exporting to a stream of single images is most useful to prepare for re-editing in a graphic editing program that can run macros on each image to effectively animate special effects into the image. It is also a useful way to facilitate finding a single frame that will be “perfect” for a still graphic composition, such as a poster or photo for an article.

As such, a truly lossless CODEC is still the best way to go.

1 Like

hah, i’m a idiot, i did forget about the audio.


I’m really pissed off about this benchmark and I’ll save my swear words.

That the author has no idea what the gpus accelerate within the apps can be seen in the comment on the memory, he thinks that the size would play a role, and this would make the apps faster, but you could run 12x capture one in the 4090s memory, C1 does need 2GB.

In capture one, the display is also accelerated on the screen, which is jerky without gpu even with the 24 core 3960X.

And what really annoys me is the TL products being left out, Neat Image, Helicon Phocus, PT-Gui and DxO PL6 too, all GPU accelerated, all specialised Pro-User Apps.

Of course, you can also do that with a slow GPU if you use the wrong APPs and work like 15 years ago.

GPGPU computing is always a symbiosis of the CPU with the most MHZ, the fastest Ram connection and the widest PCI-E bandwidth. As well as the GPU with the fastest memory connection to the most cores.


And why aren’t additional Radeon gpus tested, are they all afraid that team green will suddenly be overtaken by the mid-range model?

I’ve seen someone use an outdated gigapixel version to make radeon look bad.

I remember in 2020 when I remotely tested a Vega Pro VII and was told that a Radeon Pro W5700 was just as fast in Denoise as an RTX 6000.

I didn’t want to hear that back then either.

https://petapixel.com/2022/10/17/the-nvidia-rtx-4090-is-amazing-and-photographers-should-not-buy-it/

I’m not very concerned about whether they tested on any Topaz apps or not. I really do disagree about how useful all that GPU power is for still (or single-image) proto/graphics processing.

I am a voracious photographer. It is just a dream come true for someone like me that you can now have a pocket-sized Hasselblad digital camera that can make VHD video, lots of other photo options and XPAN, too. (And yes, I also has a phone built into it. :nerd_face:)

Obviously, enhancing video does need more GPU power that need for single still images, but that isn’t the sole consideration.

I own several photo editing apps, several of which make very good use of AI and a powerful GPU that have made doing complex editing operations that seemed to take all day and make them happen in seconds.

Another thing to consider: Making a photo WAY larger than needed for editing and refinement purposes and then reducing it to a ‘normal’ resolution for output. Using that method, coupled with a few editing tricks, and a bit of TLC and you can really make those photos pop.

There will always be articles (especially on the net) that have outrageous headlines and completely ignore the facts and then spout some outrageous BS on the subject. And then sit back and make $$$ based on the click-tracing numbers for the advertising on their page that gets washed into your eyeballs when you load their page.

Unfortunately, there are still people out there who will believe anything they read. :roll_eyes:
:thinking:

I think it will be delivered tomorrow morning…
:cowboy_hat_face:

1 Like

I do not see any advertising.

u-Block & No-Script.

1 Like

If you have time, I’d be interested in seeing times of Gigapixel, Denoise AI and DxO Deep Prime Extreme Detail.

OK! However, I should also mention that the 3090 I currently have installed is getting used by a higher and higher percentage for rendering and preview since the last few beta releases. Very gratifying. But I don’t know if the improvements Topaz has recently made for the Nvidia 3000 series will also be present for the 4000 series yet. - I assume the Nvidia drivers for these cards will me the same in many ways, but I’m certain they will also have differences. – We’ll see. I don’t expect to have time to change over to 4090 for a few more days…

Seems like the 4090 has its power connector melting.

Look over to reddit.

The PR disaster is already rolling.

@Martyprod

Sounds like a case of cheap power connectors. (I’ve been there on the University-wide level.)

And I’m going to extra careful when I install mine. It’s still in the box…

wow, i just read the reedit thread (it’s very very long) and an article. they suggest to have a 3.0 Atx powersupply as it has no issue with cables.
be careful. is the fault are the cable connectors (4x8 on Atx 2.0 PSU) or the ones on the card ?

afterburner Msi software can reduce consomption of the card until better information on this…

The one at the card.

The adapter in combination with the connector can create such a high resistance (electrical) at the contact points when bending in the connector that the adapter and connector melt.

The adapter must not be bent for the first 3.5 cm, which is quite difficult with the many cables hanging from it and the fact that the card is so wide.

1 Like

In the two tower cases I’m using space is a problem.

:scream:

I have too much space

:nerd_face: :sunglasses:

1 Like

The way I see it, they could have worked around the problem if the adapter had two clamps, then it couldn’t slip, but would be harder to get loose.

The clamp is on the opposite side of the 12V rails. On the clamp side is the grounding. The 12V side has the most space to move.

One of the redditors has operated the card upright, his cable was barely bent, but that was probably enough to burn it.

1 Like

Thomas,

As I mentioned earlier, I have me new Gigabyte Geforce RTX (Gaming OC) 4090, but it is still sitting the carton.- I’m waiting for a good timeframe to take everything down and swap out the RTX 3090 Vision OC in that system presently.

Since most of the complaints about melting adapters seem to indicate problems caused by bending the wires too close to the connectors, I decided to check on what Gigabyte furnished with their card.

It appears to be a four 8-pin connectors wired separately to a small 12-pin mini connector to fit the video card. Each leg is about 4" long and black fabric covered… There should be no problem bending these cables in a place where it won’t break conductors or cause overheating and shorts. - Actually, I’m a bit surprised that they needed to resort to this kind of hook-up arrangement at all. - I’m fairly sure that in the near future there will be several 3rd party cables available that will make all of this problem go away.

One problem that I foresee after doing this install is finding an adequate UPS power back-up suitable for a 1000W PSU. Most of the competitively proced UPS units are rated at 1500KVA, which supports roughly a load of 850-900 watts. Nearly all the popular manufacturers’ prices rise steeply when the power requirement goes beyond 900 watts… - I’m also looking into the idea of using multiple PSUs. - One to drive the computer and video card and an additional one that will power the pumps and fan motors. - Looks like I may need to do some homework before I start my mods.

2 Likes

JayZ did try hard to melt the connector but wasnt able to.

https://youtu.be/_z58lEnnX1k

1 Like

Good Link! He was showing the same “original” cable that came with my Gigabyte RTX 4090. And I’m gonna buy one of those CableMods cables. They’re way better.

In the video, it clearly show that JTC didn’t understand the principle why the connector melt. He kept bending the cable on the table, however bent cables are not the reason why the connector melt.

Poor connection between the connector are the main reason. Small contact area within the connector increase resistant hence generate heat & melt the plastic. To generate the situation, you need to put stress on the connector, but he didn’t test that during the video, that is why nothing happen.

CableMod advice not to bend the connector.
https://cablemod.com/12vhpwr/
It seem that JTC view those images in Twitter or read the title only without reading or understanding the whole text. In the guide, it clearly stated that “terminals coming loose or misaligning → may lead to uneven load across the wires → risk of overheating damage”.

Bending the cable while the connector is plugged in the graphic card cause the “terminals coming loose or misaligning” are the key. If you bend the cable on table by hand first, then plug in the graphic card nicely, it won’t cause any problem. That is why he fail to generate the situation.

Anyway I know JTC is a Youtuber not Engineer, most people watch his video for fun not for science. If he melt the connector in front of camera, Nvidia probably won’t send them review samples anymore. LOL

1 Like

I like to keep my cables harnessed logically together and fastened to put the wiring most likely to need changing due to component replacements, upgrades, etc., as close to the top of the bundle as possible to allow easy changeout.

The CableMod cable shown in the video link is very attractive, and probably good for those who want their systems to look really impressive through the glass side panels.

Since I try to keep my cable bundles out of sight, this isn’t necessary, but having adequate power to get up to the video card is an important consideration.

I found two cables on Amazon that run power from 3 PCI power sockets (Corsair) to the single connector on the RTX 4090. It is not as snazzy as the CableMod version but looks to be very well-engineered and uses stranded copper of an appropriate grade. It should not overheat. (See image below) It costs in the vicinity of $20.00 US.

1 Like

I am running a 4090 with a 5950 and 64GB RAM and u don’t need a 1000W power supply. 850W is just fine, especially for Topaz (and not some GPU benchmarking).
The 4090 is about 50% more power efficient than last generation. It’s also only about 10% faster than a 3090 in Topaz though (at least Artemis or Proteus). I am running the 4090 slightly undervolted @ 2600Mhz and with 3 x 1080p Artemis tasks simultaneously, it consumes about 250W. Plus 150W for the 5950 (fixed voltage + frequency), that’s 400W for GPU + CPU.

1 Like