why not faster

R
Posted By
rjphoto
Sep 7, 2003
Views
233
Replies
17
Status
Closed
I have two machines, AMD atholons 900mhz and 2.1 ghz, both asus motherboards, both winxp pro, newer(faster)video card on the 2.1ghz, both 512mb ram (2.1 machine 400fsb ddr memory). 2.1 benchmarks 6 times faster. So why does PSE2 opens pictures the same speed, filters eg despeckle the same speed, etc. Is there a bottle neck in PSE? very large file of an old picture and takes 4 min on both machines to despeckle. suggestions? comments? how to go faster? thanks

MacBook Pro 16” Mockups 🔥

– in 4 materials (clay versions included)

– 12 scenes

– 48 MacBook Pro 16″ mockups

– 6000 x 4500 px

BB
brent bertram
Sep 7, 2003
You might look at hard drive speed. In Photoshop ( and I assume Elements is the same ), the hard drive scratch disk is the primary memory for the program. Existing RAM memory is used as an image cache. Any operation that is dependent on the hard drive will not gain from a faster cpu speed. That’s what I’m suspecting in your case, anyway.

🙂

Brent
P
Phosphor
Sep 7, 2003
And from my experience – I installed a new hard drive with an 8MB cache while I was still running with an old G3 processor. All Elements functions got noticeably faster. I then switched the processor to a G4 that was not just faster, but also Altivec enhanced. I did pick up a little more speed, but the biggest boost came with the new hard drive (and then the activation of the Altivec plugin for Elements, which is, of course, moot on Win machines.)

Also, how does the amount of RAM compare in the two computers? While Brent says that isn’t as important as the hard drive (and I’m not doubting him ’cause he’s smart!), I also know Elements is very greedy.
PD
Peter Duniho
Sep 8, 2003
"Beth Haney" wrote in message
Also, how does the amount of RAM compare in the two computers? While Brent says that isn’t as important as the hard drive (and I’m not doubting him ’cause he’s smart!), I also know Elements is very greedy.

I didn’t read his message to say that. You are correct that having enough RAM is important and I think Brent’s message implies the same thing. Having more RAM will help avoid having to use the hard drive for temporary storage, and so to even greater a degree than getting a faster hard drive (or one with a bigger cache), adding RAM can really help.

Of course, if you already have enough RAM, more isn’t going to make things any faster, nor will speeding up disk access (except for things like opening files and starting the program, that sort of thing).

On most computers, you can tell whether the hard drive is the bottleneck simply by listening. If you hear the disk going during an operation like despeckle or any other image filtering, then a faster disk or more RAM will help (more RAM will make a much bigger improvement than a faster disk). If the disks are so quiet that you never hear them, usually there’s a light on the front of the computer that lights up when the disk is accessed.

That said, I’m wondering just how big the "very large file of an old picture" the original poster is talking about actually is. With half a gigabyte of RAM, it’d have to be pretty darn big to cause disk swapping. I’d have to say, under normal circumstances, I’d certainly expect the 2.1Ghz CPU machine to be faster than the 900 Mhz one.

For any "normal" picture size, all of the image ought to fit in RAM with plenty of room to spare (i.e. to store undo information), and the 2.1Ghz CPU should have faster memory access (the original poster doesn’t say how fast the FSB/memory is on the 900 Mhz machine), not to mention faster processing, all of which should add up to a significant improvement in speed. The only thing that makes sense, given the information given so far, is that the image is so big it IS causing disk activity during despeckling, but that’d have to be a *really* image.

Pete
CS
Chuck Snyder
Sep 8, 2003
Pete, I have a pretty marginal computer by current standards (850 mhZ processor, 512 MB RAM), and there are several filters that really seem to take a long time to do their thing (despeckle is one, smart noise is another). I also find that if I’ve had a lot of pictures open, that even if I close them, the video memory isn’t necessarily freed up. If I try to use some of those ‘slow’ filters after viewing lots of pix, the time it takes for them to complete their tasks is very long indeed – and sometimes results in a freeze-up (should also mention that I have Win 98SE, which has its own set of issues). Do you suppose some of these filters are highly iterative in nature and get ‘compute-bound’ on a resource-limited machine? I have no knowledge whatever of programming or hardware, so I’m diagnosing based on the old ‘what’s changed’ approach I learned as an engineer…

Thanks, Chuck
PD
Peter Duniho
Sep 8, 2003
"Chuck Snyder" wrote in message
[…] If I try to use
some of those ‘slow’ filters after viewing lots of pix, the time it takes for them to complete their tasks is very long indeed – and sometimes
results
in a freeze-up (should also mention that I have Win 98SE, which has its
own
set of issues).

Indeed Win98SE does. Not as bad as WinME though, so there’s a little Pollyanna opportunity for you there. 🙂

Do you suppose some of these filters are highly iterative in nature and get ‘compute-bound’ on a resource-limited machine?

The biggest question is "is there disk activity when running the filters?" A second question is "what happens if you run the same filter on the same image twice?"

You wrote "I also find that if I’ve had a lot of pictures open, that even if I close them, the video memory isn’t necessarily freed up." What are you using to determine whether memory has been freed or not? Also, do you really mean "video memory". In PC jargon, "video memory" refers to the RAM that’s installed on the video card. It’s not used by programs like Elements, except inasmuch as Elements — like any other program — indirectly causes data to be written to the video memory when it asks the operating system to display something on the monitor.

Some thoughts on what you may be seeing (without a very detailed description of the problem, all I can do is guess, and even with a detailed description, without being there, I still can’t make any really precise suggestions)…

Generally speaking, the number of images you have open should not affect the performance of any given filter. The exception to this is that if you use a filter on an image after you have spent some time working on *different* images, you may find that some, most, or even all of the data for that image has been "swapped out" to the hard disk while you were working on those other images.

The consequence of this would be that, as the filter works its way through the image, all of that data needs to be read back into the RAM so that the CPU can do the filter’s processing on the data. To make matters worse, the data for the other images (the ones you were working on before using the filter on the active image) may have to be written back out to the disk to make room for the image being processed. Finally, generally speaking the operating system won’t just start reading all of the image back in all at once; it’ll do it piecemeal as the filter works its way through the image, which is about the slowest way to read data from the disk.

So, going back to those first two questions: if there is disk activity while the filter is running, then you are almost certainly seeing a disk swapping issue. If you can run the filter two times in a row and get different results — poor performance the first time, better performance the second — then this would reinforce the theory that you’re low on RAM (especially if there’s a noticeable decrease in disk activity the second time). The reason being that, the first time through the image, you bring as much of the image into RAM as possible (possibly all of it), and so the second time through, the only memory usage that image is competing with is itself (assuming it doesn’t all fit at once), rather than the other images you were working on (since they all got swapped out to disk the first time you ran the filter).

Now, this could even apply even if you’ve closed the other images. It really just depends on what order you do things. Even when you close the other images, that will not cause the operating system to read the data for the remaining image back into RAM. The only thing that will do that is to actually work on the image again. Basically, if something’s been written to disk, the operating system won’t read it back in until the very last minute; that is, not until it actually needs that data again.

So working on those other images could push the first image’s data out to disk, where it will sit until you start editing that image again. Even closing the images will not change that state of affairs.

Okay, so that’s my "don’t have a lot of information, but here’s a thought" reply. A few other comments before I go:

* All of the above assumes a bug-free program. There’s a programming error called a "memory leak" that happens when the user of the program is finished with a particular chunk of data, but the program does not actually tell the operating system that it’s done with it. There’s a variety of ways to make this error, from a programming point of view, but one way to categorize the error into two classifications is that in some ways it happens, the program still knows about the data and is hanging on to it for no good reason, whereas in some other ways, the program itself has actually forgotten about the data, even though it didn’t tell the OS it’s done with it. My experience has been that the quality of Adobe software is reasonably high, but even they are not immune to accidents. So, there could be a memory leak in Elements that is causing the problem you’re seeing. Depending on the nature of the bug, you may or may not see disk swapping along with it. The second way I mention is actually a little better (if you’re going to have such a bug, that is), since because the program has forgotten about the memory, it will eventually get written to the disk, and at that point will not cause any further performance problems (assuming you have enough disk space, of course 🙂 ).

* You ask about whether "some of these filters are highly iterative in nature and get ‘compute-bound’ on a resource-limited machine". I’m not really sure what you’re asking there, since I don’t know your definition of "resource-limited". However, there are two main bottlenecks with respect to these filters: CPU/memory; and disk. If the entire image can fit into RAM *and* is already resident in RAM before running the filter (that is, there haven’t been other competing pieces of data in use), the disk should not be a bottleneck. If you’ve solved the RAM issue (which is mostly what I was writing about in this message), there is still the question of how fast the CPU can get through the image. This depends on two different things, either of which could be the bottleneck: memory access speed (which determines how quickly the CPU can read a piece of data from the system RAM); and CPU speed (which determines how long the CPU will take to process each pixel once it’s gotten the data from RAM).

Pretty much *all* of the filters I see in Elements (and I assume Photoshop) are indeed "highly iterative in nature" (by definition since, after all, they iterate through the entire image one pixel at a time 🙂 ). As such, they are definitely "compute-bound" (that is, dependent on the processor’s ability to process data), except when they are "disk-bound" (which is what happens when the image is not entirely in RAM when the filtering starts).

Bottom line, things that will affect how long it takes to filter an image:

* The size of the image. This is a major factor. Each time the image height and width double, the amount of data to process quadruples. Data size is exponential with respect to the apparent image size, and data size affects the time it takes to filter an image linearly (thank goodness it’s only linear…there are some computational problems where increases in data cause the time to increase with the square or cube of the data size 🙂 ). Of course, normally you don’t have much control over this. You selected the size of the image because that’s the number of pixels you need, and you have to live with that.

* Amount of RAM. This is a major factor, and in fact has the potential to completely swamp any other factors if your amount of RAM is small enough. A faster hard drive will help a little, but the real solution is to have enough RAM installed in the machine that the data doesn’t have to visit the hard drive much, if at all. Keeping all the data in RAM rather than swapping it to disk can improve processing time by several orders of magnitude in some cases.

* Disk fragmentation. This is related to the disk swapping and disk speed. If you have enough RAM and aren’t swapping, disk fragmentation doesn’t matter. But if you ARE swapping, having a fragmented disk can make an already bad situation MUCH worse. Just for example: I do a bit of video editing, and the difference between transcoding (changing from one format to another) a video file can be along the lines of 3 minutes for an unfragmented file versus 30 minutes for a fragmented one. In the same way that having enough RAM makes the difference between being CPU bound and being disk bound, the state of fragmentation on your disk makes the difference between waiting on data to be transferred from disk to RAM, versus waiting on the disk head to move back and forth around the disk (the format is MUCH faster than the latter; even though moving data to and from the disk is still a lot slower than moving data between RAM and the CPU, it’s WAY faster than the physical motion of the read/write head on a disk drive).

* CPU speed. Once you’ve solved the disk/data size issues, the CPU speed is probably the next big factor. However, it takes a large change in CPU speed to see a difference in processing speed. A 2Ghz CPU isn’t going to get through your image in half the time a 1Ghz CPU will. Best case, you would see between a 30-40% decrease in time spent, and depending on a variety of other issues, you may not even see that much. Even so, a faster CPU is always helpful.

* Memory access speed. Most computers are designed so that the RAM access is appropriate for the speed of the CPU. However, it certainly is possible to find a computer built with components that aren’t exactly right for each other, and where even though the CPU is fast, the components that handle moving data between the CPU and the RAM cannot keep up, leaving the CPU starved for something to do when dealing with computationally intensive tasks like image editing. "Front side bus" speed is what you care about here, along with the type of memory interface (right now "double-data rate" combined with "dual-channel" will provide the fastest memory access for a given front-side bus speed). Unless I had a computer that was under-performing by a ridiculous amount (e.g. 2Ghz machine that’s actually slower than a comparable 1Ghz machine), I wouldn’t spend much time worrying about memory access speed, since chances are the computer is put together right.

Sorry, I guess even the bottom line got kind of long. Hopefully somewhere in there is an answer to your question. 🙂

Pete
CS
Chuck Snyder
Sep 8, 2003
Pete, thanks very much. I’m going to print this one out so I can absorb it piece by piece!

To pursue one thought a little further, and it’s a little OT to the filter question, the phenomenon I’m seeing regarding opening and closing pictures works like this: I open pictures, either with Elements or more often Irfanview, and then close them. If I do this enough times, especially with Irfanview, the pictures will eventually start coming up all black in the Irfanview viewer – as if there wasn’t enough room left to view the whole picture. If I keep doing that, the display will eventually freeze – and it’s more than the display because the computer won’t respond to CTRL-ALT-DEL or any other keyboard command. At that point, I have to shut it down and restart the computer – at which time the pictures will view fine for a while. Does this sound like to you like RAM not being freed up?

Chuck
CS
Chuck Snyder
Sep 8, 2003

p.s. to last message: no discernible hard drive activity associated with
the phenomenon.
PD
Peter Duniho
Sep 8, 2003
"Chuck Snyder" wrote in message
[…] I open pictures, either with Elements or more often Irfanview, and then close them. If I do this enough times, especially
with
Irfanview, the pictures will eventually start coming up all black in the Irfanview viewer […] Does this sound like to you like RAM not being
freed up?

Yes, that sounds like a memory leak to me. If you can reproduce the same problem on an NT-based version of Windows (Windows 2000, XP), I’d say it’s in the programs you’re using. However, with a Win9x version (Windows 95, Windows 98, Windows ME), there’s a small possibility that the leak is actually in the operating system. That possibility moves from "small" to "probable" if you are actually seeing exactly the same behavior in both Elements and Irfanview.

Pete
CS
Chuck Snyder
Sep 8, 2003
Pete, thanks. One of these days I may try the upgrade to XP…although I dread it.

Chuck
JD
Juergen D
Sep 8, 2003
Chuck,

What are you dreading? XP itself or the procedures involved in moving files and set-ups etc? XP is quite friendly and has many little features that come in handy, such as Preview of pictures or the display of thumbnails in Explorer.
Moving files and set-ups is a different story. What I wound up doing is networking the old 98SE machine with the new XP unit. Now file transfers are no longer an issue. And I just left this set-up in place, in other words, I use both machines for certain tasks. For example, I still scan on the 98 machine and subsequently transfer the file to XP. My 2 cents.

Juergen
R
Ray
Sep 8, 2003
Chuck,

If you ever want to upgrade to XP, make sure to draw up a list of all your computer’s peripherals and check with every manufacturer or Microsoft itself to know if they support XP for your particular
equipment. I’ve found out after the fact that my scanner wasn’t XP certified and I had to buy a new
one.

And processor power is the key to XP. I had a PII 400 and had to change my computer because at that
speed, I’d still be celebrating Xmas 2001 this year 🙂 Finally, there’s the memory issue (no, no, not the leaking..!) Several apps running under Win98 will require double the amount of ram under XP.

All this being said, I encourage you to consider it. XP is far more stable than any previous OS Microsoft has ever made. Or better, get a Mac 😉

Ray
R
rjphoto
Sep 9, 2003
back to the first question. Disk speed – both 7200 rpm hd, FSB 130s on the slow cpu, 333 with fast cpu, size of file 23mb. yes there is hard drive activity, pc 133 memory slow cpu 400 DDR memory with fast cpu. re memory leak – both machines were tested after a fresh reboot and the only program open PSE2 (& all the regulars like nav – both machines similar)both machines have had recent defrag

yes I would expect the faster cpu with the faster memory to do the task faster but it didn’t!!! even opening up the jpeg file was the same with the file on the faster machine and the slower machine getting the file off 100mb network.
PD
Peter Duniho
Sep 9, 2003
"rjphoto" wrote in message
back to the first question. Disk speed – both 7200 rpm hd, FSB 130s on the slow cpu, 333 with fast cpu, size of file 23mb. yes there is hard drive activity

What is the actual resolution of the image? 23Mb is pretty big for a JPEG, and the uncompressed image (which is what winds up in memory) is likely MUCH larger than that. You can either do the math yourself (for a 24-bit image, total bytes is the width times the height times three), or just reply here with the size and I’ll figure it out. 🙂

The fact that you get hard drive activity (i.e. there’s disk swapping going on) leads me to believe that the uncompressed image winds up over 256Mb or so. You’ve got 512Mb of RAM in each machine, but the operating system and Photoshop Elements both will occupy a large chunk of that. Combined, it IS possible they exceed 256Mb, which would leave less RAM available than the image takes. Any editing at all just makes things worse, because to support "undo", copies of any changed portions of the image need to also be kept in memory. Any full-image filtering will require a complete copy of the image to be kept. Even if the image only decompressed to 128Mb, that would add up fast.

There are things one can do to reduce the memory usage of both the operating system, and possibly of Elements, but frankly, RAM is cheap these days and the easiest fix is to just add RAM. If you’re opening 23Mb JPEG files and then editing them, I think you could easily justify adding another 512Mb of RAM to either machine, or both.

Pete
R
Ray
Sep 9, 2003
I don’t know if it’s been suggested yet (I lost track of this thread…), but have you adjusted Element’s memory usage in the preferences ? It might be a good thing to give more room to breathe, sort of..

Ray
PD
Peter Duniho
Sep 9, 2003
"Ray" wrote in message
I don’t know if it’s been suggested yet (I lost track of this thread…), but have you adjusted Element’s memory usage in the preferences? It might be a good thing to give more room to breathe, sort of.

I don’t think it has come up, so thanks for mentioning it. I didn’t even realize you could adjust Element’s memory usage. I’ll have to go take a look at what that does. Depending on what it actually controls, it may or may not improve things. One thing that might be useful is to reduce the depth of the undo buffer, so if that’s one of the things it controls, that could be handy.

Pete
JF
Jodi Frye
Sep 10, 2003
yes the history states is a memory hog and the default to 20 is set for a reason. This is why it’s so important to use layers every time ya want to try something new to your image even if it’s just a pinch of sky. Trashing a layer works much better than choking the ‘undo’.
R
rjphoto
Sep 15, 2003
I have continued the experiment & it was an excuse to buy more ram. The faster machine, with faster memory, now has 1gb ram (333 fsb 400mhz DDR ram). Opening this very large jpeg (that I got from a mistake during scanning).
Time for task – I will put the 500mb results in ().
open 10 sec (70 sec), despeckle 55sec (1 min 55 sec), dust & scratch filter 2min 20s (3 min 50 sec). I also put preferences to 75% ram so PSE had about 750 mb ram to use.
My conclusion – the limiting factor between the two machine was the cache file and since the hard drives were the same performance – I had the same performance even though CPU, memory, benchmarks were much better with the faster machine. obtw the pic was about 12,000 by 12,000 pixels. Science progresses by serendipitous observations (only wish this was science).
rj

How to Improve Photoshop Performance

Learn how to optimize Photoshop for maximum speed, troubleshoot common issues, and keep your projects organized so that you can work faster than ever before!

Related Discussion Topics

Nice and short text about related topics in discussion sections