Photoshop PC won’t save to fibre channel RAiD array. (Mac Photoshop will)

AK
Posted By
Andrew_Kinnie
Jul 27, 2004
Views
452
Replies
21
Status
Closed
Greetings,

We are running Photoshop 7 on Windows XP SP1 (all updates done).

We have a fibre channel based RAID system for our online storage, and it works fine in any application except Photoshop for windows. Our server is running Windows 2000 server sp 4 and the drive is 1.37 TB (with over 1 TB free). When we try to save in photoshop, we get a disk full error.

We have tried four different XP boxes. Our Mac Powerbooks (two of them) can save to the drive fine.

Any ideas?

Andrew

MacBook Pro 16” Mockups 🔥

– in 4 materials (clay versions included)

– 12 scenes

– 48 MacBook Pro 16″ mockups

– 6000 x 4500 px

AK
Andrew_Kinnie
Jul 27, 2004
Hmm. That’s a bit nutty. OK, well are there issues with In Design 2 and Photoshop CS that I should know about before upgrading?
B
BobLevine
Jul 27, 2004
You do mean ID CS right?

There are plenty of issues with every piece of software, but if you’re only referring to the terabyte problem, then no, there is nothing I can think of.

Bob
AK
Andrew_Kinnie
Jul 27, 2004
No I mean ID 2. We have ID 2, and would like to know if we’re going to have any silly problems with ID 2 working with Photoshop CS files, or will we need to upgrade that to CS too. A bit obnoxious, as this is a ridiculous problem.
B
BobLevine
Jul 27, 2004
It’s not the ridiculous. How many terabyte drives were out there when PS
7.0 was released?

No, you shouldn’t have any problems working with ID 2 with PS CS files. But I would stongly encourage you to upgrade to ID CS. There are some excellent enhancements.

Bob
AK
Andrew_Kinnie
Jul 27, 2004
Hmmm. Maybe. There WERE multiple hundred gigabyte drives, so it’s hardly a stretch to imagine that companies would have RAID storage and as there is no operating system problem, I would disagree. Yes, it’s ridiculous.

In any event, hopefully upgrading photoshop alone won’t cause other silliness.
DM
dave_milbut
Jul 27, 2004
That’s a bit nutty

from what I understand it was a problem with a microsoft api. adobe worked around the issue in CS.
B
BobLevine
Jul 27, 2004
Well, agree or disagree, it doesn’t matter. It is what it is.

Bob
PC
Pierre_Courtejoie
Jul 27, 2004
Andrew, PS 7 shipped in April 2002. at the time, the biggest hard drive was the IBM 120GXP. It had 120 MB. You needed more than 9 or 10(!) of them to reach a terabyte. I don’t think that terabytes arrays were so common then, as they are now, where three 400gb disks are enough…
And as you were told and noticed yourself, the problem is windows specific. It does not come from Adobe.
AK
Andrew_Kinnie
Jul 27, 2004
no, they weren’t so common, but they were certainly something that should have been anticipated. RAID systems have been around for a while, and as Photoshop is an app that regularly deals with massive files, it is silly for them to have programmed into Photoshop a limitation that does not exist in the rest of the OS, or for any other app.

Is this really unforseeable? When the app is released, individual drives are already big enough that 9 of them would exceed the arbitrary maximum set in the app? When RAID systems are available that already exceeded it?

If windows couldn’t handle it, that would be one thing (also dumb IMHO), but for an app that uses huge chunks of space to set an arbitrary maximum that close to what is already available (even without raid systems that existed at that time beyond that maximum), when there is no such hardware or OS limitation warranting it, is…well…ridiculous.
B
BobLevine
Jul 27, 2004
And now go back a year or more before it was released. Were terabyte capacities common then? Do you think that this software is magically created the day before it’s released?

Bob
DM
dave_milbut
Jul 27, 2004
the api calls were in place (in windows) for accessing the large drives. photoshop took advantage of them. they had bugs in them that weren’t discovered until multi-terrabyte disks (or arrays) became more common. what’s ridiculous is you going on about it. just upgrade or forget it. it’s old news.
AK
Andrew_Kinnie
Jul 27, 2004
any more ridiculous than you going on about it?
DM
dave_milbut
Jul 27, 2004
erm, yea.

um, I know you are but what am I? :Þ
AK
Andrew_Kinnie
Jul 28, 2004
heh. I don’t know how to do a raspberry.
G
graffiti
Jul 28, 2004
but they were certainly something that should have been anticipated.

Hard to test an anticipation ain’t it?
DM
dave_milbut
Jul 28, 2004
I don’t know how to do a raspberry.

it’s a thorn.

<http://www.plexoft.com/cgi-bin/thorn.cgi>

as far as i know, microsoft hasn’t patented that one yet.
AK
Andrew_Kinnie
Jul 28, 2004
Hard to test an anticipation ain’t it?

Sigh. You’re right. Two years ago no one could possibly have imagined that there would be > 1 TB drives, and tested on RAID arrays that already existed at that time. Certainly not a company with resources as limited as Adobe. I don’t know what I was thinking.

In any event, IMHO, they should at the very least have a patch available for PS 7… You know, as every other app on windows has no problems whatsoever with these files.

Regardless, I made a statement, and have defended it despite repeated attacks and opposing opinions. My opinion (and that of anyone else who would get annoyed at having to buy an upgrade for numerous seats for such a silly reason) will not change anything, and I am more than willing to agree to disagree (as the first counter poster said).
AK
Andrew_Kinnie
Jul 28, 2004
it’s a thorn

Actually, I wasn’t talking about what you did, but I have to admit I now have less certainty as to your meaning. In any event, I still don’t know how to do a raspberry.
DM
dave_milbut
Jul 28, 2004
Two years ago

two years ago ps7 was nearing the middle of it’s product life cycle, not it’s planning stages. we were already hearing rumors of "Dark Matter" (PS CS) 2 years ago. think man. th TB boundry problem was brought up within a few months of ps7’s release. tested and acknowleged by the adobe team. unfortunately the fix was so extensive they couldn’t put it out in a patch. they gave a couple workarounds. partition so that the contiguous space is < 1TB or or temporarily fill the partition with a trash file to keep it less than 1TB. As the drive fills, reduce the size of the "fill file". they fixed it in cs. so any of the above are your options for solving your problem.

let me list them one more time:
1) partition
2) fill with a garbage file
3) upgrade to cs

my advice is upgrade for that and all the great features in CS.

but I have to admit I now have less certainty as to your meaning.

that’s ok, i meant it as a raspberry! 🙂
RH
r_harvey
Jul 28, 2004
Hard to test an anticipation ain’t it?

That’s what programmers do. Most programmers were writing code that was Y2K compliant, fifteen years ahead of schedule.

There were at least two Windows APIs available at the time for file opening; pick the one you like. It’s not really comfortable using a call named CreateFile() to open an existing file; in the DOS days, that would’ve truncated the file to zero-length. Now, about GetDiskFreeSpaceEx()… sigh.
X
Xalinai
Jul 28, 2004
Bob_Levine wrote:

And now go back a year or more before it was released. Were terabyte capacities common then? Do you think that this software is magically created the day before it’s released?

There were filesystems available with multi terabyte capacities, like NTFS.

There was a specification for the variables that return the free space on a disk.

Michael

MacBook Pro 16” Mockups 🔥

– in 4 materials (clay versions included)

– 12 scenes

– 48 MacBook Pro 16″ mockups

– 6000 x 4500 px

Related Discussion Topics

Nice and short text about related topics in discussion sections