Unsolved
This post is more than 5 years old
22 Posts
0
37197
Delayed write failure with firewire hard drive
I've been sifting through the myriad of posts which describe the Delayed Write Failure which occurs on several laptops - including Inspiron's 8200 and 8500, and even desktops. The symptoms to the problem happen when transferring files to your firewire hard drive, whatever brand it is (maxtor, acomdata, western digital). The drive stops transferring, the delayed write failure message pops up, and the firewire port becomes disabled.
Every solution, whether GAP SP2, or disabling write caching, doesn't work. The problem isn't that it's third party hardware and it's "their" fault, it's DELL's fault. Why can't DELL fess up to their mistake - at least admit it's a problem, and maybe even do what good customer service should do - and fix the problem?
Every solution, whether GAP SP2, or disabling write caching, doesn't work. The problem isn't that it's third party hardware and it's "their" fault, it's DELL's fault. Why can't DELL fess up to their mistake - at least admit it's a problem, and maybe even do what good customer service should do - and fix the problem?
Entropy42
529 Posts
0
September 5th, 2003 12:00
1) I've noticed it SOMETIMES happens, although less severely so under Linux. I've occasionally had a pause of 5-15 seconds the first time I write to the drive after mounting it. But unlike Windows, Linux recovers from the error and the write is successfully performed after 5-15 seconds. It so far has never happened more than once per mounting of the drive. Most likely an unrelated bug in the Linux SBP2 implementation.
2) Not necessarily. The only time I had it happen that I remember what I was doing, all I did was rename a directory.
3) Not ATI-specific, my desktop has a GeForce.
6) I'll have to try fiddling with this, but the problem is so rare that it would be hard to tell.
7) I also use one huge FAT32 partition, but this is probably unrelated.
8) PCI devices are (theoretically) supposed to be able to share IRQs without problems. 99.9% of the time, they do. You might notice that on Win2k systems, often *every single PCI device in your system* will be on IRQ 11. This is the case on my desktop, and has never been a problem except for with a not-quite-PCI-compliant PCI-to-PCMCIA bridge.
I8KOwner
2 Posts
0
September 9th, 2003 13:00
Sorry for the delayed response. I've been busy in the hospital.
To answer the questions of ghostdunks and entropy:
When I looked at the entries in the Event Viewer, I found that the predominant error was of type 51, with the following description:
"An error was detected on device \Device\Harddisk2\D during a paging operation.
For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp."
I also had a few scattered errors of types 9, 50, and 57, but the vast majority were type 51.
The problem does not appear to be related to the video card type, as I have a GeForce-based GPU. I am running WinXP Pro, SP1, with multiple NTFS partitions on my internal drive. My external firewire drive is a formatted as 2 NTFS partitions. The IRQ issue is still a possibility, as I remember under W2K that I could not connect my firewire drive and 802.11b wireless ethernet PCMCIA card simultaneously. The computer would generate an error, saying there were not enough free IRQ resources. Under WinXP Pro, however, I am able to connect 2 PCMCIA cards and my firewire drive all at the same time, without any such errors.
Of note, I have not had a single delayed write failure since I manually increased my page file size as described in my initial post:
"Right click My Computer, select Properties, Advanced Tab, Settings Button under Performance, Advanced Tab under Perfomance Options, Change button under virtual memory) Click on the Custom Size option, and multiply both the Initial and Maximum size values by 1.5. (For my system with 384 MB RAM, the new values became 576 and 1152.)"
Hope this helps.
Good luck, all.
I8KOwner
ghostdunks
33 Posts
0
September 10th, 2003 08:00
I am now convinced that my problem was caused by my virtual memory settings as well(similar to I8KOwner). After having switched from having one fixed size page file of 984mb to having a system managed page file, I have not had one "delayed write failure". I have noticed however, that the system seems a lot more sluggish and internal hd access has increased(probably due to the new page file increasing and decreasing in size), regardless of whether I have the external firewire drive connected or not.
chrishelps
59 Posts
0
September 14th, 2003 02:00
FYI, I too had the delayed write problem, and increasing the page file seems to have fixed it.
Chris
ghostdunks
33 Posts
0
September 16th, 2003 13:00
So as far as I know, setting the pagefile to be system managed seems to have fixed the problem for me. For other people, just increasing the size of the pagefile seems to have fixed theirs. When I had increased the size of the pagefile to a fixed 984mb, the error still occured, but that might have something to do with the IOPAGELOCKLIMIT as well.
As I like to have fixed size pagefiles(doesn't defragment the hd as much and the system seems to work faster), I'm going to try a larger fixed pagefile without the IOPAGELOCKLIMIT tweak and see if the error still occurs.
joriki
5 Posts
0
October 14th, 2003 02:00
I have this problem with a CMS ABSplus 40GB firewire drive - 4to6 pin adapter and 9v external power. The drive disconnects itself during formats and backups (after the occasional successful format). I've tried two drives, and the support and customer service at CMS have been great (bought from them directly), but I'm swapping for a USB2.0 with their USB2.0 carbus adapter that is *supposed* to give me USB2.0 speed through the pcmcia cardbus slot. Anybody else gone this route? I'd love to hear that USB2.0 works through pcmcia, especially since I thought the pcmcia bus was the limitation. CMS says it will go full USB2.0 speed - guess I'll find out soon.
Shame on Dell for once again refusing to take ownership of an issue with their laptops.
-joriki
Message Edited by joriki on 10-13-2003 10:08 PM
jeh14
22 Posts
0
October 14th, 2003 16:00
> Tweaking my system further, it seems that when I have set the IOPAGELOCKLIMIT setting in the registry,
> the error will appear again, regardless of the setting the page file. When I delete the IOPAGELOCKLIMIT
> setting, it seems to be OK.
Unless you're running Win2K RTM (no service packs) or earlier this is a complete red herring. Windows 2K SP1 and later, and Windows XP and later, do not pay any attention to the IoPageLockLimit value. You can define it if you want to, but the system doesn't even know it's there.
> As I like to have fixed size pagefiles(doesn't defragment the hd as much and the system seems to work
> faster),
As long as you keep the default size large enough there is no issue of fragmenting the HD, no time is spent "expanding and contracting the pagefile", etc., even with dynamic sized pagefiles.
jeh14
22 Posts
0
October 14th, 2003 16:00
> The IRQ issue is still a possibility, as I remember under W2K that I could not connect my firewire drive
> and 802.11b wireless ethernet PCMCIA card simultaneously. The computer would generate an error,
> saying there were not enough free IRQ resources.
That's because PCMCIA cards are basically ISA cards. They CAN'T share IRQs with other devices.
keyboarduy
22 Posts
0
October 14th, 2003 16:00
eugalaka
1 Message
0
October 21st, 2003 20:00
My pagefile has always been set to 512 mb for min and max. I can't quite pin down when the problem went away, but in my case, it was never related to the pagefile, as that's been set to static 512 from the first day I bought it.
The only thing I can think of, is that I no longer have 2 other devices chained to the HD. When I did have those other drives on there, I would get the delayed write error, along with the firewire drives and devices disappearing from My Computer, until the chain was completely unplugged and reconnected.
Sorry to add to the confusion...
dbrooksmusic
2 Posts
0
October 22nd, 2003 23:00
Hey everybody, I've been having the same problem with my Firewire drive (Wiebetech with Oxford chip & Maxtor 120 Gb). When it happens to me, it corrupts the master boot record and the data is hosed. Are you suffering similar results?
David
i8100, 1.13gHz, 512Mb, GeForce
kknd1967
10 Posts
0
October 31st, 2003 17:00
Big problems when i am using WD2000JB in an external enclosure (Bytecc ME-350F) with my I8100
problems:
1.after partition and format a 187GB(binary) large partition into NTFS I can use the drive. However, when data reach about 105-120GB(binary) (I am not very sure) the Win2000SP3 starts complaining file system problem with this drive ("windows can not write temp data...") and the "unplug or eject hardware" feature is not responding.
2. sometimes after rebooting, everything is fine. I can still read data on that drive. But everything goes bad if I try to write to the drive. the drive is not overheat. I used to think that is the problem when I tried to back up 200GB at once. But even I just connect this drive with already 110GB on it, writing a few files can cause the problem
3. The WD2000JB drive is good. Whenever I put it in my desktop with ATA card connected, it works fine. My laptop is good. Whenever I unplug the 200GB drive, my I8100 is very stable with 2 WD1200JB firewire external firewire drives working together. The enclosure is good. Whenever I put my extra Maxtor 40GB 7200rpm inside, it works fine.
4. WD diag tool report "cable error" during checking for the 200JB whenever it is in the firewire enclosure. My other external drives are fine. And the Maxtor 40G in the same enclosure is fine too.
From all these I conclude my BIOS is too old. Even if I have the ME-350F (supposed to support ATA-7 w/ 48-bit lba), it still needs my BIOS to support large drives. Do you think newest BIOS A15 would help me on this??????????????? Did anyone has similar experience and know the key to it?
Message Edited by kknd1967 on 10-31-2003 01:29 PM
Message Edited by kknd1967 on 10-31-2003 01:30 PM
jeh14
22 Posts
0
October 31st, 2003 19:00
First: I am not a moderator - but I'd think this should have gone in a separate thread; it's unrelated to the "delayed write failure" problem.
You are correct to be thinking in terms of 48-bit LBA issues. The boundary is at 128 GB (binary G, 137 GB in "decimal gigabytes"). It is the ATA/ATAPI-6 standard, by the way, that first added support for 48-bit LBA, not ATA-7.
I see several places on the net that claim your enclosure does support large hard drives. However if your particular enclosure was purchased some time ago it might need a firmware upgrade to add this support.
Another possibility is that the enclosure isn't compatible with Western Digital large hard drives. When 48-bit LBA first started to be an issue I saw enclosures sold that worked with one manufacturer's large hard drives but not another's. It looked very confusing.
Your laptop's BIOS has utterly nothing to do with accessing 1394 hard drives, so BIOS upgrades are not an issue.
Similarly for operating system drivers: For a large hard drive on the motherboard's IDE controller, you have to be running a version of the OS /SP that supports 48-bit LBA in the IDE driver. But a hard drive in a FW enclosure is not an IDE drive as far as the host OS is concerned; the only place in the chain to that drive where IDE commands exist is within the enclosure, between the bridge board and the drive. In XP, at least, 1394 hard drives are handled by the regular "disk.sys" driver and the driver for the 1394 host controller; the driver for the IDE controller isn't involved.
I realize that this doesn't give you a definitive answer, but at least you know where not to look (laptop BIOS).
ghostdunks
33 Posts
0
November 3rd, 2003 04:00
I agree with jeh14 with regards to kknd1967's problems. Based on the results he's been getting on the enclosure, it sure points to the enclosure not supporting over 130gb.
With regards to my own delayed write failure problems:
I finally figured out that my original enclosure(a New Motion USB2/Firewire external enclosure) was the original cause of most of the delayed write errors. I ran the enclosure without the top on, and whenever the drive(a Seagate 120gb) started beeping softly, I would get a delayed write error. I tested the drive fine on a normal desktop, and when I put a brand new Seagate hd into the enclosure, the same thing happened. It's gone back to to be replaced now, so I bought another enclosure.
This time, it's a Constar USB2/Firewire combo external(also using the Oxford 911 chipset, same as the New Motion). I still get delayed write errors, but much less often now. And when I do get the errors, the drive doesn't beep so it must be something else.
I've actually got a good way of triggering the delayed write error now. I've got a RAR compression archive of MSDN for July 2003. The RAR is just over 2.0 gb in size, and has thousands of files in it. It's currently sitting on the external hd. When I extract all the files from the archive into a directory on the external drive, I will inevitably get the Delayed Write error. When I extract the files into a directory on my internal hd, it's fine.
Also I've also recently bought an external Sony DVD burner which is capable of burning at 4x(using Firewire/USB2). So far, when I'm burning at 4x data files from the external hd, I will also usually get the Delayed Write failure on the hd, which obviously causes an error in the burning. The dvd writer is still fine though. If I burn at 4x data files from the internal hd, it's fine, so it seems to be related to the firewire hd. If I burn at 1x, it's fine, regardless of whether it's from the external or internal hd.
Any more ideas?
ryri
360 Posts
0
November 3rd, 2003 05:00
hi folks
i just picked up on this topic, after coming into this forum to respond to another.
I used to have an oxford 911 drive, and had similar probs. It would work fine for a while, but almost always in the middle of a large transfer the drive would spontaniously shut itself off and the turn back on, causing the heads to make a clunk sound. WinXP would then lose the connection to the drive, and it would fail. Oddly, the drive worked for about 2 months before windows started complaining.
None of you have mentioned any sounds or resets of the drive that I have, but i add it in case it's helpful in diagnosing the problem.
Someone should alert MS to the problem, and to the solution. Increasing the page file size to correct a firewire IO error is ridiculous. The page file only stores data that won't fit or isn't currently needed in RAM. It shouldn't have anything to do with file transfers, unless Windows is trying to somehow cache the entire file before transfering it (which would fill ram, and then the page file). Thus, it seems like a bug somewhere.
Actually, when i think about it, it was usually during large files that i had the problem. Has anyone who increased the pagefile size tried transfering a humogous file? (>5gb?)