LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Server (https://www.linuxquestions.org/questions/linux-server-73/)
-   -   data push from box to removable usb drive dies, then dies, then dies again. (https://www.linuxquestions.org/questions/linux-server-73/data-push-from-box-to-removable-usb-drive-dies-then-dies-then-dies-again-934449/)

bodyofabanshee 03-14-2012 01:09 PM

data push from box to removable usb drive dies, then dies, then dies again.
 
an Oracle DBA playing SysAdmin so not a total idiot, but creeping up on it.

have a usb drive mounted and as I use cp to try to get about 70G of data from a staged drive ( no RAID )sitting on my redhat box onto the usb it dies, every damn time.

any of you brilliant boys have any genius for a girl geek. i'm dying here..

lisle2011 03-14-2012 01:28 PM

Copying to USB Drive
 
Try cpio.
BUT read the man or info file first.

michaelk 03-14-2012 01:42 PM

What type of data, single file archive, etc. Any idea how much data is being copied before it dies?
What is the make/model of drive?
Do you know how it is formatted? i.e file system type? (NTFS, ext3 etc)
Any error messages?

bodyofabanshee 03-14-2012 02:13 PM

Lisle,
will do, but a precursory glance tells me there may be a problem since I don't want to untar these files, they then become 300G and my usb is 250.
If nothing else, I'm now exposed to cpio. Thank you for that.

bodyofabanshee 03-14-2012 02:28 PM

What type of data, single file archive, etc.

tar'd and gzipped

Any idea how much data is being copied before it dies?

it varies. last "death" was two files one a couple hundred M and one about 30G which died after 5.7

What is the make/model of drive?
Vendor: WD Model: 2500BEV External Rev: 1.75

Do you know how it is formatted? i.e file system type? (NTFS, ext3 etc)
It was NTFS but I ran mke2fs, mounted it, no problem, small stuff goes off and on with not a single hitch

Any error messages?

yep.. all about out of memory. after which the kernal kills the process. ( this is a puzzle because the data is actually a backup that's going from our 0+1 Box to the non-raided staged drive with no problems, night after night. I just can't get it from that staged drive to the usp by taring/gziping them to this drive)


clearly cp isn't the ticket

elfenlied 03-14-2012 10:56 PM

Depending on how you formatted the drive with mke2fs will determine the maximum file size the paritition will support.

For example:

Ext2
Block size----------: 1 KB 2 KB 4 KB 8 KB
max. file size------: 16 GB 256 GB 2 TB 2 TB
max. filesystem size: 4* TB 8 TB 16 TB 32 TB

Ext3 has similar limits see these links for more info.
http://en.wikipedia.org/wiki/Ext2
http://en.wikipedia.org/wiki/Ext3

I'm going to guess you've formatted the disk with 1KB block size and that's why its dying, NTFS would have worked fine but you might not have been able to mount the NTFS partition in RW.

bodyofabanshee 03-15-2012 05:06 AM

elfenlied,
Since there's nothing on it, is it possible to rerun mke2fs? The first time I ran is as follows mke2fs -j -m 1
Also, if I understand the layout of the following:

Block size----------: 1 KB 2 KB 4 KB 8 KB
max. file size------: 16 GB 256 GB 2 TB 2 TB
max. filesystem size: 4* TB 8 TB 16 TB 32 TB


and it is a matter of the block size being too small, why would it die at 1G sometimes, and others get all the way to 6G, and would that account for the syslog saying the process ran out of memory and was killed?

Thank you for responding to this question

Zetec 03-15-2012 06:33 AM

Plan "B" could include putting the USB drive onto a workstation and trying to copy the data from the server to a USB drive on another workstation using SCP or something? Just thinking out loud.

bodyofabanshee 03-15-2012 09:53 AM

Zetec,
It may come to that...

Reuti 03-15-2012 10:03 AM

If it’s a limit because of the file size, you can try the command split with the option -b and a proper size to write pieces of the file to the USB-disk. On the target side you can use cat to concatenate them again. And maybe check the result later on with md5sum.

bodyofabanshee 03-15-2012 11:30 AM

update, or more questions than answers depending on your point of view,


what is literally happening is that my system is running out of memory. We have plenty of memory for daily use, for moving the backup to the staged drive etc, but it gets maxed out EVERY time i use cp to try to put these files on the usb drive.

Is there a way to keep cp from journaling or whatever it's doing.. just pipe the dang thing?

michaelk 03-15-2012 11:34 AM

Post the output of the command
free -m


All times are GMT -5. The time now is 08:08 PM.