|
|
|
|
|
|
|
![]() |
|
Thread Tools | Search this Thread | Rate Thread | Display Modes |
![]() |
#1 |
Subversive filth of the hedonistic decadent West
Join Date: Mar 2003
Location: Southeast Florida
Posts: 27,936
|
Help with tar, Anyone know the correct command to make a segmented tar file?
I'm moving a bit over 10 gigs from server to server.
Problem is when I go to tar the directory it seems to get stuck at 2 gigs. Seems like there should be a way to have it make segment. I tried tar -L2000000 -cf archive.tar directoryname But this seems to want me to insert the next tape after it finishes the first segment. Anyone know the correct command to make a segmented tar file? |
![]() |
![]() |
![]() |
#2 |
Bonged
Join Date: Mar 2003
Location: BrisVegas, AUSTRALIA
Posts: 4,882
|
not sure on that one Cleo?
maybe the k modifier - limits K size ? DD
__________________
Old Dollars >>>> Now with over 90 Hosted Free Sites <<<< DangerDave.com.au - Adult Links to Free Porn |
![]() |
![]() |
![]() |
#3 |
Subversive filth of the hedonistic decadent West
Join Date: Mar 2003
Location: Southeast Florida
Posts: 27,936
|
I tired the -k but it didn't seem to work.
It sure would be easier if it would just let me create a 10 gig tarball. LOL |
![]() |
![]() |
![]() |
#4 |
Subversive filth of the hedonistic decadent West
Join Date: Mar 2003
Location: Southeast Florida
Posts: 27,936
|
Amazing what a night's sleep does for one's mind. LOL
df -k The reason that I can't make a tarball big enough is there is only just over 2 gigs free on the drive. Now what… |
![]() |
![]() |
![]() |
#5 |
Bonged
Join Date: Mar 2003
Location: BrisVegas, AUSTRALIA
Posts: 4,882
|
![]() ![]() DD
__________________
Old Dollars >>>> Now with over 90 Hosted Free Sites <<<< DangerDave.com.au - Adult Links to Free Porn |
![]() |
![]() |
![]() |
#6 |
Subversive filth of the hedonistic decadent West
Join Date: Mar 2003
Location: Southeast Florida
Posts: 27,936
|
This seems to be working. Amber found a page on the scp command and posted it in this morning's coffee thread.
scp -r UERSNAME@foxyangel.com:DIRECTORYTOGET DIRECTORYONMESERVER/NEWHOME It has been going since this morning. On a 2 gig file it was doing just under 6 megs per second but on these little html and jpg files I'm only getting a bit over 1 meg per second but it is transferring the whole domain to one of my servers for safe keeping until M3 Server gets her new dedicated server online. The whole site was done in fucking frontpage and has duplicates of duplicates of duplicates with files thrown all over the pace and in directories all over the place. It looks like someone got drunk and puked a web site. Luna is going to be doing the web work on this domain once the new server is up and running sometime next week. I'm glad that I'm not her. LOL |
![]() |
![]() |
![]() |
#7 |
Whatever don't kill ya makes ya stronger...
|
Glad I could help..
Hope all is well with it ![]() ![]() PS..Homesite does a very nice job fixing fronpage html..not that anything short of handcoding will totally fix it, but it's a start..it gets rid of a lot of the junk.. |
![]() |
![]() |
![]() |
#8 |
Subversive filth of the hedonistic decadent West
Join Date: Mar 2003
Location: Southeast Florida
Posts: 27,936
|
The pages aren't so bad other then they all have no right click code in them. The real problem is the way Frontpage manages a site and leaves copies and copies and stuff all over the place.
|
![]() |
![]() |
![]() |
|
|