I have the last inum that was being written before the failure, so I attempting to run a find in the directory it was working on. However that fails too.
The veritas guys were not clear on how this command should be run. I am currently using a mounted directory on a Unix client. Will this be able to find a specific inode on the Netapps?
Simon
-----Original Message----- From: Steve Losen [mailto:scl@sasha.acc.virginia.edu] Sent: 01 May 2002 14:55 To: Clawson, Simon Cc: 'toasters@mathworks.com' Subject: Re: File name length and veritas
Hi!
I have been told by veritas that I have a file name or path that is longer than 1024 characters. Apparently this will cause problems with NDMP
They have told me to do the following :- find . -inum 5942788 -print
But this just gives an error due to the value being to great. Anyone seen this before? I know this might seem like UNix 101..so sorry..
Unix typically limits a path name (filename/filename/filename/ ...) to 1024 bytes.
However, it is possible and entirely legal to create directory structures that are so deep that the absolute path name to a file is longer than 1024. To get around this, you usually "cd" down into the directory tree and use relative path names.
But many dump utilities insist on listing each file as an absolute path name, and this is usually limited to 1024 bytes.
Evidently "find" is also unhappy about your long file name.
If you have any deep directory trees, or any directories with long names, then I suggest that you cd into the directory and run find . -inum ... from there.
It's possible that someone may have inadvertently created a very deep directory tree due to a programming error of some sort. It very easy to do:
while :; do mkdir trouble cd trouble done
This will create trouble/trouble/trouble/trouble/ ... until the user hits a disk quota or the file system runs out of resources. But before that happens, this could go thousands of directories deep.
Steve Losen scl@virginia.edu phone: 434-924-0640
University of Virginia ITC Unix Support
I have the last inum that was being written before the failure, so I attempting to run a find in the directory it was working on. However that fails too.
Maybe you have a very deep directory tree, possibly created by mistake. I would cd to the last directory dump was working on and run "find . -print" and look for something fishy like a long string of directories. You'll probably have to rm -r the whole mess. Talk to the owner first, of course.
Steve Losen scl@virginia.edu phone: 434-924-0640
University of Virginia ITC Unix Support